Sunday, December 02, 2007

Thoughts on Calculating Risks

Atomic weapons were used only in World War II, and only by the United States. But we know that there was a very close approach to nuclear war during the Cuban missile crisis, and apparently there was another close approach in the Reagan years when the Soviet Union mistook a U.S. military exercise for a cover for a sneak attack. It occurs to me that in calculating risks we should consider not only the actual events, but also the events that were narrowly averted. How then can we weight actual events as compared to nearly averted events?

Not all nuclear attacks are equivalent. A nuclear war between the Soviet Union and the United States with tens of thousands of high-yield weapons would not be the same as a terrorist attack with a single low yield weapon. Both would be unthinkable. How does one weigh the risks of various levels of unthinkable consequences?

The appropriate understanding of nuclear risks is obviously important, and the inappropriate understanding is clearly visible in the run up to the Iraq war. We now are witnessing a lot of public concern on the possibility of Iran gaining nuclear weapons, and (in my opinion) relatively little concern for the nuclear weapons already in the hands of governments in Pakistan, India and China. The story of Russia withdrawing from the conventional arms treaty got space inside my local newspaper, not on the front page.

Flu pandemics also come to mind. There is a global epidemic of flu every year, and millions of people are affected with hundreds of thousands of deaths. Every decade or so a new flu variety replaces the old one, and a more virulent epidemic or pandemic occurs. A few to many million people die. We know of the Spanish flu that started during World War I, when 50 to 100 million people died, in a much less populated and less densely populated world than we have today. How do we calculate the risk of the next epidemic? Of the next pandemic?

Again, the Ford administration made a major initiative to prevent a swine flu epidemic that never occurred, and the current Bush administration has made an important initiative to combat an avian flu epidemic that has not occurred (yet). In both occasions there were short term political benefits obtained by predicting a risk, and being shown to be acting vigorously to respond to that risk.

The popular perception of risk depends on the availability of knowledge of the nature of the risk. Availability involves whether we have scientific knowledge of the magnitude of the risk, or whether the risk is unknown. Thus, the risk of the HIV/AIDS epidemic has been unknowable in part because the statistics have been weak on the incidence and prevalence of the disease, but also because of the impossibility of predicting when vaccines and other preventive measures would be available and how effective they would be, and of predicting when treatments would be developed, how effective they would be, and how resistance would develop.

There is also a known "availability bias" in human decision making. If people can not easily and quickly remember an example of the kind of event of concern, they tend not to give it high priority. People are more concerned about airline crashes than about auto accidents, although the latter cause many more deaths; indeed, the risk per mile traveled by automobile are much higher than those per mile in a commercial aircraft. But we read about and see on television any airliner crash anywhere in the world, but automobile accidents are too common to merit much coverage (unless they involve famous people or other special circumstances).

How do we overcome the availability bias for risks of infrequent but very serious events? How do we get an appropriate public response to politicians who pander to our fears or who focus on the wrong risks? One step in the right direction would be to improve education, teaching kids about risk analysis. Another would be for the media to do a better job. For every news story about an airliner crash, there should be equally riveting coverage of an alternative risk that would be more important to the average reader of viewer -- say the risk of carcinogens or obesity.

No comments: