Showing posts with label Risk. Show all posts
Showing posts with label Risk. Show all posts

Friday, October 30, 2015

Where are the terrorists? Oh, they ae lumped in the small circles on the upper right.


Source: Business Insider
Most people have a wildly distorted idea of the risks that they are running, How did you do?

Wednesday, August 13, 2014

How do people do dangerous jobs? Some people seek them out!

I have been watching the Book TV program on U.S. Marshals: Inside America's Most Storied Law Enforcement Agency, and wanted to share a couple of thoughts. They relate to my post in early June on Drew Gilpin Faust's book, This Republic of Suffering.

The experienced U.S. marshals in the program were asked how they felt when standing in front of a door, having to knock on the door while understanding that there was a significant probability that there would be a felon on the other side who was facing a long spell in jail if apprehended. One of the marshals told of an instant when when the knock of the door resulted in four gunshots coming through the door; one hit the shield the marshal was holding knocking him to the ground, and a second shot struck the marshal in the stomach. Marshals are aware the situation is dangerous, and the two present described increases in heart rate.

One also described in some detail the thoughts that went through his head each time -- checking who were on his right and left and behind him, reviewing the arrangements for the confrontation, assuring both safety and completion of the job. Both the experienced marshals described the importance of training, and both described trying to assure that the corps of marshals have recent, well done training. I suspect that in that critical moment, the marshals may fall back on the routine that they have been trained to perform, and concentrate on performing the tasks well.

I am reminded that many years ago I was reviewing reports of people using parachutes who had been instrumented with heart rate monitors. (The research lab in which I was working had expertise on biomonitoring and automated analysis of the records from such monitoring, and was considering bidding on a project to make parachute jumping safer.) One of the reports I read described a jumper whose parachute had failed to open. He had escaped from the harness for that chute, gone into free fall for a time to clear the unopened cute, and then popped his backup chute and landed safely. During the entire jump his heart rate had remained unaffected; his training had kicked in and he had coolly done what he had been trained to do. After landing however, his heart rate shot up reflecting a high degree of stress, and stayed high for several hours.

It occurs to me that something of the kind may happen to the well trained marshal before knocking on the door he believes will be opened by a felon. Something like this may also happen to the well trained soldier at the moment of danger in battle. Training kicks in and the mind focuses on conducting the things for which one has been trained.

I also recalled a student in a course I taught some years ago. The university course was titled "Risk", and sought to acquaint the students with literature on risk and uncertainty, and also to give them some tools to make better decisions. The students in the course had been asked to describe a career decision, the criteria that were important in making that decision, how the alternatives ranked against each criterion, and the uncertainty involved in predicting the outcome of each alternative.

My student was a law enforcement officer. He was deciding between his current job in the Washington DC area and a job offer he had to join the DEA in Florida. The Florida job involved a reduction in pay, a reduction in status, and moving away from his friends and family -- all of which were negative in his mind. He thanked me for the experience of the assessment, and went on to accept the Florida job in spite of all the drawbacks. What he was grateful for was the explicit realization that the job in Florida offered more frequent risks to his physical safety, and that he valued those experiences. Indeed he was willing to give up pay, job status, and closeness to family to seek more dangerous job experiences.

Author Faust, in her book, sought in part to understand how Civil War soldiers felt in battle. I was dubious that, lacking comparable experiences herself, she could really understand how those soldiers felt in battle. In fact, I also posted on this specific concern.

Training helps people think about what they must do to get their jobs done in times of stress, and what they must do to survive in times of danger. And while we think that people would normally try to avoid danger, some people seem to seek out dangerous jobs.


Tuesday, June 03, 2014

Thinking about Political Orientation and Risk Perception


I found this graph in a Mother Jones article on risk perception. The point of the article is that people of all political perceptions tend to have similar perception that the risk from childhood vaccines is low. Good to read.

The graph also shows, however, that attitudes on the risks of gun ownership, global warming and marijuana legislation differ greatly according to one's political orientation.

Climate Change

I think that few of us understand the science of global warming. So our concern for the risk that it poses depends in part on our knowledge of the warnings of scientists who have studied the issue; a huge majority of these scientists believe human activities are causing a rapid release of greenhouse gasses which are causing global warming. Concern for the risk of global warming also must relate to the time scale in which one thinks, since the change will increase over decades (and centuries) and there is likely to be little damage in the next decade that can be certainly attributed to climate change.

We also know that reduction of greenhouse gas emissions will have costs, and those costs are likely to affect the bottom line of oil companies and other large corporations negatively. They have been conducting a campaign for "the hearts and  minds" of the public. So perhaps conservative citizens are more likely to give credence to the corporate messages than are liberal citizens.

Moreover, denying climate change has come to be a political platform for conservatives. Perhaps there is cognitive dissonance if one were to believe one was a conservative and also to believe in anthropogenic climate change, and that people change one or the other belief to reduce that dissonance. (Indeed, I think it has been suggested that they will change the belief that is less central to the person's concept of his/her own identity.)

Guns and Pot

There is an experiential element in these areas that differs from the preceding ones. People who own and use guns probably have developed a certain belief that they can do so safely; I certainly did in the years when I was a gun owner, target shooter and occasional hunter. I suspect that people who have used a lot of marijuana also tend to believe that they did so safely, and that it is at least as safe as alcoholic drinks and tobacco that are sold freely to adults.

There is a gun industry, and it does support the NRA and its efforts to prevent regulation of firearms. There is also a body of easily available information that indicates that guns cause a great many wounds and deaths in this country, and that the level of such dangers is much lower than in countries that strongly regulate private ownership of firearms. As in the case of climate change, opposition to gun regulation has become a plank in conservative platforms, and the effects of industry, the organized lobbyists and conservative politicians are more effective with conservatives than with liberals.

I suspect Nixon's War on Drugs was an effective conservative political strategy that continues to divide liberals and conservatives. That seems to be changing according to results from the Pew Research Center:


Thursday, January 30, 2014

As cars become more common, so do auto accidents.



There is an interesting point made in an article from The Economist:
Every 30 seconds someone, somewhere, dies in a road crash, and ten are seriously injured. The toll is rising: the World Health Organisation (WHO) expects the number of deaths globally to reach nearly 2m a year by 2030, up from 1.3m now. But the pain will fall far from equally. Rich countries are making roads safer and cutting casualties to rates not seen for decades, despite higher car use. Poor and middle-income ones will see crashes match HIV/AIDS as a cause of death by 2030 (see chart). In the very poorest, the WHO expects deaths almost to triple.
While death rates from communicable diseases such as malaria and TB are decreasing, those from automobile accidents are increasing in developing countries. Of course, that means that injuries and long term disability rates are also increasing related to auto accidents. Moreover, there is a huge financial cost. 

A thought on risk aversion



There is an article in The Economist on risk, pointing out that people tend to be risk adverse. If you ask people whether they would rather have $50 in hand, or a ticket to a lottery that would pay $120 half the time or nothing the other half of the time, most would prefer to take the cash. This is in spite of the fact that if you had the bet many times, one would do a lot better taking the odds. (Ask any casino owner; they make mints of money with much more modest odds in their favor.)

The article also notes:
Upbringing, environment and experience also play a part. Research consistently finds, for example, that the educated and the rich are more daring financially. So are men, but apparently not for genetic reasons. Alison Booth of Australian National University and Patrick Nolen of the University of Essex found that teenage girls at single-sex schools were less risk-averse than those at co-ed schools, which they think may be due to the absence of “culturally driven norms and beliefs about the appropriate mode of female behaviour”. 
People’s financial history has a strong impact on their taste for risk. Looking at surveys of American household finances from 1960 to 2007, Ulrike Malmendier of the University of California at Berkeley and Stefan Nagel, now at the University of Michigan, found that people who experienced high returns on the stockmarket earlier in life were, years later, likelier to report a higher tolerance for risk, to own shares and to invest a bigger slice of their assets in shares.
It has long been my opinion that we should do simulations of markets assuming that attitudes toward risk are randomly distributed. That seems intuitive to me. Consider parimutual betting at a race track; some people would rather bet on a longshot others on a favorite, the betting is not simply due to differences in estimation of the odds on winning. 

Sunday, January 12, 2014

Embracing Uncertainty II

“I can live with doubt and uncertainty and not knowing. I think it is much more interesting to live not knowing than to have answers that might be wrong. If we will only allow that, as we progress, we remain unsure, we will leave opportunities for alternatives. We will not become enthusiastic for the fact, the knowledge, the absolute truth of the day, but remain always uncertain … In order to make progress, one must leave the door to the unknown ajar.” 
― Richard P. Feynman
In my previous post I recommended embracing uncertainty. I meant that in terms of big, important things. In trivial things, of course you should ignore uncertainty, acting as if it didn't exist. Would you prefer to wear this or that today? Ignore your uncertainty and choose. Would you prefer pancakes or waffles for breakfast? Ignore your uncertainty and choose.

How should you deal with uncertainty?

  • Recognize alternative theories of the nature of the situation, but also recognize that you may not have recognized all the relevant alternatives. Consider searching for other alternatives.
  • Recognize alternative courses of action, but also recognize that you may not have recognized all the relevant alternatives. Consider searching for other alternatives.
  • Consider the dangers involved in those alternatives actions and their probabilities. Recognize that you may have failed to identify dangers. Recognize that your estimates of probabilities may be faulty. Consider more detailed risk assessment.
  • Consider the potential benefits in those courses of action. Recognize that you may have failed to identify potential benefits. Recognize that your estimates of the probabilities may be faulty.
  • Consider making your analysis of the situation more profound. Recognize that there are costs in time and effort in continuing analysis.
At some point you will decide that the state of your analysis justifies the decision to act or not to act, and you will do so. You will also decide to continue analysis or to discontinue analysis and go on to think about something else.



Tuesday, December 03, 2013

A thought about government funding of anti-terrorist activities.


The perception of risk seems to reflect many things other than the probability of death and injury. Familiar risks like those of gun violence and auto accidents don't seem to be as worrisome as risks related to recent events that got a lot of media attention -- 9/11 and mass shootings.

The news media of course focus on "newsworthy events" rather than continuing social problems; the more likely the images relating the events are to generate an emotional response and draw viewers, the more likely they are to be played over and over on television, placed above the fold on newspapers, and featured on web pages.

Ideally we would like to see Congress allocate public funds in such a way as to do the most good. If it can save 1000 lives with one program but only 100 with an alternative program at the same cost, we want the most cost-effective program to be funded. But if voters mistakenly believe that their risks are greater from the less cost-effective program, legislators may get away with voting for the less cost effective alternative.

If voters don't know how much a program costs then they have even less opportunity to judge whether it is cost effective. Thus the intelligence agencies that have secret budgets don't get the voter scrutiny that would encourage them to be cost effective.

People also like simple solutions to simple problems. Drone strikes against terrorist leaders are easy to understand; the use of soft power to reduce extremist anger against the USA is harder to understand and harder to measure the impact.

And of course, politicians respond not only to the expressed concerns of their constituents, but also to the interests of the major donors to their campaigns and to the lobbyists who help them raise money and obtain (one sided) information on issues. Put billions of dollars into a  secret intelligence industry, and don't be surprised if politicians become seized with the urgency of funding the contracts given by the government intelligence agencies.

Friday, September 13, 2013

Apparently you are in more danger walking than from terrorist attacks!


According to the Washington Post Wonkblog:
In the last five years, the odds of an American being killed in a terrorist attack have been about 1 in 20 million (that's including both domestic attacks and overseas attacks). As the chart above from the Economist shows, that's considerably smaller than the risk of dying from many other things, from post-surgery complications to ordinary gun violence to lightning.
Source: The Economist

Sunday, August 04, 2013

A thought about risk assessment and government programs


Politician: Your taxes are protecting you from rampaging elephants.
Citizen: But there aren't any elephants here.
Politician: See how effectively we use your taxes!
We could look up the magnitude of risks such as automobile accidents, illegal drugs, or guns. They are all real, with thousands of deaths per year. However, most people do not do so.

We can not get data on the magnitude of the risk from terrorism. We only know the results of terrorist attacks that succeed, while government officials tell us that terrorists are sometimes foiled

We know that there we are subject to cognitive biases. (Click here for a list of cognitive biases.) But we seldom seem to protect ourselves from those biases as we make decisions about risks and risk amelioration.

I wonder if the American public is not allowing the government to spend too much on security, when different use of the money could save more lives, prevent more incapacity and improve health and welfare. For example, would we not have achieved more real security for our people by spending less on homeland security and war in this century, and more on health insurance and public health. Indeed, would we not have achieved more by having a health service financing system that spent less on paperwork, thereby devoting more to health services? Might we not have achieved more with government regulations that encouraged more on preventive services and less on defensive medicine (e.g. diagnostic procedures that protect the provider from suites rather than provide information to improve treatment).

When a significant industry provides economic benefits for a lot of people by spending tax payer money, those people have an interest in the flow of funds to their industry. If the tax payers fear of risk can be harnessed to support the creation of a government-financed program implemented through that industry, the people it employs benefit. Once such a program is created, lobbyists can be employed to encourage politicians to continue and expand the program. If (and unfortunately often, when) citizens don't accurately understand the risks that are ameliorated and the cost of the program, the lobbyists will win and the tax payers lose.

One aspect of this phenomenon is mission creep. The war in Iraq was sold to the public on the basis that it would avert the risk of Iraqi weapons of mass destruction being used against Americans, and especially falling into the hands of non-state actors and used by terrorists. There were no WMDs in Iraq, but the public was told, "you broke it, you own it". The mission became control of insurgency and nation building.

Similarly, we promoted regime change in Afghanistan in order to install a regime that would oppose the use of that country as a safe haven for terrorists that targeted the United States. We took troops away from the fight against terrorists in Afghanistan in order to deploy them in the Iraq war. Then we found ourselves fighting insurgents in Afghanistan and trying to build sustainable political and economic institutions in that country. It is not clear how much security from terrorist attacks those actions actually bought us, but it seems very likely that other policies in Afghanistan would have brought greater benefits in terms of security of Americans.

On the other hand, the military industrial complex got huge amounts of funding from these war policies. Unfortunately, we don't seem to count the deaths, disabilities and injuries suffered by the people who actually fought these wars nor to compare the actual damages to them with the likely damage that might have been caused by terrorists in the United States.

You don't see this happening often in the United States.

Friday, May 24, 2013

Safety is everyone's business



It is one month since an eight story building collapsed in the outskirts of Dhaka killing more than 1000 people. Reporters have been assigning responsibility to prevent similar disasters. It occurs to me that the responsibility for safety is shared. Here are some of the responsibilities:

  • The workers themselves have a responsibility for their own safety. I might have suggested to any one of them that she not work in a dangerous facility. I might have suggested that collectively they might have gone on strike until the company provided a safer workplace. Unfortunately, a worker who refused to work in an unsafe workplace in Bangladesh might starve.
  • The workers in the garment factories are often unmarried young women, and one might thing that their families should take more responsibility for their safety. Unfortunately, their families are also poor and poorly equipped to take economic risks to protect the young workers.
  • The managers and supervisors in the garment factories have a responsibility for the safety of the workers as well as for their own safety. They might insist that the companies provide safe working places. Of course, they would likely be replaced by less ethical people if they did so, and might have problems finding other jobs.
  • The owners and stockholders in the manufacturing companies have a responsibility for the safety of the workers. To the extent that that responsibility is shared, it can be avoided. Their profit margins are slim, and might disappear driving them out of business were they to really protect the safety well. Moreover, some are greedy and unethical, and those are willing to make money off of the risk of their workers.
  • Building owners have a responsibility for the safety of the buildings that they are renting out. Their incentives are similar to those of the company owners.
  • Builders have a responsibility not to build unsafe buildings. Of course, if they refuse contracts to do so, other more greedy and less ethical builders may do so to get the contracts and the ethical builders may lose or go out of business. The builder may also depend on architects and engineers to design safe buildings, and be professionally unable to judge the safety of the buildings that they are constructing. The builders who built the original five stories of the building that collapsed apparently constructed a safe structure until others added the unsafe additional three stories that over stressed foundations and the structure of the lower floors.
  • Architects and engineers have an ethical responsibility for the safety of buildings constructed under their professional oversight.
  • Governments have a responsibility for the safety of the workplace and construction. However that responsibility is limited by the legal authority given the government. More practically, it is limited by the expertise and resources available to the government agencies. And of course government employees may be corrupt of lazy.
  • Of course, people seeking to profit from the risks taken by the workers tend not to like government regulation, and often have the economic and political power to resist it.
  • The public in Bangladesh has the responsibility for its own institutions. It must assure that people who endanger workers through greed or corruption are prosecuted, that regulations are strong enough to protect the workers, and that government is strong enough and ethical enough to enforce the regulations. Unfortunately, that is a shared responsibility and thus easily avoided. Bangladesh is also poor in financial and human capital, and the public may not be able to demand a government that is adequate to protect the workers.
  • Downstream companies that profit from the risks borne by workers in Bangladesh's garment factories also have a responsibility for the safety of those workers. They are poorly placed to assure that safety both geographically and administratively. Of course, by avoiding that responsibility and taking advantage of low prices, they make more money. Moreover, the company that pays the price of the safety of those upstream workers depends on the ultimate consumers being willing to pay more for the final product.
  • International organizations don't have the authority to demand that products sold in international markets are produced by workers whose safety is adequately protected.
  • In theory governments could regulate to assure that imported products were produced by safe workers. The United States for example has regulations to assure that imported drugs are produced in such a way that they are safe for American consumers. We have not charged our governments to provide this kind of regulation.
  • Consumers have an ethical responsibility not to safe a few cents per garment that they buy if that saving comes at the expense of the safety of the workers producing those garments. Of course, an individual refusing to buy a shirt at Walmart is not going to protect a worker in Bangladesh. Consumer pressure can only be influential if it is organized.
  • So how about the civil society organizations that could organize consumers? How about the media that could inform the consumers enabling their anger to fuel organization and fund NGOs? Well I haven't seen reporters and editors blaming themselves for focusing on meretricious stories rather than important ones.
So workers in poor countries will continue to suffer injury and death, and we will continue to get cheep consumer goods via global corporations.

Sunday, May 05, 2013

A thought about the provision of information


The idea of those long instruction sheets one gets with a bottle of pills from the drug store is supposed to be to inform the patient of the risks involved in taking that drug. On average they would take something like 20 pages of normal size text. So, of course, patients don't understand them if they try to read them, and thus don't read them.

I think this is a case of different priorities of the regulators and the company lawyers. The regulators want to decrease bad side effects of medicines; the company lawyers want to avoid successful law suits against their companies by people suffering from those side effects.

Of course, one hopes that doctors and pharmacists have the information necessary to inform patients of what to watch out for. Back up systems for the provision of information help. But then the doctors and pharmacists also have their own interests to protect.

Maybe one way out would be to do (social science) research on the nature of ways to provide that information that minimizes the negative side effects. Scull and crossbones labels, child proof tops, etc. seem to help.

Sunday, February 17, 2013

Risk and flu




This video is from a nice website, Risk Bites.

It has been estimated that the Spanish flu that hit during the First World War killed 50 million to 100 million people worldwide. Flu pandemics, of which that was the worst in centuries, occur ever few decades. They occur when a new, highly infectious strain of the flu evolves to which there is little "herd immunity" in the human population. If that strain proves to be very lethal, lots of people die.

It is important that the world maintains epidemiological surveillance systems that would quickly identify the emergence of such a strain in order that public health efforts could be employed in a timely manner. These would include development of an appropriate vaccine and mass immunization campaigns using the vaccine. It is important that we inform the public that flu is more dangerous than immunization, and that they have a public duty to be immunized so that when the next potential pandemic emerges, people will respond.

The regular flu is dangerous. I quote from a January story in Bloomberg News:

The worst U.S. flu season since 2009 intensified last week, killing hundreds more people as the viral epidemic spread to additional states, health officials said. 
About 8.3 percent of all deaths nationwide were due to the flu and pneumonia for the week ended Jan. 12, more than the 7.3 percent level for an epidemic, the Centers for Disease Control and Prevention said today. About 90 percent of those deaths are people older than age 65, who are being hit particularly hard by this year’s flu strain, the Atlanta-based agency said.
Of course, it is important for people at high risk of serious cases of the flu to be immunized, but so too it is important for people who might infect the old or those with impaired immunological systems to be immunized.

Indeed, even if the immunizations are only 60 percent effective, 100 percent coverage can really benefit people. A lot of flu is spread in schools. If all kids and teachers were immunized, there would be a lot of classes in which flu was not spread from student to student. Even if only 60 percent of the kids in the classroom were protected, if one of them is yours, your whole family would be safer. Moreover, the immunizations may reduce the severity of infection even if the person is not completely immune and gets the flu.

Monday, November 15, 2010

TEDxPSU - Bruce Schneier - Reconceptualizing Security


Let me restate some ideas from this presentation.

In the real world there are dangers and actions which can be taken to reduce or eliminate those dangers. We react to the real world conditions emotionally. Schneier stresses the emotional reaction to the danger, but I would add that there are also emotional reactions to the means of dealing with dangers. (Some people react emotionally to police in ways that interfere with the accuracy of their perception of the efficacy of policing as a means of dealing with danger.) We also use intellect to react to danger and to choose ways to react to danger. Schneier is very good on what we know about cognitive bias in the intellectual models we make of the real world dangers and actions, although he can not do justice to the topic in a brief discussion. For example, I think there is a difference between our being "risk adverse" and our being "loss adverse" although both biases can reduce the quality of decisions.

I like Schneier's differentiation between security measures and security theater. Security theater is meant to influence our emotions and our models. I would say that good security theater has the effect of bringing our perceptions closer to reality and bad security theater is meant to distort our perceptions from reality. Bad security theater is perpetrated by people who would sell us products that promise more security than they deliver. Those products can be commercial products, but I would say that the Bush administration also perpetrated some bad security theater with respect to the threats of terrorism or of weapons of mass destruction in Saddam's Iraq.

Schneier makes a great point that our emotional response can be conditioned by our intellectual model of dangers and actions, and that often with time our models disappear from our consciousness leaving us simply to perceive our emotional response.

I would draw the conclusion that we want to work to make our intellectual models as good as possible and to use effective analytic means to draw conclusions from those models. Schneier suggests that we know a lot about our local risks, but my experience on a local Grand Jury dealing with scores of people accused of local crimes made me aware that my intellectual model of local crime risk was seriously deficient. Awareness of common cognitive biases can be used to help overcome those biases.

While we can individually analyze security theater to help judge whether it is benign or malicious, we can also consciously seek factual evidence to improve our intellectual models. We need help from the news and other other media to brand bad security theater from government as we have help from the government to protect us from bad security theater from the private sector (false advertising, false claims for drugs and medical treatments).

Thursday, July 29, 2010

Lab Safety

Source: "Danger in School Labs: Accidents Haunt Experimental Science," Beryl Lieff Benderly, Scientific American, August 2010.

There is a movement to improve laboratory safety in American universities. The article holds that there is a culture of safety in American industrial labs, but not in American university research labs. Even though there is little data on university lab accidents, the article cites anecdotal evidence that such accidents occur, sometimes involving students, and sometimes fatal.


My own experience in a small commercial research company many years ago suggested that there were severe problems. I recall a large gas canister, the regulator broken off, breaking loose and flying down the building breaking through lab walls. I recall an accident in a Florine research project that hospitalized two people, one for months. I recall a flock of animals killed by an accidental gas release. And I recall a B52 crashed in a test of an experimental flight warning system.

I suspect that there are more lab accidents per laboratory in developing nations, since in many developing nations safety procedures are relatively underdeveloped.

It seems to me that laboratory safety is an ethical issue for scientists, for university and research laboratory administrators, and for funding agencies. Indeed, it is one of several ethical issues including the ethical treatment of human subjects, the ethical treatment of animals involved in the research, and the containment of risks to others created by research (containment of poisons, pathogens and pests, containment of dangers to the environment).


I note that UNESCO has a program focused on the ethics of science, but has apparently never undertaken an effort to reduce these research-related risks in developing nations. Capacity building and policy advice are areas in which UNESCO might be useful.

Monday, March 22, 2010

Math Models: From financial meltdown to global warming

Computer modeling allows experts to extrapolate the implications of complex sets of assumptions and data in a timely fashion. In a few decades they have become a fundamental tool of the analyst. The computer models amplify analytical efforts in a manner analogous to the way mechanical engines amplify muscle power. Indeed we would no more be able to check the computer analysis without computers than we would be able to replace the machines in a coal mine with manual labor.

News reports suggest that an important factor in the financial meltdown at the end of the last decade was the use of mathematical models for risk management by the financial industry and its regulators. While I worked on mathematical models in another context, I don't know enough about those used in finance to comment intelligently. Still, the models that were in use must have failed to provide their users with adequate warning of the risks they were running. I would bet that the chief executives and boards of directors of the major firms involved neither had the expertise to understand the details of the models being used nor that they had taken the time to make a detailed investigation of those models on which they were betting their firms. A lot of those firms lost those bets.

Climate change is a sufficiently complex subject that powerful models must be applied to its analysis. A lot of very good scientists and modelers are doing so. Again, I am not expert in these models, but it is clear that there exist several different very strong models, indicating both the fact that there is not now an agreement on the perfect form for such a model and that the models are sufficiently robust to agree generally in their predictions. I am sure than none of the heads of government or major legislative bodies are experts in climate change nor these models. I suggest that there is a possibility that the existing models might fail to predict radical, non-linear climatic effects of factors that are not fully included in the theory or parameterization of the existing climate models.

While the bigwigs of the financial firms that got into such troubles made a lot of money in salaries and bonuses during the period in which they failed to take into account the risks that their models were wrong, the rest of us are paying for their failure in risk management.

The politicians who deny the risks that the current climate models are fatally flawed may reap short term political benefits. Let us hope that our children and their children do not pay for the failure to act now to reduce the threat of climate change. Prudence suggests that we not bet the future of the planet and the human race that the models -- which already predict major climate changes -- are not too conservative in their predictions.

Monday, March 01, 2010

Science interpreted into policy

Source: "PUBLIC HEALTH: Brawling Over Mammography," Eliot Marshall
Science 19 February 2010:
Vol. 327. no. 5968, pp. 936 - 938
DOI: 10.1126/science.327.5968.936

Science magazine provides a news article on a dispute among epidemiologists and medical practitioners as to whether breast mammographs should be routinely given to women between 50 and 60 years of age. There are a number of studies done in different locations at different times, all of which seem to indicate a small reduction in breast cancer mortality from women in that age group who receive screening. On the other hand, there are a lot of false positives, each of which is likely to cause anxiety to a woman and the time and expense of follow-on treatment. Not included is the risk due to radiation exposure from repeated x-ray examinations.

The decision on the public policy is presumably based on the comparison not only of the benefits, risks and costs of the procedure, but on whether the resources that would be used for breast cancer screening might better be used for alternative interventions.

The article indicates not only that the standard-setting committees tend to have disagreements within and between committees, but that others (NGOs, medical associations, etc.) also go public with their comments seeking to influence decisions.

The epidemiological studies of course are based on average results over large numbers of women, rather than on the information provided for individuals with differing a priori risks (as might be measured by cancer rates in relatives, specific age, years since menopause, parity, etc.).

Two conclusions.
  • Women should consult and be guided by their physicians, who should in turn deal with the costs and benefits of screening as perceived by the patient as well as the risks faced by that specific patient.
  • Really good data systems of medical records should allow data mining to greatly improve the guidelines for breast cancer screening, as of course will improved genetic screening when it becomes available.

Saturday, January 02, 2010

There they go again worrying about the wrong things!

Matthew Yglesias writes:
Spencer Ackerman tries to problematize the conclusion that the underpants bomber incident really represents a grievous intelligence failure. You should read his whole analysis. But in brief, while by definition letting a bomber on an airplane is a failure, based on what was actually known about Abdulmutallab, excluding him from flying would involve erecting pretty substantial barriers to entering the United States in ways that would have real costs. As I said before, the key point about identifying al-Qaeda operatives is that there are extremely few al-Qaeda operatives so (by Bayes’ theorem) any method you employ of identifying al-Qaeda operatives is going to mostly reveal false positives.
Looking at it one way, he is right. Looking at it another way, he is quite wrong.
  • Any simple, affordable screening method that seeks to unfailingly identify anyone among nearly seven billion people who would create a specific type of crime anytime and anywhere is bound to fail to achieve that hugely unreasonable goal.
  • I think the number of false positives among people who are convicted in the courts of attempted terrorist acts is very small indeed.
In fact screening is generally a sequential process. Simply saying that we are seeking Al Qaeda operatives is itself a screening step. Screening for Al Qaeda operatives is unlikely to identify another Oklahoma City bomber or serial killer. We screen lots of people with simple tools giving lots of false positives, and put them on a watch list. We screen the smaller number of people who ask for U.S. visas more carefully. Those who do receive visas are screened still more carefully for explosives and weapons when they seek to board an airplane. If charged with a crime, they should be investigated very thoroughly indeed. Ultimately, before finding a person guilty, prosecutors will have to decide that there is enough evidence to bring the person to trial, a judge will have to decide that the investigation and presentation of evidence has been fair, and a jury will have to decide if the case has been made beyond a reasonable doubt.

The point is, screening is a sequential process with more expensive and more accurate processes being applied following prescreening with simpler cheaper processes. Think about medical screening where one goes through a physicians office visit and stages of clinical laboratory work -- chest x-ray followed by CT scan and then by exploratory surgery -- before a final diagnosis is made.

The more fundamental point is that we are probably doing a poor job of risk analysis, spending too much money to avoid relatively small risks and too little money to avoid relatively large risks. We probably worry too much about Al Qaeda and too little about traffic accidents, communicable diseases, and behavior that creates risks of non-communicable disease such as cardio-vascular disease and cancer. The publicity that attends a failed attack to blow up an airplane dwarfs that which attends the thousands of deaths a year from flu, not to mention heart attacks, strokes and cancer. So our political leaders pander to public opinion with new programs to prevent the uncommon, ignoring the more common risks!

Sunday, September 06, 2009

Science and Decisions Advancing Risk Assessment

There is a very positive review by Gary Brewer in Science magazine of this book published by the National Academies. Brewer says that it will be the standard reference on risk analysis for years to come.

I was impressed by Brewer's thoughtful discussion of five questions, that are far easier to ask than to answer, and all to often are not addressed:
  1. What's the problem?
  2. For whom is something a problem?
  3. Who is the decision-maker?
  4. Why do risk analysis if we already know what to do?
  5. So what?

Wednesday, July 08, 2009

A Model to Improve Risk Analysis and Decision Making

Judea Pearl's article in Forbes describes the use of Baysian Networks in artificial intelligence applications.

The article is from the Forbes issue on Artificial Intelligence.

And free software to implement the Baysian Network Editor and Too Kit.that can be used to implement some of the approaches Pearl described.

Thursday, March 12, 2009

Science and Decisions: Advancing Risk Assessment


Committee on Improving Risk Analysis Approaches Used by the U.S. EPA, National Research Council, 2008

Description: "Risk assessment has become a dominant public policy tool for making choices, based on limited resources, to protect public health and the environment. It has been instrumental to the mission of the U.S. Environmental Protection Agency (EPA) as well as other federal agencies in evaluating public health concerns, informing regulatory and technological decisions, prioritizing research needs and funding, and in developing approaches for cost-benefit analysis.

"However, risk assessment is at a crossroads. Despite advances in the field, risk assessment faces a number of significant challenges including lengthy delays in making complex decisions; lack of data leading to significant uncertainty in risk assessments; and many chemicals in the marketplace that have not been evaluated and emerging agents requiring assessment.

"Science and Decisions makes practical scientific and technical recommendations to address these challenges. This book is a complement to the widely used 1983 National Academies book, Risk Assessment in he Federal Government (also known as the Red Book). The earlier book established a framework for the concepts and conduct of risk assessment that has been adopted by numerous expert committees, regulatory agencies, and public health institutions. The new book embeds these concepts within a broader framework for risk-based decision-making. Together, these are essential references for those working in the regulatory and public health fields."