Friday, November 08, 2013

Illusions of validity and skill and misplaced confidence.


Daniel Kahneman writes about the illusion of validity. All too often the brain intuits a story that satisfies the available (inadequate) information and comes to a conclusion that is then (intuitively) believed to be valid. He describes a situation from his early life in which he and a colleague were asked to predict which people performed a specific, difficult task would be most successful in a leadership course. They did so with considerable confidence, only to discover that they were almost no better than chance. Then they did the prediction again for another group with equal confidence, even knowing that they previously had been wrong. They suffered from the illusion of validity of their predictions.

In the same New York Times article he writes about the illusion of skill. All too often the brain makes up stories that satisfy the available (inadequate) information for each prediction and come to the (intuitive) conclusion that they are skilled at the predictive task. He describes an experience in which he analysed data on professional financial advisers discovering that there was no correlation between their rank order for bonuses in one year and that in other years; thus there was no repeatable difference in skill among these advisers. The results of their advice were about as good as random selections would have been. Sharing this evidence with the advisers and with the superior that awarded bonuses for performance did not shake their illusion that they were skilled at predicting stock price changes.

People who have frequent timely opportunities to discover their mistakes may avoid such an illusion of validity and the illusion of skill. However, in both examples people had their predictions positively reinforced. In the first example, the other colleague must have agreed on the recommendations made for training, and the admissions office must have accepted the recommendations. In the second case, the financial analysts must have had good salaries and received occasional bonuses "for the successful application of their skills". The much later information that predictions had been wrong perhaps failed to reverse the psychological impact of the immediate positive reinforcement from peers and clients.

Kahneman has noted that people who jump to wrong conclusions are often as sure that they are right as people who arrive at conclusions after detailed analysis of lots of data. Perhaps this is why conservative politicians are as convinced that mankind is causing climate change (on the basis of few facts and little understanding) as are scientists convinced of the global warming after extensive analysis of huge data sets based on strong theory.

No comments: