Sunday, September 02, 2012

The evidence you accept depends on your stance.


Chris Mooney is one of my favorite science writers. He has an especially interesting article in Salon titled "Study: Right twists facts". I quote:
In recent years, the field of moral psychology has been strongly influenced by a theory known as “moral intuitionism,” which has been championed by the University of Virginia psychologist Jonathan Haidt. Dealing a blow to the notion of humans as primarily rational actors, Haidt instead postulates that our views of what is right and wrong are rooted in gut emotions, which fire rapidly when we encounter certain moral situations or dilemmas — responding far more quickly than our rational thoughts. Thus, we evaluate facts, arguments and new information in a way that is subconsciously guided, or motivated, by our prior moral emotions.
But are we all equally lawyerly? The new paper, by psychologists Brittany Liu and Peter Ditto of the University of California-Irvine, suggests that may not actually be the case....... 
Liu and Ditto found a strong correlation, across all of the issues, between believing something is morally wrong in all case — such as the death penalty — and also believing that it has low benefits (e.g., doesn’t deter crime) or high costs (lots of innocent people getting executed). In other words, liberals and conservatives alike shaded their assessment of the facts so as to align them with their moral convictions — establishing what Liu and Ditto call a “moral coherence” between their ethical and factual views. Neither side was innocent when it came to confusing “is” and “ought” (as moral philosophers might put it). 
However, not everyone was equally susceptible to this behavior. Rather, the researchers found three risk factors, so to speak, that seem to worsen the standard human penchant for contorting the facts to one’s moral views. Two of those were pretty unsurprising: Having a strong moral view about a topic makes one’s inclination toward “moral coherence” worse, as does knowing a lot about the subject (across studies, knowledge simply seems to make us better at maintaining and defending what we already believe). But the third risk factor is likely to prove quite controversial: political conservatism. 
In the study, Liu and Ditto report, conservatives tilted their views of the facts to favor their moral convictions more than liberals did, on every single issue. And that was true whether it was a topic that liberals oppose (the death penalty) or that conservatives oppose (embryonic stem cell research). “Conservatives are doing this to a larger degree across four different issues,” Liu explained in an interview. “Including two that are leaning to the liberal side, not the conservative side.”
A continuing theme of this blog is that we think with our brains, not our conscious minds. My own experience is that I have trouble digesting a local conclusion about a moral quandary if it does not correspond to my intuitive response to that quandary. (Sometimes logic is right, but I would guess that is is a pretty good idea to reexamine local decisions that seem intuitively wrong.)

I am sure that Liu and Ditto (faculty at my alma mater) would class me as a liberal. That may color my judgment, but I like their conclusion that conservatives are more likely to justify intuitive beliefs by selective reading of evidence than are liberals.

Of course one's leaning toward conservatism or liberalism is a brain behavior. It is also likely to be, at least in part, a learned behavior of the brain. It can not be a coincidence that there are more liberals in some places than in others, nor is it likely that conservatives have chosen en mass to move to red states and liberals to blue states. The culture of the society in which one grows up must influence one's core beliefs. Those beliefs must be coded in one's brain in some way, and must influence other functions of one's brain.

I would express a difference with one of the things implied in the quotation above. It concerns the behavior of people with a lot of knowledge about a subject. Someone who knows a lot about, say the death penalty, may actually have learned quite a bit about the costs and benefits that accrue from legislation imposing the death penalty. That person may have come to believe that the death penalty was right (or wrong) weighing the benefits against the costs and coming to the conclusion as a result that the policy was justified (or not justified). Moral judgement may be resistant to change, but it is not impervious to reason. For such a person, it would not be unexpected to find her judgement of relative costs and benefits to correspond more closely to her conclusion than for the average uninformed person.


No comments: