Wednesday, July 30, 2003


“Half of what one learns in medical school is not true,” according to rumor. The problem is, no one knows which half is true and which is not. So today, medical students (if they are lucky) are taught to be open to new knowledge and to reject existing knowledge as better scientific evidence and theory accumulate.

There is an old debate about when in history a visit to the doctor changed from being more likely to do the patient harm, to being more likely to help the patient – a medical tipping point or divide. The estimates I have seen range over the first half of the 20th century. There is enough medically caused illness to have a word for the situation: “iatrogenic”!

An interesting thing about medical knowledge is that some of the “untrue knowledge” can be useful. The “placebo effect” is well known – people often get better when given sugar pills (or placebos) when they believe them to be medicine. Tell someone he is getting a sugar pill that will do no good, and it will do no good; lie and say it is a medication that often helps conditions like those of the patient, and the patient may benefit.

I seems to me that Knowledge for Development practice has to be based on recognition that a lot of knowledge is not true. In areas like medicine, engineering, agronomy, etc. I think it is probably important to develop knowledge systems that allow for application of the best available knowledge, but also a continuing process of improvement of knowledge – of substitution of knowledge that is more true for knowledge that is less true.

Sometimes it is most important to provide systems for the timely construction of truth. The Garbage Can theories of management would suggest that it is important for organizations to develop a consensus around which action can be programmed, than it is to achieve more fundamental epistemological values. Courts in the United States hold that the innocence of a person convicted of a crime is not sufficient basis for appeal, only failure in the fairness of the process by which the conviction was arrived upon qualifies. Certainly the knowledge processes in legislative bodies seem more focused on decision making than upon achieving truth.

I have been reading Local Knowledge: Further Essays in Interpretive Anthropology by Clifford Geertz. It includes a chapter about charisma, and the extremes to which kings went to establish the (false) knowledge in the people that they ruled that their monarchy was somehow inevitable. His examples are drawn from before the age of mass transportation and mass media, and he describes monarchs traveling for large chunks of time with enormous retinues to impress the people.

His examples bring to mind the Saddam in Iraq, who used so many means to convince the people that he was special. The Iranian monarchy sought to link themselves with thousands of years of history to show that they were somehow entitled to rule Iran. The British brought in descendants of the Profit when they organized TransJordan and Iraq after World War I, seeking similar legitimacy.

But as a Yank, I have always wondered why the British put up with the monarchy there. (Bob Hope said his family was English, they were too poor to be British.) Would anyone agree to give a German immigrant family that had renamed itself Windsor the lands and properties it now holds, and the income it still receives, to do what the British Royals do? The idea of the divine right of kings ought long ago to have been tossed on the dump heap of history.

So how does a society decide when some political knowledge is wrong, and when knowledge is not only wrong but so costly that it should be corrected in the public domain. I think the United States should be making such a political decision soon.

When the Bush Administration took the U.S. to war in Iraq, it did so on the basis of two assertions: that the Iraqis had, or were soon to have weapons of mass destruction, and that the Iraqi Government was dealing with International Terrorists. It went to war over the opposition of many Americans and many foreign governments.

A critical issue is timing. I think it is agreed that the Iraqi Government had failed in its obligation to demonstrate that it had destroyed any weapons of mass destruction that it had developed, and its obligation to show it did not possess such weapons. I think it is clear that the United Nations and other international institutions were not effective in enforcing those obligations. Was there a threat so eminent that the United States could not wait for new institutions to evolve, or for the major powers to achieve consensus about war.

Frankly, I think that if a President of the United States had reason to believe that a foreign government was providing or was about to provide weapons of mass destruction to terrorists in order that those weapons would be used against the United States, it would be appropriate to regard that as an act of war, and intervene militarily.

But some of the assertions made by the Administration, and indeed by the President in the most formal of possible situations seem not to have been true. The search for weapons of mass destruction has been unsuccessful to date, and the public has not seen any evidence that the Iraqi Government was supporting Al Qaeda. If the Administration could not distinguish true from false information on so important an issue, perhaps it should be replaced. If the Administration deliberately provided false or misleading information to enable it to take the United States to war, I would think it definitely should be replaced.

The U.S. presidential elections will be held in 2004. It will be interesting to see if better evidence accrues in the next year on Iraqi weapons programs and Iraqi support for terrorism. If it does, I wonder whether the electorate will care? If the situation remains opaque for Americans, as I suspect it will, I wonder what the result will be.

No comments: