Thursday, August 25, 2005

"Lessons of War"

CSpan program website:

"Robert McNamara, former Defense Secretary in the Kennedy and Johnson Administrations (1961-68) discussed lessons he learned from his involvement in the Cuban Missile Crisis and the Vietnam War. He also discussed clips from the documentary about his life The Fog of War, which recently won an Academy Award for Best Documentary Feature."

I found this a very interesting discussion. McNamara said that it was not until 30 years after the Cuban missile crisis that he learned from Fidel Castro himself that there had been missiles in Cuba with nuclear warheads, and that Castro had recommended to Khrushchev that they be used, recognizing the implications for Cuba of such use. The comment raises a number of questions: Did McNamara remember correctly? Did Castro tell the truth? But it illustrates most clearly McNamara's point of how difficult it is to find "truth" in the fog in which crucial decisions are made.

McNamara also pointed out that in the Cuban missile crisis, there were men at the elbows of the key actors in the decision process who had lived in and studied Russia for a long time, and could advise out of their understanding of Russians and Russian culture. He said that during the Viet Nam decision making, there were many fewer people who understood North Viet Nam and its leaders thinking, and that they were are a far greater distance from the key decision making venues. In the case of Iraq, the situation seems to be even worse. There are only a few people coming out of the universities each year with understanding of Arabic and Moslem culture, and very few of people with long time experience of Arab and Middle Eastern culture are senior advisors in U.S. government.

I had been wondering about Japanese involvement in World War II. It seems that it was based on the belief that the United States and United Kingdom would make a reasonable settlement, and not engage in full scale, protracted war after the initial successes of the Japanese military. If so, the belief was fatally flawed, and millions died as a result. How could a government come to such a false belief? If I understand McNamara's point, governments can make such mistakes easily, and often do. He says it is important for decisions to be made in knowledge that they may be wrong, based on wrong assumptions and/or wrong information. He suggests that it is often better to delay action, get more information and analyze more deeply rather than move immediately. He stresses how difficult it is, and how necessary, to get people with alternative views to speak up in policy making fora; how important and how difficult to reexamine classic assumptions (such as the “domino theory” – that if one country turned Communist, others would necessarily follow – which was not challenged until 1967 during the Viet Nam war, and proved false). And he suggests that if all your allies disagree with your analysis and the resulting policy, it is best to reexamine it in the understanding that governments are fallible.

One can assume that for McNamara, it is critical to have an exit strategy, in case the decision to go forward was erroneous. That it is critical to have means to make mid-course corrections.

All of this seems reasonable enough in theory. But it must be hard to achieve in practice, since so many problems arise from failure to follow such steps.

No comments: