Tuesday, August 31, 2010

Asteroid Discovery From 1980 - 2010

This amazing little video charts the location of every asteroid discovered since 1980. As we move into the 1990s, the rate of discovery picks up quite dramatically because we’re now working with vastly improved sky scanning systems. And that means that you will especially want to watch the second half of the video. Below the jump, I’ve pasted some more information that explains what you’re seeing. Thanks to Innovation Daily for posting this video.

To read the full, original article click on this link: Open Culture

Sunday, August 29, 2010

More on the Human-Built World

This is my second posting on reading Thomas Hughes Human-Built World: How to Think about Technology and Culture. (Click here to see the first.) The third chapter of the book is titled "Technology as Machine". In that chapter, Hughes extracts a number of public intellectuals writing about technology and society/culture during the Second Industrial Revolution -- from 1880 to the end of the 1920s. The writers discussed are primarily from the United States and Germany, countries that were among the most technologically advanced at the time, but that differed significantly in culture. The Second Industrial Revolution was marked by the spread of electrification and  of the internal combustion engine in vehicles, phenomena that must have been observed by most Americans and most Germans. I would note however, that some observers thought that at the end of the 1920s the average patient visiting a doctor would be worse for the visit -- the revolution in medical technology had not yet occurred.

While Hughes has provided a valuable historical survey of thinking at the time, the effect on me was quite negative. The authors he considers were obviously men of considerable intelligence and learning, widely respected in their own time. Not only did they disagree widely among themselves, but they all seem to have been wrong in important respects in their predictions of the impact of technology. I note that the discussion also demonstrates "temporal relativism". That is, many of the authors display values (which they clearly must have believed to be widely shared) which are not widely shared today.

I don't know whether Hughes missed it or chose not to emphasize it, but the thinking of the time seems to have ignored the fact that in the late 19th and first half of the 20th century the Second Industrial Revolution, like the original Industrial Revolution, had not reached most of the world's population.

I have also been reading Dangerous Games: The Uses and Abuses of History by Margaret MacMillan. In the book she writes about the popular interest in history, defending the role of professional historians, and complaining about not only those who deliberately falsify or misuse history, but also about the authors of popular histories who in spite of not knowing enough history and making errors in their texts, are increasingly widely read. In the text, MacMillan also makes a large number of brief references to historical events and/or interpretations. The book, like that of Thomas Hughes discussed above, leaves me impressed not only by the amount of false history in the average mind, but by how often very good historians seem to have misperceived major trends or misinterpreted the meaning of those trends. What chance have we to do better now?

I have finished four chapters.
  • "The History Craze" in which MacMillan seeks to convince the reader that there is a lot of interest in the past. Surprisingly, she does not tell us whether there is more or less interest in the past now than in the past. I also am not sure that kids confronting history in school are necessarily interested in it, nor that those interested in tracing their own genealogy are interested in history in any larger sense. As I look at the material covered in the History Chanel, the Military History Chanel, and the Biography Chanel I wonder whether the interest is general or limited to certain periods of history in certain locations (have you ever tried to find good histories of ancient Africa?).
  • "History for Comfort": The theme of the chapter is that we often seek to read history to obtain comfort, and I suppose that may explain why history of events in which our ancestors behaved well or even heroically are so popular. Of course, there is nothing wrong with taking comfort from consideration of happy or successful times in the past, but there is a danger that in so doing one will bias ones understanding of the world and the challenges it presents.
  • "Who Owns the Past": While MacMillan says we all own the past (perhaps implying that no one owns it), she clearly believes that those who have the best claim are those who study it carefully, preferably academically trained to do so, and who convey their findings honestly. She makes an important plea for professional historians to write history that is not only accessible to the intelligent reader, but also likely to attract wide attention to relevant history.
  • "History and Identity": MacMillan does not say so, but our identities are determined in significant part by the culture we inherit, and that culture has evolved over time in response to historical events and trends; thus one better understands one's identity by understanding how its cultural components resulted from the history of one's society. She seems more interested in the manipulation of identity of nations, ethnic groups, and other groups (religions, genders, etc.) by the use and misuse of history. Clearly Hitler and the Nazis provide an extreme example of misuse of a meretricious historicism for propaganda. Equally clearly, MacMillan values historical fact per se, and I suspect she would promulgate the truth especially when it challenges widely held ideological concepts. I wonder however whether founding myths can not play a useful role for some social purposes such as nation building in tribal societies, or whether it might be useful sometimes to emphasize virtues and whitewash defects in founding fathers to encourage faith and adherence to social institutions.
The first rule of the hard sciences is not to believe an experimental result until it has been replicated and replicated again. Unfortunately, while we may interpret history as repeating itself, we can not replicate historical events under controlled conditions.

The hard sciences are like history, however, in that scientists like historians have often been wrong.

Looking at the history of technology, I find some comfort in numbers. It is clear that people are living longer today than 100 or 200 years ago. There are more people who live comfortably, and a smaller percentage of people who live in dire poverty. We can chart the trends, and have some confidence that they will continue for at least a limited time in the future, remembering of course that people believed the stock market trends of the 1920s would continue right up to the crash.

I suppose the lessons of Hughes and MacMillan are to seek truth, to believe evidence, to be humble in ones pretensions towards wisdom, and to help beliefs conditionally. Scientists believe theories that have strong support, distrust new data until it has been replicated, and always hope for the conventional wisdom to be demonstrated wrong and thus to allow exciting new intellectual opportunities. Not a bad habit of thought. Still, in the real world, it may be best to act as if we believe in global warming, anthropogenic decimation of biodiversity, and other environmental problems since such actions can help avoid those very problems.

Saturday, August 28, 2010

More on the stem cell research decision

Bob Parks writes:
On Monday federal judge Royce Lamberth, appointed to the federal bench by Ronald Reagan, blocked the use of federal funds for research using embryonic stem cells on the grounds that extracting the cells kills human embryos. It is of course true that, for good or ill, every embryo has the potential to become a totally unique human being. The same is true of every zygote created by the fusion of gametes in an in-vitro fertilization Petri dish. One or more of the resulting embryos will be transferred to the patient's uterus a few days later. There will typically be many embryos left over. They are stored cryogenically in case a second transfer is necessary. By 2008 about 500,000 frozen embryos had accumulated in cryogenic facilities around the United States. That would be closer to 1 million by now, all of which retain the potential to become unique human beings. Does Judge Lamberth’s decision mean that society must now assume responsibility for the continued viability of this growing population of potential people?
Sally Lehman writes in the September edition of Scientific American:
When researchers first demonstrated in 2007 that human skin cells could be reprogrammed to behave like stem cells that can fully differentiate into other cells, scientists and politicians alike rejoiced. All the potential of embryonic stem cells might be harnessed with the new techniques—without the political and moral controversy associated with destroying a fertilized egg.

That optimism, however, may be misplaced; these transformed cells, known as induced pluripotent stem cells (iPS cells), actually present equally troubling ethical quandaries, according to bioethicists who met at the International Society for Stem Cell Research annual meeting in June. Not only do many of the ethical challenges posed by embryonic stem cells remain, but the relative ease and low cost of iPS techniques, combined with the accessibility of cells, accelerate the need to address futuristic-sounding possibilities such as creating gametes for reproduction. Scientists have already reported progress in growing precursor cells for eggs and sperm from both iPS and embryonic stem cell lines.

Clearly we need to discuss in some detail the ethics of stem cell research and of the techniques that will be generated by this research. Of course those whose ethical positions are influenced by their religious views (either on the embryo or the responsibility to people who might be helped by new medical techniques but will not be due to delays in the research) have the right and perhaps the responsibility to put forward their views in that discussion. On the other hand, lets not let Congress slip policy into amendments to funding bills without proper debate, nor let judges use judicial activism without full discussion and the participation of those scientists who can best estimate the importance of stem cell research.

Thursday, August 26, 2010

Money is the Mother's Milk of Art

Last night I saw an interview with Philippe de Montebello, the former Director of the Metropolitan Museum of Art in New York. He mentioned that when the Met bought Rembrandt's Aristotle Contemplating the Bust of Homer ($2.3 million in 1961) there were only three bidders in the running, the major museums in New York, Washington and London. He went on to say that in the subsequent half century billionaires had entered the market in a big way and the auction prices for important works were more than the museums could afford.

He said that museum directors had enjoyed a couple of centuries as the arbiters of public taste in art, but now -- as prices set the standard for taste -- those with the most money become the arbiters. That apparently was bad news for those hired by the print media to be art critics.

In the West it seems to me that it was in the Middle Ages the church which had the money to pay for art and the say as to what was hung in the churches. In the Renaissance, it was the aristocracy that supported artists and set the taste in cooperation with the church. Later, as trading cities developed wealth and an independent merchant class, one found schools of art that met their taste. The major art museums that seek to provide their visitors with a historical view of Western art thus track the taste of those who had the money and power to commission art over history.

I note that taste has changed in the past couple of hundred years. The impressionists were a group that challenged the dominant taste of their day. In the early part of the 20th century, 18th century paintings by Reynolds and Gainsborough set price records that were not matched for decades. The latter part of the 20th century went through a series of schools of art, each out-competing the last by being newer and more revolutionary.

In the interim, schools of art created by local groups or ethnic groups in the United States gained their own audiences, even if left out of the major museums in the capitols of art. Thus the third largest art market in the United States is in Santa Fe, New Mexico, a relatively small city that has a strong market for art by native Americans. There is a market for western art focusing on cowboys, for California art, for paintings of the New England sea coast, etc.

The experts who appreciate works of art and artists that had significant influence over later artists have a justifiable point of view, but de gustibus non est disputandum (in matters of taste there is no dispute).

Where does all the computer power go?

According to the New York Times, the U.S. Department of Defense now operates "15,000 networks and 7 million computing devices in dozens of countries". While a lot, probably the majority of these 7 million computing devices are personal computers or their equivalent in power, some are undoubtedly large scale machines. That is a lot of computer power!

Gray Wolves Back on the Endangered List

Article Source: Virginia Morell, ScienceInsider, August 6, 2010

"Hunted last year in Montana and Idaho, the Northern Rocky Mountain gray wolf (Canis lupus) is once again on the federal endangered species list. Yesterday, a federal judge in Helena overturned the U.S. Fish and Wildlife Service's (FWS's) decision last year to remove the wolves from the list in those two states but leave them on it in Wyoming.

"Conservationists applauded U.S. District Judge Donald Molloy's decision, but state wildlife officials in Montana and Idaho argue that the wolves' rebounding population needs to be better managed, including being hunted."

This is good news for those who are happy just to know that these animals can continue to live in the wild in the United States? There are now close to 2000 wolves in the three mountain states. The wolves were eradicated in the United States, but some came down on their own from Canada and others were reintroduced. 2000 is a very small number of animals to maintain a species, and the existence of a subpopulation in Canada, while great, is not a substitute for one in the continuous 48 states.

Behind the Scenes of "Hummingbirds"

Congress should act to restore stem cell research

Image source: Scientific American

A federal judge this week found that federal funds could not be used to finance any stem cell research that involved embryonic stem cells, finding that doing so would contravene the Dickey-Wicker Amendment that has been added to appropriation bills for the last decade. The judge, Royce Lamberth, has been responsible for a number of highly visible cases since assuming his post, and it seems likely that he has made a reasonable interpretation of the law, albeit one that is more draconian that that of either the Bush or the Obama administration.

Since stem cell research offers major advances in both scientific knowledge and medical technology, and since embryonic stem cells still seem to have properties that can not be duplicated by treatment of stem cells from other tissue sources, I would hope NIH will soon resume funding embryonic stem cell based projects.

The solution is clear. The fiscal year is almost over. The Congress, in passing the next fiscal year's appropriations, should either not include the Dickey-Wicker Amendment or it should revise the Amendment to make clear that Congressional intent is not to ban embryonic stem cell research.

The fall of Constantinople

1453: The Holy War for Constantinople and the Clash of Islam and the West

I just finished reading 1453: The Holy War for Constantinople and the Clash of Islam and the West by Roger Crowley. "Crowley tells of the decline of the Byzantine empire and the rise of the Ottoman empire leading to the final siege and conquest of Constantinople. The long section of the book dealing with the actual battle for the city is as interesting as a novel, and is pieced together from sparse and sometime conflicting accounts.

I had associated the fall of Constantinople with the end of the eastern trade or Venice and Genoa, which this book implies is a gross simplification. The Genoese city of Galata across the Golden Horn from Constantinople/Istanbul was left in tact, and Latin populations continued to be allowed to live in Istanbul after the conquest. Apparently trade trailed off and became more expensive, encouraging Europeans to seek alternative trade routes.

Similarly, I had read that the conquest had led to the flow of Byzantine scholars into Italy, but that flow and the Renaissance had started long before the fall of Constantinople; it does seem reasonable that there might have been an increment to the flow with the fall of the capitol.

Crowley is not a professional academic historian, but he does provide an interesting historical narrative which seems authoritative. I recommend it.

Monday, August 23, 2010

Thoughts on the Human-Built World

I have been reading Human-Built World: How to Think about Technology and Culture by Thomas Hughes. Hughes is a historian who specializes in histories of technological developments, and I previously read his book, Networks of Power: Electrification in Western Society, 1880-1930. Hughes defines "technology" as the technology that engineers work on -- civil works, electrical works, etc. Thus Hughes appears to think in terms of examples from infrastructure development.

I agree that engineers are professionals trained to work in certain technological fields. I think Hughes and I would also agree that manufacturing is based on technologies, including craft and other technologies that are not normally associated with engineers. I also think of agricultural, forestry and fishery technologies involved in primary production industries. Increasingly service industries are considered in terms of their technologies, such as medical technology, educational technology and the information technology underlying modern financial services.

Of course most of what I perceive on a day to day basis is a human-built world. I live in a house, in a neigbhborhood in the suburbs of a city. I take roads to shop in a shopping center, or the Metro to the city to talk to people with whom I used to work. On the other hand, I am lucky enough to live in the United States, and can drive through huge areas of forest and desert, or visit the coast to view the ocean, or on my lucky days visit a national park. It is easy enough for me to recognize that I live in a world that has been heavily influenced by man with some areas (on the surface) that are human constructed.

Hughes points out that the first colonists of what is now the United States thought that they were going to civilize a howling wilderness. Their attitude bemused the native Americans who did not perceive themselves to be living in a wilderness, since they lived in villages with paths between them, growing some crops, and surrounded by what they found comfortable if more natural environments. Of course, humans had occupied North America for some 13,000 years and were the keystone species. Charles Mann in his book 1491: New Revelations of the Americas Before Columbus that native Americans had selectively enriched forests with trees that produced mast which formed an important part of their diet, and selectively hunted so as to help assure game animals would be found near their villages. Their technology was more of a husbanding of natural resources than of terraforming lands to produce farms.

In his second chapter, "Technology and the Second Creation",  Hughes appreciates that modern people with our secular view of technology find it hard to understand or appreciate the religious views of technologies of some of our ancestors. He is certainly right about me! Still he seeks to make the case that as some important thinkers saw human technological efforts as diabolic, linking men with the fallen angels in their hubris, other thinkers saw human technological efforts as realizing a divine plan, and indeed preparing a second garden of eden in preparation for the second coming. He cites writings of early American colonists who saw the conquest of the wilderness in that sense of preparing a second garden of eden. I can appreciate the diabolic and godly concepts of human technological efforts as a powerful metaphor for modern technology policy debates.

Having written some pieces on technology in the past in which I was trying to influence opinions of my readers, I can't help suspect that that which people in the past wrote about technology was not what they really believed. Even more to the point, I suspect that the earliest colonists when seeking to scratch a living out of an unfamiliar land for which they were not prepared, and survive a sequence of diseases made more difficult to to malnourishment, did not spend as much time thinking about their divine mission as about how to get the next meal. Indeed, as colonists became more successful in later generations, I suspect that they were thinking more about expanding that success and living better lives -- focusing on day to day problems and leaving religion to Sunday meeting and perhaps a limited period of prayer a day.

On the other hand, I have just read the chapter of 1453 about portents and omens perceived by the Byzantines and Ottomans during the final days of the siege of Constantinople. I find it easy to believe that those people were consumed by fears and hopes based on the mystical, and I find it easy to believe we are better off today dealing with a more secular view of technology. Still, Hughes history is helpful as we try to understand history and how we got to where we are now.

Friday, August 20, 2010

We lucked out on the last flu pandemic, but don't count on doing so on the next!

John Barry, the author of The Great Influenza: The story of the deadliest pandemic in history, has an article in the current Foreign Policy magazine with some good common sense on flu as a public health problem. The World Health Organization data on the number of cases and number of fatalities is based on laboratory confirmed cases of H1N1 flu, and thus massively undercounts the real burden of the disease. Still, we are lucky that the impact of that flu was much less than that of previous pandemics.

We don't know when a new strain of flu will emerge capable of causing a major pandemic, but the world is still not ready for it despite progress made in recent decades. Early case finding, perhaps the most critical capacity, has improved but is still not at the level one desires. Vaccine production technology has improved, but in February 95 countries told WHO that they had no flu vaccine at all. New approaches to applying the vaccine are in the works, but the ability to get the vaccine into huge numbers of vulnerable people, as is required to limit a pandemic, is just not there.

On the other hand, a number of public health officials overreacted to the H1N1 (swine flu) epidemic:
Egypt, for example, slaughtered its entire pig population; Singapore warned citizens that violating a quarantine order would result in jail time. Mexico, where the 2009 outbreak began, was punished harshly for its transparency: France demanded that the European Union cancel flights to the country, and some U.S. commentators wanted the border shut. In total, the Mexican economy lost nearly $3 billion. This kind of overreaction only encourages governments to keep quiet the next time a virulent flu strain hits.

Will personal computers have a capacity of 10(exp 16) cps by 2025?

Source: Nic Brisbourne in The Equity Kicker

Brisbourne is writing a series of articles on Ray Kurzweil’s The Singularity is near: When humans transcend biology. This one deals with Kurzweil's projects of the future of the personal computer. Kurzweil’s prediction is that by 2025 personal computers (perhaps costing less than $1,000 in today’s money) will have the power of the human brain:
personal computers, or at least those back in 2005 when The Singularity was published provided 10(exp 9) cps and an extrapolation historical increases in computing power going forward yields the prediction that personal computers will have a capacity of 10(exp 16) cps by 2025.
The question is, what will we do with a personal computer that is 10000000 times faster than that on which I am typing this? Of course, I can imagine (easily) that the average user of personal computers will be a few orders of magnitude smarter and faster than I am. I can also imagine that software developers will continue to create packages that provide small advances in features to the consumer but require a lot more computer to run. Still, one hopes that there will be new killer apps that utilize the stunning computer power to come to do something new and really useful! I hope some really bright guys, maybe in MIT or Stanford or maybe the Indian Institute of Technology are working on those apps right now!

Thinking about the Green Revolution

The Green Revolution was basically the result of the application of science and technology to the problem of improving agricultural productivity in developing nations. I think the narrative line is something like this:

  • The densely populated countries of Asia were facing major food shortages in the early 1960s. They were not producing enough food to feed themselves and did not have the huge resources that would have been necessary to make up the shortfall. This was only the most extreme example, and in most of the developing world, most people worked in low yield agriculture and their productivity had to be increased to enable national economic development.
  • With Rockefeller and Ford Foundation support, the International Center for Maize and Wheat research in Mexico had shown that new varieties of these basic grain crops could be developed with greatly increased yields, and that grain production in Mexico had been increased by wide spread use of the new varieties. It was recognized that the approach offered a chance to improve yields all over the developing world.
  • The International Rice Research Center was created in the Philippines to expand the CIMYT work to that crop, and other International Agricultural Research Centers were later created and joined together with the coordination of Consultative Group on International Agricultural Research. Long term support of the CGIAR was achieved through a collaborative approach of several donors.
  • Potentially revolutionary advances in technology were achieved in getting important qualities necessary to improve grain production into improved cultivars of rice and other grains.
The key innovation apparently was the transfer of dwarfing genes into high yielding varieties of grains. Shorter varieties with strong stems could hold more grain weight without lodging (falling over which ruins the grain). Other traits such as disease resistance, insect pest resistance, the ability to grow in different conditions and improved nutritional characteristics were then incorporated into improved varieties over time.
  • There was then a huge job of adapting these varieties to the local conditions in individual countries, done by national agricultural research services. Then seed for the improved varieties had to be produced in industrial quantities. Farmers had to be taught to use the new varieties, a job done largely by new agricultural extension services. The human resources to run the field stations and extension services had to be trained, largely by a new network of national agricultural universities. National governments with the support of the donor community carried out the institution building of the research services, extension services and agricultural colleges. There was a lot of training of developing country agricultural leaders in U.S. and other developed country graduate schools.
  • The improved varieties, in order to achieve their potential yields, required irrigation, fertilizers and pesticides. Road systems had to be improved to facilitate movement of inputs and outputs of farming. Input and output markets had to be developed. Grain storage facilities had to be built to handle the product.  The investment needs were huge in this effort, and the donor community helped developing countries build the capacity to provide these inputs for their farmers, especially in Asia.
  • The result has been a huge increase in grain yields in Asia and Latin America, and a lesser increase in Africa where for many reasons the improved varieties were less widely accepted and used.
Thus the story is one in which innovative scientific research in IARCs proved successful. Then developing countries innovated through the use of the improved genetic materials naturalizing them to their own conditions and disseminating them to their farmers. Finally, heavy investments had to be made in the rural sector to enable farmers to realize the potential in their new materials.

The result was that the food production per farmer increased in much of the developing world. Since each farmer could produce more food, the largest movement from rural to urban areas in history was made possible. The new urban populations took jobs in manufacturing and service industries, fueling economic development. Perhaps most important, the feared famines were averted.

The United States was supportive at all stages. That support involved a partnership among government (especially USAID), foundations, land grant colleges and NGOs, not to mention the large number of individuals who made careers in this form of development assistance. The model of U.S. agricultural development was widely shared, and U.S. experience of improved agriculture fueling overall economic development was replicated in many nations.

The story illustrates an important aspect of the role of science and technology in development. A body of scientific information has accumulated over time, including information about the genetics of the grain species and their varieties, the diseases and pests that affect these crops, soils and growing conditions, etc. That information includes understanding from the social sciences about the economics of farming and food markets and understanding of the dissemination of innovations.

This scientific wealth informed specific efforts to improve varieties of the major grain crops. This effort could be seen as applied science or technology development. The social expenditures on science and improvement of varieties were fairly small, as compared to either the total value of global grain production or the value of the increased grain production since the 1960s. It was also small as compared to the total investment needed to transfer the improved varieties from the experimental field stations to the tens of millions of farmers who adopted improved varieties plus the investments in improved agricultural infrastructure needed to use the improved varieties effectively, plus the increased inputs of fertilizers and pesticides needed to achieve high yields with the improved varieties. However, without the science and technology development, the improved yields would not have been realized nor would most of the investments economically warranted.

When the effort to improve grain yield in developing nations began, U.S. scientific involvement was absolutely critical. The agricultural research capacity of the world was highly centralized in the United States. In the last half century the scientific capacity in developing nations has been greatly increased, but that in the United States still plays a critically important role globally. Focusing U.S. agricultural science in part on the needs of developing nations was a critically important role for the development assistance donor agencies, and remains important today.

Thursday, August 19, 2010

Choosing our technological future

The horseman serves the horse,
The neat-herd serves the neat,
The merchant serves the purse,
The eater serves his meat;
'Tis the day of the chattel,
Web to weave, and corn to grind,
Things are in the saddle,
And ride mankind.

There are two laws discrete
Not reconciled,
Law for man, and law for thing;
The last builds town and fleet,
But it runs wild,
And doth the man unking.

Ralph Waldo Emerson
Ode to William H. Channing
The final chapter of David Nye's book  Technology Matters: Questions to Live With is titled "Not Just One Future". Summarizing the book's argument that technology is not the sole determinant of the way we live, and considering some of the literature and film angst about the future impact of technology, suggests that we should choose our technological future with some care.

We as individuals feel we have considerable choice of technology in many of our common decisions. Consider the shoe as a device for moving on foot and protecting the feet and ankles; this is an apparently simple technology yet there are many varieties of athletic shoes, of business shoes, of boat shoes, etc. There are high heeled shoes for women and germ fighting shoes for diabetics. That choice is affected by advertising; it may be more of a social choice than individual as peer pressure is brought to bear or as people seek to emulate cultural norms in footware. If there is not enough demand for blue suede shoes, the market will intervene and no manufacturer will manufacture them. If the government decides to levy a tax against the import of shoes from a certain country (as might happen in a trade dispute), then the consumer will not have the choice of purchasing those shoes at a competitive price.

Of course, in many ways we have little choice of technology. We can individually not choose the technology used to build the roads we must use, nor that that provides the water to our house and office. As workers in a large corporation we may not have a choice of the computer brand we use nor the software with which it is loaded. Such decisions are made by institutions.

In theory, one could commute using an airplane, but in practice flying an airplane requires skills that few people have mastered. If we want people to have free choice of technologies, then we have to provide them with the skills to utilize those technologies. (All too often the trend is to make technological changes to deskill activities, a process that has made work more unpleasant for many and which can create a sense of anomie.)

Similarly, we might want to drive an electric car rather than a car running on a gasoline fired internal combustion engine. To enable people to do so, there would have to be systems developed to manufacture and sell electric cars and to service and maintain them. One can imagine needs to retrain drivers and to modify laws to handle the new forms of traffic.

Nye raises the issue of how we might make better decisions on technology. We might improve the information on which technological decisions are made. That may not be so easy, as it has been difficult to predict the impact of technologies in the past. Who in the days of foot traffic and bicycles would have thought to worry about lack of exercise in a motor vehicle based society? The people who commercialized the telephone did not understand that it would be used as much for family and community life as for business life. Even if the information were theoretically available, it can be hard to get it to the people who make technological decisions, or to get those people to use that information. Think how much we spend to get a medical system to provide adequate information to patients to enable them to make good technological choices on the health interventions available to them. Think about the reluctance of the Bush administration to use the best available scientific information on environmental and regulatory issues.

We might also improve the processes by which technological decisions are made. The use of advisory committees in legislative, executive and judicial branches of government is one way to do so. So too would be the support for civil society organizations, such as think tanks, that analyzed technological options and educate the public. Increased media attention to such information could be encouraged to inform both voters and individuals in their technological decision making. Schools could help students develop technological decision making approaches and skills that could be used throughout life. Much could be done, and I suggest should be done.

This will be the final post on Nye's book, which I have found very thought provoking and worthy of attention. He packs a lot of information and a great bibliography in a short book!

Here are links to the other postings I have made on reading Technology Matters: Questions to Live With:

History of things can surprise!

New technologies seem odd at first but eventually come to seem natural. The Internet and the World Wide Web are new enough to most of us online that we can recall them as novelties, but now are part of everyday life. The process by which technological novelties come to seem natural has been termed "naturalization:. I came across some examples recently.

Before the Civil War there was no standard currency in the United States. The "greenbacks" issued by the Union to deal with the financial crisis created by the war were the first national currency. Prior to the war, many banks issued their own currencies.

Before the Civil War there was no standard for the design of the flag of the United States. Flags had stars and stripes, but the stars on a blue field were in different patterns and the blue fields of different sizes in flags taken into battle by different units of the Union forces. Indeed, early in the Civil War units of the Union and Confederacy sometimes were taken for units of the other force as their flags were mistaken one for another.

Things that seemed natural in the past may no longer seem natural. In Constantinople the general population would often discuss matters of religious dogma in detail. There could be riots in the streets over decisions of the church hierarchy on points of dogma. That too seemed natural to the inhabitants of Constantinople, although it seems strange to us today.

Things that seem natural in one culture may not in another. Governments calculate the inflation rate based on price changes in a "market basket" of goods and services. When I lived in Chile, which at the time was experiencing very high rates of inflation that made life difficult for lots of people, the newspapers would cover the decisions as to the composition of the market basket of goods on which the government would calculate the inflation rate; common citizens would follow the discussion and argue whether the decision was good or bad. It all seemed quite exceptional to my Yank cultural biases.

Wednesday, August 18, 2010

Still more on Nye's "Technology Matters"

Chapter 10 of David Nye's Technology Matters: Questions to Live With is titled "Expanding Consciousness, or Encapsulation. Like the previous chapters, the discussion surveys a large number of studies, referring to each briefly.

Nye is concerned that as we live increasingly in a human built environment, we may be losing something that was ours living in a natural environment. It occurs to me that Homo sapiens have been living in a modified environment for a very long time. The settlers who came to America hundreds of years ago found an environment that native Americans had significantly modified. Native Americans and immigrants both lived through the use of technology. Indeed, humans may have evolved into hominid built environments.

Nye may well be tempocentric in his concern for what we are losing as we adapt to new technologies. In the industrial revolution I am sure that people objected to the fact that workers no longer developed great physical strength because machines were increasingly doing the hard work; now values have changed. People who want to develop great strength spend a lot of their free time in the gym and the rest of us are satisfied with the strength we need to cope with our modern human-built environment. Our value towards physical strength is not that of our ancestors who needed it more; our value towards many mental ability is greater than that of our ancestors since those abilities are now more important to success in our society. At one time women developed great manual dexterity spinning and weaving, today lots of us develop a lot of manual dexterity typing.

Nye focuses on the individual and the way the individual thinks. We are social animals, and we think collectively as well as individually. Homo sapiens evolved in small groups, but we today deal with information not only in small groups but in large collective institutions up to the state and intergovernmental organizations. Individually we think differently than once we did. With the Internet at hand, we no longer have to recall all the information we need to use, so we don’t train our memories. My father would have seen that as a loss, while the young generation will not.

Clearly small groups today function better in many situations than the most comparable small groups would have in the past. Small groups form on the Internet although they are not geographically close together, and such groups can bring more and better abilities to bear on a task than would a group constrained to be locally available. They can do so with more knowledge and information at hand, with more data processing ability, not to mention more technology of other kinds that would groups in the past. (A small military unit armed with today’s weapons probably would defeat a large force armed as were Roman legions or troops of the Middle Ages).

I would suggest that not only do large institutions such as global markets and powerful corporations exist because they can and because they are successful where older simpler forms would not be, but new technologies have in my lifetime made such institutions still more effective. Clearly we think better as a civilization as more people work together; thus our knowledge of the world is vastly increased over that of the past.

There is an interesting question as to how people will think in the near future, and whether really major problems will occur. Think of obesity as a result of the change of life style relating to food availability and exercise, leading to increased rates of diabetes, heart disease, etc. So too, the individual in the upcoming technological age may be less successful than he/she might have been in the past.

So too it is possible that as people adapt to new technologies they will prove less able to interact successfully in small groups. Indeed, it is possible that our civilization will not survive the problems created by the way we use technology; technology based threats range from global warming to nuclear winter to a world in which population growth exceeds the growth of food resources.

Alternatively, multi-tasking, plugged in individuals in the future may have evolved more effective ways of living in their evolved technologically-managed environment, while groups, institutions and societies may also live more successfully with the combination of new and retained technologies.

While the observers of evolving technology and human response to that evolution are interesting to read and their ruminations often are thought provoking, it seems to me that such efforts are often not very accurate. Recall the people who thought that printing, by making the bible more accessible to all, would make civilization more saintly, or the failure of the people contemplating the implications of computers in their early days to predict the flood of pornography, spam, and online gambling on the Internet.

This one of a series of postings on Technology Matters: Questions to Live With:

Tuesday, August 17, 2010

Freedom of Expression

The history of freedom of expression is quite long, and apparently early examples of societies providing such freedom include not only the classical Greek city states but also the Madrasahs and universities of classical Islamic society (where the university was invented). Of course, historically churches were not pleased by apostates nor heretics, and for most of history kings and sultans were not open to criticism. It would seem that censorship gained new urgency for religious and government leaders after the invention of printing, when opinions could be widely shared. Of course, with radio, television, and the Internet added to the print media, there is even more technological capacity to share opinions, and consequently considerable enthusiasm on the part of those who would coerce agreement to censor free speech.

In the West, it would seem that the resistance to censorship gained importance in the Reformation, with a concern for freedom of religious expression. It gained currency in political documents with the American Bill of Rights and French Declaration of the Rights of Man and of the Citizen.

I quote from Wikipedia:

Areopagitica, published in 1644, was John Milton's response to the Parliament of England's re-introduction of government licensing of printers, hencepublishers. Milton made an impassioned plea for freedom of expression and toleration of falsehood, stating:
"Give me the liberty to know, to utter, and to argue freely according to conscience, above all liberties."
Today, freedom of expression is enthroned in a number of international agreements, Again, quoting from Wikipedia:

Article 19 of the Universal Declaration of Human Rights, adopted in 1948, states that:
"Everyone has the right to freedom of opinion and expression; this right includes freedom to hold opinions without interference and to seek, receive and impart information and ideas through any media and regardless of frontiers."
Today freedom of speech, or the freedom of expression, is recognized in international and regional human rights law. The right is enshrined in Article 19 of the International Covenant on Civil and Political Rights, Article 10 of the European Convention on Human Rights, Article 13 of the American Convention on Human Rights and Article 9 of the African Charter on Human and Peoples' Rights.
It is hard to see how one can promote knowledge for development without also promoting freedom of expression!

Monday, August 16, 2010

From the modern "Hippocratic Oath"

I excerpt a couple of items from the oath now used on graduation from many medical schools:
I swear to fulfill, to the best of my ability and judgment, this covenant:

I will respect the hard-won scientific gains of those physicians in whose steps I walk, and gladly share such knowledge as is mine with those who are to follow......

I will not be ashamed to say "I know not," nor will I fail to call in my colleagues when the skills of another are needed for a patient's recovery.
These are pretty good rules for any practitioner of knowledge for development.

Technological Progress: Has it made us safer or at higher risk?

Chapter 9 of Technology Matters: Questions to Live With by David Nye is titled "More Security, or Escalating Danger." While the chapter did provoke my thinking, I did not find it well reasoned. (This is in distinction to the previous chapters, all of which I have found to be useful.)

The obvious answer is that technological progress has made us safer. Life expectancy has increased rapidly for a century or more. Generally, the more technologically advanced a region, the healthier the inhabitants and the longer they live.

More basically, advanced technology has harnessed more power and more information to achieve its purposes more effectively. Technological systems are more complex, but more modern technologies are safer to use. Medical technology has advanced in its ability to prevent, detect and treat diseases. Transportation technology has advanced to more people faster, cheaper and safer. Information technology has advanced to obtain and process more information more rapidly and more affordably. Military technology has also advanced to make it possible to kill more people, faster.

The man falling from the top of the Empire State building was heard to say, as he fell past the 3rd floor, "so far, so good".
The question of course, is whether our technological systems, our population, and our societies are evolving in such a way as to create risks of later crashes. This has happened to many societies in the past, as Jarad Diamond has demonstrated in his book Collapse: How Societies Choose to Fail or Succeed, and it is not clear that we are exempt from that risk.

This is one of a series of postings I have made on David Nye's Technology Matters: Questions to Live With. Here are links to the previous postings:
Chapter 8 of David Nye's book, Technology Matters: Questions to Live With, is titled "Should 'the Market' Select Technologies". The chapter, like the others discussed so far, is a fairly easy read, referring to scores of items from the literature on technology with references to their sources. As such it can serve as an introduction to the subject.

The chapter mentions American experience with patents and the history of the Office of Technology Assessment. It may have focused a bit too much on poor technology choices by government (suggesting that Congressional support for the Federal Highway system was badly planned and contributed to energy overuse, and suggesting that Swedish and Danish legislative decisions on nuclear energy were better than those of the U.S. Congress.) The Congress funded the first telegraph lines and subsidized the first railroads in the United States, and government bodies supported the development of canals, potable water supplies and sewerage -- all of which seem to have been good decisions.

It might have been useful to point out that decisions on public goods (rather than private goods) have to be made by governments. The issue for these decisions may not be whether they should be made by the public sector, but rather how they should best be made by the public sector. It makes a big difference if the decision is made by legislative bodies, by executive bodies, or by judicial bodies. (The resolution of tort cases might, I suppose, be in effect a social choice for a country, as for example strongly encouraging auto manufacturers to produce safer autos.) In all there cases, it matters whether the actual decision makers are provided with good advice rendered through a good process. I personally favor peer review at the micro level when choosing which technology development projects to subsidize.  Science and technology advisory bodies are found in all three branches of government. While these may be composed entirely of experts in technology, they may also include experts from the social sciences to help predict technology impacts, ethicists to deal with ethical advice, and members of the community to deal with cultural values.

We seem to feel that public safety issues often require government oversight. Thus we have drug licensing laws and building codes. In part this is a recognition that the people affected by the technological choice often do not have the knowledge and skill to make such decisions well, nor would it be efficient to require all members of the public to make these decisions from scratch. Think about each commuter making an individual decision as to whether a bridge was safe enough to traverse that day, or an airplane safe enough to ride. Incidentally, the Federal Aviation Administration seems to be the very model of a government agency which has helped society make technological choices to assure public safety, as the safety record of commercial airlines attests.

In the area of medicine, American culture holds that individual choices of drugs and procedures be informed by consultation with physicians. Physicians are licensed, and bodies such as The Cochrane Collaboration (in Civil Society) and the FDA seek to assure that physicians have adequate information to provide good advice. The profession is self regulating, with certification involving proof of continuing education, as well as regulated by government. I belong to an HMO, and the HMO also has bodies which review developments in medical technology, defining norms and standards informing the physician members of the organization. Of course, pharmaceutical manufacturing company executives are making technological choices that affect future health care decisions, as are officials in government agencies such as NIH and officials in foundations funding biomedical research. In short, the choice of technology in the health care field is quite complex, and a dichotomy between Market and Government does not do justice to how choices are made. Still, Nye's discussion raises further questions as to how society should make longer range decisions such as how much to spend subsidizing health care or whether to seek drugs to enhance mental performance. The Bush administration's decision process with respect to stem cell research might have been adduced as an example of what not to do in such decision making.

I was surprised that Nye did not address the literature on diffusion of innovations. It focuses on how decisions are made by large numbers of people who are potential users of a technology. The prototypical study in this field is how the spread of new cultivars spreads in farming communities. Each farmer decides which seed he/she will use, but studies indicate that the decision is made communally through a system in which early adopters try out a new variety, and if it is successful, the use spreads to other farmers in the community.

Note, however, that the new varieties once developed through traditional practice, later were developed by governmental agricultural research stations, and more recently are increasingly developed by the private sector. The switch from public to private agricultural research was in part due to the development of hybrid varieties that did not breed true, allowing private sector seed companies to protect their investments in new varieties.

Technological advice to farmers can be provided by government, as in the case of agriculture extension services. It can also be provided by the private sector, as in the case of seed stores and retailers of fertilizers, pesticides and farm machinery. The private sector providers may not only provide advice related directly to the products they sell, but may provide more general information as a sales promotion vehicle.

In India, there are networks of community Internet providers which allow for farmers to ask technical questions of farmers in other connected communities. As I mentioned in a previous posting on this book, decisions on building new irrigation facilities, on the allocation of irrigation water, and even on irrigation schedules are made in the rice fields of Bali by water temple priests.

Again, the ways in which technological decisions are made in agriculture are very complicated. Another example comes to mind. The dust bowl phenomenon in the United States was caused by millions of farmers making individual decisions which left large areas of soil poorly protected. Government intervened with a number of programs, from providing soil conservation advice via agricultural extension to developing a soil bank program.

Nye perhaps did not do justice to technology decision making in industry. In a previous chapter he mentioned the mistakes made by manufactures in failing to respond early and strongly to technological change, as when IBM failed to recognize the shift from main frame to mine and then micro computers. When IBM was making those decisions it held a dominant position in the computer market. Even more to the case were the decisions made by AT&T when it held a monopoly position if telecoms as to handset design. In such cases the dominant technological decision maker is a large bureaucratic organization. As Henry Ford famously said, "let them buy any color car they want, as long as it is black".

I would note that today Intel, Cisco, Boeing and other firms seem to be making very good technological decisions, holding large market shares and advancing the technologies in which they hold major positions. The U.S. military system seems to be making good enough technological decisions that U.S. forces have held a technological advantage in recent engagements. Bureaucratic technological decision making, like that in other sectors, can be effective or ineffective.

This one of a series of postings on Technology Matters: Questions to Live With:

A Comment on U.S. Higher Education

Source: "Higher Education and the Economy," Grover J. "Russ" Whitehurst, Brookings, AUGUST 09, 2010

The United States has fallen from 1st to 12th place internationally in the percentage of young adults with postsecondary degrees. The question is what does this datum mean, and what do we do about it. One thing to note is that we are comparing apples to oranges. The United States is perhaps more comparable in this respect to the European Union than to individual European nations. The United States is larger than any European nation, and the country to country differences in educational levels are paralleled by state to state differences in the United States.

Whitehurst points out:
The relationship between years of schooling and economic output at the national level is complex, to say the least. A small but consistently positive relationship between long-term growth and years of schooling is found in econometric studies, but there are many caveats and exceptions that are relevant to designing higher education policy in the U.S. For one thing there is tremendous variability in the relationship. For example, Germany has a stronger economy than France but half the percentage of young adults with a college degree. Further, France has increased its percentage of young adults with college degrees by 13 percentage points in the last 10 years whereas Germany’s output of college graduates has hardly budged, yet the economic growth rate of Germany has exceeded that of France over this same period. Obviously increasing educational attainment is not a magic bullet for economic growth. Education credentials operate within boundaries and possibilities that are set by other characteristics of national economies. We must attend to these if more education is to translate into more jobs.

A growing body of research suggests that policymakers should pay more attention to the link between job opportunities and what people know and can do, rather than focusing on the blunt instrument of years of schooling or degrees obtained. In international comparisons, for example, scores on tests of cognitive skills in literacy and mathematics are stronger predictors of economic output than years of schooling. Within the U.S. there is evidence that for many young adults the receipt of an occupational certificate in a trade that is in demand will yield greater economic returns than the pursuit of a baccalaureate degree in the arts and sciences.
It seems to me that we need to focus on continuing education. A 21 year old entering the job market after whatever training, is going to face nearly 50 years of work before him/her. I suspect that many of these people should plan for further education to advance within career paths, and even more of them will need further education or training to change jobs/careers in the future.

Surely a lot of this education and training will be provided by institutions of higher education, but a lot of it should be provided by other organizations that provide targeted services to meet specific needs. A lot of it should be online!