Friday, March 02, 2007

"The Science of Getting It Wrong: How to Deal with False Research Findings"

Read the full article subtitled "The key may be for researchers to work closer and check one another's results" on ScientificAmerican.com (February 27, 2007).

Thanks to Julianne for this reference:
In his widely read 2005 PLoS Medicine paper, Ioannidis, a clinical and molecular epidemiologist, attempted to explain why medical researchers must frequently repeal past claims. In the past few years alone, researchers have had to backtrack on the health benefits of low-fat, high-fiber diets and the value and safety of hormone replacement therapy as well as the arthritis drug Vioxx, which was pulled from the market after being found to cause heart attacks and strokes in high-risk patients.

Using simple statistics, without data about published research, Ioannidis argued that the results of large, randomized clinical trials—the gold standard of human research—were likely to be wrong 15 percent of the time and smaller, less rigorous studies are likely to fare even worse.

Among the most likely reasons for mistakes, he says: a lack of coordination by researchers and biases such as tending to only publish results that mesh with what they expected or hoped to find. Interestingly, Ioannidis predicted that more researchers in the field are not necessarily better—especially if they are overly competitive and furtive, like the fractured U.S. intelligence community, which failed to share information that might have prevented the September 11, 2001, terrorist strikes on the World Trade Center and the Pentagon.

But Ioannidis left out one twist: The odds that a finding is correct increase every time new research replicates the same result, according to a study published in the current PLoS Medicine. Lead study author Ramal Moonesinghe, a statistician at the Centers for Disease Control and Prevention, says that for simplicity's sake his group ignored the possibility that results can be replicated by repeating the same biases. The presence of bias reduces but does not erase the value of replication, he says.
Comment: Replication of results is important! JAD

No comments: