The Hidden Ways Manipulated Science Harms Our Health, From Measles To Organics

The current outbreak of measles, the largest since the disease was declared eliminated in the U.S. more than a decade ago, was made possible in large part by a single black mark in the medical research literature — a discredited 1998 study from Dr. Andrew Wakefield that purported to link the measles, mumps and rubella (MMR) vaccine to autism.

The Lancet, the journal in which Wakefield’s study appeared, pulled the study after investigations by a British journalist and a medical panel uncovered cherry-picked data and an array of financial conflicts of interest, among other trappings of fraudulent science. Wakefield, a British gastroenterologist, had gone as far as to pay children at his son’s birthday party to have their blood drawn for the research. He had also collected funds for his work from personal injury lawyers who represented parents seeking to sue vaccine makers.

Despite the journal’s retraction and Wakefield being stripped of his medical license in the U.K., the study still succeeded in generating fear and doubt about vaccines. The public health repercussions are still being felt today, as evidenced by the ongoing measles outbreak, which has affected more than 121 people, according to the latest numbers from the Centers for Disease Control and Prevention. A separate outbreak of mumps, another illness protected against by the MMR vaccine, is also emerging in Idaho and Washington state.

Wakefield isn’t the only scientist to leave a legacy of discredited work and serious health threats — although his case may be the most famous and the least ambiguous. The results of fabricated data and other forms of research misconduct often make their way into our policy and public discourse before they are identified and addressed within the scientific community. An analysis published this week in the journal JAMA Internal Medicine found that the U.S. Food and Drug Administration commonly identifies problematic research — from the fraudulent to the mistaken — during its systematic reviews of relevant studies, but rarely reports its findings to the publications in which the studies appeared. Simple sloppiness can result in damaging misinformation and misinterpretations, as can scientists exaggerating their findings in the hopes of gaining publicity or securing future funding. Then, too, there are mainstream journalists who may over- or under-emphasize certain aspects of new research, or who may not fully understand the science they’re writing about.

Combine all of that with a population whose general grasp of science appears to be middling at best, and you have a recipe for an echo chamber of misinformation. “We don’t have a particularly scientifically astute society,” said Dr. Margaret Moon, a pediatrician and bioethicist at Johns Hopkins University. “We need to do a better job helping people understood good versus bad data.”

The Internet seldom helps the situation. Type “vaccine autism” into Google, and you’d think the jury was still out on the MMR vaccine. The first listed link, a paid advertisement, reads: “Vaccines cause autism.” Other links concern an ongoing “controversy.” For the record: Among scientists, there is no controversy. Vaccines are safe.

There is no shortage of scientific data in the world today. Nor is there any shortage of people with stakes in how all that data is created, analyzed and interpreted. Finding better ways to distinguish between honest and faulty science is therefore a matter of growing interest. For example, Retraction Watch, a blog that tracks retractions of scientific papers, is keeping an eye out for such threats — and warning of their potential downstream effects.

Only one to three papers per 10,000 published are ever retracted, according to one 2010 study. But as Retraction Watch reports, many more cases of scientific misconduct are “swept under the rug.”

Moon also noted a growing scrutiny of the peer-review process, especially where financial conflicts of interest may be involved. As some recent examples have shown, corporate money can be funneled into the pockets of academic researchers — unbeknownst to the public, or even to other scientists.

While the great majority of scientific researchers are honest people trying to do good work, Moon acknowledged that there are other scientists who are not.

“That’s a fact,” she said. “The good news is that science is set up to find these things. But is it set up to find them before damage is done? No way.”

The Huffington Post