The Manila Times

MOST SCIENTIFIC STUDIES ARE WRONG

-

WASHINGTON: A few years ago, two researcher­s took the 50 most- used ingredient­s in a cook book and studied how many had been linked with a cancer risk or benefit, based on a variety of studies published in scientific journals.

The result? Forty out of 50, including salt, flour, parsley, and sugar. “Is everything we eat associated with cancer?” the researcher­s wondered in a 2013 article based on their findings.

Their investigat­ion touched on a known but persistent problem in the research world: too few studies have large enough samples to support generalize­d conclusion­s.

But pressure on researcher­s, competitio­n between journals, and the media’s insatiable appetite for new studies announcing revolution­ary breakthrou­ghs has meant such articles continue to be published.

“The majority of papers that get published, even in serious journals, are pretty sloppy,” said John Ioannidis, professor of medicine at Stanford University, who specialize­s in the study of scientific studies.

This sworn enemy of bad research published a widely cited article in 2005 entitled: “Why Most Published Research Findings Are False.”

Since then, he said, only limited progress has been made.

Some journals now insist that authors pre- register their research protocol and supply their raw data, which makes it harder for researcher­s to manipulate findings in order to reach a certain conclusion. It also allows other to verify or replicate their studies.

Because when studies are replicated, they rarely come up with the same results. Only a third of the 100 studies published in three top psychology journals could be successful­ly replicated in a large 2015 test.

Medicine, epidemiolo­gy, population science, and nutritiona­l studies fare no better, Ioannidis said, when attempts are made to replicate them.

“Across biomedical science and beyond, scientists do not get trained sufficient­ly on statistics and on methodolog­y,” Ioannidis said.

Too many studies are based solely on a few individual­s, making it difficult to draw wider conclusion­s because the samplings have so little hope of being representa­tive.

Coffee and red wine

“Diet is one of the most horrible areas of biomedical investigat­ion,” Ioannidis added - and not just from conflicts of interest with various food industries.

“Measuring diet is extremely difficult,” he stressed. “How can we precisely quantify what people eat?”

In this field, researcher­s often go in wild search of correlatio­ns within huge databases, without so much as a starting hypothesis.

Even when the methodolog­y is good, with the gold standard being a study where participan­ts are chosen at random, the execution can fall short.

A famous 2013 study on the benefits of the Mediterran­ean diet against heart disease had to be retracted in June by the most prestigiou­s of medical journals, the New England Journal of Medicine, because not all participan­ts were randomly recruited; the results have been revised downwards.

So what should we take away from the flood of studies published every day?

Ioannidis recommends asking the following questions: is this something that has been seen just once, or in multiple studies? Is it a small or a large study? Is this a randomized experiment? Who funded it? Are the researcher­s transparen­t?

These precaution­s are fundamenta­l in medicine, where bad studies have contribute­d to the adoption of treatments that are at best ineffectiv­e, and at worst harmful.

In their book “Ending Medical Reversal,” Vinayak Prasad and Adam Cifu offer terrifying examples of practices adopted on the basis of studies that went on to be invalidate­d, such as opening a brain artery with stents to reduce the risk of a new stroke.

It was only after 10 years that a robust, randomized study showed that the practice actually increased the risk of stroke.

The solution lies in the collective tightening of standards by all players in the research world, not just journals but also universiti­es, public funding agencies. But these institutio­ns all operate in competitiv­e environmen­ts.

“The incentives for everyone in the system are pointed in the wrong direction,” Ivan Oransky, co-founder of Retraction Watch, which covers the withdrawal of scientific articles, told Agence FrancePres­se. “We try to encourage a culture, an atmosphere where you are rewarded for being transparen­t.”

The problem also comes from the media, which according to Oransky needs to better explain the uncertaint­ies inherent in scientific research, and resist sensationa­lism.

“We’re talking mostly about the endless terrible studies on coffee, chocolate and red wine,” he said. “Why are we still writing about those? We have to stop with that.”

Newspapers in English

Newspapers from Philippines