No, Chocolate Probably Isn’t a Superfood
Posted September 29, 2018 5:00 p.m. EDT
Not too long ago, Brian Wansink was one of the most respected food researchers in America.
He founded the Food and Brand Lab at Cornell University, where he won attention for studies that showed that small behavioral changes could influence eating patterns. He found that large plates lead people to eat more food because they make portions look smaller and that children eat more vegetables when they have colorful names like “power peas.” Wansink wrote best-selling books and published hundreds of studies. For over a year, he served in a top nutrition policy role at the Department of Agriculture under George W. Bush, where he helped shape the government’s influential dietary guidelines. His research even led the government to spend almost $20 million redesigning school cafeterias, an initiative known as the Smarter Lunchrooms Movement.
But this month, Wansink’s career at Cornell came to an unceremonious end. On Sept. 20, the university announced that a yearlong investigation had found that he committed “academic misconduct in his research and scholarship, including misreporting of research data,” and that he had tendered his resignation. The announcement came one day after the prestigious medical journal JAMA retracted six of Wansink’s studies because of questions about their “scientific validity.” Seven of his other papers had previously been retracted for similar reasons.
“I think the extent of misconduct that has occurred with this author is unique,” Dr. Howard Bauchner, JAMA’s editor-in-chief, said in an interview. “There are literally millions of authors, and there’s very few who have had numerous papers retracted.”
For more than a year, Wansink had been dogged by accusations that many of his studies were riddled with errors, data inconsistencies and evidence of fraud. In a statement, Wansink admitted to making “typos, transposition errors and some statistical mistakes” in his papers. But he defended his work and said none of his mistakes “changed the substantive conclusions” of any of his papers. “I’m very proud of all of these papers,” he said, “and I’m confident they will be replicated by other groups.”
But as news of the scandal reverberated through academic circles, some experts said they feared it was symptomatic of a broader problem in food and health research. While very few scientists are accused of misconduct or misreporting data, critics have long contended that nutrition research is plagued by a credibility problem. They argue that an alarming number of food studies are misleading, unscientific or manipulated to draw dubious conclusions.
Wansink’s lab was known for data dredging, or p-hacking, the process of running exhaustive analyses on data sets to tease out subtle signals that might otherwise be unremarkable. Critics say it is tantamount to casting a wide net and then creating a hypothesis to support whatever cherry-picked findings seem interesting — the opposite of the scientific method. For example, emails obtained by BuzzFeed News showed that Wansink prodded researchers in his lab to mine their data sets for results that would “go virally big time.”
“P-hacking is a really serious problem,” said Dr. Ivan Oransky, a co-founder of Retraction Watch, who teaches medical journalism at New York University. “Not to be overly dramatic, but in some ways it throws into question the very statistical basis of what we’re reading as science journalists and as the public.”
Data dredging is fairly common in health research, and especially in studies involving food. It is one reason contradictory nutrition headlines seem to be the norm: One week coffee, cheese and red wine are found to be protective against heart disease and cancer, and the next week a new crop of studies pronounce that they cause it. Marion Nestle, a professor of nutrition, food studies and public health at New York University, said that many researchers are under enormous pressure to churn out papers. One recent analysis found that thousands of scientists publish a paper every five days.
“You can’t get a job if you don’t have papers,” she said. “I see this at my university. We expect assistant professors to be hired with already a record of scholarship.”
In 2012, Dr. John Ioannidis, chairman of disease prevention at Stanford, published a study titled “Is Everything We Eat Associated With Cancer?” He and a co-author randomly selected 50 recipes from a cookbook and discovered that 80 percent of the ingredients — mushrooms, peppers, olives, lobster, mustard, lemons — had been linked to either an increased or a decreased risk of cancer in numerous studies. In many cases a single ingredient was found to be the subject of questionable cancer claims in more than 10 studies, a vast majority of which “were based on weak statistical evidence,” the paper concluded.
Nutrition epidemiology is notorious for this. Scientists routinely scour data sets on large populations looking for links between specific foods or diets and health outcomes like chronic disease and life span. These studies can generate important findings and hypotheses. But they also have serious limitations. They cannot prove cause and effect, for example, and collecting dietary data from people is like trying to catch a moving target: Many people cannot recall precisely what they ate last month, last week or even in the past 48 hours.
Plenty of other factors that influence health can also blur the impact of diet, such as exercise, socioeconomic status, sleep, genetics and environment. All of this makes the most popular food and health studies problematic and frequently contradictory.
In one recent example, an observational study of thousands of people published in The Lancet last year made headlines with its findings that high-carb diets were linked to increased mortality rates and that eating saturated fat and meat was protective. Then in August, a separate team of researchers published an observational study of thousands of people in a related journal, The Lancet Public Health, with contrasting findings: Low-carb diets that were high in meat increased mortality rates.
“You can analyze observational studies in very different ways and, depending on what your belief is — and there are very strong nutrition beliefs out there — you can get some very dramatic patterns,” Ioannidis said.
He and other experts have called for reform in nutrition science. They say that researchers should publicly register their study protocols beforehand to eliminate data dredging, share their raw data to increase transparency, focus on large randomized controlled trials to produce better results, and refrain from slicing and dicing large observational data sets into multiple papers that magnify weak findings.
Experts say the problem extends to science journalists as well: Many reporters are encouraged to produce articles that get lots of attention. That is another reason researchers and universities feel pressure to put out studies and news releases with exaggerated findings.
Oransky said that while Wansink’s behavior was egregious, it is not something that is isolated to nutrition. Wansink would not even make Retraction Watch’s list of the top 30 scientists with the most retracted papers. One person on the list, an anesthesiologist, has had 183 retracted papers. Oransky estimated that every year roughly 1,400 scientific papers are retracted out of the 2 million to 3 million that are published. What made this case stand out, he said, is that a media darling like Wansink was at the center of it. Indeed, the unique part of this scandal is how prominent he was. “I don’t know too many reporters who’ve covered health and nutrition who’ve never quoted him,” said Oransky.