What can reporters learn from the chocolate diet study hoax?

About Tara Haelle

Tara Haelle (@TaraHaelle) is AHCJ's medical studies core topic leader, guiding journalists through the jargon-filled shorthand of science and research and enabling them to translate the evidence into accurate information.

Photo: BlueRidgeKitties via Flickr

Photo: BlueRidgeKitties via Flickr

You’ve been fooled. You thought eating chocolate while dieting could help you shed the pounds faster because a study supposedly said so, and outlets all over the place covered it – but it was based on an intentionally faulty, hyped study.

At least, that’s the story that journalist John Bohannon, who was the first author, partial architect and promoter of the study, told in a viral io9 piece. The story exploded in social media as readers, journalists, scientists, ethicists and others argued over what he really proved, whether he should have done it and what lessons can be gleaned from the stunt.

Of course, the article’s timing isn’t random. The documentary for which he pulled the scam comes out this week, as New York Magazine noted. But that doesn’t mean there aren’t lessons to glean here. There are takeaways from both the experiment and the hoax.

Many of the lessons from the experiment are ones Bohannon discusses in his piece: journalists didn’t ask critical questions, didn’t seek comment from outside experts, didn’t confirm the validity of the journal, relied on the press release instead of looking at the study and therefore overlooked key details such as the small study size and the very small effect. His piece merits a careful read for journalists wanting to know what red flags they should look for in covering studies. Importantly, journalists did not need extensive training in reporting on medical studies to have discovered the study was bunk.

“One of the first rules of journalism is that if your mother says she loves you, check it out,” said Ivan Oransky, M.D., global editorial director at MedPage Today, vice president of AHCJ and co-founder of Retraction Watch, about what it takes to report responsibly on medical research.

“People need to be educated reporters. They need to know how to ask the right questions,” he told me. Although Oransky attended medical school, he explained that he didn’t fully learn how to read medical studies critically and “put all the right skeptical tools in my toolbox” until he was a working journalist (and he learned a good deal of it at AHCJ meetings).

Journalists don’t need an advanced science degree to report on medical studies, he said. Instead it’s about “taking a step back and thinking about how this could possibly be wrong.” And journalists don’t need to figure that out for themselves if they are seeking insights from outside experts, an important part of the job.

It’s also worth noting that many journalists in this case – most, actually – did exactly that. Many of the outlets Bohannon mentions that published the study’s supposed findings are not exactly top-tier publications. Most are part of the churnalism already known to be a persistent plague in the media landscape. The most “rigorous” publication he mentions is Shape magazine, which led Faye Flam, in her analysis of Bohannon’s stunt, to suggest that “rigor is in the eyes of the beholder.” She points out that Bohannon offers little evidence that he “fooled millions” since most major U.S. news outlets ignored the study.

Likewise, Health News Review founder Gary Schwitzer, in an analysis at Retraction Watch (which also describes an interesting twist in which the journal says it never meant to publish the story), points out that Bohannon didn’t need to fool millions: “he may have directly fooled only a few – not millions. And those few – whom I will politely call ‘journalists’ – did the rest of the fooling for him.”

Bohannon didn’t even need to pull the stunt to prove his point. Those are the shortcomings in health news coverage that Health News Review has been addressing for close to a decade, and Schwitzer describes similar media feeding frenzies (including ones about, yes, chocolate).

There’s another aspect of Bohannon’s hoax that’s worth a thought. As many have pointed out on Twitter and as Poynter Institute media ethicist Kelly McBride told NPR, Bohannon arguably committed a number of ethical violations. McBride was referring more to media ethics: “It’s totally not ethical. It’s deliberate deception in a way that could harm people,” she told NPR. “There’s so much bad information out there, especially around diet and science, that when you make it worse you become part of the problem.”

But many have pointed out ethical issues with the “research” as well. Knowing some of the ethical lines this study might have crossed may help journalists keep an eye out for red flags in studies they cover. I spoke with Hilda Bastian, a scientist, PLOS blogger and academic editor at PLOS Medicine who has served on journal ethics committees and worked as a consumer advocate on ethics committees. She helped me understand several of the possible ethical concerns here, based on what we know. Since addressing all of these could become a lengthy essay, I’ll lay out a few bullet points of this ethical quagmire:

  • The study did not receive approval from an Institutional Review Board (IRB) or similar entity, a committee which formally reviews research proposals involving humans and assesses whether the methodology and overall study is ethical, worthwhile and valid. Bohannon told Retraction Watch the journal did not require it (also of ethical concern), but that doesn’t change the fact that he conducted a study without oversight from an ethics board.
  • The participants did not provide fully informed consent; they could not have when they were not informed of the study’s intention. Some research, such as in psychology, can involve deception as a purpose, but ethical guidelines dictate the appropriate ways to conduct that research, which tend to be more stringent and deal with studies that deceive the participants (not the greater world).
  • The study involved drawing blood, and other aspects of the study carried the potential for harm, but it’s unclear what recourse participants would have if they had been harmed in a study without IRB or similar approval.
  • The study involved a physician, which brings up questions about his involvement in the deception and the extent to which he was able to prioritize the health and safety of participants in an experiment designed to dupe.
  • The journal had ethical obligations to inquire about the trial’s registration and whether ethical procedures were followed.

So, aside from the valid points that Bohannon raises regarding what journalists need to be doing (and, apparently, what most were doing when they didn’t cover his study), this stunt highlights ethical questions journalists can look for in future studies:

  • Was the study approved by an IRB or similar entity?
  • Is the journal reputable?
  • Was informed consent obtained?
  • If it’s a clinical trial, is it registered anywhere?

Again, journalists don’t need specialized training or a science or medical degree to investigate these issues or those that Bohannon raised. They just need a very healthy dose of skepticism and a willingness to learn.


2 thoughts on “What can reporters learn from the chocolate diet study hoax?

  1. Pingback: Round up on the chocolate hoax | Stats Chat

  2. Pingback: The GAO takes a swing at the RUC, curbing runaway cancer drug costs, and allowing patients autonomy – Lown Institute

Leave a Reply