Diet soda and stroke: A reminder on reporting preliminary studies


Over at the NeuroLogica blog, Steven Novella, M.D., explores the absurdity of a poster presentation at the American Stroke Association International Stroke Conference, one obviously intended to showcase raw preliminary research on soda consumption, somehow making headlines in health and science sections around the world. The poster in question, of course, was one that reported a 61 percent increase in self-reported cardiovascular disease among those who drank diet soda versus those who drank no soda whatsoever.

It’s the sort of headline that should set off alarms in the subconscious of a health journalist, and Novella is quick to acknowledge its flaws: It was observational, self-reporting, didn’t control for many obvious confounds and only measured a single point in time. However, Novella writes, that doesn’t necessarily make it a bad study. It just makes it a preliminary one. One suited for something like a poster presentation. Which is exactly what it was. (Emphasis mine.)

While this study has serious flaws that preclude any confident interpretation, it is a reasonable preliminary study – the kind of study that gets presented as a poster at a meeting, rather than published in a high-impact peer-reviewed journal. Such preliminary research is mostly an exercise in data dredging – looking at data sets for any interesting signals. The purpose of such preliminary research is to determine whether or not more definitive follow up research is worth the time and effort. If there were no signal in this data, then don’t bother designing and executing a tightly controlled several year prospective trial.

So if the study was just a necessary but early step in the research process, how did things get so out of hand? Frequent consumers of health media criticism already know what’s coming next.

The problem is in the reporting of these studies. The mainstream media probably should just ignore any study that is deemed preliminary, especially if it’s just an isolated study. Perhaps in a thorough feature article it would be reasonable to give an overview of the state of the research into a question, including preliminary studies, because in a feature time can be taken to put the evidence into perspective. But reporting a single preliminary study as science news is a highly problematic approach.

Novella points out that people who saw stories about the study – even those stories that included reasonable reporting – will likely just remember the headline linking diet sodas and stroke risk. He also reminds us that “Over-reporting of preliminary results also has the effect of confusing the public with lots of noisy information, most of which is not true.”

Online guide focuses on covering medical studies

Covering Medical Research

Reporters are inundated with lures to cover the latest medical study or scientific conference paper. And there are some significant milestones being reached in medical research. But, more often, the information reaching the public is way too preliminary or even misleading, say those behind a new AHCJ reporting guide on covering health studies.

The guide will help journalists analyze and write about health and medical research studies. It offers advice on recognizing and reporting the problems, limitations and backstory of a study, as well as publication biases in medical journals and it includes 10 questions you should answer to produce a meaningful and appropriately skeptical report. This guide, supported by the Robert Wood Johnson Foundation, will be a road map to help you do a better job of explaining research results for your audience.

Creative Commons License

Republish our articles for free, online or in print, under a Creative Commons license.