Study: Weak science in hospital press releases


Reporters often come under fire for subpar or sensationalized coverage of medical science, but a new study by researchers from the Department of Veterans’ Affairs and Dartmouth Medical School indicates that hospital PR departments may be, at the very least, willing accomplices in the dissemination of articles with eye-catching headlines and inflated, out-of-context reports.

From the study (emphasis added):

Academic medical centers issued a mean of 49 press releases annually. Among 200 randomly selected releases analyzed in detail, 87 (44%) promoted animal or laboratory research, of which 64 (74%) explicitly claimed relevance to human health. Among 95 releases about primary human research, 22 (23%) omitted study size and 32 (34%) failed to quantify results. Among all 113 releases about human research, few (17%) promoted studies with the strongest designs (randomized trials or meta-analyses). Forty percent reported on the most limited human studies—those with uncontrolled interventions, small samples (<30 participants), surrogate primary outcomes, or unpublished data—yet 58% lacked the relevant cautions.

The study says that these press releases tend to dictate which studies receive attention and how those studies that do receive coverage are reported on. In support of that assertion, the authors cite findings that about a third of medical stories rely heavily on press releases.

And because, as we all know, no report on a scholarly study would be complete without information on context and methodology, here’s how the authors selected their 200-press-release sample:

We selected the 10 highest-ranked and 10 lowest-ranked of the academic medical centers covered in U.S. News & World Report‘s medical school research rankings that issued at least 10 releases in 2005. In addition, we identified each medical school’s affiliates by using an Association of American Medical Colleges database. The Appendix Table lists the centers and their affiliated press offices. We initially intended to compare press releases by research ranking, but because we found few differences, we report findings across the entire study sample, highlighting the few differences by rank where they exist.

Other interesting points from the study:

  • 26 percent of the releases exaggerated the study’s importance (it happened more often in animal research)
  • 24 percent used the word “significant,” though only one distinguished between statistical and clinical significance
  • The study’s authors don’t explicitly blame hospitals for feeding reporters sketchy information, though they do speculate that this could be a contributor to poor health reporting
  • Authors suggest that “the quickest strategy for improvement would be for centers to issue fewer releases about preliminary research, especially unpublished scientific meeting presentations.”
  • They also recommend “centers should limit releases about animal or laboratory research. …  Two thirds of even highly cited animal studies fail to translate into successful human treatments.”

We’d like to hear from our readers – what press releases have you seen that were overblown or out of context? And, for those of you who work in hospital PR, what you do to ensure your releases don’t fall into these traps?


Jennifer Huget of The Washington Post‘s Checkup blog posted about this study, concluding that journalists need to read the studies and not rely upon press releases.

Creative Commons License

Republish our articles for free, online or in print, under a Creative Commons license.

Andrew Van Dam