Read the abstract of a randomized controlled trial of doula home visits for low-income mothers and you might conclude that the intervention is a solid public policy win.
“Among higher risk-taking mothers, the intervention was related to less intrusiveness during early infancy, less psychological and physical aggression during toddlerhood, more sensitive parenting attitudes, and greater toddler social relatedness,” it stated.
The findings “suggest that doula home visiting may be a particularly effective model for enhancing sensitive, non-aggressive parenting among young mothers with a history of risk-taking behavior,” researchers wrote.
But a recently launched publication called the No-Spin Evidence Review — which aims to counter overly rosy study abstracts by offering more balanced plain-language summaries of research findings — had a different take.
The review published a revised version of the abstract, which it deemed “not accurately reported.” It noted that after 30 months, home visits had a statistically significant impact on just one of the study’s nine measured outcomes, and that single positive finding could have been by chance.
Further, the review said the idea that some moms with a history of risky behaviors like smoking could benefit from doula visits “should be considered preliminary” until confirmed by further research.
Exaggerated benefits are pervasive
Journalists should be aware that misleading abstracts are “pervasive in reporting on social programs in lots of different areas,” Jon Baron, president of the review’s publisher, the Coalition for Evidence-Based Policy, said in an interview.
The coalition is a nonprofit, nonpartisan organization that bills itself as an independent source of expertise on evidence-based policy.
Baron pointed to strong incentives to put a positive spin on findings.
For sponsors of social programs, spreading the perception that their project works well generates additional funding and enables expansion. For researchers, affirmative findings are easier to get published in top journals than null findings and also help to attract future research opportunities.
On the other hand, there are almost no incentives to report disappointing findings or limitations that could deflate confidence in a program’s effectiveness, Baron noted. As a result, “There is no one that is effectively pushing back against all of this spin.”
Exaggerated claims in abstracts “lead to widespread over-labeling of programs as ‘evidence-based’ and, in some cases, the expansion of ineffective programs,” according to the No-Spin Review’s website.
The review, which is funded by the philanthropy Arnold Ventures, employs public policy experts who examine large randomized controlled trials of social programs and if necessary, rewrite them to be more neutral. They may report on the primary outcome, identify other outcomes as exploratory and unreliable, and mention important limitations such as a high dropout rate.
Policymakers and foundations who are interested in promoting evidence-based programs are the review’s primary audience of 3,200 subscribers, Baron said.
A resource for journalists
Unfortunately, journalists are sometimes guilty of reporting just the abstract of a study rather than delving into its specific findings and examining the rigor of the methodology.
Reaching out to independent experts in the field for comment may not provide balance to a story because experts don’t always have time to check into every detail, Baron noted.
“How many people reading the study, even somebody in the field, are going to check the study registration, or whether there is a study registration?” he said. “People don’t have the time to do that, even if they are the top people in the field.”
Journalists can cite the review’s plain-language summaries “as a resource for a straight account on what the study found,” Baron said.
He also urged journalists to reach out to the coalition if they need input on a randomized trial of a health-related social program. Journalists can email Baron or use the coalition’s contact link.
The No-Spin review replaced the coalition’s Evidence-Based Policy Newsletter, which reviewed clinical trials. The archives are online.
Among those related to health:
- Million Hearts program to prevent heart attack and stroke in high-risk Medicare patients
- A smartphone app to reduce drinking among university students
- Program to help older adults in affordable housing to successfully age in place
- U.S. Lung Health Study of intensive smoking cessation program for smokers with early lung disease
- Low-cost program led by non-physician community healthcare providers to reduce hypertension in rural China.
Baron hopes that by reading evidence reviews, journalists and others familiarize themselves with the ways that investigators can manipulate data. “It would be great if reporters have a better sense of how pervasive this problem is so they can be more skeptical when they see a claim,” he said.





