A now-retracted study in the journal Science once again reveals how important it is that journalists find appropriate expert sources to weigh in on findings before publishing stories about them.
The well-publicized paper, co-authored by Columbia researcher Donald Green and UCLA graduate student Michael LaCour, suggested that opponents of same-sex marriage were more likely to change their minds after talking with gay and lesbians canvassers. But, as Retraction Watch reported last week, LaCour faked the data. The journal initially posted an “Editorial Expression of Concern” but officially retracted the paper Thursday. Green had specifically requested the retraction, but LaCour does not agree with it. Continue reading
Few areas of medical research are as challenging to study as nutrition. Randomized controlled nutrition trials are very difficult to conduct, and individual variation among participants can be much greater than in other areas. Add to that the urgency of the “obesity epidemic” and the multibillion dollar industry of diets, supplements and other weight-loss schemes, and it becomes clear how competing ideologies make it tough to parse the evidence. Continue reading
A $400,000 grant from the MacArthur Foundation will be used to create a database of retractions from scientific journals, extending the work done by Adam Marcus and AHCJ Vice President Ivan Oransky on their Retraction Watch blog.
The grant was awarded to the Center for Scientific Integrity, a nonprofit organization set up by Marcus and Oransky. Continue reading
Recently, Dr. Ben Goldacre (@bengoldacre), a prominent critic of drug studies, wanted to find out how often side effects reported by users of cholesterol-lowering drugs called statins were genuinely caused by the medications.
The study he co-authored concluded that most reported side effects of statins aren’t often due to the drugs themselves, but to other causes. The study generated front-page headlines in the U.K., with an article in The Telegraph declaring, “Statins have virtually no side effects, study finds.”
Outcry ensued. Patients who experienced side effects on statins begged to differ, and Goldacre’s fans wondered if he had suddenly gone soft on pharmaceutical companies.
In response, Goldacre penned a nuanced explanation of the study findings, explaining* that its conclusions were flawed because it was based on incomplete data.
The statin study controversy aside, his blog post makes some key points about how side effects are reported in medical journals that are helpful for health reporters to keep in mind when covering the downsides of new drugs. I’ve boiled some important points down and included them in this tip sheet for AHCJ members.
*Editor’s note: An earlier version of this post used the word “admitting.”
Image by Mark Robinson via flickr.
The military uses the phrase “the fog of war” to describe the miscalculations and botched decisions that get made in the heat of combat.
But you need not sign up for active duty to run into foggy thinking. Just call a scientist and interview them about their own research.
One of my favorite examples of this is when researchers conduct observational studies that can’t show cause and effect, yet interpret their findings to reporters as if they do. Continue reading
Photo: BlatantNews.com via Flickr
Recently, an editor sent me a study to cover on concussions in teenagers. At least, that’s what we thought the research was about, based on the title of its press release: “Teenagers who have had a concussion also have higher rates of suicide attempts.”
And I was excited to cover the study. Like gut bacteria and anything to do with chocolate or coffee or stem cells, concussion is a hot topic right now. That’s partly because brain scientists are just beginning to understand the lasting impacts of these sometimes subtle but probably cumulative injuries.
And they affect everybody from pro athletes to pee wee football players. So when parents and coaches see the word “concussion,” their thoughts rightfully turn to young athletes. About half of concussions in kids ages 8 to 19 are sports-related, according to a nationwide study of concussions published in 2010 in the journal Pediatrics.
The press release said the study found that kids who have had concussions were not only more likely to try to commit suicide, but to engage in other sorts of high-risk behaviors like taking drugs, stealing cars, setting fires and bullying.
The message here is that a kid who gets hit in the head too many times – presumably playing sports – might turn to drug abuse, self-injury and other sorts of criminal behaviors. And that’s the way it was covered in the press. Continue reading
When writing about medical studies, reporters should always ask researchers about any financial relationships with drug companies or device manufacturers. That was one of the main lessons from a panel on conflicts of interest on Saturday at Health Journalism 2014.
Starting in September, sunshine provisions in the Affordable Care Act will require drug companies to disclose most payments to doctors. Some companies have already started to publicize their financial relationships with doctors. But most medical journal articles do not give accurate information on researchers’ potential conflicts of interest, said panelist Susan Chimonas of the Institute of Medicine as a Profession at Columbia University.
“You shouldn’t be uncomfortable asking these questions,” Chimonas said. “They owe you this information. They owe everyone this information.” Continue reading
Jonathan Latham, Ph.D.
Remember the burger grown from stem cells? It might be a great idea, except a single patty grown using today’s technology, at least, cost a whopping $332,000.
In a new AHCJ tip sheet, Jonathan Latham, Ph.D., executive director of the Bioscience Resource Project, asks whether discoveries like that are breakthroughs or “fakethroughs” – scientific advances that will never progress to new treatments or beneficial products. He also talks about his brand of investigative science journalism and why reporting on new discoveries should probably be more muted.
He has two tips for reporters and advice about what research journalists should cover.
Spend any significant amount of time reporting on research and you’re bound to run across a real stinker of a study.
Too often, the studies that become clickbait on the web or turn up in women’s magazines – usually boiled down to a surprising health tip – are just, well, how do I put this? Crap.
There are a lot of those kinds of studies in the world. Studies that are too small to be meaningful, or they ask bad or useless questions, they’re poorly designed or they essentially answer a question that’s already been repeatedly answered.
These kinds of studies exist because the publish-or-perish culture of academia rewards volume over value. And let’s accept our part in this, too. There’s always a media outlet that’s willing to trumpet a surprising, if completely unsound, study.
In a microcosm, a bad study or two can raise an eyebrow or a chuckle. In a macrocosm, however, the situation is dire. Continue reading
One of the most important skills required of reporters who cover medical research is the ability to find and discuss the limits of the studies we cover.
To that end, a trio of professors at Cambridge University recently published a helpful comment in the journal Nature: “Twenty Tips for Interpreting Scientific Claims.” (If you don’t subscribe, you can read the full article for free here.)
Some of my favorites (in no particular order):
- Study relevance limits generalizations – a great reminder that the conditions of any study will limit how its findings can be applied in the real world.
- Bias is rife – We talk about several types of bias in the topic section, like reporting bias and healthy user effect. The article reminds us that even the color of a tablet can shade how study participants feel. Continue reading