Journal’s retraction highlights value of keeping ‘a biostatistician in your back pocket’

About Tara Haelle

Tara Haelle (@TaraHaelle) is AHCJ's medical studies core topic leader, guiding journalists through the jargon-filled shorthand of science and research and enabling them to translate the evidence into accurate information.

Science-retractedA now-retracted study in the journal Science once again reveals how important it is that journalists find appropriate expert sources to weigh in on findings before publishing stories about them.

The well-publicized paper, co-authored by Columbia researcher Donald Green and UCLA graduate student Michael LaCour, suggested that opponents of same-sex marriage were more likely to change their minds after talking with gay and lesbians canvassers. But, as Retraction Watch reported last week, LaCour faked the data. The journal initially posted an “Editorial Expression of Concern” but officially retracted the paper Thursday. Green had specifically requested the retraction, but LaCour does not agree with it.

The fraud came to light when two graduate students at the University of California at Berkeley ran into difficulties as they attempted to replicate the findings. When they contacted the firm that allegedly had worked on the study, the firm denied knowledge of the research. In addition, the person listed in the research paper as having worked the study was not even an employee of the firm. The Retraction Watch story – whose popularity briefly crashed their servers – includes an excellent chronology of events.

Making it through peer review, as countless studies have shown, does not mean a study is 100 percent reliable even at highly respected publications. In fact, as Retraction Watch founders Adam Marcus and Ivan Oransky, an AHCJ board member, explain in a New York Times op-ed about the situation. “Journals with higher impact factors retract papers more often than those with lower impact factors, though it’s not clear why, they said.

These most prestigious journals also garner the most press when highlighting a juicy topic such as sex or race. “And yet, many reporters fail to do the necessary due diligence before publishing their work,” Marcus and Oransky said. “The drive for scoops is even greater in journalism than it is in science.”

In this case, the fraud surfaced due to the Berkeley grad students’ efforts. But that doesn’t mean a conscientious journalist could not have found an expert able to point out red flags or at least the significant limitations of the study. In fact, a thorough analysis by Tim Groseclose, economics professor at George Mason University, on the conservative community blog Ricochet provides exactly that. Groseclose explained in his post how the methodology described in Green’s and LaCour’s paper was sketchy and that confidence intervals reported were suspicious. Dubious confidence intervals call into question the statistical significance of the findings.

Groseclose’s discussion of confidence intervals brings up a suggestion Oransky makes nearly every times he speaks publicly. In an On the Media interview about the Science incident, Oransky told journalists, “You should keep a biostatistician in your back pocket. They could rip these studies apart, find out what was wrong (and) give me the questions to ask at the least.” Most universities will have biostatisticians willing to talk to reporters, and Oransky encourages journalists to develop a good relationship them.

He also cautions journalists against taking studies at face value, even from top journals. “If this makes a single reporter think twice about covering a study without any skepticism next time, we’ve accomplished something,” he told On the Media host Brooke Gladstone.

The Poynter Institute also offered some insights on the lessons this incident has for journalists. It quotes Bill Marimow, editor of The Philadelphia Inquirer, pointing out one of the biggest potential pitfalls

“The concern is that it is very easy for a journalist who has not spent a career focusing on one subject to be misled by data that has been massaged for a particular cause, not just the truth.”

Poynter’s James Warren acknowledges with his own anecdote that sometimes journalists will make mistakes. One solution, Columbia University political scientist and economist Chris Blattman, tells Warren, is for reporters to convey in their stories “there’s a chance this isn’t true, even if [the researchers] did everything right.”

Leave a Reply