How reporters suss out research misconduct

Share:

report on bad science

Photo by Zachary Linhares

By Alexander Castro/Rhode Island Health Journalism Fellowship

  • Moderator: Nidhi Subbaraman, science reporter, The Wall Street Journal
  • Mark Barnes, J.D., LLM, partner, Ropes & Gray LLP
  • Stephanie M. Lee, senior reporter, The Chronicle of Higher Education
  • Ivan Oransky, M.D., co-founder of Retraction Watch, editor-in-chief of The Transmitter, and distinguished journalist in residence at New York University’s Arthur Carter Journalism Institute. 

Scientific research and data are supposed to make a journalist’s story better, and more deeply tied to reality. Significant new data can become a story in its own right. 

Another way for research to become a story? The data isn’t true.

Scientific fraud, misconduct and deception were a few takeaways from a panel at Health Journalism 2024. Moderated by Nidhi Subbaraman, a science reporter at the Wall Street Journal, the panel illuminated how to work a beat that alternates between “extremely public” and “secretive and opaque.”

Deceit and subpar practices in research are not a new trend, Subbaraman explained, and pointed toward a high-profile debacle reported in 2005, wherein a researcher alleged that raw data for his study on heart attacks and diet was unavailable: It had been stored on a cupboard’s wooden shelves, and nibbled away by termites. A decade passed before the study was scrutinized — it was published in 1992.  

“For all its demands of scientists, publishers acknowledge that, by and large, catching errors is not their primary game,” Subbaraman said.  

Sites like Retraction Watch, which studiously catalogs and investigates rescinded papers, have helped fill in the gap left by publishers’ blindspots.

“You go back to 2002, more than 20 years ago now, you had about one in 5,000 papers being retracted,” said Ivan Oransky, M.D., co-founder of Retraction Watch. “If you go to last year, we’ve had about one in 500 papers being retracted.”

Oransky contextualized 2023’s aggressively high rate of retraction as partially due to paper mills, which he compared to essay mills for college students, churning out research articles with a questionable basis in reality for publication in journals of repute. Academic pressure to regularly publish research means some articles are written too hastily, carelessly or deceptively. 

Bad science could be more easily tucked away in journals, or a termite-infested cupboard, in the pre-Twitter age, Subbaraman noted. But research articles now attract more eyes than they did before, and more scrutiny means it’s more likely that slip-ups, hyperboles or outright lies can be caught. 

Panelist Stephanie M. Lee has been a frequent chronicler of sloppy science and data manipulation — previously at Buzzfeed and now at The Chronicle of Higher Education

“The truth is that producing knowledge is a human endeavor,” Lee said. “And so that process has flaws and bad incentives, shortcomings. Sometimes the process of how something gets done is what the real story’s about,” — or, rephrased as a slide title in her presentation, “Bad behavior = good story.”

Mark Barnes, JD, LLM, is a partner with Ropes and Gray, a Boston firm specializing in research integrity and misconduct. He’s often tasked with representing universities and academic medical centers in the legal fallout from research scandals.

“There really is a very careful and very thorough process,” Barnes said. “The people who are accused in this process of research misconduct, they have incredible due process rights … They have more due process rights than criminals.”

Panelists pointed to a number of resources to help reporters’ search efforts. The popular Open Secrets can be used to follow the money that flows into suspect studies. Paperwork like interest disclosure forms and filings for publicly traded companies can lend insight into researchers’ possible motivations. (These can be easier to obtain if you’re covering research from a public university.) Clinical trial registrations are submitted to federal drug regulators before a study, so they can suggest whether a study’s actual findings aligned with its stated goals. Finally, PubPeer hosts discussion on specific research articles. It also offers a browser extension that will notify you if you’re looking at a retracted article. 


Alexander Castro writes about policy and statewide trends in health, education and humanities for Rhode Island Current, part of States Newsroom. A 2024 AHCJ Fellow from Rhode Island, he lives and works in the state’s capital city of Providence.

Contributing writer

Share:

Tags: