One of the most important skills required of reporters who cover medical research is the ability to find and discuss the limits of the studies we cover.
To that end, a trio of professors at Cambridge University recently published a helpful comment in the journal Nature: “Twenty Tips for Interpreting Scientific Claims.” (If you don’t subscribe, you can read the full article for free here.)
Some of my favorites (in no particular order):
- Study relevance limits generalizations – a great reminder that the conditions of any study will limit how its findings can be applied in the real world.
- Bias is rife – We talk about several types of bias in the topic section, like reporting bias and healthy user effect. The article reminds us that even the color of a tablet can shade how study participants feel.
- Data can be dredged or cherry picked – Remember the line Mark Twain liked to use, “There are three kinds of lies: lies, damned lies, and statistics.” When considering numbers, the article reminds us to ask “What am I not being told?”
The concepts are intended to help politicians, who need a basic understanding of science to make sound policy decisions. But David Spiegelhalter, a biostatistician who is the University’s Winton Professor for the Public Understanding of Risk, thinks they’re useful for reporters, too.
“I think all of these are relevant to journalists, particularly on the potential for bias, imperfections of science, and dangers of data-dredging,” he said in an email.
I asked Spiegelhalter, who keeps a close eye on the news, if there were other tips he would add specifically for reporters. He offered these two:
- Extraordinary claims require extraordinary evidence
- The single study that goes against accepted scientific wisdom is probably wrong
You can read more of Professor Spiegelhalter’s musings on the news at his blog, Understanding Uncertainty.