It’s been 15 years since BMJ published the most rigorous type of study there is — a systematic review of randomized controlled trials — to assess the evidence for using a parachute to prevent death and major injury when jumping from a plane. RCTs are considered the gold standard in research, and systematic reviews claim the top spot of the evidence pyramid.
As those familiar with this now-famous study know, the authors of that 2003 Christmas issue study found no RCTs testing of the safety of jumping from an airplane with a parachute. They had to conclude, “As with many interventions intended to prevent ill health, the effectiveness of parachutes has not been subjected to rigorous evaluation by using randomized controlled trials.”
But at long last, an RCT on this topic exists. In BMJ’s 2018 Christmas issue, a randomized controlled trial tested the effectiveness of using a parachute to prevent death and major trauma when jumping from an airplane. And like authors of the original parachute study, the new study’s authors make an important point.
First, let’s return to the earlier research. Obviously the first parachute study was tongue-in-cheek, but the authors’ point was important: The absence of randomized controlled trials for a certain medical intervention does not mean that the intervention cannot be safe and effective. And an RCT often would be unethical, as I noted in a past blog on the “tyranny of the RCT.” Sometimes, well-designed observational studies must suffice.
And common sense matters too. After all, if you had 100 people diagnosed with dehydration, you wouldn’t give water to only half of them and see what happens to the other half.
The authors then poked fun at evidence-based medicine advocates who “have criticized the adoption of interventions evaluated by using only observational data” with a recommendation: “We think that everyone might benefit if the most radical protagonists of evidence-based medicine organized and participated in a double-blind, randomized, placebo-controlled, crossover trial of the parachute.” (Note: A crossover trial means every participant will eventually jump from the plane without a parachute.)
Cardiologist Robert W. Yeh, M.D., an assistant professor at Harvard Medical School, and his colleagues now have taken on that challenge — and as Yeh shared on Twitter, found that parachutes made no difference to injury or death rates! Their study was small, involving just 23 brave (or crazy) volunteers from the 92 aircraft passengers they invited to participate. But they randomized those 23 so that half jumped from a plane with a parachute and half jumped with a plain backpack (unblinded). Within both groups, 0 percent suffered death or traumatic injury.
Ah, but the method of madness matters! The non-participating passengers flew at 800 km/hr at an altitude of 9,146 m, but the trial participants jumped a whopping 0.6 meter (2 feet) from a plane traveling at an incredible 0 km/hr. The authors point out their trial’s glaring limitation — an inability to generalize to higher altitude jumps — and use it make a point that health journalists would be wise to remember:
“When beliefs regarding the effectiveness of an intervention exist in the community, randomized trials might selectively enroll individuals with a lower perceived likelihood of benefit, thus diminishing the applicability of the results to clinical practice.”
Put plainly, if most people already think an intervention works, then an RCT may end up with enough bias in its design that the conclusion ends up clinically meaningless. Sometimes, an RCT is truly unethical, and other times an RCT really might be needed to test an intervention taken for granted. Health journalists should scrutinize an RCT’s methods closely.