The United States Preventive Services Task Force (USPSTF) has made changes to the way they present their recommendations in hopes of making them more user-friendly for physicians. The result also is clearer and easier to follow for journalists and consumers too.
The changes, outlined in the September issue of JAMA, include better use plain language, making the recommendations more easily scannable and emphasizing top-line recommendations without repetitive or marginally relevant information. You still can get the nitty-gritty details of a recommendation and supporting evidence from the site, but for those needing a quick summary, it’s now easier to find what you need.
If you aren’t already familiar with the task force, it’s worth taking the time to peruse the site — it’s one of the few U.S. government websites I’ve found to be intuitive and easy to navigate. The USPSTF is an independent volunteer panel of physicians who are experts in prevention research and in interpreting and practicing evidence-based medicine.
Every year, the panel reviews all the evidence on a specific topic —such as screening for a particular cancer or clinical activities that aim to prevent a disorder — and issues recommendations for doctors on which practices have evidence to support it.
The experts most often come from primary practice backgrounds and the rules to avoid conflicts of interest are strict. Members can have no recent industry-related disclosures and must undergo rigorous vetting — publicly available — before serving on the panel, which also makes them great as sources with a low likelihood of bias).
The most significant limitation to the panel’s recommendations is that they’re only as good as the evidence they review, but a 2018 study in JAMA suggests that evidence they use is unlikely to have a strong funding-related bias. The researchers looked at funding sources for all studies included in the systematic reviews that the USPSTF relied on for their recommendations from January 2014-February 2016.
Among the 79% of studies with funding sources listed, more than half (56%) were funded by government agencies. A third (32%) had nonprofit or university funding and only 17% had industry funding. While the funding proportions varied by topic — screening for chlamydia and gonorrhea, for example, had 75% industry funding — the diversity of funding types and geography is encouraging. The study as a whole is a good reminder to journalists to check the funding of studies in systematic reviews they regularly use.