Tag Archives: hospital quality

Series reveals gaps in communication of hospital inspection results

Jodie Jackson Jr. of the Columbia (Mo.) Daily Tribune took an in-depth look at patient safety at University Hospital, part of the University of Missouri Health Care system.

Jackson found that inspections, by CMS and the FDA, have repeatedly turned up systemic practices that compromised patient safety. At the same time, the Joint Commission awarded the hospital a full accreditation, raising questions about why the agencies don’t share information.

In a blog post, Jackson, a Midwest Health Journalism Program Fellow, says he has “examined some 700 pages of documents and have had national infection control leaders examine the reports that formed the basis for the series.”

Critics point out issues in patient satisfaction ratings

About Andrew Van Dam

Andrew Van Dam of The Wall Street Journal previously worked at the AHCJ offices while earning his master’s degree at the Missouri School of Journalism.

On the heels of a government proposal to tie hospital incentive payments to patient satisfaction ratings, a few outlets have started looking at the validity of such measurements.

At HealthLeaders Media, Cheryl Clark reports that regional differences in tendency to be satisfied (the numbers show that New Yorkers are harder to please than Midwesterners and New Englanders, for instance) mean that any absolute number thresholds issued by the feds would penalize hospitals in parts of the country where folks are less likely to respond well to surveys.

And on KevinMD.com, William Sullivan, D.O., J.D., takes a few swings of his own, first taking aim at the ratings’  sampling and statistical grounding, then moving on to what he says is hospitals’ over-reliance on percentile quality ratings.

The problem, according to Sullivan? Overall patient satisfaction is quite high, thus doctors’ ratings cluster tightly around the low 90s on a 100-point scale. That means even a small shift in absolute rating will cause a huge jump in percentile. On at least one system, a 4-percentage-point absolute drop will take a doctor from the 90th percentile to the 50th. And, thanks to the aforementioned sampling issues, that drop can be caused by a handful of particularly ornery patients. Patients who, Sullivan writes, are thus given massive leverage.

With our employment and our compensation hinging on every “5” we can get, doctors are being coerced into giving patients whatever they want, regardless of medical appropriateness. When we cater to satisfaction scores more than we cater to proper medical care, we are violating our oath, devaluing our education, and potentially harming our patients.

AHCJ Resource:Analyze patient satisfaction surveys for your local hospitals

“Numbers can be a start – not the end – of a story,” the AHCJ website notes. Remember that patient satisfaction scores only mean so much. Sometimes the best doctors have gruff demeanors while those with inferior skills have great bedside manners. Patients may not recommend hospitals to friends because they dislike the food or think their roommates were too loud.  But if patients report that doctors or nurses didn’t communicate well, that very well could affect the care the patients received. Using data can give you a valuable tip sheet to generate ideas and questions in your pursuit of a story.

For hospital overall survey results, AHCJ includes comparison of data first released in March 2008 then updated quarterly, allowing journalists to compare overall survey results over a lengthy timeline.

Yale doc knows the right data is out there

About Andrew Van Dam

Andrew Van Dam of The Wall Street Journal previously worked at the AHCJ offices while earning his master’s degree at the Missouri School of Journalism.

Writing for Forbes Magazine, Matthew Herper profiles Harlan Krumholz, the pioneering quality of care researcher and Yale cardiologist. During his career, Krumholz, 52, has been at the leading edge of everything from Hospital Compare and angioplasty delivery times to recent headline grabbers like the Vioxx suit and the 2009 study on radiation exposure during routine scans. The highlight of the profile comes when Herper highlights Krumholz’ knack for picking out just the right metrics with which to hold feet to the fire.

Harlan M. Krumholz, M.D.

Harlan M. Krumholz, M.D.

By figuring out what to measure and how, he showed that even top hospitals were systematically underperforming, largely because no one was tracking the results.

Krumholz’s basic idea is that if you ask the right question and pick the right measurement, you can figure out a way to get the answer, often using billing records or existing databases. This frequently involves partnering with insurers or Medicare. He has a knack for focusing on performance metrics that hold hospitals accountable.

Mass. won’t post hospitals’ death rates

About Andrew Van Dam

Andrew Van Dam of The Wall Street Journal previously worked at the AHCJ offices while earning his master’s degree at the Missouri School of Journalism.

The Boston Globe‘s Liz Kowalczyk reports that, two years after it was first proposed by a consumer group, the Massachusetts Health Care Quality and Cost Council has decided it won’t publish hospital-wide mortality rates. The problem, it seems, is the lack of an accurate, universal method of computing such numbers.

Health and Human Services Secretary Dr. JudyAnn Bigby, who heads the group that made the decision, said current methodology for calculating hospital-wide mortality rates is so flawed that officials do not believe it would be useful to hospitals and patients and could harm public trust in government.

It appears, Kowalczyk writes, that general hospital mortality rates just aren’t “ready for prime time” quite yet.

The council convened an expert panel, which worked with researchers to evaluate software of four companies for measuring hospital mortality. The problem was that researchers came out with vastly different results when they used the various methodologies to calculate hospital mortality between 2004 and 2007 in Massachusetts, and they could not tell which company’s results — or if any — were accurate.

Analysis of billing record data reveals hospital quality issues in Las Vegas

About Pia Christensen

Pia Christensen (@AHCJ_Pia) is the managing editor/online services for AHCJ. She manages the content and development of healthjournalism.org, coordinates AHCJ's social media efforts and edits and manages production of association guides, programs and newsletters.

Using data from hospital billing records, Marshall Allen and Alex Richards of the Las Vegas Sun have been able to identify “hospital-acquired patient harm,” that is, events in which patients are harmed while in the hospital.

Medicare does not pay for these “never events” and so they are reflected in hospital billing codes. Such events include things like leaving foreign objects in a patient, bed sores, falls, infections related to catheters or surgical sites, blood clots and poor glycemic control.

Nevada – and 40 other states – collect such data for analysis, Allen and Richards report. In Nevada, the state had not yet analyzed the data so the reporters requested it and did the analysis.

The pair requested and received records for “every Nevada hospital inpatient visit going back a decade — 2.9 million in all. The information, coupled with interviews with more than 150 patients and health care insiders, has yielded a sweeping and detailed portrait of hospital care in Las Vegas.”

The project includes stories about patients who were harmed while hospitalized, the documents behind the reporting, data tables, interactive graphics and more.