Tag Archives: hospital compare

Disclosure of hospital infections still in its infancy

About Andrew Van Dam

Andrew Van Dam of The Wall Street Journal previously worked at the AHCJ offices while earning his master’s degree at the Missouri School of Journalism.

On Forbes.com, Gergana Koleva evaluated the woeful state of national hospital-associated infection reporting, with the help of recently published research. As Koleva writes, such infections account for more than 8,000 deaths each year in the United States and add an estimated $10 billion in annual cost, and hospitals routinely collect valuable data on such things for internal use, yet no clear reporting standards exist on a national level.

The report … shows that only 21 states currently have legislation that requires monitoring and public reporting for surgical site infections. Of those, only eight states actually make the data publicly available, and only a total of 10 procedures – out of 250 possible types of surgeries – get reported.

And even many those states that reported some surgical infection rates as of late 2010 (Colorado, Massachussetts, Missouri, New York, Ohio, Oregon, South Carolina, and Vermont)

Continue reading

Ohio’s hospital transparency law under fire

About Andrew Van Dam

Andrew Van Dam of The Wall Street Journal previously worked at the AHCJ offices while earning his master’s degree at the Missouri School of Journalism.

Thanks are due to blogger and one-time hospital executive Paul Levy for drawing our attention to the Ohio hospital industry’s recent push to overturn much of the state’s recently passed transparency legislation.

The law required hospitals to post performance data, such as infection rates and patient satisfaction, on the Ohio Hospital Compare site.

According to Brandon Glenn’s report in the MedCity News, the hospital industry opposes the site, online since Jan. 1, 2010, because it serves the same purpose as the federal Hospital Compare site.

The OHA supports the new legislation… because it wants to remove “duplicative” reporting requirements on the state’s hospitals. Ohio hospitals already report the same data to a federal Hospital Compare website maintained for the public by the Centers for Medicare & Medicaid Services, said OHA spokeswoman Tiffany Himmelreich.

The new legislation “doesn’t reduce reporting. It just eliminates reporting the same information to two different places,” she said. “We don’t want the public to feel that this is taking a step backwards in terms of data availability.”

For their part, consumer advocates say website maintenance is not an onerous burden, and that the hospital association’s push is part of a larger, statewide antitransparency trend.

As an interesting side note, Glenn found the Ohio Hospital Compare site to be rendered inoperable by apparent bugs on an initial visit but discovered that, after his inquiries to the state health department, the site was put into working order.

Dallas reporters use AHRQ data to measure patient safety

About Andrew Van Dam

Andrew Van Dam of The Wall Street Journal previously worked at the AHCJ offices while earning his master’s degree at the Missouri School of Journalism.

The Dallas Morning News continues its 19-month investigation into patient safety at UT Southwestern Medical Center and Parkland Memorial Hospital.

The project, “First, Do No Harm: An investigation of patient safety in Dallas hospitals,” is behind the website’s paywall but The Dallas Morning News has granted AHCJ members access. To find out how to access the stories, please click here and log in as an AHCJ member.

Among the latest reporting:

Dallas Morning News reporters Ryan McNeill and Daniel Lathrop took advantage of AHRQ’s Patient Safety Indicator (PSI) software, typically used internally by hospitals, to process 9 million publicly available patient records from Texas hospitals, all of which came from between

Parkland, the prominent local hospital that has earned scrutiny on numerous prior occasions, was just the most notable of a number of area hospitals that came up short (and generated headlines), but our interest lies more with the reporters’ investigative methodology as well as the path they’ve blazed for broader hospital quality reporting.

All their work was done in consultation with experts in the field, including academics, government officials and hospital administrators. An outside review indicated McNeill and Lathrop used the software properly, and their results were in line with a similar public analysis. But that’s not to say it was a simple process.

The newspaper spent six months analyzing nearly 9 million state hospital discharge records using Patient Safety Indicators, or PSI, software. This highly sophisticated system was designed for the federal government as a tool to measure potentially preventable complications among hospital patients.

The PSIs do not present a complete safety picture because they are based on administrative data — a summary of diagnoses, procedures and outcomes derived from patients’ medical charts, as opposed to a complete review of all medical records.

It’s not a perfect measure, but it’s one of the best available.

PSIs “reflect quality of care inside hospitals,” according to the Agency for Healthcare Research and Quality, a division of the U.S. Department of Health and Human Services. It released the PSI software in 2003 and periodically updates it, most recently in August. The News used that version for its final analysis.

The software analyzes the administrative data that nearly every hospital in Texas reports to the state. No patient-identifying information is included.

The results on 15 PSIs are statistically “risk-adjusted” because some hospitals treat a disproportionate share of unhealthy patients, who face a greater risk of potentially preventable complications. Rates from eight of the indicators are used to determine a hospital’s patient safety “composite score.”

The AHRQ has just started posting some PSI measures on Hospital Compare, and the Texas health department plans to follow suit in 2013, but reporters looking to get their hands on a broader swath of the data will still have to follow the Dallas duo’s do-it-yourself approach.

The reporters’ work drew criticism from the Texas Hospital Association, which said the methodology was “not intended for use in public reporting.” McNeill refutes its claims in a blog post. Daniel K. Podolsky, president of UT Southwestern Medical Center, also sent a letter criticizing the reporting. George Rodrigue, managing editor of The Dallas Morning News, published a point-by-point response to Podolsky’s letter.

Data shows disconnect between patient perception, hospital performance

About Andrew Van Dam

Andrew Van Dam of The Wall Street Journal previously worked at the AHCJ offices while earning his master’s degree at the Missouri School of Journalism.

Sifting through Medicare hospital rating data, USA Today reporters Steve Sternberg and Christopher Schnaars found an enlightening disconnect between patients’ subjective ratings of hospitals and hospital performance on quantitative measures such as death and readmission rates.

“This is a very important finding,” says Donald Berwick, director of the Centers for Medicare & Medicaid Services, adding that though patient-survey data offer critical insights into how it feels to be a patient at different hospitals, patients’ perceptions don’t tell the whole story.

The story is packaged with an infographic that allows readers to look up ratings for local hospitals.

AHCJ resources

Data: Hospital performs ‘combination’ CT scans at 10 times national rate

About Pia Christensen

Pia Christensen (@AHCJ_Pia) is the managing editor/online services for AHCJ. She manages the content and development of healthjournalism.org, coordinates AHCJ's social media efforts and edits and manages production of association guides, programs and newsletters.

Lisa Chedekel, of the Connecticut Health Investigative Team, used Hospital Compare data from the Centers for Medicare & Medicaid Services to find that patients at the University of Connecticut’s John Dempsey Hospital are getting “combination” CT scans much more that the national average.

ct-scan

Photo by Akira Ohgaki via Flickr

Combination scans mean that patients get two scans which, of course, subjects them to more radiation than a regular scan.

For chest scans, a patient’s radiation exposure from a double scan is 700 times higher than from a simple chest X-ray. For abdominal scans, the radiation dose is comparable to that of approximately 400 chest X-rays.

Nationally, the rate of patients getting a combination scan is 5 percent for chest scans and 19 percent for abdominal scans. At Dempsey, 48 percent of patients receiving chest scans had combination scans. For abdominal scans, it was more than 72 percent.

The hospital’s chief of radiology said he was “absolutely staggered” by the high rates but that “his own internal review last year had flagged a high incidence of the multiple scans – a trend that the hospital is now addressing.”

Related