STAT’s “Denied by AI” series a model of solid investigative journalism

Share:

nurse holding a patient's chart

Photo by Laura James via Pexels

Last week, we shared nine follow-up stories journalists could pursue in the wake of STAT News’ excellently-reported series, “Denied by AI.” The articles by Bob Herman and Casey Ross detailed the dangers of using an artificial intelligence-driven algorithm to assist doctors and insurance case managers in making health care decisions for patients. 

Their reporting showed how UnitedHealth Group used the algorithm to override clinical judgment in many cases and deny rehabilitation care to seriously ill people. The series was a Pulitzer Prize finalist for investigative reporting (the full text of the articles can be found at that link) and also took second place in AHCJ’s Awards for Excellence in Health Care Journalism. 

Today, we discuss more about the series and the flaws it exposed, particularly how the algorithm’s care denials yielded profits for insurers, while the company authorized sending home patients who could barely care for themselves. It’s an example of investigative reporting at its finest, and of AI usage at among its worst. 

Algorithm above all 

In recent years, insurers have increasingly adopted algorithms to predict how many hours of therapy patients will need, which types of doctors they might see, and when they will be able to leave a hospital or nursing home, the series said. 

The algorithm UnitedHealth Group used, called nH Predict, factors in details such as a person’s diagnosis, age, living situation and physical function to match them with similar patients from its database of 6 million patients it compiled over years of working with providers, Herman and Ross reported. Then, it generates an assessment of a patient’s mobility and cognitive capacity, along with a prediction of their medical needs, estimated length of stay and discharge date. 

It sounds good in theory, for planning purposes. But the series exposed how a UnitedHealth subsidiary called NaviHealth set a target date for case managers to keep patients enrolled in Medicare Advantage (MA) plans in nursing-home rehabilitation stays to within 3% of days the algorithm projected, and later narrowed that to less than to within 1%. At the same time, there was no accounting for variances in the patients’ health status, Herman and Ross explained. 

Even in instances when employees argued that patients needed more time in rehab, the company’s physician medical reviewers deferred to the algorithm, “fueling internal dissent that the denials were inappropriate and contrary to clear medical evidence,” the reporters wrote. The company also directed frontline care coordinators to time their reviews of patients’ progress so they could be discharged on the exact date the algorithm projected. 

The reporting was based on a review of internal communications at NaviHealth, interviews with five former employees, patients’ family members, lawyers, and experts in insurance coverage and mathematical modeling. 

“Our investigation revealed that the algorithm’s use was resulting in premature payment denials for patients recovering from strokes, cancer, and other serious illnesses,” Herman and Ross wrote. “We pursued this story because people deserve to know what they are signing up for, and how their Medicare Advantage coverage works when they are at their most vulnerable.” 

All about profit

STAT’s series demonstrated that MA plan insurers already generate healthy profits, and when the algorithm said care should be cut off, these insurance plans could further boost their profits even though some patients suffered serious consequences. 

Health insurers denying care is not new, of course, as the federal Department of Health and Human Services’ Office of the Inspector General showed in 2022. But Herman and Ross found that AI drove health insurers’ denials to new heights in MA plans. Among the more than 61.2 million Medicare members, 54% (32.8 million people) are enrolled in MA plans this year, KFF reported in August.

“Behind the scenes, insurers are using unregulated predictive algorithms, under the guise of scientific rigor, to pinpoint the precise moment when they can plausibly cut off payment for an older patient’s treatment,” Herman and Ross explained. “The denials that follow are setting off heated disputes between doctors and insurers, often delaying treatment of seriously ill patients who are neither aware of the algorithms, nor able to question their calculations.”

If seniors or their loved ones need to pay for their own care, they can appeal denials and then spend months waiting for a decision that may not be in their favor, the reporters explained.

Perhaps most concerning about AI in health care in this scenario: The algorithms already were in place when patient care began, Herman and Ross reported. Then, the algorithms not only made recommendations that failed to account for each patient’s individual circumstance, but they also conflicted with rules Medicare required plans to cover, they wrote.

Patients with stroke complications whose symptoms were so severe they needed care from multiple specialists were getting blocked from stays in rehabilitation hospitals, Herman and Ross added. Amputees were denied access to care meant to help them recover from surgeries and learn to live without their limbs, they wrote. And efforts to reverse what seemed to be bad decisions were going nowhere, they explained. 

NaviHealth has not published any scientific studies assessing the real-world performance of its nH Predict algorithm, Ross and Herman wrote. To the extent it tests its performance internally, those results had not been shared publicly when STAT published the series.

The company also did not respond to STAT’s questions about the use of its algorithm, Herman and Ross wrote. However, a spokesperson said in a statement that the algorithm “is not used to make coverage determinations” but “is a guide to help us inform providers, families and other caregivers about what sort of assistance and care the patient may need both in the facility and after returning home.”

Resources

Karen Blum and Joseph Burns