Care for some of the sickest Americans is decided in part by algorithm. New research shows that software guiding care for tens of millions of people systematically privileges white patients over black patients. Analysis of records from a major US hospital revealed that the algorithm used effectively let whites cut in line for special programs for patients with complex, chronic conditions such as diabetes or kidney problems.
The hospital, which the researchers didnt identify but described as a “large academic hospital,” was one of many US health providers that employ algorithms to identify primary care patients with the most complex health needs. Such software is often tapped to recommend people for programs that offer extra support—including dedicated appointments and nursing teams—to people with a tangle of chronic conditions.
Researchers who dug through nearly 50,000 records discovered that the algorithm effectively low-balled the health needs of the hospitals black patients. Using its output to help select patients for extra care favored white patients over black patients with the same health burden.
When the researchers compared black patients and white patients to whom the algorithm assigned similar risk scores, they found the black patients were significantly sicker, for example with higher blood pressure and less well-controlled diabetes. This had the effect of excluding people from the extra care program on the basis of race. The hospital automatically enrolled patients above certain risk scores into the program or referred them for consideration by doctors.
The researchers calculated that the algorithms bias effectively reduced the proportion of black patients receiving extra help by more than half, from almost 50% to less than 20%. Those missing out on extra care potentially faced a greater chance of emergency room visits and hospital stays.
“There were stark differences in outcomes,” says Ziad Obermeyer, a physician and researcher at UC Berkeley who worked on the project with colleagues from the University of Chicago and Brigham and Womens and Massachusetts General hospitals in Boston.
The paper, published Thursday in Science, does not identify the company behind the algorithm that produced those skewed judgments. Obermeyer says the company has confirmed the problem and is working to address it. In a talk on the project this summer, he said the algorithm is used in the care of 70 million patients and developed by a subsidiary of an insurance company. That suggests the algorithm may be from Optum, owned by insurer UnitedHealth, which says its product that attempts to predict patient risks, including costs, is used to “manage more than 70 million lives.” Asked by WIRED if its software was that in the study, Optum said in a statement that doctors should not use algorithmic scores alone to make decisions about patients. “As we advise our customers, these tools should never be viewed as a substitute for a doctors expertise and knowledge of their patients individual needs,” it said.
The algorithm studied did not take account of race when estimating a persons risk of health problems. Its skewed performance shows how even putatively race-neutral formulas can still have discriminatory effects when they lean on data that reflects inequalities in society.
The software was designed to predict patients future health costs as a proxy for their health needs. It could predict costs with reasonable accuracy for both black patients and white patients. But that had the effect of priming the system to replicate unevenness in access to healthcare in America—a case study in the hazards of combining optimizing algorithms with data that reflects raw social reality.
When the hospital used risk scores to select patients for its complex care program it was selecting patients likely to cost more in the future—not on the basis of their actual health. People with lower incomes typically run up smaller health costs because they are less likely to have the insurance coverage, free time, transportation, or job security needed to easily attend medical appointments, says Linda Goler Blount, president and CEO of nonprofit the Black Womens Health Imperative.
Because black people tend to have lower incomes than white people, an algorithm concerned only with costs sees them as lower risk than white patients with similar medical conditions. “It is not because people are black, its because of the experience of being black,” she says. “If you looked at poor white or Hispanic patients, Im sure you would see similar patterns.”
Blount recently contributed to a study that suggested there may be similar problems in “smart scheduling” software used by some health providers to increase efficiency. The tools try to assign patients who previously skipped appointments into overbooked slots. Research has shown that approach can maximize clinic time, and it was discussedRead More – Source