By Talia Kruger
SACRAMENTO, CA – California Attorney General Rob Bonta last week launched an inquiry regarding the prevalence of racial and ethnic bias in healthcare algorithms.
According to a statement from Bonta’s office, “(H)ealthcare algorithms are becoming more commonplace tools, and have been used to assist healthcare professionals in determining patients’ medical needs as well as in administrative work…and are used to help aid medical professionals in health related decision-making.
“The complexity of healthcare decision-making technology can range anywhere from simple charts for decision-making to complex AI programming, (and) allow patient care and outcomes to become more efficient and effective,” said the CA AG.
But, the AG Office added, as healthcare algorithms become more widely used, “there is growing concern that they will affirm long-standing racial and ethnic bias in the healthcare industry…leading to inequitable outcomes for patients.”
An example of this can be found in a study in which researchers found that a widely implemented algorithmic tool referred Black patients to enhanced services less so than White patients with the same medical issues.
This issue arose because the algorithm utilized patient healthcare history but failed to take into account gaps in care due to racial inequities, said the AG.
According to Bonta, “We know that historic biases contribute to the racial health disparities we continue to see today. It’s critical that we work together to address these disparities and bring equity to our healthcare system.”
He said the investigation aims to bring hospitals and other healthcare systems into compliance with state non-discrimination laws, adding he hopes to “(e)nsure that all Californians can access the care they need to lead long and healthy lives.”
Findings such as these have highlighted the demand for clarity in algorithmic construction and usage, he explained, noting, said the statement, “the need to eliminate any biases that may affirm healthcare inequities that historically disadvantaged populations from technological decision-making tools.”
In an effort to eliminate these biases, AG Bonta is asking hospitals across the state to provide information regarding how they are working to address racial and ethnic bias in their decision-making technology.
The Attorney General is requesting which algorithms, software systems, and decision-making programs are being used in clinical decision making, management of population health, care, and utilization, scheduling of appointments and operations, and billing practices and approvals.