An Overdue Fix: Racial Bias and Pulse Oximeters
The invention of pulse oximeters in the 1980s reshaped healthcare. While tracking blood oxygen content (commonly recognized as the “fifth vital sign”) once required a painful blood draw and time-delayed analysis, pulse oximeters deliver nearly instantaneous data by simply sending a pulse of light through the skin. Today, pulse oximeters today are ubiquitous: built into smartwatches, purchased at pharmacies for home health monitoring, and used by clinicians to inform treatment of everything from asthma to heart failure to COVID-19. Emerging algorithms are even incorporating pulse ox data to predict future illness.
There is a huge caveat. Pulse oximeters are medically transformative, but racially biased. The devices work less accurately on dark-skinned populations because melanin, the chemical which gives skin pigment, interferes with light-based pulse ox measurements. This means that dark-skinned individuals can exhibit normal pulse ox readings, but be suffering from hypoxemia or other critical conditions.
But because regulations to this day do not require diversity in medical device evaluation, many pulse ox manufacturers don’t test their devices on diverse populations. And because the Food and Drug Administration (FDA) has created streamlined pathways to approve new medical devices based on technology that is “substantially similar” to already-approved technology, the racial bias embedded in ‘80s-era pulse ox technology continues to pervade pulse oximeters on the market today.
COVID-19 illustrated, in devastating fashion, the consequences of this problem. Embedded bias in pulse oximeters demonstrably worsened outcomes for patient populations already disproportionately impacted by COVID-19. Studies show, for instance, that Black COVID-19 patients have been 29% less likely to receive supplemental oxygen on time and three times as likely to suffer occult hypoxemia during the pandemic.
Similar inequities persist across the health-innovation ecosystem. Women suffer from lack of sex-aware prescription drug dosages. Minorities increasingly suffer from biased health risk-assessment algorithms. Children and those with varying body types suffer from medical equipment not built for their physical characteristics. Across the board, inequities create greater risks of morbidity and mortality and contribute to ballooning national healthcare costs.
This need not be the status quo. If health stakeholders—including patient advocates, medtech companies, clinicians, researchers, and policymakers—collectively commit to systematic evaluation and remediation of bias in health technology, change is possible.
An excellent example is eGFR algorithms. These algorithms, used to assess kidney functionality, previously used faulty “correction factors” to account for patient race. But this correction did not actually correlate with biological realities—and instead of treating patients more effectively, it increased disparities in care. Motivated by the data, advocacy and industry organizations issued broad recommendations to avoid using the eGFR calculation. Hospitals and medical systems listened, dropping eGFR from practice, and the National Institutes of Health (NIH) is now committing funding to investigate alternative calculations.
We as a society must continue to root out bias in health technology, from development to testing to deployment.
When we develop new medical tools, we should consider all the populations who could ultimately need them.
When we test tools, we should rigorously evaluate outcomes across subgroup populations, looking for groups that might fare better or worse from its use in care.
And when we deploy technologies, we need to be ready to track the outcomes of their use at scale.
Engineers, researchers, and clinicians can support these goals by designing medical devices with equity in mind. The UK just launched its evidence-gathering process on equity in medical devices, looking into the impacts of bias and ways to build more equitable solutions. The FDA’s meeting reviewing the evidence on pulse oximetry is a start to auditing technologies for their performance on different populations.
Advocacy organizations can support these goals by providing input to ongoing policy processes. The Federation of American Scientists (FAS), alongside the University of Maryland Medical System, submitted a public comment to the FDA to call for regulations that will encourage the development of low-bias and bias-free tools. FAS is also convening a Forum on Bias in Pulse Oximetry to examine the consequences of bias, build an evidence base for bias-free pulse oximetry, and look ahead to approaches to build more equitable devices.
“Do no harm”, a central oath in medicine, is becoming exceedingly difficult in our technological age. Yet, with an evidence-based approach that ensures technologies equitably serve all groups in a population and works to correct them when they do not, we can come closer to achieving this age-old goal.
Investing in interventions behind the walls is not just a matter of improving conditions for incarcerated individuals—it is a public safety and economic imperative. By reducing recidivism through education and family contact, we can improve reentry outcomes and save billions in taxpayer dollars.
To respond and maintain U.S. global leadership, USAID should transition to heavily favor a Fixed-Price model to enhance the United States’ ability to compete globally and deliver impact at scale.
State, local, tribal, and territorial governments along with Critical Infrastructure Owners face escalating cyber threats but struggle with limited cybersecurity staff and complex technology management.
Congress should create a new Science and Technology Hub within the Government Accountability Office to support an understaffed and overwhelmed Congress in addressing pressing science and technology policy questions.