An Overdue Fix: Racial Bias and Pulse Oximeters
The invention of pulse oximeters in the 1980s reshaped healthcare. While tracking blood oxygen content (commonly recognized as the “fifth vital sign”) once required a painful blood draw and time-delayed analysis, pulse oximeters deliver nearly instantaneous data by simply sending a pulse of light through the skin. Today, pulse oximeters today are ubiquitous: built into smartwatches, purchased at pharmacies for home health monitoring, and used by clinicians to inform treatment of everything from asthma to heart failure to COVID-19. Emerging algorithms are even incorporating pulse ox data to predict future illness.
There is a huge caveat. Pulse oximeters are medically transformative, but racially biased. The devices work less accurately on dark-skinned populations because melanin, the chemical which gives skin pigment, interferes with light-based pulse ox measurements. This means that dark-skinned individuals can exhibit normal pulse ox readings, but be suffering from hypoxemia or other critical conditions.
But because regulations to this day do not require diversity in medical device evaluation, many pulse ox manufacturers don’t test their devices on diverse populations. And because the Food and Drug Administration (FDA) has created streamlined pathways to approve new medical devices based on technology that is “substantially similar” to already-approved technology, the racial bias embedded in ‘80s-era pulse ox technology continues to pervade pulse oximeters on the market today.
COVID-19 illustrated, in devastating fashion, the consequences of this problem. Embedded bias in pulse oximeters demonstrably worsened outcomes for patient populations already disproportionately impacted by COVID-19. Studies show, for instance, that Black COVID-19 patients have been 29% less likely to receive supplemental oxygen on time and three times as likely to suffer occult hypoxemia during the pandemic.
Similar inequities persist across the health-innovation ecosystem. Women suffer from lack of sex-aware prescription drug dosages. Minorities increasingly suffer from biased health risk-assessment algorithms. Children and those with varying body types suffer from medical equipment not built for their physical characteristics. Across the board, inequities create greater risks of morbidity and mortality and contribute to ballooning national healthcare costs.
This need not be the status quo. If health stakeholders—including patient advocates, medtech companies, clinicians, researchers, and policymakers—collectively commit to systematic evaluation and remediation of bias in health technology, change is possible.
An excellent example is eGFR algorithms. These algorithms, used to assess kidney functionality, previously used faulty “correction factors” to account for patient race. But this correction did not actually correlate with biological realities—and instead of treating patients more effectively, it increased disparities in care. Motivated by the data, advocacy and industry organizations issued broad recommendations to avoid using the eGFR calculation. Hospitals and medical systems listened, dropping eGFR from practice, and the National Institutes of Health (NIH) is now committing funding to investigate alternative calculations.
We as a society must continue to root out bias in health technology, from development to testing to deployment.
When we develop new medical tools, we should consider all the populations who could ultimately need them.
When we test tools, we should rigorously evaluate outcomes across subgroup populations, looking for groups that might fare better or worse from its use in care.
And when we deploy technologies, we need to be ready to track the outcomes of their use at scale.
Engineers, researchers, and clinicians can support these goals by designing medical devices with equity in mind. The UK just launched its evidence-gathering process on equity in medical devices, looking into the impacts of bias and ways to build more equitable solutions. The FDA’s meeting reviewing the evidence on pulse oximetry is a start to auditing technologies for their performance on different populations.
Advocacy organizations can support these goals by providing input to ongoing policy processes. The Federation of American Scientists (FAS), alongside the University of Maryland Medical System, submitted a public comment to the FDA to call for regulations that will encourage the development of low-bias and bias-free tools. FAS is also convening a Forum on Bias in Pulse Oximetry to examine the consequences of bias, build an evidence base for bias-free pulse oximetry, and look ahead to approaches to build more equitable devices.
“Do no harm”, a central oath in medicine, is becoming exceedingly difficult in our technological age. Yet, with an evidence-based approach that ensures technologies equitably serve all groups in a population and works to correct them when they do not, we can come closer to achieving this age-old goal.
Most federal agencies consider the start of the hiring process to be the development of the job posting, but the process really begins well before the job is posted and the official clock starts.
An open jobs board for political appointee positions is necessary to building a stronger and more diverse appointee workforce, and for improving government transparency.
Analyzing NEPA outcomes isn’t just an academic exercise; it’s an essential step for eliminating the biggest hurdles of the environmental review process.
Without market-shaping interventions, federal and state subsidies for energy-efficient products like heat pumps often lead to higher prices, leaving the overall market worse off when rebates end.