People from ethnic minorities, women and people from disadvantaged communities are at risk of poorer quality healthcare due to bias within medical tools and devices, a new report has found.
Among other findings, Equity in Medical Devices: Independent Review raised concerns about devices that use artificial intelligence (AI), as well as those that measure oxygen levels.
The team behind the review said urgent action was needed.
“We would like to have a fair view of the entire life cycle of medical devices, from initial testing to patient recruitment in the hospital or community, to early study phases and implementation. implemented in the field after their authorization,” said Professor Frank Kee, director of the Public Health Center at Queen’s University Belfast and co-author of the review.
Assistant Minister for Health Andrew Stephenson said: “Ensuring the health system works for everyone, regardless of their ethnicity, is paramount to our values as a nation. This supports our wider work to create a fairer and simpler NHS,” he said.
The government-commissioned review was set up by Sajid Javid in 2022 while he was health secretary after concerns were raised about the accuracy of pulse oximeter readings in black people and ethnic minorities.
These widely used devices have been thrust into the spotlight due to their importance in healthcare during the Covid pandemic, where low oxygen levels were an important sign of serious illness.
The new report confirmed fears that pulse oximeters overestimate the amount of oxygen in the blood of dark-skinned people, noting that while there was no evidence it affected care in the NHS, Damages have been noted in the United States, with such biases leading to delays. diagnosis and treatment, as well as worsening organ function and death, in black patients.
The team emphasizes that it is not asking that these devices be avoided. Instead, the review proposes a number of steps to improve the use of pulse oximeters in people of different skin tones, including the need to examine changes in readings rather than single readings, while providing also tips on how to develop and test new devices. to ensure they work well for patients of all ethnicities.
Concerns about AI-based devices were also highlighted in the report, including the potential for such technology to exacerbate the underdiagnosis of heart disease in women, leading to discrimination based on socio-status -economic of patients and lead to underdiagnosis of skin cancers. in people with darker skin. Concerns about the latter, they say, are due to the fact that AI devices are largely trained on images of lighter skin.
The report also noted problems with polygenic risk scores – which are often used to measure an individual’s risk of disease due to their genes.
“The main genetic datasets used by polygenic risk scores are predominantly for people of European ancestry, which means they may not be applicable to people of other ancestries,” said Enitan Carrol, professor of pediatric infection at the University of Liverpool and co-author. of the magazine.
However, attempts to correct bias can also be problematic. Examples highlighted by the report include race-based corrections applied to measurements from devices called spirometers, used to assess lung function and diagnose respiratory conditions, which themselves have been found to contain bias.
Professor Habib Naqvi, chief executive of the NHS Race and Health Observatory, welcomed the findings, adding that the review recognized the need for immediate changes, equity assessments as well as guidance and stricter regulations regarding pulse oximeters and other medical devices.
“Access to better health should not be determined by your ethnicity or the color of your skin; medical devices must therefore be adapted to all communities,” he said.
“It is clear that the lack of diverse representation in health research, the absence of robust equity considerations, and the scarcity of co-production approaches have led to racial bias in medical devices, clinical evaluations, and other health care interventions. »