Close Menu
  • Home
  • News
  • Startups
  • Innovation
  • Industry
  • Business
  • Green Innovations
  • Venture Capital
  • Market Data
    • Economic Calendar
    • Stocks
    • Commodities
    • Crypto
    • Forex
Facebook X (Twitter) Instagram
[gtranslate]
Facebook X (Twitter) Instagram YouTube
Innovation & Industry
Banner
  • Home
  • News
  • Startups
  • Innovation
  • Industry
  • Business
  • Green Innovations
  • Venture Capital
  • Market Data
    • Economic Calendar
    • Stocks
    • Commodities
    • Crypto
    • Forex
Login
Innovation & Industry
Innovation

‘Immediate Action’ Needed On Racially Biased Medical Devices In U.K.

News RoomNews RoomMarch 12, 2024No Comments4 Mins Read

Campaigners have called for “immediate action” on racial bias in medical devices like pulse oximeters following the publication of a U.K. report into health inequality.

Biased technology can lead to missed diagnoses and delayed treatment for ethnic minority groups and women, experts warn in the investigation led by Dame Margaret Whitehead.

Pulse oximeters

The British government commissioned an independent review into bias and tech after experts raised concerns that tools used to measure oxygen levels might not work well in people with darker skin tones.

Ethnic minority groups in the U.K. were hit particularly hard by Covid-19, and organisations like the NHS Race and Health Observatory asked at the time if the devices might be a factor.

The tools — called pulse oximeters — work by sending light through a person’s finger. They estimate how much oxygen is in that person’s blood based on how much light passes through.

Monday’s review found “extensive evidence of poorer performance” of the devices in people with darker skin tones. They tend to overestimate oxygen levels, which could lead to treatment delays.

To make matters worse, pulse oximeters and other optical devices are often tested on people with light skin tones, whose results are then “taken as the norm.”

Recalibrating the devices or adjusting guidance around them could help produce more reliable results for what is still a “valuable clinical tool”, the report authors wrote.

Usage guidance should be updated immediately to reduce racial bias in current clinical practice, they added.

Artificial intelligence

AI has massive potential within healthcare and is already used for some clinical applications.

But it’s well recognised that AI tools can produce biased results on the basis of the data they’re fed.

In medicine, that could mean underdiagnosing skin cancer in people with darker skin, if an AI model is trained on lighter skinned patients. Or it might mean failing to recognise heart disease in X-rays of women, if a model is trained on images of men

With relatively little data yet available to measure the impact of such potential biases, the authors call for a taskforce to be set up to assess how large language models (like ChatGPT) could affect health equity.

‘Immediate action’ needed

Jabeer Butt, chief executive of the Race Equality Foundation, told me the review’s findings “demand immediate action.”

“We must swiftly implement change, including conducting equity assessments and enforcing stricter regulations on medical devices,” he said. “Equal access to quality healthcare should be a right for all, regardless of race or skin colour. We need to get rid of biases and ensure our medical tools perform effectively for everyone.”

Although he welcomed the report, it was “crystal clear” that “we need greater diversity in health research, enhanced equity considerations, and collaborative approaches.”

Otherwise racial biases in medical devices, clinical evaluations and healthcare interventions “will persist,” he added. “Rather than improve patient outcomes, they could lead to harm.”

Butt’s concerns echo those of NHS Race and Health Observatory chief executive Professor Habib Naqvi.

“It’s clear that the lack of diverse representation in health research, the absence of robust equity considerations, and the scarcity of co-production approaches, have led to racial bias in medical devices, clinical assessments and in other healthcare interventions,” Naqvi said in a statement.

He added that the research sector needed to tackle the under-representation of ethnic minority patients in research. “We need better quality of ethnicity data recording in the [public health system] and for a broad range of diverse skin tones to be used in medical imaging databanks and in clinical trials,” he said.

Medical devices should “factor equity from concept to manufacture, delivery and take-up,” he added.

Read the full article here

Related Articles

New Era Of NIL: What Every Athlete & Creator Can Learn From Dave Chapelle

Innovation April 16, 2024

Keep Playing Your Dungeons & Dragons Characters After The Campaign

Innovation April 16, 2024

Intel Announces Gaudi 3 Accelerator For Generative AI

Innovation April 16, 2024

‘Escape From Tarkov’ Balance Patch Tweaks Streets Loot And Rare Spawns

Innovation April 16, 2024

Generative AI Is Going To Shape The Mental Health Status Of Our Youths For Generations To Come

Innovation April 16, 2024

Broadcom’s Acquisition Of VMware: A New Dawn For Managed Service Providers

Innovation April 16, 2024
Add A Comment
Leave A Reply Cancel Reply

Copyright © 2026. Innovation & Industry. All Rights Reserved.
  • Privacy Policy
  • Terms of use
  • Press Release
  • Advertise
  • Contact

Type above and press Enter to search. Press Esc to cancel.

Sign In or Register

Welcome Back!

Login to your account below.

Lost password?