Remote care: Distance may not be the biggest challenge > Overcoming the challenges of AI to improve health inequity

Overcoming the challenges of AI to improve health inequity

The conversation

In a recent article, For fair and equal healthcare, we need fair and bias-free AI, Henk van Houten, Executive Vice President, Chief Technology Officer, Royal Philips, wrote that Artificial intelligence (AI) has the potential to make healthcare more accessible, affordable, and effective. He pointed out that it can also inadvertently lead to erroneous conclusions and thereby amplify existing inequalities. Mitigating these risks requires awareness of the bias that can creep into AI algorithms.  However, by being aware of any bias in AI algorithms, and through careful design and implementation, AI has the promise to create a healthier and fairer future for all. 

  • Researchers at the University of California revealed that an algorithm used to flag high-risk patients for enhanced medical attention displayed significant racial bias.
  • Bias can occur at any phase of AI development and deployment, from using biased datasets to deploying algorithms in a different context than the one it was trained for.

Our perspective

Why this matters for sleep and respiratory patients

 

The role of AI in identifying at-risk populations in the community could potentially open up new treatment and care pathways for people suffering from sleep and respiratory conditions. However, in the quest to bring equality to health systems, a growing issue is whether AI can be effective in identifying the needs of underserved communities, where the prevalence of respiratory conditions is higher. Already, we have witnessed how the algorithms that AI technologies rely on can be skewed against certain populations.

The importance of building trust

In a widely reported study published in Science last year,1 a group of researchers at the University of California revealed that an algorithm used by US hospitals to flag high-risk patients for enhanced medical attention exhibited significant racial bias. Black patients were less likely to be flagged for additional care than white patients, even when they were equally sick. Through removing disparities in health data that often result in a one-size-fits-all approach, we support bridging some of the divides in healthcare outcomes.  


Sleep and respiratory patients could embrace the role of AI technology in their healthcare — and the platforms and systems they use — if they trust those methods will help treat their sleep and respiratory conditions in the long run. If we get this right, we can reach patients in all corners of our communities and change millions of lives. 

Thought leadership logo

More from

"Addressing health inequity: the experts speak"

  • Remote care: Distance may not be the biggest challenge

    Remote care: Distance may not be the biggest challenge

    Dr. Frederic Seifer discusses how remote monitoring brings patients into the health system and is transforming the doctor-patient relationship.

    Click here to learn more
    More :
    SRC
  • Using precision medicine to address health disparities

    Using precision medicine to address health disparities

    In this CME/CE approved presentation, Dr. Azizi Seixas discusses how population health approaches can help drive new ways to solve health disparities.

    Click here to learn more
    More :
    SRC
  • A conversation on Global health inequities: how sleep and nutrition can play a part in possible solutions

    A conversation on Global health inequities: how sleep and nutrition can play a part in possible solutions

    Professor Shantha Rajaratnam and Dr. Sandro DeMaio discuss and share evidence of the importance of addressing health inequalities and how it factors into to sleep health.

    Click here to learn more
    More :
    SRC

Stay up-to-date and subscribe

Sign up to stay informed and receive information on healthcare innovation, straight to your inbox

By submitting this form, I agree to receive marketing related digital communications about Philips products, services, events and offers that may be relevant to my given your user preferences and behavior. I can revoke my consent at any time.

*
*
*
*
*
*

By submitting this form, I agree to receive marketing related digital communications about Philips products, services, events and offers that may be relevant to my given your user preferences and behavior. I can revoke my consent at any time.

Les informations collectées sont enregistrées par Philips France Commercial, 33 rue de Verdun 92150 Suresnes. Vos données seront conservées en dehors de l’Union Européenne et utilisées par les services marketing de Philips pour une durée de 24 mois à compter de leur collecte ou du dernier contact de votre part. Conformément à la loi n°78-17 Informatique et Libertés modifiée et au Règlement européen n° 2016/679 vous bénéficiez, sur les données personnelles qui vous concernent, d’un droit d’accès et de rectification et, s'ils sont applicables, d'un droit de suppression, d’opposition, d'effacement, de limitation du traitement, et de portabilité. Vous pouvez exercer ces droits en vous adressant à : privacy@philips.com en fournissant un justificatif d'identité. Vous pouvez également définir des directives relatives au sort de vos données après votre décès et introduire une réclamation auprès de la CNIL. 

References:

 

1. Obermeyer Z, Powers B, Vogeli C, Mullainathan S. Dissecting racial bias in an algorithm used to manage the health of populations. Science. 2019;366:447–45. https://doi.org/10.1126/science.aax2342

You are about to visit a Philips global content page

Continue

You are about to visit a Philips global content page

Continue

Our site can best be viewed with the latest version of Microsoft Edge, Google Chrome or Firefox.