Up next


Tackling Bias in Health AI Systems from a Human Rights Lens

2 Views
Makestube
17
Published on 07/13/23 / In People & Blogs

Full Title: Tackling Bias in Health AI Systems from a Human Rights Lens

Speakers:
Jake Okechukwu Effoduh (He/Him)
Vanier Scholar, Osgoode Hall Law School
York University

Abstract:
The healthcare industry is witnessing an era of innovation explosion, part of which is as a result of the increasing use of artificial intelligence (AI) within healthcare contexts. Although the technology is still relatively new, there are already some promising examples of AI systems improving diagnostics, treatment, and the speed of healthcare delivery. For example, some AI systems are remarkably predicting disease outbreaks, cancers, and heart diseases long before any signs or symptoms show. The use of AI is also advancing the practice of telemedicine, medical informatics, and is improving clinical operations (such as interpreting staining images and aiding the performance of high-risk surgeries). Many of these innovations are unprecedented.

However, one of biggest challenges in the use of AI for healthcare is the issue of bias: instances when the application of an AI algorithm compounds existing inequities in socioeconomic status, race, ethnic background, religion, gender, disability, sexual orientation, or other criteria in ways that could amplify discrimination or adversely impact inequities in health systems. One of the many ways that this phenomenon of bias from AI health systems could occur is when an AI algorithm produces results that are systemically prejudiced due to erroneous assumptions in the machine learning process.

In this session, I plan to speak on bias in health AI systems. I hope to discuss how AI algorithms can become biased, how they enter systems, and the harmful effects of algorithmic bias in AI for health care. Taking a critical human rights approach, I will also explore the legal and regulatory responses to bias in AI for health purposes.

Show more
0 Comments sort Sort By

Up next