{"id":48806,"date":"2024-06-28T11:24:02","date_gmt":"2024-06-28T10:24:02","guid":{"rendered":"https:\/\/www.innovationnewsnetwork.com\/?p=48806"},"modified":"2024-06-28T11:28:13","modified_gmt":"2024-06-28T10:28:13","slug":"study-reveals-ai-models-that-analyse-medical-images-can-be-biased","status":"publish","type":"post","link":"https:\/\/www.innovationnewsnetwork.com\/study-reveals-ai-models-that-analyse-medical-images-can-be-biased\/48806\/","title":{"rendered":"Study reveals AI models that analyse medical images can be biased"},"content":{"rendered":"
AI models often play a role in medical diagnoses, especially when analysing medical images such as X-rays.<\/p>\n
However, studies have found that these models don\u2019t always perform well across all demographic groups, usually faring worse on women and people of colour.<\/p>\n
In 2022, MIT researchers reported that AI models can make\u00a0accurate predictions\u00a0about a patient\u2019s race from their chest X-rays \u2014 something that the most skilled radiologists can\u2019t do.<\/p>\n
That research team has now found that the models that are most accurate at making demographic predictions also show the biggest \u2018fairness gaps\u2019 \u2014 that is, discrepancies in their ability to accurately diagnose images of people of different races or genders.<\/p>\n
The findings suggest that these models may be using \u2018demographic shortcuts\u2019 when making their diagnostic evaluations, which lead to incorrect results for several groups.<\/p>\n