Digital Mental Health Resources

How Artificial Intelligence Could Exacerbate Inequalities in Diagnosing and Treating Mental Illness in Women and Other Patient Populations.

How Artificial Intelligence Could Exacerbate Inequalities in Diagnosing and Treating Mental Illness in Women and Other Patient Populations.

By: Samantha Kayne

The promise of improving mental health with the use of Artificial Intelligence (AI) has caused significant excitement in the industry. While research on the use of AI in treating mental illness is in its early stages, researchers are hopeful that AI can be used to analyze electronic health records, brain imaging data and other information to better understand and treat mental illnesses.

Some scholars have raised concerns that “the perceived epistemic superiority of AI to guide clinical judgments” may further intensify structural vulnerabilities of some patient populations (McCradden, Hui & Buchman). Epistemic superiority refers to a clinician giving undue authoritative weight to AI’s findings and interpretations, while disregarding that these findings may be flawed due to unintended bias in AI algorithms.

One example of this phenomenon is in the reporting of sexual harassment. As researcher Miranda Fricker explained, sexual harassment was a concept that was not well-articulated pre-1970s.  If a person experienced sexual harassment in the workplace prior to 1970, they might interpret unwanted sexual advances as socially acceptable behavior (Fricker). Victims had not yet formulated concepts necessary to understand and communicate their experience.

Researchers have found degrees of gender discrimination in AI determinations, with evidence that stereotypes are embedded in AI algorithms (Wei & Zhou). Thus, AI-driven mental health software and devices could contain data-driven sexist bias with the potential to impede the mental health care of women (Fiske, Henningsen & Buyx). The potential use of AI systems in psychiatric clinical decision-making could exacerbate epistemic injustice (McCradden, Hui & Buchman). 

Algorithmic classification of psychiatric diagnoses and predictions of benefit from certain treatments which may unduly influence care givers and contribute to inequity of care are just some of the flaws of reliance on AI in the treatment of mental illness (McCradden, Hui & Buchman). 

AI’s growing populairty in healthcare, including psychiatry, has the potential to advance care for all populations.  However, we must carefully assess the benefits of using AI and the dangers of over-reliance on its outputs in treatment and diagnoses.