Skip to main content
Full access
Professional
Published Online: 24 September 2024

Addressing Bias and Inclusivity in AI-Driven Mental Health Care

AI-powered platforms are only as unbiased as the systems they are trained on—and the people who developed those systems.
Rapid advancements in artificial intelligence (AI) and other technologies promise to have a significant effect on the practice and delivery of mental health care. However, AI platforms are only as unbiased as the systems that they are trained on—and, because those systems have been developed by a society that is ingrained with biases on gender identity, race, ethnicity, and sexual orientation, they are not free from biases. Psychiatrists owe it to their patients to consider who is being included and excluded with these advancements.
A panel on “Innovations in Mental Health Disparities and Inequities” at APA’s 2024 Mental Health Innovation Zone (MHIZ)—held as part of the APA Annual Meeting in New York City earlier this year—discussed the nuances of bias within the context of increasing AI use in psychiatry. Panelists included Christina Mangurian, M.D., M.A.S., vice dean and professor of psychiatry at UCSF School of Medicine, and Jacques Ambrose, M.D., M.P.H., M.B.A., chief clinical integration officer for the psychiatry department at Columbia University Medical Center and senior medical director at ColumbiaDoctors Psychiatry; the panel was moderated by Steven Chan, M.D., M.B.A., clinical assistant professor at Stanford University School of Medicine and immediate past chair of APA’s Committee on Innovation.

From Input to Throughput

The panelists noted that to ensure marginalized communities are part of the conversation and have access to mental health care, multiple layers of biases embedded in the infrastructure of AI programs must be addressed, including the fact that training data is generated by predominantly white, educated, upper-middle-class users from industrialized nations. As a result, the input is inherently biased, with the voices and needs of most non-Western countries, less affluent nations, and minority, diverse populations not captured.
Bias can also be introduced during the refinement of input data after data training and algorithm generation, during a step called “throughput.” For example, the teams behind conversational AI platforms are not necessarily diverse, leading to the gender bias demonstrated by default female-servant voice assistants like Siri, Alexa, and Google Home. A series of tests conducted by The Washington Post in collaboration with Globalme and Pulse Labs showed that these products have a harder time recognizing accents that are not Western—specifically, Western, Eastern, Southern, or Midwestern U.S. accents. “Just the fact that a default was set just injects bias into how we use these systems,” Chan said.
Ambrose connected this bias to the stigma around discussing mental health. “When you’re using publicly available chatbots, they’re very reticent to comment on anything psychiatry-related,” he said. “I think it’s inadvertently injecting the bias of stigma surrounding mental health and how we are actually functioning on a day-to-day basis.”

Listening to Patients

Additionally, Ambrose highlighted functional biases related to visual, hearing, motor, and cognitive impairments. “If you’re thinking about a chat-based agent where you have to constantly use your hands in order to type something,” he said, “people who don’t necessarily have the manual dexterity—like people who have major or mild neurocognitive disorders, who don’t have the fine motor tuning, people who have cerebral palsy—those populations are extremely stigmatized in using this technology.”
There is also the question of how people access AI functionalities. While about 68% of the world’s population has some sort of smartphone, it is important to consider underserved rural communities, Black and brown populations, and people from non-industrialized nations, who are excluded from the technological revolution in mental health. People without smartphones or internet access render it useless.
The MHIZ panelists encouraged mental health providers to be part of the conversation regarding the navigation and usage of AI platforms, as well as the training of AI algorithms. Mangurian recalled writing a paper on suicide and recognizing rape some years ago, noting that at the time Siri did not know how to respond to rape. While the interface now provides guidance, Mangurian’s example underscored the importance of having advocates for patient needs in AI advancements.
Indeed, a key point for the audience to take home was that evidence-based practices and patient needs must be prioritized over profit. In order to continue to destigmatize mental health care and prevent discrimination toward individuals experiencing mental illness, physician leaders must remain engaged with AI development. ■

Biographies

Caroline Liu, M.S., is a third-year medical student at the University of California, Davis, School of Medicine.

Information & Authors

Information

Published In

History

Published online: 24 September 2024
Published in print: October 1, 2024 – October 31, 2024

Keywords

  1. Artificial intelligence
  2. AI
  3. Mental health disparities
  4. Mental health inequities
  5. Cultural bias
  6. Christina Mangurian, M.D., M.A.S.
  7. Jacques Ambrose, M.D., M.P.H., M.B.A.
  8. Steven Chan, M.D., M.B.A.

Authors

Details

Metrics & Citations

Metrics

Citations

Export Citations

If you have the appropriate software installed, you can download article citation data to the citation manager of your choice. Simply select your manager software from the list below and click Download.

For more information or tips please see 'Downloading to a citation manager' in the Help menu.

Format
Citation style
Style
Copy to clipboard

View Options

View options

Get Access

Login options

Already a subscriber? Access your subscription through your login credentials or your institution for full access to this article.

Personal login Institutional Login Open Athens login

Not a subscriber?

Subscribe Now / Learn More

PsychiatryOnline subscription options offer access to the DSM-5-TR® library, books, journals, CME, and patient resources. This all-in-one virtual library provides psychiatrists and mental health professionals with key resources for diagnosis, treatment, research, and professional development.

Need more help? PsychiatryOnline Customer Service may be reached by emailing [email protected] or by calling 800-368-5777 (in the U.S.) or 703-907-7322 (outside the U.S.).

Media

Figures

Other

Tables

Share

Share

Share article link

Share