Psychiatrists and mental health professionals need to proactively help shape the future of artificial intelligence (AI) as it relates to psychiatric practice—or AI may end up shaping psychiatric practice instead.
That’s the message Jina Suh, Ph.D., principal researcher in the
Human Understanding and Empathy Group at Microsoft Research, brought to APA’s Board of Trustees at its March meeting in Washington, D.C. She was joined in her presentation by Tim Althoff, Ph.D., an assistant professor of computer science at the University of Washington. He directs the
Behavioral Data Science Group, which works on research related to AI, language models, and their application to mental health.
Suh described a future in which generative AI—machine learning systems, such as ChatGPT, that generate content derived from vast amounts of data—will loom large in all areas of psychiatric training and practice. But it is a future that APA and allied mental health organizations have the opportunity to help mold and direct. (At the meeting, trustees approved the Position Statement on the Role of Augmented Intelligence in Clinical Practice and Research; see box below.)
“Psychiatrists must thoughtfully and proactively envision the future of AI in mental health to support the patients and communities they serve and to train the next generation of psychiatrists with AI literacy,” Suh said.
The growth of AI-generated mental health products is projected to be enormous. A global health care consulting firm, Towards Healthcare,
reported that the market value of mental health “chatbots” is estimated to surpass $6.51 billion by 2032. This growth is being driven by the shortage of mental health professionals and the demand for scalable, accessible, convenient, and affordable mental health services, Suh told trustees.
But predictably, there are lots of ways that AI can go wrong if it is untested or used for purposes other than what it was tested for. A
report by the Center for Countering Digital Hate found that popular AI tools generate harmful content about 41% of the time when prompted to provide information on eating disorders.
Althoff, in comments to Psychiatric News, noted that current generative AI technology is making overly simplistic assumptions and said his lab has been working on addressing these shortcomings. “Current AI technology too often assumes that a third person, without any psychiatric expertise, often embedded in a different socio-cultural context, can judge what is harmful,” he said. “That doesn’t make any sense, and psychiatrists have known this for a long time, which is why the best version of this technology will come from multidisciplinary teams that integrate [their] expertise.”
Suh said that APA and individual psychiatrists should collaborate with other mental health professionals and vested organizations to do the following:
•
Collect and share AI failures and strategies for mitigating possible harm caused by AI failures.
•
Develop guidelines for how AI is applied to psychiatry and to mental health–related products accessible to the public.
•
Develop a checklist for guiding the design of chatbots.
•
Develop a framework for evaluating AI safety, including long-term effects on mental health professionals and patients, especially children.
Suh explained that popular products such as ChatGPT are built on
foundation models, or “general-purpose AI systems.” These are capable of a range of general tasks (such as text synthesis, image manipulation, and audio generation). Notable examples of foundation models are OpenAI’s GPT-3 and GPT-4, which underpin the conversational chat agent ChatGPT.
“Because foundation models can be built ‘on top of’ to develop different applications for many purposes, this makes them difficult—but important—to regulate,” according to the
Ada Lovelace Institute, an independent AI research institute in the United Kingdom. “When foundation models act as a base for a range of applications, any errors or issues at the foundation-model level may impact any applications built on top of from that foundation model.”
For these reasons, Suh said, the foundation models upon which AI applications are built are far from perfect; they are associated with “
hallucinations” (unrealistic, false, or nonexistent content) and may be prone to misinformation and bias. These models require the oversight of vested professionals, including mental health professionals, who can think strategically about where, when, and how AI should be integrated into various settings.
Suh posed some questions that are ripe for the input of psychiatry:
•
How can conversational data be mined by AI to improve patient-provider communication, patient understanding of diagnosis and treatment, and/or utilization of patient-generated data for personalized treatment?
•
How can humans collaborate with AI to augment the therapeutic power of human therapists?
•
How can AI be used to support reflective thinking by clinicians and the training of new physicians?
“There are exciting new opportunities in treatment delivery, especially when we focus on the generative capabilities that can aid in personalized brainstorming and planning, act as provocateurs to challenge thoughts or behaviors, or participate in role-playing and skills practice,” Suh told trustees.
A startling example of this is a simulation model, using generative AI, to provide real-time feedback to clinicians practicing dialectical behavior therapy (DBT). Suh and Althoff were coauthors of a
report on the model published in
arXiv, an open-access archive for scholarly articles in the fields of physics, mathematics, computer science, statistics and other fields.
“We built a system that performs bespoke simulation and role-play and gives expert level feedback through generative AI in the context of teaching interpersonal effectiveness skills in DBT,” Althoff told Psychiatric News.
In her remarks to the Board, Suh said AI should be applied to clinical practice selectively. “Because off-the-shelf generative AI models have demonstrated only surface-level knowledge of psychotherapy, it is important to [apply AI] strategically to select aspects of treatment rather than attempting to replace therapy or treatments entirely.”
She emphasized that the incorporation of AI into clinical workflows, whatever the setting, needs to enhance, not replace, human participation. “When considering AI innovation in clinical workflows, it is important to design for human augmentation through collaboration, reflection, or training rather than human replacement to preserve the importance of genuine human connection that is a cornerstone of psychiatry.”
The future of AI will be astonishing—in ways both exciting and possibly surprising—with interactive effects on the human mind and brain. What happens, she asks, when we have intelligence at our fingertips that completes our thoughts before our thoughts are fully formed?
“We need to anticipate and monitor short- and long-term effects of generative AI use on individuals’ cognition and mental health, including AI risks to vulnerable populations,” Suh said. “We also need to observe the impact of AI innovation in psychiatry on the psychiatric profession itself to avoid the future where mental health professionals are working on behalf of AI.”
Other Board Actions
In other business, the Board approved several recommendations from the APA Nominating Committee to increase member awareness of opportunities to serve on the Board of Trustees. These include expanding communication regarding elections using social media, APA’s website, and videos that can be posted on both; working with the Nominating Committee to host workshops and webinars; and Q&A sessions and other forums; and establishing mentorship opportunities between Board members and interested APA members.
Trustees also approved the following:
•
A 5% increase in member dues for 2025.
•
Participation in the FDA Total Product Life Cycle (TPLC) Advisory Program (TAP). TAP is intended to help ensure that U.S. patients have access to high-quality, safe, effective, and innovative medical devices for years to come by promoting early, frequent, and strategic communications between the FDA and medical device sponsors.
•
Reappointment to the APA Foundation Board of Directors for three-year terms of Michelle Durham, M.D., M.P.H., Ben Zobrist, Edmond Pi, M.D., and Monica Taylor-Desir, M.D., M.P.H., and appointment of Farha Abbasi, M.D.
•
Reappointment for five-year terms of Lisa Dixon, M.D., as editor of Psychiatric Services; Kimberly Yonkers, M.D., as editor of the Journal of Psychiatric Research and Clinical Practice; and Laura Roberts, M.D., as editor in chief of APA’s Publishing Book Division. ■