Skip to main content
Full access
Editorial
Published Online: 15 June 2023

Artificial Intelligence (AI) Psychotherapy: Coming Soon to a Consultation Room Near You?

News stories about generative artificial intelligence (AI) abound, alternately predicting that AI will cause humanity’s demise and touting it as the solution to everything from climate change to health care scarcity (1). In 2022, OpenAI’s release of ChatGPT, an easy-to-use AI interface that produces fluent text in response to prompts, intensified highly ambivalent discussions about AI. Pundits and knowledgeable experts alike marveled at ChatGPT’s ability to produce text that resembles human writing while acknowledging that the speed of its evolution was outstripping our ability to understand its risks (2). This technology performed better on law school and medical school entrance exams than many human applicants, leading to predictions that AI will eventually render some professions obsolete (3). Psychotherapists, too, seem at risk for being replaced by AI (4). Are fears of AI supplanting psychotherapists reasonable?
To explore how AI might respond to a patient in distress, I typed this prompt into ChatGPT: “I feel sad. Can you help me?” It quickly generated the following text:
I am sorry to hear that you are feeling sad. I’ll do my best to help you. It’s important to remember that I’m an AI language model and may not fully understand or empathize with your emotions, but I can offer some general suggestions and support. . . . Remember, everyone experiences sadness from time to time, and it’s okay to seek help when you need it. You don’t have to face it alone.
Despite being impressed by the speed and conversational tone of ChatGPT’s text generation and the convenience of receiving an immediate response during hours when a human therapist would likely be unavailable, the formulaic exchange and canned feedback provided by ChatGPT left me with doubts about its ability to provide genuine soothing to humans experiencing depression.
Conversational agents or chatbots such as Woebot, Wysa, and Koko have similar capabilities compared with ChatGPT and are widely and commercially available to support mental health. These chatbots use natural language processing to converse with humans. Interactions take place via text-based messages or voice, creating a “therapist-like” dialogue with users. Rather than engaging in deep, psychological exploration, however, most chatbots focus on well-operationalized tasks and structured exercises that may help individuals better manage their symptoms. Woebot, for instance, provides self-help content that is based on principles of cognitive-behavioral therapy (CBT). In a 2-week randomized controlled trial of Woebot conducted with young adults, depression scores improved significantly for those assigned to the intervention compared with those assigned to the control condition (5). A meta-analysis that included 32 studies of conversational agent interventions found statistically significant beneficial short-term effects, compared with control conditions, on numerous mental health domains, including depression, anxiety, distress, quality of life, and negative affect (6). No data are available on long-term effects of these AI applications.
For individuals who may not otherwise have access to therapists, AI can boost self-management skills that improve symptoms, and chatbots may improve health outcomes at low cost and at scale. For instance, in low-resource settings where access to mental health specialists remains limited, technology-enabled interventions have been identified as possible solutions to psychotherapy scarcity (7). Thus, chatbots represent a promising option for addressing problems of access, especially when they are included as part of a comprehensive stepped care model that allows for escalation of services for those who do not achieve adequate benefit from automated programs (8). Chatbots may also be appealing to some individuals with anxiety about talking to other humans (9), providing them with a safe space to build skills and confidence before (hopefully) moving to interpersonal support.
AI-enabled platforms are not without risk, however. For instance, a large study of individuals with frequent suicidal ideation (N=18,882) found that assignment to an interactive, online dialectical behavior therapy intervention was associated with significantly increased risk for self-harm compared with usual care (10). Legal scholars and ethicists have raised concerns about the use of AI in health care settings, highlighting the need to proactively address data privacy, cybersecurity, health equity, and patient safety (2). Regulators recognize the growing need for careful vetting of proposed interventions, calling attention to the need for objective assessment of technology-enabled mental health interventions before approval for consumer use (11). Current generations of AI are also notoriously unreliable and have a penchant for inaccuracies and “hallucinations” (12).
Studies of self-guided mental health interventions consistently show that at least a modicum of human contact is essential for successful user engagement and efficacy (13, 14). Thus, future interventions will likely adopt blended approaches that include both human coaching and AI technology (15). Interestingly, as with conventional psychotherapy (16), nonspecific psychotherapy processes such as empathy and accountability are more important to outcomes with conversational agents than are specific factors such as CBT exercises (5, 6). Perhaps not surprisingly, automated chatbots like Wysa, described as an “AI-based emotionally intelligent mobile chatbot,” are free to use, but additional fees are required to access a human coach (17). When Koko, an online emotional support service, substituted AI-generated text-based responses for human responses, users felt betrayed. The company’s cofounder subsequently acknowledged that “simulated empathy feels weird, empty” (18).
Conversational agents and other AI-informed applications seem best suited to administer personalized psychoeducation and skills-focused activities. For deeper, more exploratory work, AI cannot replace its human progenitors. Psychotherapies that involve therapeutic engagement with complex interpersonal dynamics and an in-depth exploration of emotions, interpersonal relationships, and unconscious conflicts are unlikely to be meaningfully delivered by AI. Thus, humans will continue to be essential to psychotherapy delivery, at least for the foreseeable future.
But AI is fast and convenient, so I asked it to help me wrap up this editorial. When I directed ChatGPT to briefly review the benefits and risks of AI for psychotherapy, it obligingly summarized them as follows:
AI in psychotherapy has pros and cons. On the positive side, it offers increased accessibility, anonymity, consistency, and scalability. It can provide support 24/7, reduce stigma, deliver standardized care, and reach more people. However, it lacks human connection, struggles with contextual understanding, raises ethical and privacy concerns, and may be inflexible or biased. While AI has potential, careful attention is needed to address these limitations and ensure its responsible and effective use in psychotherapy. It is crucial that individuals have access to high-quality mental health care provided by trained and licensed human therapists.
Not bad, ChatGPT! For most of us, AI technology will increasingly be useful for mundane tasks like drafting closing paragraphs. It may also be helpful in teaching self-management skills to individuals in underresourced areas. However, AI applications are unlikely to replace humans as psychotherapists.

References

1.
Chubb J, Reed D, Cowling P: Expert views about missing AI narratives: is there an AI story crisis? AI Soc (Epub Aug 25, 2022). doi:
2.
McGreevey JD Ⅲ, Hanson CW Ⅲ, Koppel R: Clinical, legal, and ethical aspects of artificial intelligence–assisted conversational agents in health care. JAMA 2020; 324:552–553
3.
Haupt CE, Marks M: AI-generated medical advice—GPT and beyond. JAMA 2023; 329:1349–1350
4.
Griffin E: My weekend with an emotional support AI companion. New York Times, 2023. https://www.nytimes.com/2023/05/03/technology/personaltech/ai-chatbot-pi-emotional-support.html
5.
Fitzpatrick KK, Darcy A, Vierhile M: Delivering cognitive behavior therapy to young adults with symptoms of depression and anxiety using a fully automated conversational agent (Woebot): a randomized controlled trial. JMIR Ment Health 2017; 4:e19
6.
He Y, Yang L, Qian C, et al: Conversational agent interventions for mental health problems: systematic review and meta-analysis of randomized controlled trials. J Med Internet Res 2023; 25:e43862
7.
Hoeft TJ, Fortney JC, Patel V, et al: Task-sharing approaches to improve mental health care in rural and other low-resource settings: a systematic review. J Rural Health 2018; 34:48–62
8.
Mak WWS, Ng SM, Leung FHT: A Web-based stratified stepped care platform for mental well-being (TourHeart+): user-centered research and design. JMIR Form Res 2023; 7:e38504
9.
Vaidyam AN, Wisniewski H, Halamka JD, et al: Chatbots and conversational agents in mental health: a review of the psychiatric landscape. Can J Psychiatry 2019; 64:456–464
10.
Simon GE, Shortreed SM, Rossom RC, et al: Effect of offering care management or online dialectical behavior therapy skills training vs usual care on self-harm among adult outpatients with suicidal ideation: a randomized clinical trial. JAMA 2022; 327:630–638
11.
Kahane K, François J, Torous J: The digital health app policy landscape: regulatory gaps and choices through the lens of mental health. J Ment Health Policy Econ 2021; 24:101–108
12.
Metz C: Why do AI chatbots tell lies and act weird? Look in the mirror. New York, New York Times, 2023. https://www.nytimes.com/2023/02/26/technology/ai-chatbot-information-truth.html
13.
Cuijpers P, Donker T, Johansson R, et al: Self-guided psychological treatment for depressive symptoms: a meta-analysis. PLoS One 2011; 6:e21274
14.
Swartz HA, Rollman BL, Mohr DC, et al:
A randomized pilot study of Rhythms And You (RAY): an Internet-based program for bipolar disorder administered with and without clinical helper support in primary care.
J Affect Disord
;
15.
Lungu A, Jun JJ, Azarmanesh O, et al: Blended care–cognitive behavioral therapy for depression and anxiety in real-world settings: pragmatic retrospective study. J Med Internet Res 2020; 22:e18723
16.
Laska KM, Gurman AS, Wampold BE: Expanding the lens of evidence-based practice in psychotherapy: a common factors perspective. Psychotherapy 2014; 51:467–481
17.
Inkster B, Sarda S, Subramanian V: An empathy-driven, conversational artificial intelligence agent (Wysa) for digital mental well-being: real-world data evaluation mixed-methods study. JMIR mHealth uHealth 2018; 6:e12106
18.
Ingram D: A mental health tech company ran an AI experiment on real users. Nothing’s stopping apps from conducting more. New York, NBC News, 2023. https://www.nbcnews.com/tech/internet/chatgpt-ai-experiment-mental-health-tech-app-koko-rcna65110

Information & Authors

Information

Published In

Go to American Journal of Psychotherapy
Go to American Journal of Psychotherapy
American Journal of Psychotherapy
Pages: 55 - 56
PubMed: 37317570

History

Published in print: June 01, 2023
Published online: 15 June 2023

Keywords

  1. Psychotherapy
  2. Artificial intelligence
  3. health policy
  4. AI language model
  5. Conversational agents

Authors

Details

Holly A. Swartz, M.D. [email protected]
Department of Psychiatry, University of Pittsburgh School of Medicine, Pittsburgh.

Notes

Send correspondence to Dr. Swartz ([email protected]).

Metrics & Citations

Metrics

Citations

Export Citations

If you have the appropriate software installed, you can download article citation data to the citation manager of your choice. Simply select your manager software from the list below and click Download.

For more information or tips please see 'Downloading to a citation manager' in the Help menu.

Format
Citation style
Style
Copy to clipboard

View Options

View options

PDF/EPUB

View PDF/EPUB

Login options

Already a subscriber? Access your subscription through your login credentials or your institution for full access to this article.

Personal login Institutional Login Open Athens login
Purchase Options

Purchase this article to access the full text.

PPV Articles - APT - American Journal of Psychotherapy

PPV Articles - APT - American Journal of Psychotherapy

Not a subscriber?

Subscribe Now / Learn More

PsychiatryOnline subscription options offer access to the DSM-5-TR® library, books, journals, CME, and patient resources. This all-in-one virtual library provides psychiatrists and mental health professionals with key resources for diagnosis, treatment, research, and professional development.

Need more help? PsychiatryOnline Customer Service may be reached by emailing [email protected] or by calling 800-368-5777 (in the U.S.) or 703-907-7322 (outside the U.S.).

Media

Figures

Other

Tables

Share

Share

Share article link

Share