Skip to main content
Full access
Technology in Psychiatry
Published Online: 20 October 2023

AI in Psychiatry: What APA Members Need to Know

Caution is the watchword for now in trying to use artificial intelligence for various purposes in psychiatry.
Psychiatrists have been inundated with ideas and information about how artificial intelligence (AI) is going to impact—even revolutionize—the future of psychiatry. To help members understand AI better, APA hosted a webinar on the subject in August. Here, I am going to discuss some of the material presented as well as answer questions about AI that we have received from APA members.
APA uses the term “augmented intelligence” when referring to AI to focus on AI’s assistive role in augmenting human decision-making, not replacing it. Augmented or artificial intelligence (AI) has been proposed for a variety of clinical uses: assisting with documentation, automating elements of billing and prior authorizations, detecting potential medical errors, supporting literature reviews, and more. Clinicians wonder whether the technology is already available to support these tasks and how to harness it to improve their patient care and workflows. However, generative AI and other large language models (LLMs) can also propagate biased or substandard care and pose new challenges to protecting patient privacy.
The webinar was led by me; Khatiya Moon, M.D., an assistant professor of psychiatry at Zucker Hillside Hospital and a member of APA’s Committee on Mental Health Information Technology; and Abby Worthen, APA’s deputy director of digital health. In the webinar we addressed clinical, ethical, and legal considerations for AI, specifically LLMs such as ChatGPT and Google’s Bard. Here are the main takeaways from the webinar:

Clinical Considerations

Output from AI can be misleading or incorrect. It can draw conclusions that may lead to bias-related harm.
Knowing tech sources, algorithm features, and training methods may provide some insight into the accuracy of output and what biases may exist, but this information is often not disclosed by tech companies.
New evaluation metrics and benchmarks are needed to assess generative AI performance and utility of specific models in psychiatry.
We need to educate patients on the risks of using LLMs to answer personal health questions and share that LLMs do not maintain confidentiality.
If AI is used to make clinical decisions, patients must be informed.

Ethical and Legal Considerations

APA urges caution in the application of untested technologies in clinical settings. Clinicians should approach AI technologies with caution, being aware of potential biases or inaccuracies and ensure that they are continuing to comply with HIPAA in all uses of AI.
Physicians remain responsible for the care they provide and can be liable for treatment decisions they make relying on AI that result in patient harm. As such, physicians should always carefully review any output guided by AI before implementing it into a treatment plan.
Physicians should ensure that they are transparent with patients about how AI is being used in their practice, particularly if AI is acting in a “human” capacity.
Regulatory guardrails and best practices exist to protect patient privacy (that is, HIPAA best practices), including informed consent, data minimization, data security, and accountability. To utilize LLMs or generative artificial intelligence, health care entities generally need to enter into business associate agreements with technology companies to safeguard protected health information.
Prompts entered into LLMs are stored on company servers and subject to the company’s privacy policy. Prompts containing private health information could be leaked or sold to third parties, compromising patient privacy.

FAQs

Q  What are some available tools?
A  There are many LLMs available to the public. The most popular are ChatGPT, Google Bard, and Bing Chat powered by GPT-4. GPT-4All is an open-source ecosystem of chatbots that include uncensored models that can run locally and offline. Some LLMs focus on medical applications and include BioBERT, Clinical BERT, Med-BERT, and Google’s Med-PaLM2. There are generative AI models that can create images, video, and audio as well. A multitude of apps and services utilize generative AI technology to offer specific functionalities such as editing photos, creating presentation slides, summarizing journal articles, and more. Regardless of which model you try or use, keep the privacy considerations in mind to avoid HIPAA violations. References provided by LLMs are often false and generated, so make sure to double check output for accuracy.
Q  How can we use AI to our advantage especially regarding documentation without violating HIPAA or patient trust?
A  While publicly available models have the capability to minimally assist with documentation, the risks of HIPAA violations and inaccurate output are too great. Entering into a business associate’s agreement with a business focused on developing generative AI for clinical use may offer a HIPAA-compliant way to harness the technology as it continues to improve. ■
APA members who have questions about AI may send them to [email protected].

Biographies

Darlene King, M.D., is an assistant professor in the Department of Psychiatry at UT Southwestern Medical Center, deputy medical information officer at Parkland Health, and the chair of APA’s Committee on Mental Health Information Technology. She graduated from the University of Texas at Austin with a degree in mechanical engineering prior to attending medical school and residency at UT Southwestern.

Information & Authors

Information

Published In

History

Published online: 20 October 2023
Published in print: November 1, 2023 – November 30, 2023

Keywords

  1. Diane King
  2. keyword phrase
  3. Khatiya Moon
  4. Worthen
  5. Artificial intelligence
  6. generative augmented intelligence
  7. HIPAA
  8. Protected health information
  9. privacyChatGPT
  10. Google Bard
  11. Bing Chat
  12. GPT-4
  13. BioBERT
  14. Clinical BERT
  15. MedBERT

Authors

Affiliations

Metrics & Citations

Metrics

Citations

Export Citations

If you have the appropriate software installed, you can download article citation data to the citation manager of your choice. Simply select your manager software from the list below and click Download.

For more information or tips please see 'Downloading to a citation manager' in the Help menu.

Format
Citation style
Style
Copy to clipboard

There are no citations for this item

View Options

View options

Get Access

Login options

Already a subscriber? Access your subscription through your login credentials or your institution for full access to this article.

Personal login Institutional Login Open Athens login

Not a subscriber?

Subscribe Now / Learn More

PsychiatryOnline subscription options offer access to the DSM-5-TR® library, books, journals, CME, and patient resources. This all-in-one virtual library provides psychiatrists and mental health professionals with key resources for diagnosis, treatment, research, and professional development.

Need more help? PsychiatryOnline Customer Service may be reached by emailing [email protected] or by calling 800-368-5777 (in the U.S.) or 703-907-7322 (outside the U.S.).

Media

Figures

Other

Tables

Share

Share

Share article link

Share