Site maintenance Wednesday, November 13th, 2024. Please note that access to some content and account information will be unavailable on this date.
Skip to main content
As the digital revolution finally reaches the mental health clinic, we can point to actual examples of modern information technology ready to improve mental health care. The Veterans Health Administration is implementing population-based outreach to prevent suicide, driven by machine learning–derived risk prediction models (1). The Food and Drug Administration recently approved the first “prescription digital therapeutic,” a mobile phone app officially endorsed as safe and effective treatment for substance use disorders. Computerized voice processing may soon be capable of providing real-time feedback to psychotherapists regarding quality of treatment and therapeutic alliance (2).
Excitement about the potential of new information technologies, however, has sometimes focused more on the high-tech tools than on the problems we hope they can solve. Focusing on the technology, we might orient around finding new mental health applications for eHealth or machine learning rather than around unmet treatment needs or gaps in mental health care. In our excitement about new technology, we can become too much like the young child with a shiny new hammer who’s looking to pound anything that might be a nail.
If we hope to find the maximum benefit from exciting new tools, we should first identify the jobs that need doing. As readers of this journal are well aware, we need not look far to identify important areas of unmet need or “pain points” in our delivery of services for mental or substance use disorders.
Recent work on suicide risk prediction illustrates the relationship between jobs and tools. The unmet need is clear; rates of suicide continue to increase, and traditional clinical assessment is hardly better than chance for identifying people at highest risk. This could be a job for machine learning or artificial intelligence tools. If we hope to inform a population-based outreach program, then we would start with a case-control design comparing all people who attempt or die by suicide with a control group drawn from the same at-risk population (3). We would use the resulting prediction models to identify the highest-risk individuals in the population. If we hope to deliver accurate individual risk predictions to clinicians at the point of care, then we would start with a cohort design including all types of visits for which we hope to deliver predictions (4). We would use the resulting prediction models to estimate risk at future visits. In either case, we might use any of several machine learning tools, such as penalized or regularized parametric models, decision tree–based models, or deep learning models. Available evidence suggests that those model development tools have generally similar performance for this specific job. No amount of artificial intelligence, however, can replace good clinical epidemiology, which entails clearly identifying an aim or question and matching that aim or question to the appropriate research design. In neither of the above scenarios would we use machine learning or artificial intelligence tools to simply confirm long-established risk factors for suicidal behavior. We don’t need a more complicated, more expensive, and less transparent tool for that relatively simple job.
A philosophy of user-centered design (5) begins the development of any product or service with understanding the needs, priorities, preferences, and constraints of end users. This approach is most often applied to the development of interventions, especially technology-enabled interventions. But it is equally applicable to the development of prediction models or computerized decision support systems. Development of a prediction model would begin with questions such as: Which decision maker do I aim to help? What decision(s) does this person need to make and when/where must the decision be made? What information could be available to inform those decisions? What is the most helpful way to deliver any recommendation or prediction?
Unfortunately, user engagement often comes late in the process of developing new mental health (or general health) technologies. Rather than ask what consumers or patients might want or need, late-stage user engagement focuses on how to entice consumers or patients to use a tool we’ve already developed. User engagement regarding the look and feel of a product or tool is fundamentally different from user engagement regarding what product is actually needed. To continue the hammers-and-nails analogy, late-stage user engagement is akin to fine-tuning the shape of the hammer rather than asking what (if anything) needs to be built.
We should also remember that technology trends change nearly as fast as the seasons and that human nature evolves quite slowly. If we had been designing “digital therapeutics” as recently as 2015, we would have been focused on the exciting potential of Google Glass. At the height of the Bitcoin craze, anything involving Blockchain would have seemed unstoppable—no matter how irrelevant Blockchain technology was to the problem at hand. Adding “Blockchain” to the name of the Long Island Iced Tea Corporation boosted the stock price by 200%, followed soon by a crash and a federal investigation. Those are instructive cautionary tales about technology fads. In contrast, our human problems with fear, hopelessness, impulsivity, apathy, or distraction are likely to be just as important generations from now.
Although I am old enough to remember when computers filled whole rooms and were controlled by punch cards, I am definitely not a Luddite. I spend much of my time developing and improving machine learning tools to identify people at risk for suicide. And I have helped develop and test eHealth interventions for dialectical behavior therapy and effective self-management of bipolar disorder. I am genuinely excited about the potential for artificial intelligence to support better human decision making and the potential for eHealth and mHealth interventions to deliver empirically supported psychosocial treatments when and where they are actually needed. But I hope to stay focused on the things associated with those unmet needs: inconsistent clinical decision making and the limited current reach of empirically supported psychosocial treatments. I should have no allegiance to any specific tools—only to finding the right tools to solve those problems.
We will focus our attention in the right places if we remember who we work for. We call our work psychiatric services (or mental health services) because we aim to serve people who live with mental health conditions—and their families and caregivers.
I’m sure the manager of my local home improvement store would prefer I start my weekend shopping for new tools, rather than first identifying which jobs I need to do. If I were serving the home improvement store, I’d end my weekend with a basement full of shiny new tools and many jobs left undone. Let’s make sure that the jobs rule over the tools and not the other way around.

References

1.
Reger GM, McClure ML, Ruskin D, et al: Integrating predictive modeling into mental health care: an example in suicide prevention. Psychiatr Serv 2019; 70:71–74
2.
Imel ZE, Caperton DD, Tanana M, et al: Technology-enhanced human interaction in psychotherapy. J Couns Psychol 2017; 64:385–393
3.
Kessler RC, Hwang I, Hoffmire CA, et al: Developing a practical suicide risk prediction model for targeting high-risk patients in the Veterans health Administration. Int J Methods Psychiatr Res 2017; 26(3)
4.
Simon GE, Johnson E, Lawrence JM, et al: Predicting suicide attempts and suicide deaths following outpatient visits using electronic health records. Am J Psychiatry 2018; 175:951–960
5.
Lyon AR, Bruns EJ: User-centered redesign of evidence-based psychosocial interventions to enhance implementation—hospitable soil or better seeds? JAMA Psychiatry (Epub ahead of print, Nov 14, 2018)

Information & Authors

Information

Published In

Go to Psychiatric Services
Go to Psychiatric Services

Cover: XXXX

Psychiatric Services
Pages: 642 - 643
PubMed: 31138055

History

Received: 29 April 2019
Accepted: 29 April 2019
Published online: 29 May 2019
Published in print: August 01, 2019

Keywords

  1. Suicide &amp
  2. self-destructive behavior, Computer technology

Authors

Details

Gregory E. Simon, M.D., M.P.H. [email protected]
Kaiser Permanente Washington Health Research Institute, Seattle.

Notes

Send correspondence to Dr. Simon ([email protected]).

Metrics & Citations

Metrics

Citations

Export Citations

If you have the appropriate software installed, you can download article citation data to the citation manager of your choice. Simply select your manager software from the list below and click Download.

For more information or tips please see 'Downloading to a citation manager' in the Help menu.

Format
Citation style
Style
Copy to clipboard

View Options

View options

PDF/EPUB

View PDF/EPUB

Get Access

Login options

Already a subscriber? Access your subscription through your login credentials or your institution for full access to this article.

Personal login Institutional Login Open Athens login

Not a subscriber?

Subscribe Now / Learn More

PsychiatryOnline subscription options offer access to the DSM-5-TR® library, books, journals, CME, and patient resources. This all-in-one virtual library provides psychiatrists and mental health professionals with key resources for diagnosis, treatment, research, and professional development.

Need more help? PsychiatryOnline Customer Service may be reached by emailing [email protected] or by calling 800-368-5777 (in the U.S.) or 703-907-7322 (outside the U.S.).

Media

Figures

Other

Tables

Share

Share

Share article link

Share