Skip to main content
Full access
Technology in Mental Health
Published Online: 10 October 2018

Integrating Predictive Modeling Into Mental Health Care: An Example in Suicide Prevention

Abstract

Recent advances in statistical methods and computing power have improved the ability to predict risks associated with mental illness with more efficiency and accuracy. However, integrating statistical prediction into a clinical setting poses new challenges that need creative solutions. A case example explores the challenges and innovations that emerged at a Department of Veterans Affairs hospital while implementing REACH VET (Recovery Engagement and Coordination for Health—Veterans Enhanced Treatment), a suicide prevention program that is based on a predictive model that identifies veterans at statistical risk for suicide.
The advent of predictive modeling offers a novel tool to supplement clinical judgment when identifying risks associated with mental illness. Recent advancements in computing power and statistical techniques offer a unique opportunity to model complex processes with higher accuracy (1). New research is exploring how predictive models can improve treatment for depression (2), schizophrenia (3), and many other conditions. Predictive models offer several notable benefits, including efficiently integrating data that may span years or come from multiple sources, using objective data that do not rely on human subjectivity, and including risk factors that may offer little clinical insight in isolation (e.g., gender). Predictive modeling is particularly suited for large health care systems where patients interact with multiple providers in a variety of ways and where integrated electronic patient records are used.
The general challenge of reliably identifying psychiatric conditions is exacerbated when it comes to identifying suicide risk. Clinicians cannot predict suicide with any acceptable degree of accuracy (4). Despite significant research conducted on suicide screening and assessment, systematic reviews suggest that predictive abilities have not improved in the past 50 years of research and the likelihood of an accurate prediction remains close to chance (5). To address this challenge, many organizations are now working to develop predictive models for suicide risk. This opportunity has been broadly recognized, and several groups are developing advanced statistical procedures to analyze electronic health records (6). In fact, the National Action Alliance for Suicide Prevention has prioritized such efforts (7).
Although the potential benefits of a predictive model are clear, the mental health field does not yet have a set of “best practices” for implementing the results from predictive modeling in clinical practice. Once a predictive model is established, mental health systems are left with the task of intervening with individuals who may not actually be at risk (given that predictive models are not perfect), and depending on the goals of the program, many individuals identified by the predictive model will not have self-identified as having a need for intervention. Therefore, the clinical use of results from statistical models is challenging.
In this column, we describe local implementation of a predictive model for suicide in a large Department of Veterans Affairs (VA) health care system to identify the clinical and administrative challenges of implementing such a model in a real-world clinical setting and to illustrate the solutions that were developed.

Implementation Example

A large, local VA health care system implemented an intervention for veterans identified as “high-risk for suicide” by a predictive model that analyzes data from medical records. The implementation described below is intended to represent a single example of how a predictive model was integrated into a hospital setting and does not represent the broader national VA implementation of the program.
The VA, in collaboration with the National Institute of Mental Health, developed the predictive model that identifies veterans at statistical risk for suicide (8). The statistical details of the model have been previously described (8, 9). Briefly, an initial development and validation study was conducted (8), followed by retesting and a comparison of the potential benefits of a variety of machine learning approaches (9). The final statistical model uses a penalized logistic regression model with 61 variables collected from electronic health records that include a variety of risk factors for suicide (e.g., demographic characteristics, mental health diagnoses, prior suicide attempts, medications) (9). In the validation study that examined risk over one year (8), the predictive model succeeded in identifying a group with suicide rates that were 30 to 60 times higher than the overall sample. Because only 31% of the veterans identified in the highest risk group had been flagged by clinicians as “high-risk for suicide” through the VA’s medical records flag system, the model identified additional veterans at risk (8). In addition, those at high risk for suicide were also at risk for a variety of adverse outcomes, including other external causes of mortality and all-cause mortality.
A national prevention program known as REACH VET (Recovery Engagement and Coordination for Health—Veterans Enhanced Treatment) was rolled out to local VA facilities to make use of the predictive model results. The predictive model produces a monthly list of veterans who are identified as being at high risk for suicide. Each VA facility was asked to name a local coordinator at program launch who would be responsible for engaging providers and making them aware of patients identified as “high risk,” helping providers understand and choose next steps, and answering staff and veteran questions about the program. A Web-based dashboard was developed to identify local REACH VET patients and to help track required suicide prevention steps. Once veterans are identified as high risk, guidance from the national REACH VET program includes a review of the treatment plan (e.g., frequency of mental health appointments, use of evidence-based treatments, re-evaluate pharmacotherapy) and evaluation of any treatment enhancements that might reduce risk (e.g., creating a safety plan that details coping strategies and social and professional supports available during a crisis, increased monitoring of stressful life events, peer support, telehealth, and sending periodic caring letters to at-risk patients). Providers are asked to reach out to veterans to offer support, assess risk, and collaboratively consider treatment changes.

Challenge 1: coordinator identification.

In line with the nationwide rollout of REACH VET, the first task was to identify a coordinator. The time requirement for this role was estimated to be 20% of a full-time role. The initial challenge was to specify the knowledge, skills, and abilities needed for an effective coordinator and to identify an available staff member who best met the specified criteria. We ultimately selected a nurse in our same-day access mental health clinic based on a combination of factors, including the provider’s significant institutional knowledge, interpersonal and communication skills, experience with acute mental health care, and occupational background, which included a history of successfully engaging both prescribing and nonprescribing mental health providers.
The REACH VET coordinator’s style of communicating with providers and dedication to strong teamwork were instrumental to the program’s early success. For instance, the REACH VET program often identified patients who were already known to be at risk. In these cases, some of the providers understandably wondered how the program might improve care. Accordingly, the coordinator routinely validated the high-quality care and substantial efforts to engage veterans when that was already evident in the medical record. She was effective at developing rapport with providers and attempted to partner to work through the program’s steps together and consider possible treatment enhancements. The coordinator reviewed the program’s Web-based dashboard simultaneously with the provider and explained the program and associated documentation requirements in a collaborative manner.

Challenge 2: staff preparations and communication.

There are many demands placed on providers in every health care setting, and the VA facility described here is no different. We anticipated diverse reactions from providers upon notification that their patient was judged to be at statistical risk by a VA algorithm. To increase the likelihood of collaborative responses, several actions were taken. First, the REACH VET coordinator briefed the facility’s mental health leadership team on the program, emphasizing the innovative nature of the program and its potential for success. The facility’s director of mental health encouraged supervisory support of the initiative and requested staff support. The briefings of the program were also given at facility mental health staff meetings, and the REACH VET coordinator authored an article in the mental health service newsletter describing the program and emphasizing the promise of such innovations for reducing veteran suicide. Efforts were made to ensure all providers in this large health care system were aware of the program and its leadership support before their first contact with the program.
In addition, technology was important in both communicating with providers and minimizing provider burden. Providers were alerted to veterans identified by REACH VET by encrypted e-mail. Internal, secure instant-messaging software was available through remote access, enabling efficient communication between the coordinator and provider and permitting different parties to share a view of their computer desktop for synchronous education and problem solving on the REACH VET dashboard. The enterprise electronic medical record enabled the coordinator and provider to immediately review the veteran’s treatment plan and care received. Technology was critical, not only to the statistical side of the program but to the implementation tasks as well.

Challenge 3: intervention.

A key issue for implementation was the selection of interventions and actions providers should consider when one of their patients is identified by the statistical model.
Although the national REACH VET program had helpful recommendations for interventions, we needed to develop specific guidance that could be used by mental health providers when reviewing a medical chart. A local team of mental health leaders collaborated to discuss potential clinical actions and best practices (acknowledging that any list would not meet all situations and providers would need to select appropriate actions based on their clinical judgment). The list of potential actions included a range of potentially useful considerations, including referral for more intensive therapy services, creating a new or updated safety plan, providing information about the Veterans Crisis Line, and referral for intensive case management.

Challenge 4: ethical considerations.

Although REACH VET was mandated nationally for VA, our facility sought to ensure that our implementation was thoughtful and considered applicable ethical principles, given that new suicide prevention efforts were being initiated based on results from this predictive model. A team that included our integrated ethics program officer and psychologists who have published papers on mental health ethical issues considered the implementation of suicide prevention programs that are based on statistical prediction. The team analyzed the issues and developed recommendations (10). In general, the recommendations emphasize careful safety planning, staff training, ensuring that use of patient data complies with applicable law and regulations, and acknowledging that additional research is needed to guide providers in how to discuss suicide prediction results with patients (10).

Challenge 5: documentation.

One of the needs that was identified early was a standardized approach to documentation. A REACH VET note template for our electronic medical record was prioritized, not only to ensure responsible documentation but also to provide a simple means for cuing providers on potentially useful actions to consider. Therefore, we elected to embed within all REACH VET note templates an itemized list of potential clinical responses. We believed that recommending a list of potential actions would help mitigate provider anxiety and uncertainty and facilitate adoption of the REACH VET program. The process of creating and revising templates was iterative throughout the program rollout.

Initial Metrics and Results

At the time of writing, our health care system has had 313 veterans identified as high risk for suicide by the statistical model. Providers have considered additional prevention services for 100% of patients. In 66% of cases (N=207), the provider elected to reach out to the veteran as part of their review of treatment. Interpretation of this statistic is challenging, because many patients may have had upcoming appointments. Interestingly, only 7% of providers (N=22) documented that they informed the veteran that he or she was identified as being at high statistical risk for suicide or other adverse outcomes as part of that outreach. It is possible that this step was not specifically documented, or providers may be unsure how to communicate the complex meaning of statistical risk to veterans. Alternatively, providers may have believed that sharing this information was clinically contraindicated.
One of the beneficial outcomes of our local efforts was that the national REACH VET program subsequently modified and adopted the standardized note template described above for use across VA facilities nationwide.

Conclusions and Recommendations

Our facility learned important lessons during the implementation of REACH VET and established some best practices that were later adopted nationally. Rapid nationwide implementation of an innovative suicide prevention effort is a complex endeavor. One key to local implementation was the national REACH VET leadership and facilitator team who provided critical support to regional coordinators. The national REACH VET team was highly responsive to coordinators’ e-mails and calls, rapidly adapted the program based on feedback, and efficiently fixed technical problems. It is difficult to overstate the positive impact of the national team to our local implementation.
Several lessons from the implementation may apply broadly to mental health programs seeking to implement interventions that are based on predictive learning results. Without effective program leadership at the top, our local implementation would not have been as successful. Additional keys to success included local leader prioritization of the program, clear communication and preparation of the clinical staff before rollout, and dedicated and protected time for the coordinator. Local implementation often requires ongoing innovation, and our initial program data suggest areas for additional research, including whether and how providers should explicitly discuss suicide prediction results with patients. Additional research needs to include both clinical outcomes (e.g., effect of the intervention on suicide) and implementation outcomes (e.g., fidelity of intervention).

Acknowledgments

This column is the result of work supported with resources and the use of facilities at the VA Puget Sound Health Care System, located in Seattle and Tacoma, Washington. The contents do not represent the views of the U.S. Department of Veterans Affairs or the U.S. government.

References

1.
Darcy AM, Louie AK, Roberts LW: Machine learning and the profession of medicine. JAMA 2016; 315:551–552
2.
Chekroud AM, Zotti RJ, Shehzad Z, et al: Cross-trial prediction of treatment outcome in depression: a machine learning approach. Lancet Psychiatry 2016; 3:243–250
3.
Mikolas P, Hlinka J, Skoch A, et al: Machine learning classification of first-episode schizophrenia spectrum disorders and controls using whole brain white matter fractional anisotropy. BMC Psychiatry 2018; 18:97
4.
Hughes DH: Can the clinician predict suicide? (Erratum) Psychiatr Serv 1995; 46:449–451
5.
Franklin JC, Ribeiro JD, Fox KR, et al: Risk factors for suicidal thoughts and behaviors: a meta-analysis of 50 years of research. Psychol Bull 2017; 143:187–232
6.
Kessler RC, Warner CH, Ivany C, et al: Predicting suicides after psychiatric hospitalization in US Army soldiers: the Army Study to Assess Risk and Resilience in Servicemembers (Army STARRS). JAMA Psychiatry 2015; 72:49–57
7.
National Action Alliance for Suicide Prevention, Research Prioritization Task Force: A Prioritized Research Agenda for Suicide Prevention: An Action Plan to Save Lives. Rockville, MD, National Institute of Mental Health and the Research Prioritization Task Force, 2014
8.
McCarthy JF, Bossarte RM, Katz IR, et al: Predictive modeling and concentration of the risk of suicide: implications for preventive interventions in the US Department of Veterans Affairs. Am J Public Health 2015; 105:1935–1942
9.
Kessler RC, Hwang I, Hoffmire CA, et al: Developing a practical suicide risk prediction model for targeting high-risk patients in the Veterans Health Administration. Int J Methods Psychiatr Res 2017; 26(3) (Epub July 4, 2017)
10.
Tucker RP, Tackett MJ, Glickman D, et al: Ethical and practical considerations in the use of a predictive model to trigger suicide prevention interventions in healthcare settings. Suicide Life Threat Behav (Epub ahead of print, Jan 18, 2018)

Information & Authors

Information

Published In

Go to Psychiatric Services
Go to Psychiatric Services

Cover: Cover image © Africa Studio.

Psychiatric Services
Pages: 71 - 74
PubMed: 30301448

History

Received: 17 May 2018
Revision received: 9 July 2018
Accepted: 7 August 2018
Published online: 10 October 2018
Published in print: January 01, 2019

Keywords

  1. Computer technology
  2. Suicide
  3. Self-destructive behavior
  4. veterans
  5. predictive models

Authors

Affiliations

Greg M. Reger, Ph.D., M.A. [email protected]
Mental Health Service, Veterans Affairs Puget Sound Health Care System, Seattle/Takoma, Washington (all authors); Psychiatry and Behavioral Sciences, University of Washington School of Medicine, Seattle (G. Reger, Ruskin, M. Reger).
Mary Lou McClure, R.N., B.S.N.
Mental Health Service, Veterans Affairs Puget Sound Health Care System, Seattle/Takoma, Washington (all authors); Psychiatry and Behavioral Sciences, University of Washington School of Medicine, Seattle (G. Reger, Ruskin, M. Reger).
David Ruskin, M.D.
Mental Health Service, Veterans Affairs Puget Sound Health Care System, Seattle/Takoma, Washington (all authors); Psychiatry and Behavioral Sciences, University of Washington School of Medicine, Seattle (G. Reger, Ruskin, M. Reger).
Sarah P. Carter, Ph.D.
Mental Health Service, Veterans Affairs Puget Sound Health Care System, Seattle/Takoma, Washington (all authors); Psychiatry and Behavioral Sciences, University of Washington School of Medicine, Seattle (G. Reger, Ruskin, M. Reger).
Mark A. Reger, Ph.D.
Mental Health Service, Veterans Affairs Puget Sound Health Care System, Seattle/Takoma, Washington (all authors); Psychiatry and Behavioral Sciences, University of Washington School of Medicine, Seattle (G. Reger, Ruskin, M. Reger).

Notes

Send correspondence to Dr. G. Reger ([email protected]). Dror Ben-Zeev, Ph.D., is editor of this column.

Metrics & Citations

Metrics

Citations

Export Citations

If you have the appropriate software installed, you can download article citation data to the citation manager of your choice. Simply select your manager software from the list below and click Download.

For more information or tips please see 'Downloading to a citation manager' in the Help menu.

Format
Citation style
Style
Copy to clipboard

There are no citations for this item

View Options

View options

PDF/ePub

View PDF/ePub

Get Access

Login options

Already a subscriber? Access your subscription through your login credentials or your institution for full access to this article.

Personal login Institutional Login Open Athens login
Purchase Options

Purchase this article to access the full text.

PPV Articles - Psychiatric Services

PPV Articles - Psychiatric Services

Not a subscriber?

Subscribe Now / Learn More

PsychiatryOnline subscription options offer access to the DSM-5-TR® library, books, journals, CME, and patient resources. This all-in-one virtual library provides psychiatrists and mental health professionals with key resources for diagnosis, treatment, research, and professional development.

Need more help? PsychiatryOnline Customer Service may be reached by emailing [email protected] or by calling 800-368-5777 (in the U.S.) or 703-907-7322 (outside the U.S.).

Media

Figures

Other

Tables

Share

Share

Share article link

Share