Measurement-based care (MBC) is a measurement-based approach to mental health care that is associated with improved treatment outcomes and guideline concordance (
1–
3) as well as increased patient communication and engagement in care (
4,
5). Furthermore, MBC has been shown to improve the cost-effectiveness and efficiency of care in both pharmacotherapy (
1,
3,
6) and psychotherapy (
2,
7,
8). As a result, multiple organizations have promoted MBC as a worthy goal for implementation (
7,
9,
10). Despite its many benefits and widespread enthusiasm for its implementation, MBC is underused (
11–
14) and has been challenging to implement (
9,
15). Multiple barriers to MBC have been documented, including lack of informatics tools, lack of training on its use, lack of incentives (such as performance metrics), providers’ perceptions of burden, insufficient time in session, and lack of alignment with patient goals (
7,
11,
13–
19).
In addition to barriers to MBC implementation experienced across all mental health care settings, clinicians integrated into primary care use brief appointments to maintain availability of warm handoffs from primary care providers (
20,
21) and therefore face heightened in-session time constraints. As a result, these constraints may make MBC especially challenging to conduct in integrated primary care settings, such as the U.S. Department of Veterans Affairs’ (VA’s) Primary Care Mental Health Integration (PCMHI) program. VA expects that all PCMHI programs will include both embedded mental health providers and collaborative care management (
22), the latter being inherently measurement based (
23,
24). However, although MBC was encouraged in PCMHI training events, no clear expectation for its use emerged until 2017, when VA began requiring national PCMHI competency training for all PCMHI staff (
21). At the end of a training period, all providers and care managers must demonstrate their competency, including by explaining MBC and using results from patient-reported outcome measures (PROMs) with patients in a 30-minute appointment. For many PCMHI staff, participation in this training was the first time they had considered using PROMs with patients. For example, a 2015 study reported that only 23% of 8,000 PCMHI patient records showed evidence of use of a PROM (
25); clear linkage of PROM use to treatment decisions was found in only 8.5% of charts reviewed (
26). Thus, MBC remains a relatively new and potentially challenging practice for many PCMHI providers, and implementation support may be needed to ensure that barriers are overcome to achieve full implementation of this complex, novel practice.
To support the uptake of MBC in all mental health care settings, VA has rolled out a phased national MBC mental health initiative since 2015 (
9). VA defines MBC as consisting of three essential components: collect, share, and act. Collect involves use of standardized self-report measures administered on a repeated basis to track treatment progress. Share involves sharing data with patients and other providers. Act is defined as using data to engage veterans in shared decision making to individualize goals, collaboratively develop treatment plans, assess progress over time, and adjust treatment as appropriate. Ending in 2017, the first phase consisted of identifying volunteer champion sites and supporting them through provision of a national dashboard, education materials, didactics, and minimal coaching. Champion sites were encouraged to implement MBC by using four PROMs—the Patient Health Questionnaire–9 (PHQ-9) (
27), the General Anxiety Disorder–7 (GAD-7) instrument (
28), the PTSD Checklist for DSM–5 (PCL-5) (
29), and the Brief Addiction Monitor (BAM) (
30)—at the start of and repeatedly throughout an episode of care. PROM results were expected to be recorded in the electronic medical record (EMR). All other details of MBC implementation were left to the sites’ discretion. The second phase of the initiative began in 2018 and focused on spreading MBC implementation resources through a community of practice. The third phase began in 2020 as the initiative formed field-based work groups for various clinical settings (e.g., substance use disorder and posttraumatic stress disorder treatments) that developed recommendations for national MBC policies specific to their setting. Although early results were promising (
9), VA’s implementation efforts were further challenged by the onset of the COVID-19 pandemic. In response to the pandemic, most of the VA mental health workforce moved to provision of telephone- or video-based care by April 2020, requiring significant adaptations to all health care areas, including the novel and complex practice of MBC.
Implementation facilitation is an interactive process of problem solving and support (
31) and has been widely utilized, especially in primary care settings, to address contextual challenges and promote successful implementation of complex evidence-based practices and programs (
32,
33). Implementation facilitators help stakeholders identify and address implementation barriers and leverage site strengths to maximize the potential for successful implementation. We conducted a large project that sought to understand whether and how an intensive facilitation strategy can effectively support implementation of MBC in VA PCMHI care (
34). In this study, we aimed to compare the effectiveness of facilitated support, consisting of an implementation facilitation strategy and resources from the national MBC initiative, with the effectiveness of receiving the national MBC initiative resources alone. We focused our evaluation of the facilitation strategy on implementation outcomes by using the RE-AIM (Reach, Effectiveness, Adoption, Implementation, Maintenance) framework (
35,
36). This article focuses on the effects of implementation facilitation versus receiving national MBC initiative resources alone on four of the RE-AIM domains: implementation, adoption, reach, and effectiveness.
Methods
This pragmatic, randomized, multisite study was approved by the VA Central Institutional Review Board as a combination of research and quality improvement (QI). Paired sites were randomly assigned to receive facilitation support plus national MBC initiative resources or MBC initiative resources alone. Site-level activities to support implementation of MBC were considered QI, whereas centralized data collection activities measuring implementation outcomes constituted research. Site staff who participated in study interviews provided informed consent. A detailed description of all planned methods has been previously published (
34).
Site Recruitment and Randomization
Ten sites were recruited. Potential sites were identified from the MBC initiative’s phase 1 champion sites as well as other sites with well-implemented PCMHI programs and interest in improving MBC implementation. Potential sites were grouped by key similarities, such as hospital and PCMHI program size. The first two sites within a group with all required permissions were paired and randomly assigned to receive facilitation or the comparison condition. As sites began the project, site leaders selected the specific PCMHI clinic to participate. As a result, facilitation sites consisted of three VA medical center (VAMC) clinics and two community-based outpatient clinics (CBOCs). Comparison sites consisted of four VAMC clinics and one CBOC. The average size of facilitation sites was 17,872 (range 15,663–22,705) unique patients in the previous fiscal year, and the average size of comparison sites was 15,339 patients (range 8,131–21,101). At recruitment, the mean percentage of PCMHI visits with PROMs was 27.7% for facilitation sites and 29.9% for comparison sites. Two site pairs completed all study participation before the COVID-19 pandemic. The remaining three pairs experienced the onset of the pandemic during the study intervention.
National MBC Initiative Resources
The study intervention took place between May 2018 and June 2020, so national MBC initiative resources from phases 1 and 2 (described above) were available to all sites. Comparison sites were free to use these resources, QI teams, or other implementation support strategies.
Implementation Facilitation Strategy
The implementation strategy consisted of facilitation from external MBC experts combined with a local QI team at each facilitation site. QI teams included PCMHI leadership, primary care and mental health care leadership (or their designees), and other stakeholders selected by PCMHI site leadership. Facilitation was provided by two highly experienced external facilitators (EFs), one per site, who had expertise in implementation science and PCMHI models of care. Three subject matter experts (SMEs) provided consultation on MBC to the EFs and QI teams as needed. Over 12–18 months, EFs and SMEs met regularly with QI teams and supported their efforts to increase MBC use. EFs provided a mean±SD of 65.7±11.3 hours per site, and SMEs provided 8.4±4.2 hours per site.
EFs applied the facilitation strategy, tailoring it to each site’s needs and resources across three phases: preparation, design, and implementation (
Figure 1). During the preparation phase, EFs engaged PCMHI leadership, conducted needs assessments, and helped identify QI team members. The design phase started with a QI team meeting, during which the EF introduced the study, assessed knowledge of and perceptions about the evidence, and led initial planning discussions. Throughout this phase, EFs and SMEs focused on providing training to the QI team on MBC practice and systems change and on helping the QI team assess current MBC practice and develop a customized implementation plan. When this plan was completed, the site began the implementation phase, during which the QI team, with EF and SME support, educated and mentored clinicians and encouraged them to adopt MBC. Throughout this approximately 6-month phase, EFs and SMEs also helped the QI team monitor implementation progress, troubleshoot challenges, use national MBC initiative resources, and adjust local MBC practices to overcome problems and enhance the sustainability of MBC.
Evaluation
Using concurrent mixed methods to comprehensively describe the complexity of practice changes (
37), we evaluated the implementation facilitation strategy with the RE-AIM (
35,
36) domains of implementation (MBC processes that the sites implemented or changed), adoption (uptake of MBC by PCMHI providers), reach (proportion of PCMHI veterans who received MBC), and effectiveness (effect of MBC on care). Although we based our definition of MBC practice on the collect, share, and act components of the VA’s MBC initiative, our definitions of the implementation of these components were more specific.
Table 1 provides definitions of the RE-AIM domains and evaluation measures and describes data source specifications for this study. Data collection was organized around the phases of implementation described above (
Figure 1). Within each site pair, the timing of data collection at both sites was based on the facilitation site’s phase of facilitation.
Qualitative Data Collection and Analysis
Qualitative interviews (N=14) with PCMHI leaders for each site (as well as MBC champions at two sites) were designed to capture detailed descriptions of MBC practices in each of the collect, share, and act components (i.e., the RE-AIM domain of implementation) and perceptions about the adoption of those practices (i.e., adoption domain). Time 1 (T1) interviews were conducted during the preparation phase, before initiation of QI team activities. Time 2 (T2) interviews were conducted after the completion of the implementation phase. Interviewees were asked to describe their own views on MBC and MBC practice across their teams and, at T2, to describe changes that had occurred in MBC implementation. All interviews were approximately 1 hour long and were recorded.
We conducted a directed content analysis (
38) of verbatim transcripts of these interviews. The analysis team consisted of two experienced qualitative researchers (L.O.W., M.J.R.) and a trained research assistant. First, we developed a priori codes on the basis of the interview guide. The team met regularly to discuss and resolve differences in application of codes and to refine the codes and definitions. We extracted coded material into templated documents organized by codes and subcodes and then summarized them for each site. We entered code summaries for each data collection period into a composite matrix, created site summaries, and compared pre- and postimplementation summaries to examine changes in MBC practice over the study period. We then aggregated our composite matrix into facilitation and comparison sites and further compared practices and changes in practice on the basis of sites’ assigned study conditions.
Clinical Administrative Data Collection and Analysis
Administrative data were captured retrospectively from the VA EMR stored in the VA Corporate Data Warehouse (CDW) (
39). For each analysis, study cohorts were designed to include patients or providers who were actively engaged in PCMHI care during each observation period. CDW data captured the visit type (PCMHI or specialty mental health care), diagnoses, and provider for each mental health visit; all PROM administrations (type, score, and date); and prescription data (
Table 1).
To determine which PROMs would be included in the study, we identified instruments that were administered on the same day as a PCMHI visit and were brief, repeatable measures that were used in at least 2% of all PCMHI visits. The resultant measures included the PHQ-9, GAD-7, PCL-5, and the Insomnia Severity Index (
40). Although use of the BAM is encouraged by the MBC initiative, it was not being used at sufficient frequency for inclusion.
CDW data were captured in two 6-month data pulls. The T1 data pull captured data for the 6 months before the start of the preparation phase. The T2 data pull captured 6 months of data, beginning 4 months before the end of the implementation phase.
Descriptive statistics were used to evaluate changes in site-level reach and adoption. For reach, paired t tests were applied to compare facilitation sites and comparison sites in terms of the proportion of patients with two or more PCMHI visits in which PROMs were used. For MBC effectiveness, we modeled the odds of change in depression treatment by using a generalized mixed-model regression, applying a logit link function, and nesting study site within study pair. Treatment condition, time, and the interaction between treatment condition and time were included as predictors in the model. PHQ-9 administration and PHQ-9 severity history were tested as moderators and main effects. Regression analyses were performed with PROC GLIMMIX (SAS, version 8.2).
Results
Implementation: Qualitative Findings
Implementation findings are described by MBC component and are summarized in
Table 2.
Collect.
At T1, at all sites except for one facilitation site that used tablet computers, the predominant mode of PROM administration was on paper. Paper administration meant that, most often, entry of scores into the EMR required clinicians to perform a separate step beyond documenting progress notes. Strategies for accomplishing this data entry varied among clinicians, regardless of the site’s assigned study condition. By T2, one additional facilitation site and one comparison site had successfully implemented collection through tablet computers that entered data automatically. With the COVID-19–related transition to virtual care, all sites found workarounds for administration; typically, either clinicians read PROM questions to patients or used fillable pdf files.
At T1, expectations for the frequency of PROM administration varied widely across sites, irrespective of study condition. One facilitation site and one comparison site expected PROM administration at every visit. By T2, one additional facilitation site had increased the expectation to collect at every visit, and the other facilitation sites did not change. One comparison site increased its expectation to collect at 50% of visits, one decreased its expectation, and the others remained unchanged. At all three facilitation sites that were active during the COVID-19 pandemic, expectations did not change when clinicians moved to virtual care. In contrast, two of three active comparison sites expressly decreased their reported emphasis on MBC, including data collection.
Share.
At T1, PCMHI leaders either did not know how often clinicians were sharing data with patients or reported significant variability across clinicians. By T2, most sites, regardless of study condition, reported that sharing, or clinician motivation to share, had increased. All facilitation sites and no comparison sites reported increased discussion at team meetings about sharing data with patients.
Act.
The variety of ways in which data were used in patient care were similar across sites and did not appear to vary by site study condition or over time. Clinicians used data to inform treatment decisions, determine level of care (i.e., PCMHI vs. specialty referral), and track progress. Data were also used as a springboard for patient discussions about lack of progress, ambivalence about treatment, or termination of therapy. PCMHI leaders also occasionally highlighted the importance of MBC in helping clinicians keep their sessions focused and improve the efficiency of care. At T1, no sites reported use of algorithms to guide care, and by T2, only one facilitation site had created an algorithm for depression care.
Although a national PROM dashboard and care management software capabilities were available, use of aggregate PROMs for program evaluation, QI, or marketing was limited at T1 across all sites. One facilitation site was reviewing data to identify patients for recruitment to group treatment, and one facilitation site was using dashboard data to discuss the frequency of PROM collection at team meetings. By T2, national dashboards were being used by four facilitation sites, but only one comparison site, to track the frequency of PROM administration and to give feedback to clinicians. Notably, after the onset of COVID-19, PCMHI leaders at two of three comparison sites commented that use of programmatic data had stopped, whereas none of the three facilitation sites changed their use of such data.
Adoption: Quantitative and Qualitative Findings
Quantitative data indicated that overall, from T1 to T2, the total number of providers who increased their rates of PROM administration was greater at facilitation (N=15) than comparison (N=10) sites (
Figure 2). Because of the small sample, no further quantitative analysis was possible. However, four of the five facilitation sites showed some increased adoption, whereas three of the five comparison sites showed an increase. Of the six sites still engaged in the study during the pandemic, two facilitation sites and one comparison site increased adoption. Qualitative findings (
Table 2) indicated generally high enthusiasm for MBC across sites and showed that more facilitation sites than comparison sites perceived adoption to be high at T1. Notably, the importance of MBC to providers remained unchanged after the onset of the COVID-19 pandemic at all facilitation sites, whereas its importance decreased at all comparison sites that actively participated in the study during the pandemic.
Reach: Quantitative Findings
The change in the number of veterans with at least two visits during which PROMs were collected was significantly greater at facilitation than comparison sites (change in number=24.8, standard error=4.4, 95% CI=12.7–35.9, t=5.69, df=4, p=0.005). Because the sample was small, site-level quantitative analysis was not possible. However, as seen in
Figure 3, from T1 to T2, three facilitation sites maintained (defined as being within ±10 percentage points) and two facilitation sites increased the proportion of patients with at least two visits with PROMs collected. Over the same period, two comparison sites maintained and three sites decreased the reach of MBC. At the six sites that were still active after the onset of the pandemic, one facilitation site increased and two maintained reach of MBC, whereas one comparison site maintained and two decreased the reach of MBC to the PCMHI population.
Effectiveness: Quantitative Findings
Antidepressant change.
In the T1 data, 104 (35%) of 301 and 205 (52%) of 397 patients who had a PCMHI visit with a PHQ-9 score had a change to their prescription of antidepressant medication at facilitation and comparison sites, respectively. In the T2 data, 81 (32%) of 254 and 175 (41%) of 429 such antidepressant changes occurred at facilitation and comparison sites, respectively. We observed a significant main effect of PHQ-9 score severity (F=20.05, df=1 and 9, p=0.002), such that the odds of experiencing a change in antidepressant prescription increased by a factor of 1.9 (95% CI=1.4–2.6) when a patient received at least one high PHQ-9 score (≥11) or expressed any suicidal ideation, compared with patients having PHQ-9 scores <11. No significant interaction effects were found for study condition, and no main effects were found for study condition or time.
Specialty mental health referral.
In the T1 data, 54 (17%) of 311 and 40 (9%) of 432 patients who had a PCMHI visit with a PHQ-9 score had a referral to specialty care at facilitation and comparison sites, respectively. In the T2 data, 53 (20%) of 263 and 47 (10%) of 453 such referrals were made to specialty care at facilitation and comparison sites, respectively. As with the medication changes, we also noted a significant effect of PHQ-9 severity (F=8.4, df=1 and 9, p=0.02), such that the odds of referral to more intensive care increased by a factor of 1.8 (95% CI=1.1–3.0) when a patient received at least one high PHQ-9 score (≥11) or expressed any suicidal ideation, compared with patients having PHQ-9 scores <11. We found no significant interaction effects for study condition and no main effects for study condition or time.
Integration of Qualitative and Quantitative Results
Overall, the qualitative data examining MBC implementation revealed more frequent improvements in the collect and share components of MBC at facilitation sites. These findings were underscored by quantitative data, which showed that more providers increased their use of MBC (adoption) and a significantly greater number of veterans received MBC in facilitation than in comparison sites (reach). Qualitative (implementation) and quantitative (effectiveness) findings were also aligned for the act component; when data collection occurred, clinicians used the collected data in similar ways across the study conditions and without significant changes over time.
Discussion
Taken together, our findings provide a snapshot of the highly variable successes and challenges experienced by 10 VA sites as they attempted to implement MBC with or without facilitation. Although they showed variability, all sites improved MBC implementation. Overall, our findings suggest that facilitation helped sites to implement or improve aspects of MBC (implementation), and associated improvements were seen in adoption by providers and reach to more patients. The implementation intervention had no direct effect on the effectiveness of MBC. However, when PROMs were collected, clinicians responded to elevated PROM scores (indicating significant symptoms of depression and the need for initiation of, or change in, treatment). Because significantly more PROMs were collected at facilitation sites than at comparison sites, the impact of MBC on the quality of depression care was greater at those sites.
Our findings replicate and extend what other studies have reported. Consistent with the literature from within and outside VA, we found numerous challenges in implementing MBC in mental health care (
7,
9,
41–
43). In VA, an increased use of PROMs has been associated with VA’s national MBC initiative (
9). We observed increases in use of PROMs at some sites that used national MBC initiative resources alone, but gains were significantly greater at sites that received facilitation support. Throughout the literature, definitions of MBC have been inconsistent, and implementation studies have infrequently measured clinicians’ sharing of data with patients (
9,
43,
44). Our project advances the field by considering more aspects of MBC, sharing, and use of data. Through a qualitative approach, we could capture PCMHI leaders’ perceptions of how the clinicians on their teams discussed MBC with patients. Finally, the effectiveness of MBC has rarely been demonstrated in naturalistic settings. Importantly, the findings of our study—obtained outside of randomized controlled trials—show that clinicians are responsive to PROM scores.
Our mixed-methods study was well equipped for the naturalistic experiment that occurred when six of our 10 sites participated during the COVID-19 pandemic between baseline and the implementation observation periods. Qualitative findings showed that facilitation sites overwhelmingly “stayed the course” of MBC implementation despite unprecedented changes to care delivery. As a result, adoption and reach at facilitation sites grew or were maintained, despite COVID-19–related challenges, whereas the opposite was seen at comparison sites. These results help us understand the importance of implementation facilitation for sustaining focus on a practice innovation despite a massive upheaval in care provision. Nearly every large practice change effort is challenged by competing initiatives, although rarely so drastic. Thus, this finding is widely applicable.
Despite leaders and clinicians’ strong desire to implement MBC, multiple barriers remain and are applicable to all settings. Time and simple technological tools to collect and enter PROMs were significant challenges across sites. Even sites that had successfully implemented tablet-based collection were affected after care shifted to the virtual environment. Technological solutions that automate data entry and are independent of the physical presence of patients, such as the text- and e-mail–based PROM collection recently implemented in VA, are essential. Community settings require similar tools but may struggle in the absence of integrated medical record systems. A second lingering challenge that applies to both VA and the community is the essentially private nature of sharing data with patients within the confines of therapy sessions. In VA, providers are trained to share PROMs with patients, yet measurement of sharing activity remains challenging. PCMHI leaders at facilitation sites tried to overcome this barrier by modeling and discussing sharing at team meetings, an approach that would be applicable across settings.
Study limitations included our use of PCMHI leaders as informants to gather the perspectives of all PCMHI providers on their team; these reports may have been biased by their own, typically very favorable, views of MBC. However, this methodological choice limited burdens on busy clinicians. Generalizability of our findings may be limited by our use of sites that could be considered relatively early adopters of MBC; implementation of MBC is likely to be more challenging at less motivated sites. Our quantitative data were limited by the nature of clinical administrative data, especially at most of the sites where paper-and-pencil administration may have resulted in extra challenges to entering PROM results into the EMR. Our findings likely underestimated PROM collection at those sites. Our quantitative analyses for reach and adoption were limited by the small sample, which did not allow study of the sizable variations in team structure and roles. Finally, sustainment of MBC implementation was not addressed in this article.
Conclusions
External facilitation was associated with increased implementation of MBC and helped sites overcome even the extraordinary challenges of the COVID-19 pandemic. The results of our naturalistic study also indicate that MBC is effective even outside of randomized controlled trials. Although continued systemwide and more intensive support remains essential to overcome barriers and provide new technological solutions, once MBC is implemented, it is well positioned to improve mental health care.