Even though many evidence-based practices have been developed, few clients with severe mental illness receive effective services (
1). Evidence-based practices are often implemented poorly and rarely endure beyond initial enthusiasm and grant funding. The empirical literature on sustainment of evidence-based practices is weak and fragmented (
2). Factors commonly hypothesized to influence sustainment include political support, funding stability, community partnerships, organizational capacity, fidelity monitoring, technical assistance, workforce development, and supervision (
3,
4). Although many individual factors have modest empirical support (
3), research findings have been inconsistent.
Monitoring the quality of services is crucial; without mechanisms to ensure adherence to model standards, programs will vary widely in how services are delivered (
5). Therefore, fidelity scales, defined as measures that assess adherence to a program model, have become essential tools for program implementation (
6) and sustainment (
7). Strategies to maintain and improve the quality of services include establishment of learning collaboratives (
8) and technical assistance centers to provide training, resource materials, fidelity monitoring, on-site consultation, and other sources of support (
9,
10). At the local agency level, quality improvement efforts, including fidelity and outcome monitoring, are associated with better implementation (
11). The frequency and quality of supervision also affect the quality of implementation and sustainability of services (
3,
12). Another factor, workforce development, ensures that practitioners have the skills to do the practice (
13). At the state level, ensuring a well-trained workforce requires systematic and large-scale methods for initial and booster training; online training is one such method (
14).
One evidence-based practice increasingly implemented throughout the United States is the Individual Placement and Support (IPS) model of supported employment (
15,
16). In recognition of the many factors influencing the long-term survival of any program model, a comprehensive learning community was developed to sustain IPS. Beginning in 2001, the Dartmouth Psychiatric Research Center and the Johnson & Johnson Office of Corporate Contributions partnered to develop a comprehensive program to strengthen state and local infrastructures to promote access to IPS throughout the United States. After starting as a small demonstration in three states, the program has evolved into a network of 20 states and three European countries known as the IPS learning community (
17).
Historically, the term “learning collaborative” has been used to define a network of organizations with a shared goal of improving treatment for a specific medical condition, facilitated by regular communication (for example, meetings, teleconferences, and newsletters) and collection and dissemination of objective information about procedures and outcomes, typically over a few months (
18,
19). Some collaboratives provide training and technical assistance and facilitate research and innovation (
20). The IPS group adopted the term “learning community” to signify its long-term commitment to quality and expansion; the term differentiates our approach from time-limited quality improvement learning collaboratives (
21).
The purpose of this descriptive study was to examine sustainment of U.S. programs in the IPS learning community over a two-year period. We also examined changes over that time in the infrastructure supporting sustainment. We hypothesized that participation in the learning community would promote sustainment, program fidelity, and employment outcomes.
Methods
Overview
This prospective study examined changes over a two-year period in 129 programs participating in the IPS learning community in 2012. In addition to examining sustainment, fidelity, and employment outcomes, we examined funding and quality improvement efforts. The Dartmouth Institutional Review Board approved the study, which followed the principles outlined in the Declaration of Helsinki.
IPS Learning Community
The IPS learning community provides a set of strategies, interventions, and activities intended to promote the dissemination, implementation, and sustainment of IPS services. The learning community has encompassed a two-tiered, decentralized approach. In the United States, Dartmouth trainers and researchers bring together state leaders and help them build a viable infrastructure for implementing and sustaining IPS services within their states (
17). Dartmouth provides resources, such as brochures, posters, policy bulletins, newsletters, videos, and online training for employment specialists and supervisors. In each state, the leadership team includes three key roles: liaisons from two state agencies responsible for employment services (that is, mental health and vocational rehabilitation) and one or more state trainers. State leaders create parallel learning communities consisting of IPS programs within their public mental health systems. As part of their participation in the learning community, state leaders submit quarterly employment outcome reports for IPS programs within their states; Dartmouth analyzes and distributes the data back to the states (
22). State trainers conduct periodic fidelity reviews of both new and established IPS programs by using a validated fidelity scale (
23). IPS programs are considered active participants once they begin submitting outcome reports, typically about nine months after start-up.
Sample
The sample consisted of 129 U.S. sites in 13 states that were active in the IPS learning community as of January 2012. All sites meeting this criterion agreed to participate in the study. At the 2012 interview, the sites had participated in the learning community for a mean±SD of 4.5±2.7 years (median=3.9). The 13 learning community states were Connecticut, District of Columbia, Illinois, Kansas, Kentucky, Maryland, Minnesota, Missouri, Ohio, Oregon, South Carolina, Vermont, and Wisconsin. The number of sites per state ranged from three to 21; four states had 16 or more sites, and six states had six or fewer sites. Sites were located in both urban and rural communities (
23). Most local programs had a single IPS team (N=105, 81%), but 24 sites had two (N=12, 9%), three (N=7, 5%), or four or more (N=5, 4%) teams. The mean job tenure for IPS team leaders was 5.0±4.9 years (median=3.3). Although the learning community does not compile statistics on client background characteristics, the people served in these programs reflect the clients served by the public system.
Measures
We operationally defined sustainment as follows: a program is sustained if it continues to employ staff, maintains an active client caseload, and provides direct services. Programs sometimes continue in name only, without adhering to the program model that they originally implemented (
3). Therefore, a more meaningful standard is sustaining a program at good fidelity (that is, adhering to the core principles of the program model).
Fidelity.
The 25-item Individual Placement and Support Fidelity Scale (IPS-25) assesses adherence to the evidence-based principles of IPS (
24). Each item is rated on a 5-point behaviorally anchored dimension, ranging from 1, representing lack of adherence, to 5, indicating close adherence to the model. The total score on the IPS-25, the sum of item scores, ranges from 25 to 125. A score of 100 or more is considered good fidelity.
The fidelity manual recommends that two trained fidelity assessors conduct a 1.5-day site visit to collect the necessary information to complete the IPS-25 (
www.dartmouthips.org). The assessment procedures include interviews with the IPS program leader, employment specialists, clinicians, agency leaders, family members, and clients; observation of team meetings and community contacts with employers; and review of client charts. For quality improvement purposes, assessors prepare a fidelity report with fidelity ratings for each of the 25 items and recommendations for improvement.
Competitive employment rate.
Competitive employment is defined as employment in integrated work settings in the competitive job market at prevailing wages, with supervision provided by personnel employed by the business. The learning community tracks the quarterly competitive employment rate, which is based on at least one day of competitive employment during a specified three-month period (that is, a calendar quarter). All competitive jobs are counted, regardless of hours per week worked. The site-level competitive employment rate is calculated as the number of clients employed divided by number of clients active on the caseload during the quarter in which the fidelity assessment is completed. The benchmark for good employment outcome is a quarterly employment rate of 41% or more (
17).
Data collection procedures.
As part of the IPS learning community, individual sites agree to participate in annual fidelity assessments and submit quarterly competitive employment outcomes (
17,
22). Each state identifies a group of qualified fidelity reviewers who independently assess IPS programs in the state on adherence to IPS standards. The fidelity assessors for each state include trainers from technical assistance centers, coordinators from state mental health and vocational rehabilitation agencies, and IPS supervisors from other programs. Dartmouth provides training and technical assistance in performing fidelity assessments through a three-day workshop, bimonthly teleconferences, individual consultation, and on-site training.
Also, as part of the agreement for participating in the learning community, each state appoints a coordinator to compile quarterly outcome data and report them to the Dartmouth team, which prepares and distributes a detailed quarterly report of outcomes aggregated at the state and learning community levels (
17). For this study, we used quarterly outcomes for June 2012 and June 2014. For three sites missing second-quarter outcome data, we substituted data submitted in the following quarter.
For this study, state coordinators provided site-level fidelity data by using a standardized spreadsheet from their most recent on-site fidelity reviews. With the exception of Maryland, all states followed the above fidelity procedures with minor variations. (Maryland used an earlier version of the IPS Fidelity Scale [
25] not directly comparable to the IPS-25.) The dates for the site fidelity reviews ranged from January 2011 to August 2012 for the initial time period and from January 2013 to August 2014 for the second time period.
Interview Procedures
Initial team leader interviews.
Between February and May of 2012, interviewers contacted team leaders for the 129 IPS programs active in the IPS learning community as of January 2012. Telephone interviews averaged one hour and consisted of open-ended questions and factual short-answer questions. [The interview guide is included in an online supplement to this article.]
The interview included a checklist of funding sources, and respondents were asked to identify which sources were used by their agency to fund IPS services and to rank the top three revenue sources. The interview guide also included questions about use of technical assistance, training, fidelity monitoring, outcome monitoring, and field mentoring. A final question asked, “Do you have any worries about IPS being discontinued in the next year?” These responses were dichotomized as either yes (worried about discontinuation) or no (not worried about discontinuation).
Follow-up team leader interviews.
Between February and July of 2014, interviewers recontacted program coordinators from the same sites included in the 2012 sample. We interviewed the current IPS team leader for 122 sites that were sustained by 2014, in addition to the team leader for one site representing the merger of two sites that also continued to offer IPS services. The interview protocol was a shortened version of the initial interview with the key questions unchanged. We completed initial and follow-up interviews with IPS team leaders (or another knowledgeable staff member) at all the study sites.
Results
Sustainment
Of the 129 sites active in 2012, 124 (96%) continued to offer IPS services in 2014. Two of the 124 continuing sites were no longer functioning as independent programs because their parent organizations had merged in 2013. The remaining five sites that had discontinued IPS services were located in three states. In this article, we report changes over time for the 122 sites active at both time periods and operating as independent sites.
Funding Sources for IPS
According to 2012 team leader interviews, the most common funding sources for IPS were vocational rehabilitation agencies, state or county mental health budgets, Medicaid, and Ticket to Work (a Social Security Administration program for people receiving disability payments), as shown in
Table 1. Other revenue sources were rare. Of the 122 active sites in 2014, the four most common funding sources remained the same. Utilization of Medicaid and vocational rehabilitation funding significantly increased, and utilization of Ticket to Work funding significantly declined during this time.
The importance of different funding sources (that is, the proportion of total program revenue) yielded a different rank order, as shown in
Table 1. In 2012, team leaders most often ranked state or county budgets as their top revenue source for IPS programs, followed by Medicaid and vocational rehabilitation agencies. By 2014, Medicaid had overtaken state or county budgets as the mostly frequently top-ranked revenue source. Sites varied by state in their dependence on funding sources; in 2012, most sites identified state or county budgets (four states), Medicaid (four states), and vocational rehabilitation agencies (four states) as their top revenue source. In one state, 13 (68%) of 19 sites relied exclusively on state funding for the IPS services, whereas in the other states nearly all sites reported multiple funding sources.
We defined “funding diversity” as the number of funding sources used among the three major funding sources (that is, vocational rehabilitation, state or county budgets, and Medicaid). Funding diversity increased between 2012 and 2014 (t=4.17, df=121, p<.001). The percentage of sites accessing all three sources of funding increased from 28% to 49%.
Quality Improvement Activities
From 2012 to 2014, the number of sites completing fidelity reviews in the past year increased from 79 (65%) to 92 (75%) (McNemar’s test [N=122], p=.07), as shown in
Table 2. The number of sites completing online training and the number receiving on-site technical assistance were similar in 2012 and 2014.
Fidelity and Employment Outcomes
Between 2012 and 2014, the mean IPS-25 fidelity scores showed a modest but significant increase, from 103.8±9.5 to 108.4±7.6 (t=3.25, df=62, p=.002) at the 63 sites with fidelity reviews in both years. Among sites with fidelity reviews in both years (including Maryland), the percentage of sites meeting the standards for good fidelity increased from 56 (78%) to 65 (90%) (McNemar’s test [N=72], p=.05). Among the 122 sites that sustained IPS services in 2014, the mean quarterly employment rate increased significantly from 41%±15.0% to 43%±13.1% (t=2.13, df=121, p=.04). The number of sites achieving the benchmark standard for good employment increased from 62 (51%) to 77 (63%) (McNemar’s test [N=122], p=.02).
Worries About Discontinuation of IPS Services
Between 2012 and 2014, the number of IPS team leaders worried about IPS services being discontinued declined from 29 (24%) to 17 (14%) (McNemar’s test [N=121], p=.06). The most common worry was future funding.
Discussion
The two-year sustainment rate was 96% for 129 programs in the IPS learning community. This sustainment rate is higher than the 80% rate over a two-year period after the termination of the formal implementation phase in a national study of 49 sites implementing a new evidence-based practice aided by systematic technical assistance (
7). Data on sustainment of evidence-based practices are rarely published. Many studies make it clear, however, that enthusiasm for an innovative program model, as well as model fidelity, often fades over time (
3,
26–
28). Funding initiatives targeting specific program models often spur growth, followed by rapid dissolution when state-sponsored funding ends. For example, over a span of less than a decade, one state experienced a cycle of rapid growth followed by a collapse of services for an evidence-based practice when the targeted funding for this program was abruptly curtailed (
3,
29). To our knowledge, no one has examined the empirical literature on sustainability to establish benchmarks for target rates for sustaining programs over time (
2).
Beyond the high sustainment rate, most sustaining sites met benchmark standards for good fidelity and good employment outcomes. Improvements on both criteria suggest that the learning community enhanced the quality of services at participating sites. Model drift has not occurred in the IPS learning community, probably because of fidelity and outcome monitoring. Fidelity monitoring is crucial for gauging how well programs are sustained; once a state discontinues fidelity reviews, program leaders may introduce changes that compromise services (
3).
Another key to maintaining IPS services has been ongoing attention to funding. The development of IPS services commensurate with need is especially formidable for IPS, because it lacks a single reliable funding source (
30,
31). Fortunately, all of the states in the IPS learning community have been able to survive state budget cuts and other challenges, primarily through the work of creative and persistent state leaders who have ensured continuous (and increasing) funding for IPS. Support and advice from the network of trainers and other state leaders in the learning community have helped avert program discontinuations precipitated by budgetary shortfalls. During the study period, sites diversified their funding sources, with more sites accessing funding from vocational rehabilitation agencies and Medicaid over time. Despite similarities in the broad categories of funding sources used to fund IPS programs, the specific state funding algorithms varied greatly (
31). State regulations vary considerably, and states need to be creative in developing viable funding models.
During the two-year period, states maintained the frequency of quality improvement activities, such as fidelity reviews, training, and technical assistance. Moreover, national, state, and program leaders provided extensive technical assistance during this time that was not captured by the survey. The finding of improved employment outcomes between 2012 and 2014 is especially encouraging, because it points to positive changes in the lives of clients, which is the reason for providing IPS services.
Recently the New York State Office of Mental Health funded an initiative modeled on the IPS learning community (
32). This initiative provides technical assistance and a process for monitoring fidelity and employment outcomes, and it has achieved outcomes similar to those in the national learning community. New York also has targeted funds for IPS, promoting the growth and sustainment of IPS services. By the end of 2014, 59 (69%) of 86 eligible programs had joined the New York initiative.
This study had several limitations. First, the interview data relied on a single respondent from each site, without confirmed interrater reliability and validity. Second, the sampling method may have biased the sustainment rate because dropouts prior to 2012 were not included. Third, fidelity reviews were not completed for all sites. Fourth, the relatively brief follow-up period of two years warrants caution. Fifth, increased employments rates could have been affected by the economic recovery under way during this period. Sixth, generalizability was limited by sampling states that were among early adopters of IPS. Finally, because this study included no comparison group, no causal inferences can be made about the impact of the learning community.
Conclusions
Sustainment of evidence-based practices appears to be enhanced through the mechanism of a learning community. Although relatively untested with other evidence-based practices, its basic concepts are promising. Controlled studies of long-term learning communities in comparison with usual methods are needed before drawing firm conclusions.