Individual placement and support (IPS) is a model of supported employment for people with mental health conditions that has consistently been proven effective in helping such people to obtain and maintain competitive employment (
1,
2). For many people with mental health conditions, employment is central to recovery. According to the IPS Employment Center, in 12 U.S. randomized controlled trials, IPS participants typically have been found to have employment rates (average=68%) that are twice as high as those of participants in comparison interventions when studied for 12–24 months (
https://ipsworks.org/index.php/evidence-for-ips). Furthermore, hundreds of programs in the IPS learning community report quarterly employment figures that have averaged between 40% and 50% over 20 years. The IPS learning community figures are not comparable to data from controlled trials because they are based on quarterly administrative data, but they do reflect performance of active programs. Experts attribute the success of IPS to its eight core principles: zero exclusion, which means that anyone who wants to work is eligible; focus on competitive employment rather than sheltered or volunteer jobs; rapid job search without extended assessment or training; targeted job development to locate jobs that match the client’s interests and strengths; attention to client preferences regarding job type and location, hours of work, disclosures, and types of support; individualized job supports as needed and desired; integration of vocational and clinical services; and expert benefits counseling (
1). Research supports each principle (
3). In addition, the overall score on the IPS Fidelity Scale has been consistently correlated with employment outcomes (
4). However, research on the specific fidelity items has been inconsistent.
IPS is effective, in part because research, in addition to humanistic and client-centered values and client input, has guided the development and refinement of the model (
1). By contrast, expert opinion is an unreliable method for developing models. For example, experts in the 1980s recommended extensive preemployment counseling and a complete separation of vocational and clinical services (
5), practices that were widely adopted but subsequently discredited by research.
IPS is a flexible intervention that has been implemented successfully, according to fidelity standards reflecting the eight core principles, in many national and international settings characterized by myriad differences in workforce rules and in health care, rehabilitation, insurance, and educational systems. Thirty-two controlled clinical trials, including studies in more than 20 U.S. states and more than a dozen other countries, support the effectiveness of IPS (
2).
Modifications of evidence-based practices are common and often necessary (
6,
7), and implementers of IPS continue to modify the model, especially in relation to new populations and new settings. Implementation researchers sometimes use the term
adaptations to designate carefully designed, research-based alterations and to differentiate adaptations from the broader category of modifications (
8). However, these researchers also acknowledge that adaptations are rarely designed, measured, and evaluated carefully. We therefore use the more generic term
modifications here and discuss three types: minor alterations for context, omission of core principles, and augmentations.
The reasons for modifying evidence-based practices, including IPS, reflect numerous sociopolitical, organizational, provider, and recipient issues (
8). Modifications often entail deliberate adjustments to the design or delivery of an intervention to improve contextual fit, such as for historically disadvantaged populations, without compromising core principles. Such modifications are often effective. For example, modifications of IPS to improve outreach to rural clients (
9) and to Latino families (
10) have proven to be effective.
Interventions that have omitted one or more of the eight core principles of IPS have generally led to more negative employment outcomes compared with interventions that maintained full fidelity to IPS. Studies that did not consider clients’ interest in employment (
11), clients’ preferences (
12), integrated vocational and clinical services (
13), long-term supports (
14), or systematic job development and follow-along support (
15,
16) have yielded weak results.
Planned augmentations of IPS are clearly needed. One-third of IPS participants do not attain competitive employment, as reported in most studies, presenting an obvious target for additional help. Thus far, however, most augmentation studies have been disappointing. For example, studies on the addition of motivational interviewing, family psychoeducation, or social skills training to IPS have not demonstrated improved outcomes. A notable exception is the addition of cognitive enhancement interventions for clients who have not benefited from IPS (
17); some (but not all) studies of this type of augmentation have reported improved employment outcomes.
Less rigorous implementations are undoubtedly common in real-world settings and have usually yielded poorer outcomes (
15,
18). Low IPS fidelity scores generally correlate with poor employment outcomes in demonstrations, consistent with the notion that adhering to core principles is necessary for effectiveness (
4).
In sum, the evidence indicates that omitting or weakening core IPS principles generally reduces the model’s effectiveness. On the other hand, minor alterations are often effective, and augmentations for nonresponders could be beneficial and deserve further study. Modifications are necessary—orthodoxy can limit innovation, implementers sometimes face unavoidable constraints in new settings or with new populations, economic and workforce issues are dynamic, and evidence-based practices must evolve in relation to societal changes. For example, recent changes related to emphasizing supported education for younger clients and addressing online job applications and remote services prompted by the COVID-19 pandemic led to an update of the IPS manual (
19). Implementation researchers are actively considering changes to IPS for new settings, such as supportive housing, justice programs, addiction clinics, and low- and middle-income countries.
Researchers should hew to the scientific procedure of examining modifications and outcomes in small studies before changing the core principles of IPS. Large efficacy trials, effectiveness demonstrations, or policy changes should be conducted later in this logical sequence. Furthermore, the onus should be on researchers to carefully document IPS modifications (and their justifications) and fidelity. Without all these steps, policy changes are premature.
The fields of mental health and vocational services are replete with examples of expert opinions leading to ineffective or harmful interventions and misguided policies. In mental health, the examples of harm are legion: frontal lobotomies, lengthy hospitalizations for young patients with a diagnosis of schizophrenia, incarceration for minor offenses of people experiencing psychosis, placement of blame on families for their children’s serious illnesses, and many more. In psychiatric rehabilitation, vocational interventions for people with serious mental health conditions have been developed and implemented solely on the basis of the recommendations of experts, who advocated lengthy career counseling, skills training, or transitional employment (
5). For example, the vocational intervention called choose-get-keep was disseminated widely in the United States and internationally for years before a clinical trial showed it to be ineffective (
20).
By contrast, researchers helped to develop IPS carefully from its inception, starting with small studies to test, refine, and add core principles and to develop fidelity measures and proceeding to a series of randomized controlled trials and cost-effectiveness studies in different settings before disseminating the model (
1). The evidence-based practice movement espouses these fundamental procedures. Policy makers should also follow scientific procedures (
21). The argument that experts know what works and that spending time on research is unnecessary rings hollow. Policy changes and government mandates in the absence of rigorous research tend to produce unintended side effects (
21). The potential for ineffectiveness, unnecessary increased costs, or even harm is high. On the other hand, once an evidence-based practice garners a strong level of evidence and can meet the needs of millions of people, policy changes are practically, legally, and ethically important. In fact, failure to implement evidence-based practices such as IPS can violate the Americans With Disabilities Act and
Olmstead v L.
C. (
22).
The crux of the issue is that modifications of IPS (and other evidence-based practices) should be made on the basis of empirical research, by using scientific steps of intervention development rather than researchers’ or clinicians’ opinions. We therefore suggest the following guidelines.
First, implementers should deliver IPS with high fidelity to a new population and then study modifications in cases where standardized IPS is ineffective (
23). When implementing IPS for new populations or in new settings, researchers have often made modifications to the model before first evaluating high-fidelity IPS. A dilemma occurs when the level of evidence of the findings is not strong: Were the weak findings caused by the modifications or the new population? In the absence of careful implementation research, the parsimonious conclusion is that less rigorous implementation produces poorer outcomes (
15,
18).
Second, when implementers decide that modifications to core IPS principles or augmentations are needed for a new population, they should listen carefully to clients about what they believe would help and try promising innovations. For example, enhancement of employment specialist competencies, development of stronger alliances with employers, and brief certificate training programs that provide specific skills to facilitate a client’s career goals represent promising practices that deserve research (
19,
24). Innovators could also develop and assess augmentations to help clients who do not respond to IPS but want to work (
17). Researchers should follow this standard sequence: document the alterations in detail, pilot the intervention, conduct efficacy trials to compare the new intervention with standard IPS, and then transition to effectiveness research. The framework for reporting adaptations and modifications–enhanced model provides a comprehensive approach to document and measure planned modifications (
8). Each of these steps should precede policy changes.
Finally, IPS researchers and implementers should follow tenets of social science research such as consilience, parsimony, hierarchy of evidence, and beneficence. In other words, interventions should align with findings from related scientific investigations (consilience); have concise, simple explanations (parsimony); follow standardized research steps (hierarchy of evidence); and provide benefits for and avoid harms to clients (beneficence). These tenets should be applied carefully to untested models before interventions are disseminated (
21). Interventions that do not incorporate scientific standards can easily go awry, waste resources, and cause harm to the public. Stated more positively: good research leads to good policy.