In the United States, approximately 80% of individuals with serious mental illness (such as schizophrenia) are unemployed (
1,
2). Long-term unemployment and disability contribute to the prolonged poverty, housing instability, and food insecurity commonly experienced by these individuals (
3). The gold standard for increasing employment in this population is the individual placement and support (IPS) model of supported employment (
1), an evidence-based vocational rehabilitation service.
Specifically, IPS increases the employment rate for individuals with serious mental illness to approximately 55% via rapid job placement through job development with ongoing supports (
4,
5). Technology-based adjunctive interventions have the potential to increase the employment rate of these individuals to an even greater extent (
6,
7). For example, computer-based cognitive remediation and computerized job interview training increased employment among individuals with serious mental illness who had struggled to find employment through IPS as usual (
8,
9). Although other technologies (e.g., smartphone apps or Minds@Work) have potential to improve employment (
10,
11), they have not yet been evaluated in IPS.
Analysis of initial outcomes of implementing adjunctive interventions can elucidate whether the interventions may be appropriate for IPS, acceptable for IPS clients and staff, and feasible to deliver without reducing IPS fidelity. However, implementation of evidence-based practices typically occurs ≥10 years after an intervention is developed, which leads to substantial delay in the identification of effective implementation strategies (
12).
To expedite implementation of an intervention, Curran and colleagues (
12) formalized the use of hybrid effectiveness-implementation designs to concurrently evaluate an intervention’s effectiveness and implementation. The hybrid type 1 (HT1) design primarily studies an intervention’s effectiveness with a secondary evaluation of initial implementation process outcomes (
12). An HT1 trial (Clinical Trials ID: NCT03049813) (
9) recently evaluated whether virtual reality job interview training (VR-JIT), an Internet-delivered job interview simulator with automated feedback (
13,
14), could be an effective adjunct to IPS. The results of the trial showed that individuals with serious mental illness who were still unemployed after the first 90 days of IPS and randomly assigned to receive IPS with VR-JIT had a significantly higher proportion of employment (52% vs. 19%) and shorter time to employment (hazard ratio=3.2) by 9 months after randomization than individuals with comparable unemployment after 90 days of IPS who were randomly assigned to receive IPS as usual (
9). The aforementioned HT1 trial included the present multilevel, multimethod process evaluation of facilitators of and barriers to future VR-JIT implementation in high-fidelity IPS and of IPS staff’s and clients’ perceptions of VR-JIT appropriateness, acceptability, feasibility, and usability. Therefore, this initial implementation process evaluation could serve as a template for future studies evaluating technology-based interventions within IPS.
Methods
This initial process evaluation of VR-JIT implementation within IPS used a sequential, complementary mixed-methods design in which qualitative results provided additional depth of understanding of quantitative implementation results (
15). Specifically, we collected quantitative data from IPS staff (i.e., employment specialists, team leaders, and program directors) on VR-JIT’s preimplementation acceptability, appropriateness, and expected feasibility in the parent RCT (
9). We also used RCT data on the acceptability and usability of VR-JIT from recipients of the intervention. Next, we conducted qualitative postimplementation focus groups and semistructured interviews with research staff implementing VR-JIT, IPS employment specialists and team leaders, and VR-JIT recipients. The study protocol was approved by the institutional review boards at the University of Michigan and Northwestern University. A data safety and monitoring board supervised study procedures. All study participants provided informed consent.
Trial Setting and Interventions
The study was conducted at Thresholds, a nonprofit community mental health agency in Chicago, from June 2017 to October 2020. Thresholds provides comprehensive mental health services (
http://thresholds.org).
IPS.
IPS is a vocational support model based on eight principles: consumer choice (no exclusion criteria—only interest in work is required), a focus on competitive employment (e.g., nonsheltered work in settings employing people without disabilities for competitive wages), integration of mental health and employment services, attention to client preferences, benefits planning, rapid job search, job development, and individualized job supports (
16). Independent ratings by the state of Illinois assessed with the IPS Fidelity Scale (
17) indicated mean fidelity ratings ranging from 111 to 117, reflecting good-to-exemplary fidelity to IPS on all scales (range 0–125, with higher scores indicating greater fidelity) during this study.
VR-JIT.
VR-JIT is an interactive, computerized job interview simulator commercially licensed by SIMmersion (
www.simmersion.com) that uses a virtual hiring manager named “Molly Porter” who is governed by an algorithm that draws from >1,000 video clips of an actor who plays the role of Molly asking various job interview questions. Trainees verbally respond to Molly’s questions by repeating aloud a selection from 10 to 15 scripted choices offered via the virtual job interview interface. Trainees receive automated feedback in real time and via transcript and performance reviews, and their interview performance is scored between 0 and 100 after interview completion (
18). Trainees apply for one of eight positions in the simulated interview, and an e-learning curriculum provides interview preparation advice. On the basis of the job interview literature and content expert review (
19,
20), VR-JIT trains recipients in eight interview skills designed to convey an applicant’s positive attributes (e.g., being a hard worker and being easy to work with).
Delivery model.
Employment specialists were initially trained to implement VR-JIT but did not do so because of unforeseen implementation barriers. Instead, Thresholds’ internal research staff (called VR-JIT implementers) delivered VR-JIT at community agencies. (See the online supplement to this article for staff training methods to implement VR-JIT with fidelity and for the delivery structure of VR-JIT.)
Participants
IPS staff participants included program directors, team leaders, and employment specialists who were trained to implement VR-JIT during the main RCT or had VR-JIT recipients in their caseload. VR-JIT recipients were RCT participants who were actively engaged in IPS at the time of enrollment (i.e., they had at least one contact with an employment specialist in the past 30 days; were ≥18 years old; had a diagnosis of schizophrenia, schizoaffective disorder, major depressive disorder [any type], or bipolar disorder [types I and II, with or without psychotic features]; and were currently unemployed or underemployed and planning to interview for a job within the next 4 weeks). (Other inclusion criteria are listed in the online supplement.)
Quantitative Methods
IPS employment specialists (N=8), team leaders (N=3), and program directors (N=2) were invited to complete a paper survey after completing an orientation on how to teach RCT participants to use VR-JIT. Overall, 42 of 54 VR-JIT recipients completed a paper survey assessing VR-JIT acceptability. Six participants randomly assigned to VR-JIT never used the tool and did not complete the acceptability and usability surveys. Twenty-eight VR-JIT recipients completed the usability items, which were added during the second year of enrollment.
VR-JIT preimplementation acceptability, appropriateness, and feasibility among IPS staff.
VR-JIT orientation acceptability, VR-JIT appropriateness, and the expected feasibility of delivering VR-JIT within IPS were assessed by using items adapted from Proctor et al. (
21) and Weiner et al. (
22) and were used in previous evaluations of implementation of virtual interviewing in special education settings (
23–
25).
Orientation acceptability consisted of seven items (e.g., “How satisfied are you with the training you received on VR-JIT?”) that were summed for a total score. Possible scores ranged from 1 to 35, with higher scores indicating greater acceptability. Item responses were rated on a 5-point Likert scale (i.e., 1, not at all satisfied, to 5, very satisfied, or 1, not at all acceptable, to 5, very acceptable) (Cronbach’s α=0.74).
Appropriateness consisted of five items (e.g., “How well do you think VR-JIT fits with clients’ reasons and motivations for receiving IPS services?”) that were summed for a total score. Possible scores ranged from 1 to 25, with higher scores indicating greater appropriateness. Item responses were rated on a 5-point Likert scale (i.e., 1, not at all effective, to 5, very effective, or 1, not at all well, to 5, very well) (Cronbach’s α=0.82).
Expected feasibility consisted of nine items (e.g., “How confident are you that after training [participants], you will be able to support them as they implement VR-JIT?”) that were summed for a total score. Possible scores ranged from 1 to 45, with higher scores indicating greater feasibility. Item responses were rated on a 5-point Likert scale (i.e., 1, not at all prepared, to 5, very prepared, or 1, not at all confident, to 5, very confident) (Cronbach’s α=0.89).
Postimplementation VR-JIT acceptability and usability among recipients.
VR-JIT recipients completed acceptability (N=42) and usability (N=28) surveys after their final VR-JIT session. The Training Experience Questionnaire (
18) was used to evaluate VR-JIT acceptability. This survey included five items (e.g., “How helpful was this training in preparing you for a job interview?”) that were summed for a total score. Possible scores ranged from 1 to 35, with higher scores indicating greater acceptability. Item responses were rated on a 7-point Likert scale (e.g., 1, extremely unhelpful, to 7, extremely helpful) (Cronbach’s α=0.73).
Two items were adapted from the Systems Usability Scale (
26) to assess VR-JIT usability (i.e., “How comfortable are you using the VR-JIT training on your own?” and “How much do you need staff there to help you use VR-JIT?”). The items were rated on a 4-point Likert scale (i.e., 1, not at all, to 4, very much). Individual item means are reported.
Data analysis.
Quantitative data were analyzed with SPSS, version 26.0. Descriptive analyses (i.e., mean, SD, and range) were generated for the IPS staff–level and VR-JIT recipient–level initial implementation process outcomes.
Qualitative Methods
Qualitative data were collected to assess the initial process evaluation of VR-JIT implementation through focus groups and, later, through using purposive semistructured interviews, with a data analysis phase between the two stages.
Data collection.
The IPS program director at Thresholds shared a study recruitment e-mail with employment specialists and team leaders (who had clients in their caseload who received VR-JIT) to participate in the qualitative study. Two focus groups were held with IPS staff (N=11) and three with VR-JIT recipients (N=13). Thirteen semistructured interviews were held with VR-JIT implementers (N=3), employment specialists (N=3), team leaders (N=3), and VR-JIT recipients (N=4). Focus groups and interviews were video recorded. VR-JIT recipients were recruited by telephone and interviewed by research staff (recorded with digital audio). (See the online supplement for additional data collection methods.)
Analysis.
Interviews and focus group discussions were transcribed by research assistants and analyzed with dedoose software (
27). Three research team members (S.B., M.H., J.J.) analyzed interview and focus group transcripts by using line-by-line coding and assigning codes on the basis of a priori and emerging themes (
28). A priori themes included coding for content relevant to the experience of implementing VR-JIT, whereas emerging themes uncovered new information about using VR-JIT within IPS. Each coder worked with an unmarked transcript that was coded for themes. The three research team members then met to review codes and themes until categories were agreed on. Research team members (S.B., M.J.S., J.D.S.) organized the categorization of themes by using the Consolidated Framework for Implementation Research version 2.0 (CFIR 2.0) approach (
29). CFIR 2.0 was selected because of its provision of evidence-based domains that help to frame the implementation needs of mental health interventions (e.g., outer setting and how aspects of the intervention itself influenced implementation at a specific site, with a particular population) (
29,
30).
Integration of Quantitative and Qualitative Analyses
Following a sequential, complementary mixed-methods approach (
15), we compared the qualitative results with the quantitative results to expand the depth of understanding of the outcomes of VR-JIT implementation. We then integrated the qualitative themes and applied them to the quantitative implementation categories of acceptability, appropriateness, and feasibility. Qualitative themes that did not fit the quantitative categories were included in the results as emerging data within CFIR 2.0 domains that enhanced our understanding of VR-JIT implementation in a community mental health agency.
Results
Table 1 displays the demographic characteristics of the study participants.
Quantitative Analysis
IPS staff.
IPS staff (N=13) reported that VR-JIT orientation was acceptable (mean±SD acceptability score=32.1±2.7, range 28–35) and that VR-JIT was appropriate for interviewing practice within IPS (mean appropriateness score=22.2±2.4, range 19–25). Also, IPS staff expected VR-JIT implementation to be feasible within IPS (mean feasibility score=38.1±7.2, range 27–45).
VR-JIT recipients.
VR-JIT recipients (N=42) reported that VR-JIT was acceptable (mean acceptability score=30.4±4.0, range 16–35), that they were comfortable using VR-JIT independently (mean score assessing level of comfort with using VR-JIT independently=3.3±1.0, range 1–4), and that they did not need much staff support to use it (mean score assessing need for staff support in using VR-JIT=2.4±1.0, range 1–4).
Qualitative Analysis
Qualitative themes aligned with three broad CFIR 2.0 domains (
Table 2): intervention characteristics (including relative advantages of the intervention and areas of adaptability), outer setting (including community characteristics), and individuals (including VR-JIT recipients). (See the
online supplement for supporting quotations.)
Intervention characteristics.
The first theme suggested that VR-JIT provided a relative advantage to IPS (i.e., VR-JIT provides direct, unapologetic feedback that IPS staff do not feel they can provide and enables staff to shift attention to job development). The second theme centered on potential VR-JIT adaptations to enhance uptake within IPS (i.e., IPS staff having a working knowledge of troubleshooting any technology issues, being able to translate the “virtual” aspect of VR-JIT during orientation (e.g., clearly indicating the simulated nature of the intervention to recipients and that it was not a remotely held interview for a real job), and reducing the duration and frequency of sessions) (
Table 2).
Outer setting.
A single theme emerged on community characteristics that emphasized challenges to the use of public transportation as a barrier to attending multiple visits (
Table 2).
Individuals.
A single theme emphasized psychiatric symptoms and long periods of disability as potential barriers to VR-JIT engagement for individuals with serious mental illness receiving this intervention (
Table 2).
Mixed-Methods Analysis
Our mixed-methods approach helped complement the quantitative findings with the qualitative results obtained through the interviews and focus groups (
31). Quantitative findings of acceptability, appropriateness, and feasibility among IPS staff were matched with quotations from the focus groups or interviews (
Figure 1). For example, as reported above, IPS staff scored VR-JIT highly on appropriateness, and they noted that “[VR-JIT] has been a great tool for members who have been enrolled [in IPS] and were able to use it.” Quantitative findings on acceptability and usability of VR-JIT among its recipients matched the tenor of quotations from the focus groups and interviews with recipients (
Figure 2). For example, VR-JIT recipients gave VR-JIT high scores on acceptability (see above), and one said, “I didn’t have a challenge, I enjoyed the Molly [VR-JIT] program. [I] made all of my appointments; of course I made them on time.” (See the
online supplement results for additional supporting quotations.)
Discussion
This evaluation of the initial implementation of VR-JIT in IPS enhances our understanding of how to implement and potentially adapt this intervention in urban settings. Quantitative, qualitative, and mixed-methods results suggested that IPS staff found VR-JIT to be acceptable and appropriate and that the staff expected VR-JIT implementation to be mostly feasible. Notably, some employment specialists suggested that it would take time to strategize how VR-JIT would fit into their workflow. VR-JIT recipients reported that the intervention was highly acceptable and usable. The qualitative findings suggested that VR-JIT offered benefits for IPS employment specialists because of the addition of a focused and direct intervention for job interview training. Additionally, general technology skills were needed to ensure that the program would run smoothly in a community setting. Finally, a minority of recipients struggled to sustain VR-JIT engagement because of their psychiatric symptoms or challenges with focus and attention.
To overcome technology-related barriers, IPS staff and clients suggested adapting VR-JIT delivery for recipients with low computer literacy and training IPS staff in addressing recipients’ challenges with using technology. IPS staff also recommended that information to orient recipients on how to use VR-JIT be better at explaining that the intervention is a training activity. Furthermore, the results suggest that VR-JIT engagement and acceptability can be enhanced by aligning the duration and frequency of sessions with recipients’ preferences. Finally, barriers to use of public transportation were related to the need for multiple trips to the agency for VR-JIT sessions. Given that many low-income adults residing in an urban setting rely on public transportation, this finding raises the need to consider implementing VR-JIT via self-guided, fully remote, or hybrid (e.g., in-person VR-JIT orientation with remote support) delivery strategies. However, the feasibility of these implementation strategies is unknown and was beyond the scope of this study because we did not know the extent to which IPS clients had access to computing devices at home or in their neighborhood or whether they could borrow such devices from their agency. However, this scalability issue is agency specific and highlights the need for agency-level planning for VR-JIT implementation, which has associated labor costs (
32).
Overall, our results suggest that technology-assisted interventions offer individuals with serious mental illness more opportunities to achieve their potential for recovery. Over the past decade, research has suggested that integrating technology-based interventions within the existing mental health service delivery system is practical, feasible, and acceptable (
33–
35). Consistent with these previous studies, the initial implementation outcomes in this study suggest that VR-JIT can add value and offer relative advantages to existing service, while requiring consideration of some potential adaptations to overcome initial barriers. Thus, this study provides findings on an intervention that focused on employment readiness, whereas most existing research on technology-assisted interventions within mental health services has focused on symptoms and treatment adherence (
33).
Implications for Practice
We suggest some adaptations for future VR-JIT delivery. For instance, employment specialists could contextualize the importance of virtual interview practice before implementation, which may improve some issues with frustration or disengagement that were observed for a minority of VR-JIT recipients. In addition, mental health agencies could consider training peer support specialists as VR-JIT implementers, because a recent study found that peer support specialists reported that VR-JIT was acceptable and perceived themselves as being uniquely positioned to deliver the intervention (
36).
Additional implications focus on how VR-JIT could enhance IPS. For instance, employment specialists reported that they appreciated the blunt responses from the virtual hiring manager (i.e., “Molly Porter”) during the virtual interview. This approach gave their clients direct feedback from a third party in addition to their own supportive responses that were often influenced by their intention to be socially desirable. Additionally, a recent qualitative study on IPS employment specialists and VR-JIT indicated that clients perceived that working with VR-JIT improved their confidence in job interviewing (
37), consistent with earlier RCT results (
9). Notably, employment specialists perceived their clients’ improved confidence after VR-JIT as helping with IPS engagement and job search activities.
Limitations and Future Directions
Most VR-JIT sessions were delivered by trained research staff (i.e., VR-JIT implementers) as opposed to employment specialists. Although employment specialists gave feedback on how VR-JIT assisted their clients and their own work, the results are limited for forecasting implementation by employment specialists. For VR-JIT recipients, we note a potential risk to internal validity due to maturation effects. Specifically, VR-JIT recipients completed their acceptability and usability measures and initial focus groups before enrollment was paused because of the COVID-19 pandemic. Individual interviews occurred after the termination of RCT study enrollment (but were held during the originally planned follow-up period) because of the COVID-19 pandemic. Last, the four VR-JIT recipients participating in the interviews had been conveniently sampled from the eight VR-JIT recipients who were still engaged with IPS services during the pandemic; therefore, these views represent those of recipients with greater VR-JIT engagement.
Conclusions
This sequential, complementary mixed-methods evaluation revealed initial outcomes of VR-JIT implementation within IPS. The survey-based quantitative findings suggested that IPS staff and VR-JIT recipients found VR-JIT to be acceptable, usable, and feasible for improving job interviewing skills. Qualitative results gained from focus groups and interviews with VR-JIT implementers, IPS staff, and VR-JIT recipients provided additional depth and context to VR-JIT implementation findings, such as participants’ noting the advantage of offering VR-JIT within IPS to assist clients with obtaining employment more efficiently and providing recommendations for how to improve VR-JIT delivery. Implications for future VR-JIT delivery within IPS include several adaptations, such as additional training for implementers and alternative delivery strategies, to help overcome initial implementation barriers.
Acknowledgments
The authors thank the members of Thresholds who participated in the study and acknowledge the more than 50 Thresholds administrators, research staff, and IPS staff who diligently supported various aspects of the study implementation.