In the United States, approximately one in two adults will experience a mental illness in their lifetime, often beginning during childhood or adolescence (
1,
2). Because of a plethora of factors, including cost, stigma, availability of services, and a fragmented mental health care system (
3,
4), less than half will receive adequate treatment (
2). The combination of high prevalence and limited access to services has led to increasing efforts to train the lay public in first-line mental health response.
Modeled on the conventional approach to first aid, Mental Health First Aid (MHFA) is a course that trains the general public to recognize and respond to mental health issues in their communities (
5). MHFA was founded in 2000 in Australia and has since expanded to 24 countries, including the United States (
6). Besides the standard adult course, specialized MHFA curricula exist for law enforcement, firefighters and emergency medical services personnel, teens, higher education, rural settings, workplaces, and people who work with youths, older adults, and veterans (
7,
8).
To provide basic first-line assistance and make referrals to professional care, most people who participate in the MHFA course (i.e., trainees) learn a five-step action plan known as ALGEE: Assess for risk for suicide or harm, Listen nonjudgmentally, Give reassurance and information, Encourage appropriate professional help, and Encourage self-help and other support strategies (
9). Trainees of the teen MHFA course learn a modified action plan: Look for warning signs, Ask how they are, Listen up, Help them connect with an adult, and Your Friendship is important (
10). Course length and content may vary by country, and changes to the program are often made in light of new evidence. For example, the MHFA Australia ALGEE action plan language was recently updated to the following: Approach the person, Assess and assist with any crisis, Listen and communicate nonjudgmentally, Give support and information, Encourage the person to get appropriate professional help, and Encourage other supports (
11). MHFA USA continues to use the original ALGEE action plan. Additionally, in Australia, the standard adult course is 12 hours long, and the youth course is 14 hours long (
8), whereas in the United States, both classes are 8 hours long (
12). Several MHFA courses have been adapted for specific national, cultural, and linguistic contexts (
6,
13).
Since it was adapted and introduced to the United States in 2008, >2 million people have been trained in MHFA (
14). MHFA trainings are available to the general public and are typically voluntary. However, training in MHFA increasingly is required by some police departments, fire departments, and schools (
15), given the likelihood of encountering a person experiencing a mental health crisis in those settings. In an effort to support mental health programming and prevention efforts, MHFA has been explored in >87 studies (
16) and is supported by policy makers (
17,
18) as well as several health, education, and police departments (
19). MHFA encourages mental health literacy among the general public and professionals (e.g., paramedics, law enforcement officers, and teachers) who are likely to be called on to support mental health needs in their communities. Significant funding for MHFA-related projects is awarded through Project AWARE (Advancing Wellness and Resiliency in Education) state agency grants (
17,
20).
MHFA has consistently been shown to reduce stigma regarding mental health conditions and increase mental health knowledge, recognition of mental disorders, belief in effective treatments, and confidence and intent to help among its trainees, with mixed results for its effect on the amount of actual helping behavior performed by trainees (
21–
26). However, because most existing program evaluations (
16) and systematic reviews (
21,
22,
25,
26) have focused on evaluating direct training outcomes (e.g., changes in knowledge, attitudes, and behavioral intent), little is known about how effective MHFA is in addressing the mental health needs of those who receive helping behaviors of MHFA trainees. Two meta-analyses, with literature searches conducted in 2017 (
22) and 2018 (
24), have begun to examine this gap in the literature, finding no significant effects on the quality of helping behaviors provided by MHFA trainees (
22) or the mental health of recipients of MHFA-guided helping behaviors (
22,
24). Mei and McGorry’s recent commentary highlights the growing interest in and need to evaluate MHFA-related mental health outcomes among recipients (
27).
This systematic review included solely evaluations of MHFA actions taken outside of the classroom to provide an understanding of whether—and how—MHFA actions are helpful to those experiencing a mental health crisis. Although changing the knowledge, attitudes, and behavioral intent of MHFA trainees is an important and worthwhile goal, ultimately, it is crucial to understand how MHFA affects those it intends to help. Furthermore, because of the continued proliferation of programming (
6,
15,
19) and evaluation studies (
16), there is a need for an up-to-date synthesis of MHFA’s effects. Finally, although recent meta-analyses (
22,
24) have examined the evidence for MHFA’s effects on trainee behavior and recipient mental health, they did so among other training outcomes, with limited attention placed on the particular challenges associated with evaluating posttraining, “real-world” outcomes. To our knowledge, no other evaluations, systematic reviews, or meta-analyses have selectively focused on trainee behavior and recipient mental health outcomes. We conclude by providing several recommendations for how to strengthen the evidence base of the MHFA program.
Methods
We conducted a systematic review by using the Preferred Reporting Items for Systematic Reviews and Meta-Analysis (PRISMA) Statement (
28). (A table showing a PRISMA checklist of items is available as an
online supplement.)
Search Strategy
We searched the PubMed, PsycINFO, PTSDpubs, and EMBASE electronic databases for studies published before or on March 9, 2021. Owing to linguistic similarity in intervention names, we used the search terms “psychological first aid,” “mental health first aid,” “psychological crisis intervention,” and “mental health crisis intervention.” (The full search strategy and exact search terms are detailed in the
online supplement.) The research protocol was developed prospectively and in adherence to the PRISMA Protocols guidelines (
29).
Selection
We included peer-reviewed studies that evaluated behaviors taken by MHFA trainees to help people experiencing a mental health problem (i.e., recipients). We examined outcomes that assessed trainee behavior, ranging from whether the trainee approached someone in crisis and engaged in an MHFA-guided action (e.g., encouraged appropriate professional help) to the effect of MHFA-guided action on the recipients’ mental health. To isolate the impact of training outside of the classroom, we excluded outcomes evaluating changes in trainee mental health, knowledge, attitudes, or behavioral intent. Commentaries, book chapters, opinion pieces, protocols, reviews, and studies not published in English were also excluded. There were no restrictions on setting.
Two authors (S.F., K.S.) independently reviewed the database search results by title and abstract and selected studies on the basis of predetermined inclusion and exclusion criteria (see
online supplement). In total, 15% of titles and 10% of abstracts were randomly selected for review by both authors and compared for quality control. A third author (S.H.) compared the selections and settled any disagreements. Two authors (S.F., K.S.) then independently reviewed the full texts of selected studies for final inclusion.
Data Analysis
Four authors (S.F., K.S., M.B., K.J.) independently extracted study-level data related to setting, design (including whether the study incorporated a control group with or without pre- and posttest results), participant characteristics, intervention details, and outcomes evaluated. Four studies were randomly selected to be extracted independently again by a different author as a quality check.
Behavioral outcomes were categorized by type (trainee use of MHFA skills, helpfulness of trainee’s actions, or recipient mental health) and the person who reported it (trainee or recipient). The main findings of each study were then summarized and identified as evidence of positive effect, partial positive effect, no effect, or negative effect, on the basis of a p<0.05 significance level. If studies did not compare pre- and posttraining outcomes or did not have a control group, their findings were considered to have insufficient information to assess MHFA efficacy. Additionally, to ensure comparability across courses, only studies evaluating a form of MHFA that explicitly taught the ALGEE or the “Look, Ask, Listen, Help Your Friend” action plan were used to evaluate efficacy. Last, we did not report on behavioral outcomes measured immediately posttraining, because trainees would not have had sufficient time to perform any real-world MHFA actions by then.
Two authors (S.F., S.H.) independently assessed risk for bias using the Cochrane Risk of Bias tool, which rates studies as “low risk,” “high risk,” or “unclear risk” for bias in the following domains: random sequence generation, allocation concealment, blinding of participants and personnel, blinding of outcome assessment, incomplete outcome data, selective reporting, and other sources of bias (
30).
Results
The search identified 9,855 records, of which 1,093 were duplicates, 7,827 were excluded after title review, and 786 were excluded after abstract review. Of the 149 articles reviewed in full, 119 were excluded. These articles were excluded because they evaluated an intervention that was not MHFA (e.g., psychological first aid, N=57 studies), they did not measure at least one trainee behavior or recipient mental health outcome (N=36), the full papers were unavailable or not in English (N=11), they were systematic reviews (N=8), or they were other nonevaluation studies (N=7). One additional study (
31) was identified through an anonymous peer reviewer of this study, leaving 31 studies to be included in the synthesis.
Of these 31 studies, nine (
31–
39) were rigorous enough to be used to evaluate MHFA efficacy. Studies that did not meet minimum rigor criteria lacked a control group and reported posttraining outcomes only (
40–
47), had no control groups and reported pre- and posttraining outcomes (
9,
10,
48–
57), were a cluster-randomized controlled trial (RCT) with relevant outcomes not measured in the control group (
58), or were a cluster RCT that did not evaluate the ALGEE or the “Look, Ask, Listen, Help Your Friend” action plan (
59) (see
online supplement).
Description of Studies Used to Assess Efficacy
Of the nine studies we used to assess MHFA efficacy, three (
31,
38,
39) had not been synthesized in previous systematic reviews. All were RCTs (including two cluster RCTs). Follow-up periods ranged from 4 months to 3 years after the training. (A summary of the studies used to assess efficacy is available in the
online supplement.)
MHFA evaluations occurred in predominantly high-income countries, including six in Australia. Trainees came from diverse backgrounds and represented the general public, students, teachers, government employees, and parents. Nearly all MHFA courses were in person and taught by certified MHFA instructors or mental health professionals. Course formats varied from multiple, shorter sessions to long, 1-day sessions. Training totaled between 9 and 14 hours. Six studies evaluated the standard adult MHFA course, and three evaluated the youth course (for adults who work with youths).
Outcomes
Trainee use of MHFA skills, reported by trainee.
All of the reviewed studies measured trainee-reported use of MHFA skills to help a person experiencing a mental health condition (see
online supplement), with nine (
31–
39) meeting our criteria to allow a conclusion. Studies asked trainees whether they used MHFA skills at all (
31,
33,
34,
36–
39) and, if they did, the frequency of use (
33,
36,
37), with one study (
32) not specifying the questionnaire wording. Five studies (
31,
33,
37–
39) additionally considered the fidelity of trainee actions to the ALGEE plan. Three studies found a statistically significant increase in use of MHFA skills after 4 (
34), 6 (
31), and 24 (
37) months, whereas six (
32,
33,
35,
36,
38,
39) found no change in such use (see
online supplement). Four of the studies that found no change (
32,
36,
38,
39) were underpowered at posttraining because of significant loss to follow-up.
Trainee use of MHFA skills, reported by recipient.
Four studies asked about receipt of MHFA help from the recipient’s perspective (see
online supplement), one of which (
35) had enough information to enable a conclusion. The study found that high school students reported receiving increased information about mental health conditions from their trainee teachers after 6 months but did not report receiving increased help from them. The study was adequately powered.
Helpfulness of trainee’s actions, reported by trainee.
Seven studies asked trainees whether they perceived the assistance they provided as helpful to recipients. However, none of the studies had enough information to allow a conclusion (see
online supplement).
Helpfulness of trainee’s actions, reported by recipient.
Two studies (
38,
39) reported on the same RCT at different follow-up periods and asked adolescents how well their parents who were trained in MHFA supported them when they experienced a mental health difficulty. The studies found no effect of the training at 12, 24 (
39), or 36 months (
38) (see
online supplement). Both studies were underpowered.
Recipient mental health, reported by trainee.
Using the parent report version of the Strengths and Difficulties Questionnaire (SDQ), the aforementioned two studies (
38,
39) also found no change in parent-reported adolescent mental health difficulties at 12, 24 (
39), or 36 months (
38) (see
online supplement). Again, the two studies were underpowered.
Recipient mental health, reported by recipient.
Three studies (
35,
38,
39) assessed recipient mental health as reported by the recipient. In one study (
35), recipients were high school students whose teachers were trained, and in the other two studies (reporting on the same RCT at different follow-up periods) (
38,
39), recipients were adolescents whose parents were trained in MHFA. None of the studies found significant changes in mental health difficulties assessed with child report versions of the SDQ (see
online supplement). Two studies (
38,
39) were underpowered.
Risk for Bias
Risk for bias, according to the Cochrane Risk of Bias tool, is summarized in
Table 1 (
40–
59). Thirteen studies were identified as having overall high risk for bias (
10,
40–
44,
48–
52,
54,
58), two as having medium-to-high risk (
45,
57), seven as having medium risk (
9,
31,
46,
47,
53,
56,
59), seven as having medium-to-low risk (
32–
37,
55), and two as having low risk (
38,
39). Few studies used an allocation strategy that included random sequence generation (
31–
39) or allocation concealment (
32–
34,
36,
38,
39). Blinding of participants and personnel was impossible for all MHFA evaluations; this resulted in a high risk for bias only if study participants could become aware that they were participating in an intervention evaluation and, as a result, may respond to surveys in a systematically different way than did control group participants. Therefore, studies with no control group (
9,
10,
40–
57) or with a comparable intervention as control (
38,
39) were classified as having a low risk for bias, whereas studies with as-usual or waitlist control groups were classified as having a high risk for bias. Very few studies blinded outcome assessors (
32,
36–
39). Studies that did not have control groups were again classified as having a low risk for bias. Results for incomplete outcome data were mixed and dependent on loss to follow-up rates. The main other source of bias identified was not controlling for participant-level characteristics, such as trainee’s previous mental health response experience.
Discussion
We identified 31 studies that evaluated the behaviors of MHFA trainees and the mental health of recipients. All of the included studies asked trainees whether they used MHFA skills in real-life situations. However, few evaluated the helpfulness of their actions or their effects on recipient mental health. Only nine studies assessed MHFA efficacy in a rigorous manner. On the basis of these nine studies, we found mixed (positive and neutral) evidence of changes in trainees’ use of MHFA skills and no evidence of improvements in the helpfulness of trainees’ behaviors or recipient mental health.
Few of the included studies used rigorous study designs needed to establish effects of the MHFA program. Although several studies included pre- and posttraining outcomes (
9,
31–
33,
35–
39,
48–
59), fewer used a control group (
31–
39,
58,
59); randomly assigned participants to treatment (
31–
39,
58,
59); were adequately powered (
31,
33–
35,
37,
49); or accounted for participant-level characteristics that might influence MHFA helping behaviors (
31–
34,
36–
39,
46,
47,
49,
55,
57–
59), such as profession or previous mental health experience. Furthermore, most outcomes were reported by trainees. Although this form of assessment is an important first step and facilitates data collection, it is based on the subjective impression of the trainee and may be susceptible to social desirability and recall biases. Trainee reports are particularly undesirable for evaluating the impact of MHFA on recipients, because they involve making assumptions about recipients’ experiences. Some studies included both trainee and recipient reports for the same outcomes, which helped address reliability but did not address the aforementioned biases. Ideally, studies would use standardized surveys or professional assessments to evaluate mental health outcomes. Finally, although follow-up periods varied, none of the studies measured recipient mental health immediately after MHFA-guided trainee interventions. Because MHFA trainees are trained to provide a first-line response to a crisis situation, initial reductions in mental health difficulties of recipients may be more appropriate to measure than medium- and long-term effects. Overall, the risk for bias of the included studies was medium to high.
Previous systematic reviews of MHFA evaluations have addressed behavioral outcomes only minimally and alongside training outcomes. Systematic reviews and meta-analyses by Hadlaczky et al. (
21) and Maslowski et al. (
24) found moderate improvements in trainees’ use of MHFA skills, whereas Morgan et al. (
22) found small improvements. Morgan et al. also considered the quality of helping behaviors offered, reporting no significant improvements on this measure. Neither Morgan et al. nor Maslowski et al. found significant improvements in recipients’ mental health. Ng et al.’s 2020 systematic review (
25) focused on teen and youth MHFA and found that both training curricula generally resulted in more helping behavior of trainees. However, findings in a 2020 systematic review of youth MHFA for educators by Sánchez et al. (
26) were inconclusive for this measure. Differences in results for trainee use of MHFA skills were likely due to a mixture of study inclusion criteria and search dates. Hadlaczky et al. (
21), Ng et al. (
25), and Sánchez et al. (
26) did not restrict by study type, and Morgan et al. (
22) and Maslowski et al. (
24) included all controlled trials. Also of note, Maslowski et al.’s reporting of trainee use of MHFA skills included confidence measures, which increased the number of eligible studies used to assess this outcome. In our systematic review, we used only RCTs measuring actual helping behaviors and their effects on recipients to assess MHFA efficacy. Three of the nine studies we used to evaluate efficacy had not been synthesized in previous reviews. Finally, unlike in other reviews, we solely focused on posttraining behavioral outcomes and on whether outcomes were reported by trainees or recipients, an essential element in program evaluations.
Given that there currently are few studies of MHFA with adequate rigor, and that findings from these studies are mixed, we conclude that there is insufficient evidence that MHFA achieves the desired impact on the helping behaviors of trainees and the mental health of recipients. MHFA implementers should take particular care when describing the intervention as evidence based and be specific about outcomes when evidence does exist, such as improving trainee knowledge, attitudes, and behavioral intent (
21–
26), and when evidence is insufficient, such as whether MHFA measurably affects trainee behavior or recipient mental health. Notably, some evidence suggests that MHFA trainees who rated themselves as having high intent to help were more likely to report at follow-up that they had actually provided help (
60–
62). Thus, it is possible that training outcomes such as behavioral intent are mediating the relationship between MHFA training and trainee helping behavior, but this possibility requires further investigation. Future research could seek to isolate the specific training components and mechanisms that affect trainee behavior and recipient mental health to ultimately inform updates to the curricula.
A lack of good-quality evidence does not necessarily render MHFA an unhelpful intervention; in fact, it has been shown to positively affect trainees’ knowledge of mental health issues, attitudes toward mental illness, and intent to help (
21–
26). Rather, it highlights the gaps in our understanding of how it affects trainee behavior and recipient mental health. More—and more rigorous—evaluations of these outcomes are necessary. Researchers interested in building the evidence base for MHFA can draw on decades of development and evolution in program evaluation. Rigorous designs including MHFA trainee randomization, control groups, and longitudinal follow-up (beyond pre- and posttraining designs) are the minimum required to establish the efficacy of MHFA on trainee helping behavior and recipient mental health outcomes. To address the primary weaknesses in the existing literature, such as lack of power to detect statistically significant effects and bias introduced by a reliance on trainee reports, we consider it very critical that all future study designs prepare for substantial loss to follow-up and measure recipient responses to MHFA-guided helping behaviors. Moreover, future studies should choose more dynamic follow-up times that allow adequate time for trainees to encounter a situation requiring MHFA actions (
22) but also capture the initial, short-term impact of these actions on recipients. To overcome challenges related to data collection and design, study designs that use pre- and post-MHFA training assessments of trainee-recipient dyad outcomes (e.g., parents trained in MHFA and their children) (
38,
39) can serve as exemplar designs for future studies.
To facilitate data collection among potential recipients, studies may first be restricted to smaller populations in which recipients can be monitored more easily (e.g., families and schools) (
33,
38,
39); then, as programmatic effect is established, surveillance can be expanded to larger populations of potential recipients. Using a validated rubric to observe and rate simulated role-play may also help to address self-reporting biases of trainee behavior (
63). Several challenges are associated with evaluating posttraining outcomes, and, notably, evaluations of the even more ubiquitous physical first aid have been similarly limited (
64). However, innovative research designs (
33,
38,
39) and tools (
63) have begun to address these challenges and should continue to be supported and further improved. To enable this essential research, it is crucial that MHFA-supporting institutions and funding mechanisms (
20) allocate sufficient funds to evaluations of trainee behavior and recipient mental health that meet at least the aforementioned standards of rigor.
The main strengths of this review were its focus on trainee behavior and recipient mental health outcomes and conclusions that were based only on studies that met a standard of adequate rigor. This allowed us to highlight the current state of the evidence for outcomes that are infrequently studied yet crucial to understanding MHFA’s practical, real-world applications. The review was limited by the exclusion of non-English studies and of studies that were not available online, although there were relatively few such studies. Additionally, several of the included studies were underpowered, raising the possibility of type II error. Although pooling the studies into a meta-analysis would have addressed this, measurements for each outcome were too few and varied to do so in a meaningful way. Last, most evaluations were performed in high-income Western countries—Australia, in particular—thus limiting the generalizability of our findings. MHFA has been licensed and adapted in 24 countries (
6) and will likely continue to expand. Rigorous evaluations should be conducted in every setting where MHFA is performed, if possible, and particularly in low- and middle-income countries.
Conclusions
Our review found insufficient evidence that MHFA improves the helping behaviors of trainees or the mental health of recipients of such behaviors. Our findings highlight a crucial research and evaluation gap whose closure must be prioritized as MHFA continues to become more popular. As a psychoeducational intervention, MHFA addresses critical barriers to improving community mental health, such as stigma and public education. Furthermore, the rapid proliferation of (
5) and funding allocated to (
20) MHFA indicates a growing desire to understand and address mental health issues in the United States, and this momentum should not be lost. However, as MHFA trainees are expected and encouraged to provide first-line support to people experiencing mental distress, it is just as, if not more, important to understand how their actions affect those they intend to help.