Skip to main content
Full access
Brief Reports
Published Online: May 2013

A Review of School Mental Health Programs in SAMHSA’s National Registry of Evidence-Based Programs and Practices

Abstract

Objective

School programs provided by the Substance Abuse and Mental Health Services Administration’s National Registry of Evidence-Based Programs and Practices (NREPP) were reviewed to describe program characteristics, costs, and ratings of research and dissemination.

Methods

Data were gathered from the NREPP to identify mental health programs adaptable for schools. Program costs and quality and dissemination ratings were examined as a function of program characteristics.

Results

School mental health programs constituted 32% of the registry, with 44% providing only materials at cost and 46% providing universal mental health promotion rather than intensive supports. Readiness for dissemination was poorer for programs providing only intensive supports, and quality of research increased as total costs of program implementation increased.

Conclusions

Mechanisms for tracking mental health promotion and treatment can be effective in disseminating information about evidence-based school programming. Assessing program transportability is necessary for decision making to match programs with the needs of particular schools and communities.
School mental health programs are reducing the gap between youths who need and receive mental health services (1). This improvement is related to the many advantages the school context offers for prevention, early intervention, and intervention (1,2). The trend is also due to federal and state policies (36) that call for increased school mental health and the use of evidence-based programs and practices. However, evidence-based programs are often implemented in schools unsuccessfully or with poor quality, which increases the need to understand barriers (7) and capacities necessary for high-quality implementation (2) and to elucidate the connection between the delivery of effective evidence-based programs in schools and the achievement of valued outcomes (8).
In 2006, the Substance Abuse and Mental Health Services Administration (SAMHSA) launched the National Registry of Evidence-Based Programs and Practices (NREPP), an objective, scientifically based mechanism for tracking prevention and intervention programs for mental health problems and substance abuse and assisting users in evaluating their application. Programs are voluntarily submitted for consideration by their developers and evaluated by independent reviewers trained by SAMHSA. The registry documents empirical support and the resources and materials required for implementing these programs, and it identifies their readiness for dissemination. Particular programs are not endorsed, promoted, or approved by SAMHSA, leaving decision-making around program selection to the user. Few studies have examined the factors associated with use of NREPP in relation to the actual implementation of programs and practices in community and school settings.
A recent comprehensive review of mental health interventions in NREPP considered programs implemented for all ages and across all settings, including clinical setting, school, community, and home (9). Findings indicated that a greater proportion of youth programs were exclusively proprietary (that is, fees were charged by developers) in contrast to programs for adults or for adults and youths. Although the review (9) did not focus on the ability of schools, specifically, to implement these programs, it identified potential barriers for program implementation, with 57% of intervention program materials and training found to be proprietary. Although exclusively proprietary programs achieved significantly lower NREPP ratings for quality of research compared with programs that provided some or only publicly available materials, programs that provided only publicly available materials yielded lower readiness-for-dissemination scores, creating challenges in decision making by potential consumers (9).
We used this prior review (9) as a basis for this study and examined programs that are listed in NREPP as being appropriate for schools. We describe program characteristics according to each program’s level of intervention, area of focus, intended outcome, and proprietary status, and we calculated program costs to examine associations with program characteristics and program ratings on quality of research and readiness for dissemination provided on NREPP’s Web site. Our specific aims included reviewing differences in programs by using descriptive statistics for program characteristics, costs, and ratings; correlations between program costs and ratings; and analysis of variance (ANOVA) in program costs as a function of program characteristics and program ratings. Implications of this review are discussed, including the NREPP’s practical application, feasibility of conducting and maintaining these programs within schools, and future directions for school mental health programs.

Methods

During summer of 2011, we conducted a review of programs in NREPP’s database that were classified as mental health promotion or treatment (or both) and were deliverable in school settings. Reference to this date is critical due to the registry’s reviewing processes; interventions are being continuously added as intervention developers elect to participate through a voluntary, self-nomination system for review. We note, however, that SAMHSA’s process does not review all programs and that NREPP is not an exhaustive registry.
Using information provided on the registry, we describe the program characteristics, including intervention tier (that is, whether the intervention was universal, selective, or indicated), outcomes (substance use, academic achievement, violence rate, externalizing behaviors, family functioning, and suicide rate), mechanism addressed to reach outcomes (such as social or emotional skills, parenting, system-level delivery, cognitive-behavioral approaches, psychoeducation, or a combination of these), and proprietary status (whether materials and services were available only for a fee or whether they were available at no cost, or some combination). Program costs were calculated on the basis of the first year of use with ten students and for continuing the program a second year, which was based on the expenses that the program developers listed (such as for workbooks, training, software, and fidelity monitoring tools). Costs reflected the sum of the items listed by the program developer; costs for two programs could not be determined from information provided. We acknowledge that this may not encompass costs for hiring staff to deliver programs unless specified by the developer.
We considered program ratings, as scored by SAMHSA reviewers, on quality of research and readiness for dissemination. Quality was measured on a 5-point scale ranging from 0, not acceptably addressed, to 4, acceptably addressed. Readiness for dissemination was measured on a 5-point scale ranging from 0, not available, to 4, adequate materials available. The SAMHSA reviewer assessing quality of research assigns ratings according to reliability and validity of measures, intervention fidelity, missing data and attrition, and potential confounds accounting for intervention effects. Readiness-for-dissemination scores are assigned on the basis of the reviewer’s evaluation of implementation material, training and support resources, and availability of quality assurance procedures.

Results

At the time of the review, search results yielded 67 (34%) programs of the 200 interventions on the registry that met search criteria (specified by NREPP for being applicable in schools and focused on mental health promotion or treatment); however, four were not included in this review because the program’s target age did not include youths. In regard to program characteristics, almost half of the programs consisted of universal mental health promotion (N=29, 46%). Four (6%) programs were classified as selective, and three (5%) were indicated, whereas others addressed a combination of tiers. Only six (10%) programs provided support across all tiers, and five (8%) developers indicated that the program was not classifiable in this way.
The most common intervention outcomes addressed were youth substance use (N=15, 24%), followed by academic achievement (N=12, 19%), violence prevention (N=10, 16%), externalizing behaviors (such as attention-deficit hyperactivity disorder, conduct disorder, and aggression; N=8, 13%), family functioning (N=8, 13%), and suicide prevention (N=5, 8%). Programs used different mechanisms to address mental health; 19 programs (30%) targeted youth social or emotional skills or both, seven (11%) programs addressed parenting, four (6%) aimed efforts at system-level delivery, four (6%) addressed cognitive-behavioral approaches, three (5%) were primarily psychoeducation, and 26 (41%) programs targeted more than one of these areas.
Twenty-eight (44%) of the 63 programs provided all materials, training, and support at a cost—no program items were listed as available for free. Only one program (2%), TeenScreen (10), was publicly available at no cost. TeenScreen’s focus is to identify middle school and high school youths who need mental health services because of suicide risk or mental health concerns. TeenScreen reported free training, manuals, survey tools, consultation, and technical assistance, although costs for staff to administer the screening tool were not itemized in the estimate. This program was unique in that most programs identified expenses associated with program materials and manuals, training, measures, or technical support for adaptation or implementation. Program costs ranged from $0 to $50,000 for implementation for each year of the program. The mean±SD program cost was $4,338±$7,589 in the first year and an additional $1,856±6,805 for continuing the program a second year.
Program scores for quality of research (2.92±.56) and readiness for dissemination (3.34±.66) and their association with program costs are shown in Table 1. Notably, quality-of-research scores were not correlated with readiness-for-dissemination scores (r=.07, ns). Although quality of research was not significantly correlated with first-year costs, as quality scores increased, costs associated with second-year implementation (r=.34, p<.01) and total costs across the two years (r=.28, p<.05) increased.
Table 1 Descriptive statistics and correlations for ratings and costs of 63 school-based mental health programsa
RatingQuality of researchReadiness for dissemination1st-year costs2nd-year costsTotal costs
Quality of research.07.21.34**.28*
Readiness for dissemination .03.04.03
1st-year costs  .87**.97**
2nd-year costs   .96**
Total costs    
a
Registered in the National Registry of Evidence-Based Programs and Practices
*p<.05, **p<.01
ANOVAs examining differences in costs, quality-of-research scores, and readiness-for-dissemination scores as a function of program characteristics supported differences in program costs as a function of program tier (F=4.88, df=1 and 55, p<.05). Programs that focused on a single tier (universal, selective, or indicated) had significantly fewer costs the first year ($2,626±$3,284) compared with programs that addressed multiple tiers ($7,247±$11,699). Program readiness-for-dissemination scores also differed as a function of tier addressed (F=8.05, df=3 and 54, p<.01). Post hoc analyses indicated that programs that addressed only indicated supports yielded the poorest ratings for readiness for dissemination (1.93±.15), compared with universal interventions (3.48±.55) and interventions targeting multiple tiers (3.45±.59).
ANOVAs did not support significant differences in costs, quality-of-research scores, and readiness-for-dissemination scores as a function of program outcome or mechanism targeted to address outcome. Analyses supported differences in program costs as a function of proprietary status (first-year cost, F=4.78, df=1 and 58, p<.05; total costs across two years, F=4.07, df=1 and 58, p<.05). Programs that indicated that materials, training, or support were provided at cost and for free (that is, some items with cost and others not) had significantly fewer costs the first year (first year, $2,521±$2,903; total, $3,081±$4,202) compared with programs that provided all resources at cost (first year, $6,719±$10,570; total, $10,229±$19,863).

Discussion and conclusions

In summary, mental health treatment and promotion programs that can be delivered in schools accounted for 32% of NREPP, and only 10% of those support the provision of services across the continuum of prevention and selective and indicated intervention. Almost exclusively, programs required substantial expenses that generally did not include financial support and infrastructure for staffing and sustainability of programs, and almost half of them (42%) indicated they charged fees for all materials, training, and support. Although costs were higher for materials, training, and support for programs that addressed multiple tiers instead of just a single tier, it is important to consider the need to purchase additional programs to meet the needs of students in the other tiers. Readiness for dissemination was also low for programs addressing only tertiary or indicated supports in comparison with those addressing multiple tiers.
As found in the recent more general review of NREPP (9), this review, which focused on school-based programs for youths, suggests that navigation of evidence-based program registries to increase the likelihood of evidence-based practice is a challenging and multidimensional set of tasks. Effective integration of mental health intervention into the school environment often includes training school mental health professionals on how to work effectively in schools, including developing and maintaining relationships with school administrators, teachers, health staff, and others, and understanding relevant educational regulations and policies (11). Notably, the measurement and development of these relationships in school mental health continues to be explored empirically relative to interdisciplinary collaboration and resource sharing (12) and quality assessment and improvement (11).
While resource sharing and quality of interventions implemented with manuals are potentially captured by NREPP’s readiness-for-dissemination rating, additional assessment of the resources and infrastructure capacities for delivering these programs should be considered, given that they are essential to successful implementation (13). In addition, considering modular approaches for tertiary interventions may prove useful in advancing understanding of transportability and implementation in school settings (14). Further research is needed to understand organizational qualities (such as infrastructure, financing, implementation support, and emphasis on quality) that influence successful use of such registries and transportability of evidence-based programs and practices into common practice settings.

Acknowledgments and disclosures

The authors report no competing interests.

References

1.
Stephan SH, Weist MD, Kataoka S, et al.: Transformation of children’s mental health services: the role of school mental health. Psychiatric Services 58:1330–1338, 2007
2.
Weist MD, Stephan SH, Lever N, et al.: Quality in school mental health; in Advances in School-based Mental Health Interventions. Edited by, Evans SW, Weist MD, Serpell Z. New York, Civic Research Institute, 2007
3.
New Freedom Commission on Mental Health: Achieving the Promise: Transforming Mental Health Care in America. Final Report. Pub no SMA-03-3832. Rockville, Md, US Department of Health and Human Services, 2003
4.
Mental Health: A Report of the Surgeon General, Final Report. Pub no SMA-03-3832. Rockville, Md, US Department of Health and Human Services, 1999
5.
Katulak NA, Brackett MA, Weissberg RP: School based social and emotional learning (SEL) programming: current perspectives; in International Handbook of Educational Change, 2nd ed. Edited by, Lieberman A, Fullan M, Hargreaves A, et al. New York, Springer, 2008
6.
No Child Left Behind (NCLB) Act of 2001, Pub L No 107-110, §115, Stat 1425, 2002
7.
Langley AK, Nadeem E, Kataoka SH, et al.: Evidence-based mental health programs in schools: barriers and facilitators of successful implementation. School Mental Health 2:105–113, 2010
8.
Forman S, Olin SS, Hoagwood KE, et al.: Evidence-based interventions in schools: developers’ view of implementation barriers and facilitators. School Mental Health 1:26–36, 2009
9.
Hennessy KD, Green-Hennessy S: A review of mental health interventions in SAMHSA’s National Registry of Evidence-Based Programs and Practices. Psychiatric Services 62:303–305, 2011
10.
McGuire LC, Flynn L: The Columbia TeenScreen Program: Screening youth for mental illness and suicide. Trends in Evidence-based Neuropsychiatry 5:56–62, 2003
11.
Stephan SH, Davis ET, Callan B, et al.: Supervision in school mental health; in Helping Others Help Children: Clinical Supervision of Child Psychotherapy. Edited by, Neill T. Washington, DC, American Psychological Association, 2006
12.
Weist MD, Lever N, Stephan S, et al.: Formative evaluation of a framework for high quality, evidence-based services in school mental health. School Mental Health 1:196–211, 2009
13.
Mellin EA, Weist MD: Exploring school mental health collaboration in an urban community: a social capital perspective. School Mental Health 3:81–92, 2011
14.
Chorpita BF, Daleiden E: Weisz: Modularity in the design and application of therapeutic interventions. Applied and Preventive Psychology 11:141–156, 2005

Information & Authors

Information

Published In

Go to Psychiatric Services
Go to Psychiatric Services

Cover: Portrait of a Woman, by William Beckman, ca. 1988. Oil on board. Photo credit: Jerry L. Thompson/Art Resource, New York City.

Psychiatric Services
Pages: 483 - 486
PubMed: 23632576

History

Published in print: May 2013
Published online: 15 October 2014

Authors

Affiliations

Melissa George, Ph.D.
The authors are affiliated with the Department of Psychology, University of South Carolina, 1512 Pendleton St., Columbia, SC 29208 (e-mail: [email protected]).
Leslie Taylor, Ph.D.
The authors are affiliated with the Department of Psychology, University of South Carolina, 1512 Pendleton St., Columbia, SC 29208 (e-mail: [email protected]).
Sara C. Schmidt, M.A.
The authors are affiliated with the Department of Psychology, University of South Carolina, 1512 Pendleton St., Columbia, SC 29208 (e-mail: [email protected]).
Mark D. Weist, Ph.D.
The authors are affiliated with the Department of Psychology, University of South Carolina, 1512 Pendleton St., Columbia, SC 29208 (e-mail: [email protected]).

Metrics & Citations

Metrics

Citations

Export Citations

If you have the appropriate software installed, you can download article citation data to the citation manager of your choice. Simply select your manager software from the list below and click Download.

For more information or tips please see 'Downloading to a citation manager' in the Help menu.

Format
Citation style
Style
Copy to clipboard

There are no citations for this item

View Options

View options

PDF/ePub

View PDF/ePub

Get Access

Login options

Already a subscriber? Access your subscription through your login credentials or your institution for full access to this article.

Personal login Institutional Login Open Athens login
Purchase Options

Purchase this article to access the full text.

PPV Articles - Psychiatric Services

PPV Articles - Psychiatric Services

Not a subscriber?

Subscribe Now / Learn More

PsychiatryOnline subscription options offer access to the DSM-5-TR® library, books, journals, CME, and patient resources. This all-in-one virtual library provides psychiatrists and mental health professionals with key resources for diagnosis, treatment, research, and professional development.

Need more help? PsychiatryOnline Customer Service may be reached by emailing [email protected] or by calling 800-368-5777 (in the U.S.) or 703-907-7322 (outside the U.S.).

Media

Figures

Other

Tables

Share

Share

Share article link

Share