Site maintenance Wednesday, November 13th, 2024. Please note that access to some content and account information will be unavailable on this date.
Skip to main content

Abstract

Objective:

The aim of this review was to explore what is known about the effectiveness of strategies to increase the use of research in mental health policies.

Methods:

PsycINFO, MEDLINE, PubMed and EMBASE were searched for peer-reviewed journal articles by using the terms information dissemination OR knowledge OR diffusion of innovation OR knowledge transfer OR knowledge exchange OR evidence based OR evidence informed AND mental health policy OR decision makers. Searches were limited to articles pertaining to humans, written in English, and published from 1995 to 2013. Studies were excluded if they did not include a component related either to mental health policy or to mental health policy and decision makers or did not describe the development, implementation, or evaluation of an intervention that included a component aimed at increasing use of evidence. Reference lists were scanned to identify additional papers.

Results:

The search returned 2,677 citations. Fifty additional papers were identified via reference lists of relevant articles. Nine separate intervention studies were identified that included a component aimed at increasing use of evidence in mental health policy. All employed at least three strategies to increase evidence use, mostly in regard to implementation of a particular evidence-based policy. Methodologies of the identified studies did not enable estimation of the effectiveness of individual strategies to increase evidence use.

Conclusions:

Little research has examined how to increase the use of evidence in mental health policy. Available research suggests a number of potentially effective strategies for increasing the use of evidence that warrant further examination.
Mental health problems cause great distress for individuals and their loved ones and have major cost implications for the broader community. Although a number of effective treatments have been established for a range of mental health problems, there remains a large gap between the treatment that the evidence suggests is optimal for a given condition and the treatment actually received (1,2). In Australia, it has been estimated that if evidence-based treatments were delivered instead of usual practice across a range of disorders, 15% of the total burden of disease (measured in years lived with disability) would be averted (28% versus 13%), with negligible change in total expenditure on mental health care (3).
This widely acknowledged evidence-practice gap in mental health (1) has come to the attention of governments internationally, motivating pledges to support evidence-informed mental health policy (47). The involvement of governments in leading the delivery of evidence-based services is vital because the mental health service system is shaped by incentives and disincentives to deliver particular treatments and services that are included in government policies (1,8). This is evident, for example, in Washington State’s success in expanding the use of evidence-based mental health interventions by diverting funds from the justice system (9) and by the Substance Abuse and Mental Health Services Administration’s efforts to encourage uptake of evidence-based practices (10). Changes in policy can drive mental health service reform efforts (11), and the decisions made by policy makers and other system leaders to endorse or mandate the use of particular evidence-based practices are often the first step in implementation.
For the purposes of this article, “policy” is defined as “a formal statement or action plan developed by a government agency or statutory body in response to an identified problem. This includes regional, state, or national legislation, policies, programs, directives, protocols, guidelines, and service models” (12). Internationally, there are several examples of mental health policy initiatives that are strongly evidence informed. One of the largest is the Improving Access to Psychological Therapies program. Launched in the United Kingdom in 2008, this program aims to substantially increase access to the psychological treatments judged by the National Institute for Health and Care Excellence to be evidence based, primarily cognitive-behavioral therapy for depression and anxiety (13,14). In the United States, the Veterans Health Administration has been a national leader in the implementation and dissemination of evidence-based psychological therapies since the development in 2004 of its Mental Health Strategic Plan. In addition, more than 33 states have initiated partnerships with universities and colleges to drive the implementation of specific evidence-based mental health practices (8,11,15,16). These have tended to focus on evaluation and training, and their impact has not always been maximized because of the comparatively low funds and efforts devoted to knowledge exchange (16). Evidence-based practice uptake by states has also been researcher driven; some treatment development teams (for example, teams developing multisystemic therapy [17]) have been successful in having their program adopted as policy in various locales. Taking a different approach, the Alberta Depression Initiative has designed an evidence-based knowledge transfer and exchange strategy to guide its work (18).
Despite the success of these programs, many believe that overall progress in establishing evidence-informed mental health policy has lagged behind that of evidence-based health policy more generally (19). Indeed, little is known about the factors that facilitate the creation and adoption of evidence-informed mental health policy (11,20), and few studies appear to have investigated the use of evidence in mental health policy development. A recent review of knowledge translation (which was defined by the authors as “a dynamic and iterative process that includes synthesis, dissemination, exchange and ethically sound application of knowledge to improve the health of Canadians, provide more effective health services and products and strengthen the health care system”) in mental health (21) found 187 relevant papers; however, in only eight of these papers were policy makers one of the stakeholder groups identified. Instead, most of the papers focused on service providers, and a significant number involved people with lived experience of mental ill health. In the United States in particular, there has been a clear move toward the adoption of evidence-based practices by many states (11); however, few papers have described how states came to endorse evidence-based practices in general or the particular practices implemented. This suggests that the role of policy and policy makers in mental health knowledge translation has not yet come to the fore as a key concern for the research community.
The theoretical and conceptual understanding of how research is used more broadly in policy development has increased greatly in recent years. Furthermore, in the health and public health field many strategies have been proposed and tested to increase the use of research evidence in policy development. The types of strategies that have been tested to date include training policy makers in critical appraisal skills (22) and providing policy makers with relevant evidence summaries (23). More complex programs have also been studied, such as providing policy makers with access to an online registry of research evidence; providing tailored evidence messaging; engaging the services of a knowledge broker (24); and offering policy agencies a facilitated program of evidence awareness, access to tailored research evidence, critical appraisal skills development, networking, and evidence summaries (25).
Despite the existence of research examining strategies to increase the use of evidence in health policy and the many papers describing how research is used in a policy context, there have been few attempts to develop predictive models, or action frameworks, to organize knowledge and enable a systematic approach to selecting and testing potential intervention strategies (26). Indeed, we were unable to identify any unifying theory of how to increase the use of research evidence in mental health policy. A model from the public health field was developed by the Centre for Informing Policy in Health with Evidence from Research (CIPHER) and is known as the SPIRIT (Supporting Policy In health with Research: an Intervention Trial) (26) action framework (Redman S, Turner T, Davies H, et al., unpublished manuscript, 2014). This model was designed to facilitate the development and testing of interventions to improve the use of research evidence in policy and program development. It hypothesizes that a host of factors influences policy at any given time, including public opinion, media, the economic climate, political ideology and priorities, and stakeholder interests. Research is viewed as only one of the factors that can influence policy. The SPIRIT action framework hypothesizes that a catalyst is needed to stimulate the use of research; that responses to this catalyst are determined by the capacity of the organization, including the extent to which the organization values research and the tools and systems for research use that are in place and the skills of the staff; that research engagement actions are employed, including accessing and appraising research evidence, generating new research, and interacting with researchers; and that research use occurs in a variety of ways.
This article explores what is known about the extent to which strategies to increase the use of research in mental health policies are effective. Although there are other action frameworks available, we have elected to use the SPIRIT action framework to categorize the interventions that have been studied. Because research of this nature tends to be multifaceted, use of the framework will enable a clearer assessment of the types of strategies used to date, the areas of the action chain that have been the focus of most attention, and the areas that have not yet been explored.

Methods

Data Sources and Search Strategy

MOOSE (meta-analysis of observational studies in epidemiology) guidelines were used to inform the methods of this review (27). PsycINFO, MEDLINE, PubMed, and EMBASE were searched for peer-reviewed journal articles focusing on strategies to increase the use of research in mental health policy. We combined subject headings (MeSH) with text words in order to avoid missing publications that had not been indexed at the time of searching. Search queries were combined using the Boolean operators AND or OR, with the aim of increasing sensitivity, reducing false positives, and preventing repetitive citations. The search terms used were information dissemination OR knowledge OR “diffusion of innovation” OR knowledge transfer OR knowledge exchange OR evidence based OR evidence informed OR decision makers and mental health policy. OR public mental health policy. Searches were limited to those pertaining to humans, written in English, and published from 1995 to 2013. The reference lists of obtained articles were searched for other relevant articles that may have been missed. Authors were contacted for additional information or to answer queries as required.

Classification

All of the publications located through the search were included in the coding process. One researcher (AW) coded the publications, and another two researchers (SRM and CM) reviewed the coding by using the definitions and exclusions outlined below. Any disagreements were resolved through discussion.

Exclusion Criteria

Publications were excluded if they did not describe the development, implementation, or evaluation of an intervention that included a knowledge exchange component and if mental health decision makers (including policy makers at state, provincial, and national levels, as well as other leaders who are responsible for making large-scale decisions about the adoption or implementation of mental health recommendations in their locale) did not receive at least some part of the intervention. Unpublished articles and published conference abstracts for which an accompanying article could not be located were excluded. Narrative accounts of interventions that did not employ standard intervention designs were included in the review. When multiple papers derived from a single study or program were identified, information regarding the methods and results of the intervention was collected from each publication and is grouped together in the relevant tables. Articles reporting on studies that included clinicians (either managing or practicing) in their samples were included if the studies also involved decision makers.

Establishment of Domains

Two of the authors (AW and SRM) independently read all of the studies that met the inclusion criteria and mapped the strategies used to increase the use of evidence in mental health policy to the strategies outlined in the SPIRIT action framework. The strategies identified were organized into the following three SPIRIT domains, each with several subdomains: policy influences, including media, public opinion, or stakeholder interests; capacity, which refers to increasing the extent to which the organization and staff value research, increasing the extent to which the organization has the tools and systems needed to support research engagement and use, and increasing the extent to which staff have the skills and knowledge to engage with and use research; and research engagement actions, which involve increasing access to research evidence, increasing skills to appraise research evidence, increasing the generation of new research or analyses by decision makers, and increasing the interaction between decision makers and researchers.

Intervention Studies

Following Moore and colleagues (28), we examined all the intervention studies in terms of the strategies tested, the results, and the methods employed. Information regarding each intervention strategy was categorized as follows: the strategy tested (coded as outlined above), target population (coded as decision makers, including policy makers at state, provincial, and national levels, as well as other leaders who are responsible for making large-scale decisions about the adoption or implementation of mental health recommendations in their locale; clinicians, encapsulating all professionals delivering a mental health–related service, such as psychiatrists, psychologists, social workers, general practitioners, and mental health nurses; consumers; researchers; analysts; foster parents; and the general public), level at which the intervention was administered (coded as group, individual, dyad, both individual and group, and both individual and dyad), policy level (coded as regional, including policies that affect U.S. counties and U.K. National Health Service Health Boards; state, including policies that have an impact on U.S. states and Canadian provinces; and national), number of times administered and frequency administered (coded as once, weekly, monthly, ongoing, or unclear), duration of the intervention (coded as time from baseline to final follow-up), funding source (coded as publicly funded research agency, government policy agency, and independent broker agency, as acknowledged in the publication or on the agency’s Web site), and whether decision makers were involved in the research (coded as at least one person from a policy or broker agency participating in one or more of the following, as judged through authorship or acknowledgments: designing, implementing, or interpreting the findings of an intervention).
Research methods employed in the intervention studies were categorized as follows: research design (coded as case study, cross-sectional, randomized controlled trial, cluster-randomized trial, multicase study, and rolling cohort), sample selection and size (coded as the number of individuals or groups participating in the study and how they were selected), data collection method (coded as one or more of the following: self-report, focus groups, questionnaires, document review, activity log, meeting transcripts, video [recording of sessions to monitor fidelity], or independent assessment), when outcome measures were collected (coded as time from the conclusion of the intervention to final follow-up—immediately after the intervention, six months after the intervention, or 24 months after the intervention).

Results

The original search returned 2,677 citations, and an additional 50 papers were identified by scanning the reference lists of relevant articles. After duplicates and papers that did not meet selection criteria were excluded, 17 papers arising from nine separate intervention studies remained (either protocols, case study descriptions of interventions, or standard reporting of trial outcomes). It is important to note that most studies did not refer to policy makers per se. Therefore, we reviewed studies examining interventions that included mental health decision makers.

Methods of the Intervention Studies

Of the nine intervention studies located, two were randomized controlled trials (29,30), one study employed a rolling cohort design (31), and the remaining six utilized a case study approach (3237) (Table 1). Most studies represented major pieces of work and involved large groups, such as entire counties or communities (eight to 56 counties) (29,37,38), 18 teams of local authorities and agencies (31), and ten community mental health service teams (35). A small group of nine data analyst–manager dyads was used in one study (32), and three management or service delivery teams within a mental health organization took part in another (36). The scale of several studies was unclear, but the participants involved were mental health organizations (number unspecified) and segments of the Veterans Health Administration (sample size unspecified) (33). The policies or decision makers relevant to the majority of identified studies (five of nine studies) operated at a regional level (29,32,35,36,38). Two studies addressed state-level policies or decision makers (34,37), and two addressed national-level policies or decision makers (31,34).
TABLE 1. Assessment of the quality of studies of interventions to increase the use of research in mental health policiesa
StudyDesignSample selection and sizeMethodData collection methodsTiming of data collectionOutcome measures
Chamberlain et al., 2008–2012 (30,31,38,48)Randomized controlled trial56 counties in California and OhioCounty clusters were randomly assigned into 1 of 2 conditions and 3 timeframes (cohorts 1, 2, and 3). In the individual condition, counties implemented the evidence-based practice (MTFC) individually in the absence of community development teams. For the community development teams, groups of 4–8 counties were assembled into teams and helped to engage in peer-peer networking, as well as receiving technical assistance and support from consultants.Self-report surveys, document review, video data (to monitor fidelity), activity log (for example, contact logs between the system leaders and study staff, number of foster homes available, number of foster parents recruited), independent assessment (for example, trainer impressions of foster parents or staff)Data collected across the 10 intervention componentsStages of Implementation Completion measure, which uses multiple data sources to assess completion of multiple stages of implementation; time taken for counties to complete multiple stages of MTFC implementation; time taken to place children in an MTFC-based foster care
Glisson et al., 2005–2013 (17,29,43,44,49)Randomized controlled trial14 counties from the Appalachian region of Tennessee. To be eligible, a county could not fall within a metropolitan statistical area, had to contain primarily communities with <2,500 residents and no communities with >15,000, had to have a lower per capita income than the state average ($28,641), and had to have a greater proportion of children living in poverty than the state average (17.6%).2 × 2 factorial design. The ARC intervention was randomized at the county level; 6 of 14 counties were allocated to receive ARC. Then young people referred to juvenile court for delinquency in all 14 participating counties were randomly assigned to receive a new evidence-based practice (multisystemic therapy) or usual care. ARC involves a series of activities guided by an ARC specialist to create support within the community for services for the target population (in this case, delinquent youths), help service organizations facilitate improvements in service delivery, and develop social networks among key individuals (opinion leaders, stakeholders, and service providers). About 30% of ARC specialist time is spent with community stakeholders. ARC addresses challenges to implementation with 3 strategies: providing organizational tools needed to identify and overcome service barriers (for example, teamwork and goal setting), introducing principles of effective service systems, and enhancing service provider behaviors and attitudes that are conducive to service improvement efforts.Therapist logs, caregiver reportsTherapist log completed weekly and collected weekly; TAM-R completed monthly; MST audio coding (3 sessions per family from early, mid, and late treatment); CBCL completed at baseline, 6, 12, and 18 months; number of youths in out-of-home placements identified via monthly phone calls with caregivers and in-depth interviews at baseline, 6, 12 and 18 monthsMultisystemic therapist logs, TAM-R by phone, multisystemic therapy audio coding adherence system, caregiver report, CBCL, youths in out-of-home placements
Chamberlain et al., 2012 (31)Rolling cohort18 of 20 teams of local authorities and agencies in England that won funding to establish MTFCNational implementation team assisted successful grantees in implementing MTFC. This involved providing guidance and assistance at all levels, including setting up multiagency teams, helping to make the systems-level changes needed to implement the program, providing training for staff and foster parents, and providing ongoing supervision and regular audit and feedback sessions.Details not providedAnnually for each of the 4 years of the grantDetails not provided
Feinberg et al., 2002 (37)Cross-sectional203 community leader participants (for example, from government, human services, law enforcement, and schools) from 21 communties in PennsylvaniaCommunity leaders received training in prevention science during the 1-year planning period for Communities That Care (CTC) program implementation. Three training sessions were held: key leader orientation, 2 days; risk and resource analysis, 3 days; promising approaches training, 3 days.Structured interview with the CTC program director collecting qualitative and quantitative data for each of the 21 local coalitions; then interviews with 9 key leaders from each of the 21 communitiesProgram directors, 2 years after the training intervention; key leaders, 4 years after; no pretraining measuresAttendance at training (self-reported and from register); comprehensive, structured interview for key leaders and program directors covering the following domains: readiness, attitudes and knowledge, CTC external relations, CTC functioning, CTC efficacy
Driedger et al., 2010 (32)Case study9 data analyst–manager dyads, who were part of an earlier mapping project, selected from Ontario Early Years Centres in Ontario, Canada2 training sessions for data analysts in the use of EYEMAP mapping software and other mapping software if required and ongoing support and advice; 4 training sessions for analyst-manager dyads on how to interpret spatial data and how to apply these data to policy and program development workField notes, e-mail exchanges between the research team and participants, self-report, focus groups, qualitative interviewsPre-, mid- and postinterventionMapping skills assessment, qualitative data (focus groups, interviews, e-mails, field notes)
Luck et al., 2009 (33)Case studyAudience segmentation used to target segments on a national (U.S.) level, including decision makers, clinicians, frontline workers, and consumers; sample size not specifiedSocial marketing approaches adapted to key segments. For national-level decision makers: research summaries, cost-benefit estimates, brochures, meetings, and teleconferences; for service-level decision makers: implementation plans, training, training materials, clinical evidence summaries, small group meetings, conference invitations, and meetings to discuss customization; for frontline workers and clinicians: evidence summaries, case studies and testimonials, training and procedures, presentations by practitioners; for potential consumers: presentations by Veterans Service Organization representatives and information from other veterans who had engaged with the programNot specifiedDetails not providedDetails not provided
McGrath et al., 2009 (34)Case studyKey stakeholders in child mental health in Nova Scotia, Canada; sample size and method of selection not specifiedIntegrated knowledge translation strategy. Researchers engaged in numerous dissemination strategies in 3 main phases of the project: design, research, and study completion. For the design stage, feedback on program development, design of materials, cost-effectiveness, and service delivery was sought from key stakeholders. For the research phase, key stakeholders were contacted regularly and provided with feedback regarding the trial (costs, progress, and risk management). Community dissemination strategies were undertaken (such as mall visits, study launches, community exhibits, and TV campaigns) to promote Family Help. For the study completion phase, researchers presented study findings at relevant conferences.A number of qualitative observations offered, but no details provided on how these data were collectedDetails not providedDetails not provided
Stark et al., 2013 (35)Case studyCommunity mental health service teams from the U.K. National Health Service (NHS) Highland were targeted because of a knowledge exchange partnership between the University of Stirling and the NHS Highland; 10 of 13 teams participated; 13 family caregivers and 7 people with dementia were interviewed. Eligibility criteria were not described.Researchers from the University of Stirling and executives from the NHS Highland partnered to facilitate knowledge translation and production of locally relevant dementia research. Steering group included NHS executives, researchers, service staff, and charity representatives. Monthly meetings covered decision making, meeting national dementia targets, and identifying local priorities. A systematic review of rural dementia service literature was conducted. Observational, case study, and consultation data were collected to identify current practices of service staff and knowledge of policy makers. Data plus the literature review findings were used to develop and implement strategies to increase knowledge or skills related to dementia management.Data from surveys, observation, consultations, and interviews used to measure preliminary outcomes, practices, and knowledge; data then guided further initiatives; unclear how main outcomes were assessedDuring and immediately after the 2-year projectA survey of community mental health services to identify current practice; observation of practice in community, clinic, and ward settings; consultation with service users and caregivers
Ward et al., 2012 (36)Case study3 management or service delivery teams in a U.K. mental health organizationA knowledge broker was engaged to help each team formulate a plan to address a particular service delivery or evaluation challenge. The broker’s work addressed 3 components of knowledge exchange: information management, linkage and exchange, and capacity building.Qualitative data collected from knowledge broker field notes and narrative interviews conducted by an independent researcher with members of the teamsKnowledge broker field notes collected throughout the trial; narrative interviews conducted at the end of the trialQualitative data; no formal outcome measures used
a
ARC, Availability, Responsiveness and Continuity model; CBCL, Child Behavior Checklist; MTFC, multidimensional treatment foster care; TAM-R, Therapist Adherence Measure–Revised
The data collection methods used for several of the identified interventions were not clearly stated (31,33,34). The remaining studies (17,32,3538) all employed multiple methods of data collection, reflecting their complexity. The data collection methods included logs (of field work, of contact between relevant parties, and of implementation or intervention components completed and therapist logs); review of documents, transcripts, videos, and e-mail exchanges; qualitative interviews; and self-report or caregiver report surveys. The outcome measures used in several of the interventions were not specified (31,33,34,36). The two largest-scale interventions used a mixture of standardized and bespoke outcome measures. For example, Chamberlain and colleagues (39) used the Stages of Implementation Completion measure, a comprehensive tool that utilizes information from a variety of sources to track the completion of each stage of implementation. They also monitored the time taken to complete each stage and the time taken to place children in a foster home that used multidimensional treatment foster care (MTFC) (38). Glisson and colleagues (29,40) used the Therapist Adherence Measure–Revised (41) and the Child Behavior Checklist (42), as well as multisystemic therapy audio coding and monitoring of the number of youths in out-of-home placements. Other outcome measures used included mapping skills assessment (32), surveys of community mental health services to identify current practices (35), and observation of clinical practice in community, clinic, and ward settings (35).

Strategies to Increase the Use of Evidence

An overview of the nine intervention studies that met the criteria of this review, along with the strategies they tested that aligned with those in the SPIRIT action framework, is presented in Table 2. A wide range of strategies was described in the identified intervention studies, and all the studies employed multiple strategies. There was a high degree of overlap in the strategies tested in various studies. Seven of the nine studies examined the dissemination of particular evidence-based practices to decision makers, as opposed to examining methods of increasing the use of evidence by decision makers per se.
TABLE 2. Overview of studies of interventions to increase the use of research in mental health policies
StudyStrategies testedTarget group and policy levelLevel administeredFrequency of interventionDuration of interventionDecision maker or broker involvedFunding sourceResults
Chamberlain et al., 2008–2012 (30,31,38,48)Stimulating better communication and relationships among stakeholders; increasing the extent to which the organization and staff value research evidence; increasing the extent to which staff have the knowledge and skills to use evidence; increasing the extent to which the organization has the tools and systems needed to support research engagement and use; increasing access to research evidence; increasing the interaction between decision makers and researchersDecision makers, clinicians, frontline workers, and consumers at the regional levelIndividual and groupOngoingNot statedDecision makersGovernment policy agenciesTrial still in progress
Chamberlain et al., 2012 (31)Stimulating better communication and relationships among stakeholders; increasing the extent to which the organization and staff value research evidence; increasing the extent to which staff have the knowledge and skills to use evidence; increasing the extent to which the organization has the tools and systems needed to support research engagement and use; increasing the interaction between policy makers and researchersDecision makers, frontline staff, and foster parents at the national levelIndividual and groupDetails not provided4 yearsDecision makersGovernment policy agencyNo data on formal study outcomes were provided. However, several sites received grants and established multidimensional treatment foster care, and positive system changes reportedly occurred.
Driedger et al., 2010 (32)Increasing the extent to which the organization and staff value research evidence; increasing the extent to which staff have the knowledge and skills to use evidence; increasing the generation of new research or analysis by policy makersDecision makers and analysts at the regional levelBoth individual and dyadOngoingApproximately 2 yearsDecision makersPublicly funded research agencyMost analysts exhibited increased mapping skills. Qualitative data indicated some increase in the use of maps to support decision making.
Feinberg et al., 2002 (37)Increasing the extent to which the organization and staff value research evidence; increasing the extent to which staff have the knowledge and skills to use evidence; increasing the extent to which the organization has the tools and systems needed to support research engagement and use; increasing access to research evidence; increasing the generation of new research or analyses by decision makers; increasing the interaction between decision makers and researchersDecision makers at the state levelGroupOngoing12 monthsDecision makersGovernment agencyKey leader training was associated with leaders’ more positive perception of the internal and external functioning of the coalition. Some evidence was found that the intervention may improve individual attitudes and knowledge.
Glisson et al., 2005–2013, (17,29,43,44,49)Stimulating community support; stimulating better communication and relationships among stakeholders; increasing the extent to which the organization and staff value research evidence; increasing the extent to which staff have the knowledge and skills to use evidence; increasing the extent to which the organization has the tools and systems needed to support research engagement and useDecision makers, clinicians, frontline workers, and consumers at the regional levelIndividual and groupOngoingApproximately 40 monthsDecision makersPublicly funded research agencyAt 6-month follow-up, youths total problem behavior score was significantly lower among those in the group receiving multisystemic therapy and Availability, Responsiveness and Continuity (ARC) intervention, compared with youths in the other conditions. No difference was found between groups at 18-month follow-up in terms of problem behavior, but youths in the group receiving multisystemic therapy plus ARC were significantly less likely to have entered out-of-home placements.
Luck et al., 2009 (33)Stimulating community support; stimulating better communication and relationships among stakeholders; increasing the extent to which the organization and staff value research evidence; increasing access to research evidence; increasing the interaction between decision makers and researchersDecision makers, clinicians, frontline workers, researchers, and consumers at the national levelGroupDetails not providedDetails not providedDecision makersGovernment policy agencyThe depression collaborative care model promoted was adopted by the Veterans Health Administration as part of the new priority health initiative and associated policies.
McGrath et al., 2009 (34)Stimulating community support; stimulating better communication and relationships among stakeholders; increasing the extent to which the organization and staff value research evidence; increasing the extent to which staff have the knowledge and skills to use evidence; increasing access to research evidence; increasing the interaction between decision makers and researchersDecision makers, clinicians, frontline workers, researchers, and the general public at the state levelGroup and individualOngoingApproximately 7 yearsDecision makersGovernment policy agency, publicly funded research agencyOutcome data were not reported. Anecdotally, it was reported that Family Help obtained wide acceptance from users and had an influx of referrals. Funding and services for Family Help expanded to additional districts, and there was increased interest in the program from other provinces.
Stark et al., 2013 (35)Increasing the extent to which the organization and staff value research evidence; increasing the extent to which staff have the knowledge and skills to use evidence; increasing access to research evidence; increasing the interaction between decision makers and researchersDecision makers, clinicians, frontline workers, and researchers at the regional levelGroup levelDetails not provided2 yearsDecision makersGovernment policy agencyOutcome data were not reported. Anecdotal findings indicated positive policy changes, including increased prioritization for dementia care through development of diagnostic clinics, commissioning a dementia training strategy, implementation of national care standards and diagnostic standards across all operational areas, and a recommendation that all community mental health teams implement an agreed-upon, integrated care pathway. Funding was also provided to dementia charities to support patients and families during early diagnosis.
Ward et al., 2012 (36)Increasing the extent to which the organization and staff value research; increasing the extent to which staff have the skills and knowledge to engage with and use research; increasing access to research evidenceDecision makers and clinicians at the regional levelGroupOngoing10–15 monthsDecision makers and brokersPublicly funded research agencyUse of all 5 components of knowledge exchange was found to increase over the study period. Components were not discrete and often co-occurred.

Policy influences.

Three studies (17,33,34) focused on policy influences as a key strategy for increasing the use of evidence in mental health policy. Broadly, these strategies involved the mobilization of community support for the adoption or implementation of a particular evidence-based practice. One of these studies evaluated the impact of social marketing strategies on the adoption of evidence-based practice in the Veterans Health Administration’s Quality Enhancement Research Initiative (33). The authors applied a social marketing approach designed to gain the support of the audience segments relevant to the Veterans Health Administration (for example, national and regional leadership, facility managers, frontline providers, and veterans) for the national adoption of a collaborative care model for depression. Various groups were provided with the information hypothesized as most likely to drive their behavior change. For example, leaders were given information about cost and quality impact, and frontline workers were given information about the impact of the depression care program on health and practitioner workload.
Stimulating better communication and relationships among stakeholders was a strategy employed by five of the intervention studies (17,31,33,34,38). For example, Chamberlain and colleagues (30,31,38) tested the impact of community development teams, which, among other activities, worked to assist counties to develop the peer networks needed to facilitate the multiagency collaborations that were vital to the successful countywide implementation of the MTFC program.

Capacity.

Increasing the extent to which the organization and staff value research evidence was not an explicit goal of the intervention studies identified but was nonetheless an indirect element of all of the studies considered (17,3138). Attempts to increase the extent to which research evidence is valued generally took the form of efforts to demonstrate the strength and scope of the evidence for the evidence-based practice that was under consideration. Glisson and colleagues (17,29,40,43,44), for example, created and tested a multifaceted organizational and community-level intervention known as Availability, Responsiveness and Continuity (ARC), which was designed in part to support a shift toward a culture of evidence-based practice. The ARC model is cognizant of the social context of agencies (including the service providers, organization, and broader community). Specifically, this approach recognizes that the agency context—and its fit with the objectives of any new practice—plays a vital role in determining whether and how well a new practice is implemented within an organization.
Increasing the extent to which organizations have the tools and systems needed to support research engagement and use was an element of five of the interventions identified (17,31,32,37,38). The tools and systems involved were generally those required to implement an evidence-based program. In the case of Driedger and colleagues (32), however, the intervention program involved supplying Ontario Early Years Centres with Geographical Information System software. Data analyst–policy maker dyads were then trained in the use and interpretation of Geographical Information System data and in how to use these data to inform policy and planning decisions.
Increasing the extent to which staff have the skills and knowledge to engage with and use research was a strategy employed by all but one of the studies considered (17,31,32,3438). Direct training in research use skills, such as that described in the study by Driedger and colleagues (32), was rare. Ward and colleagues (36), however, provided three management or service delivery teams with a knowledge broker to help them devise a plan to address a particular service delivery or evaluation challenge. The knowledge broker aimed to provide help and advice centered on information management, linkage, and exchange and to enhance participants’ capacity to participate in knowledge exchange as they worked through their problem. Most programs applied training to help participants use the research relevant to a particular evidence-based practice that they were seeking to have adopted as policy (for example, Stark and colleagues [35]) or implemented (for example, Glisson and colleagues [29] and Chamberlain and colleagues [38]). Feinberg and colleagues (37) went further, providing “key leaders” from 21 communities in Pennsylvania with three multiday training sessions. These included the provision of general information and skills related to program implementation, monitoring, and evaluation, as well as information specifically relevant to the Communities That Care program that was about to be rolled out.

Research engagement actions.

Increasing access to research evidence was a strategy employed by seven studies (31,3338). Studies sought to improve access to evidence related to a particular evidence-based treatment, rather than to improve skills in accessing research in general. Increasing the generation of new research or analyses by decision makers was a strategy employed in the study by Driedger and colleagues (32), in which data analysts and managers were taught to use Global Information Systems software to analyze local data to inform policy and program planning. The study by Feinberg and colleagues (37) also included an element of this strategy, providing training in monitoring or evaluation. Increasing the interaction between decision makers and researchers was a component of seven studies (17,31,3335,37,38), again primarily related to implementing a particular evidence-based practice.

Impact of Strategies to Increase the Use of Evidence

All of the studies identified used multiple strategies to increase the use of evidence in mental health decision making and did not report on them separately. Therefore, it is impossible to highlight any one strategy as being effective. Further, most of the studies identified either did not report outcome data (3436) or are still in progress (30,31,38). Luck and colleagues (33) did not present formal outcome data; however, they achieved the aim of their social marketing intervention in having the TIDES (Translating Initiatives for Depression into Effective Solutions) collaborative care model adopted by the Veterans Health Administration. Driedger and colleagues (32) reported increased mapping skills among most analysts and some increase in the use of maps to support decision making. As discussed below, however, the design used in this study limited the strength of the conclusions that can be drawn.
Glisson and colleagues (29) conducted the largest and most complete intervention study involving decision makers at a county level. They found that young people receiving multisystemic therapy in counties that also received the ARC intervention moved into the nonclinical range of problem behaviors more quickly than young people who received the same therapy in non-ARC counties. Young people in the ARC counties were also significantly less likely to be in out-of-home placements at the 18-month follow-up. These findings suggest that the multifaceted ARC approach, which includes focus on the social context of organizations and the social process of adopting innovations (that link to stimulating community support, stimulating better communication and relationships among stakeholders, increasing the extent to which the organization and staff value research evidence, increasing the extent to which staff have the knowledge and skills to use evidence, and increasing the extent to which organizations have the tools and systems needed to support research engagement and use) may be of significant benefit in enhancing the use of evidence and evidence-based practices.
Feinberg and colleagues (37) found modest evidence that providing “key leader” training in prevention science prior to implementation of Communities That Care in Pennsylvania was associated with leaders’ expression of more positive perceptions of the internal and external functioning of their coalition. There was also some evidence of an impact of training on individual knowledge and attitudes, but no evidence was found of an impact on perceived efficacy of Communities That Care. This study was limited by the fact that outcome data were collected two to four years after training and by the researchers’ reliance on retrospective self-report data on community readiness. Nevertheless, the study provides some encouraging insights into the impact of wide-ranging preimplementation training on the functioning of coalitions.

Discussion

Even though the use of evidence in the development of mental health policy and programs is becoming increasingly important to governments and decision makers internationally (47), the results of this study suggest that only a small number of researchers in the mental health field have begun to systematically investigate strategies that may increase the use of evidence by decision makers, particularly in regard to mental health policy and planning. Nine intervention studies that touched on these issues were identified in our search. Most employed case study designs, and many did not specify outcome measures—or indeed the outcomes of the research—reflecting the exploratory nature of much of the work to date. Although a range of strategies to increase the use of evidence in mental health decision making were noted, it is not yet possible to draw strong conclusions about their efficacy. These approaches differ markedly from the knowledge translation strategies that have been adopted to increase the use of research evidence in public health policy (28,45,46).
Few studies that tested the efficacy of interventions to increase the use of evidence by decision makers were identified in this review. However, 55 papers that touched on the use of evidence in mental health policy were found, suggesting a growing interest in this area in the mental health community. Many of the articles that we identified were conceptual, whereas others provided descriptive accounts of evidence use in decision making. Some described the implementation of major policy changes in mental health (for example, the introduction of prescribing algorithms in Michigan state mental health policy [47]), but the reports did not address how or why evidence came to be used. Other papers, such as those documenting the implementation of the Improving Access to Psychological Therapies program in the United Kingdom (13,14), provided glimpses of how major evidence-based initiatives came into being; however, these reports were ineligible for inclusion in this review because they did not describe interventions that actively involved decision makers.
There is a growing body of literature focused on the implementation of evidence-based mental health programs and treatments (including seven of the nine interventions that we identified), but this interest has tended to be focused on what may be considered the second stage of implementation—getting clinicians to adopt a practice. Thus, as Wang and colleagues (48) pointed out, little evidence currently exists in regard to the factors that influence decision makers’ uptake of evidence-based practices, arguably the first stage of implementation. It is worth noting that most of the implementation interventions identified in this review involved collaborations between service delivery organizations or government departments that typically operate in silos (for example, multisystemic therapy, which requires multiagency teams to work together [38]). It is perhaps because of the extra layers of complexity inherent in engaging leaders from disparate areas that these study teams have taken the time to specify the strategies they used to help decision makers take up and implement evidence-based practices. Furthermore, it should be noted that much of the work of mental health decision makers is far broader in scope than decisions in regard to the implementation of particular therapies. The results of this review suggest that as yet few empirical attempts have been made to increase mental health decision makers’ capacity to engage with research evidence. The SPIRIT action framework (26; Redman S, Turner T, Davies H, et al., unpublished manuscript, 2014) may provide a useful starting point for researchers interested in pursuing this line of inquiry.
All of the interventions identified in this review employed multiple strategies outlined in the SPIRIT action framework to increase the use of evidence in mental health decision making. Most included an element focused on manipulating the external factors that influence policy decisions. Two focused on stimulating community support (33,44). Increasing communication and relationships among stakeholders was a strategy employed in most identified studies (17,31,33,34,38). Strategies to improve communication included establishing and facilitating steering committees and peer-to-peer networking and setting up multiagency teams. Improved communication was a key strategy that facilitated the adoption of evidence-based practices by decision makers, particularly in the interventions described by Chamberlain and colleagues (30,31,38) and Glisson and colleagues (17,29,40,43,44,49). As discussed above, implementation of evidence-based practices in these studies required active cooperation between various government departments and service agencies—and in the case of Chamberlain and colleagues (30,31,38), across teams of counties.
Increasing the extent to which staff value research evidence and increasing access to research evidence were strategies employed by most of the interventions identified. In contrast to the health–public health literature in which strategies in these arenas have included activities such as providing access to online repositories of systematic reviews and summaries of research evidence targeted to decision makers’ needs (23,24), the strategies identified in the mental health field were predominantly geared toward educating decision makers about the evidence for a particular evidence-based treatment. This is consistent with the noted focus in the mental health literature on implementing specific evidence-based practices, rather than on attempting to create a culture of greater evidence use among mental health decision makers more generally.
Increasing staff skills and knowledge was another commonly employed strategy. For most of the studies identified, this strategy was used to focus on increasing skills and knowledge in regard to the implementation of a particular evidence-based practice. One study, however, aimed at developing general skills in the use of research (that is, using or interpreting spatial data) (32). This approach is more in line with public health research in this area, where improving decision makers’ critical appraisal skills in particular has been a commonly tested strategy (22,25). The work of Feinberg and colleagues (37) was also notable for including some general monitoring and evaluation training.
Most of the above-mentioned studies described multifaceted interventions to increase the use of evidence in mental health policy. The studies by Driedger and colleagues (32) and Luck and colleagues (33), however, tested the efficacy of a specific strategy on evidence use. The explanatory power of the study by Driedger and colleagues is limited by its case study design; however, it nonetheless provided modest evidence of an increase in spatial mapping skills among data analysts and of some increase in the use of maps to support decision making. The study by Luck and colleagues did not report formal outcome measures but provided some support for the effectiveness of a social marketing approach in having an evidence-based collaborative care model adopted by the Veterans Health Administration. This approach was geared to the relevant segments of the audience and encompassed a range of strategies, including stimulating community support, improving communication among stakeholders, increasing the extent to which research is valued, and increasing access to research evidence.
Two studies were large-scale randomized controlled trials designed to test different methods of implementing complex evidence-based treatments. Glisson and colleagues (17,29,43,44) tested the effectiveness of the ARC organizational intervention in enhancing the implementation of multisystemic therapy. They provided good evidence that the ARC’s focus on addressing the social context of agencies was beneficial in enhancing the effectiveness with which multisystemic therapy was implemented. The ARC intervention shows great promise as a model that could be applied to enhance the implementation of particular treatments. It is likely that it could also be effective in moving organizations toward a culture of evidence use more generally, although this has not yet been explored.
The community development team model developed by Chamberlain and colleagues (30,31,38,39,48) is currently being rigorously tested in a large randomized trial. The results of this study are likely to provide important new knowledge and directions for those seeking to improve the implementation of evidence-based practices, particularly in a cross-disciplinary, cross-jurisdictional setting. The development of the Stages of Implementation Completion measure (39) is also likely to advance knowledge and practice in implementation more broadly.
The following caveats should be noted in regard to this review. First, only articles published in English between 1995 and 2013 were within the scope of the review, and it is possible that relevant work was published before or after this period or in another language. Second, although we attempted to make our search criteria broad and also scanned the reference lists of relevant articles, we may have missed some relevant research, especially articles published in the gray literature. In particular, it is likely that some mental health intervention studies that included a component related to increasing evidence use among policy makers are not discoverable because this aspect of the study was not central enough to include in the keywords. This reflects the fact that increasing the use of evidence in mental health policy is still an emerging area of inquiry.

Conclusions

Few empirical studies designed to increase the use of research evidence in mental health policy were identified, and most of those we found focused on implementation. The major evidence-based practices and treatments that have been written into mental health policy by governments internationally suggest a growing appetite for evidence; however, a major gap between evidence and practice remains. More systematic, scientifically rigorous attempts to increase the use of evidence in mental health policy—or even to document the process by which evidence is taken up in policy development—are needed if substantial, lasting change is to be made. Encouraging mental health decision makers to implement specific evidence-based practices is important work and has clearly already begun. However, if major, overarching changes are to be made to the mental health system, we must go further. Developing and testing strategies to improve mental health decision makers’ capacity to use research evidence in policy making, and indeed their attitudes toward research in general, are vital next steps. The SPIRIT action framework provided a useful method for categorizing the strategies used in the identified interventions and may help guide future work in developing and implementing strategies to increase the use of research evidence in mental health policy.

References

1.
Goldman HH, Ganju V, Drake RE, et al: Policy implications for implementing evidence-based practices. Psychiatric Services 52:1591–1597, 2001
2.
Rosenheck RA: Organizational process: a missing link between research and practice. Psychiatric Services 52:1607–1612, 2001
3.
Andrews G, Issakidis C, Sanderson K, et al: Utilising survey data to inform public policy: comparison of the cost-effectiveness of treatment of ten mental disorders. British Journal of Psychiatry 184:526–533, 2004
4.
Out of the Shadows at Last: Transforming Mental Health, Mental Illness and Addiction Services in Canada. Final report of the Standing Committee on Social Affairs, Science and Technology. Ottawa, Ontario, Senate of Canada, 2006
5.
Satcher D: From the Surgeon General. Global mental health: its time has come. JAMA 285:1697, 2001
6.
Achieving the Promise: Transforming Mental Health Care in America. Pub no SMA-03-3832. Rockville, Md, Department of Health and Human Services, President’s New Freedom Commission on Mental Health, 2003
7.
National Service Framework for Mental Health: Modern Standards and Service Models. London, Department of Health, 1999
8.
Rapp CA, Goscha RJ, Carlson LS: Evidence-based practice implementation in Kansas. Community Mental Health Journal 46:461–465, 2010
9.
Outcome Evaluation of Washington State's Research-Based Programs for Juvenile Offenders. Olympia, Washington State Institute for Public Policy, 2004
10.
Identifying and Selecting Evidence-Based Interventions: Revised Guidance Document for Strategic Prevention Framework State Incentive Grant Program. Rockville, Md, Substance Abuse and Mental Health Services Administration, Center for Substance Abuse Treatment, 2009
11.
Bruns EJ, Hoagwood KE: State implementation of evidence-based practice for youths: part I. responses to the state of the evidence. Journal of the American Academy of Child and Adolescent Psychiatry 47:369–373, 2008
12.
Haynes A, Turner T, Redman S, et al: Developing definitions for a knowledge exchange intervention in health policy and program agencies: reflections on process and value. International Journal of Social Research Methodology (Epub ahead of print, June 6, 2014)
13.
Clark DM, Layard R, Smithies R, et al: Improving access to psychological therapy: initial evaluation of two UK demonstration sites. Behaviour Research and Therapy 47:910–920, 2009
14.
Radhakrishnan M, Hammond G, Jones PB, et al: Cost of Improving Access to Psychological Therapies (IAPT) programme: an analysis of cost of session, treatment and recovery in selected Primary Care Trusts in the East of England region. Behaviour Research and Therapy 51:37–45, 2013
15.
Bruns EJ, Hoagwood KE, Rivard JC, et al: State implementation of evidence-based practice for youths: part II. recommendations for research and policy. Journal of the American Academy of Child and Adolescent Psychiatry 47:499–504, 2008
16.
State Mental Health Agencies (SMHAs) Are Making Substantial Progress Towards Achieving the Major Goals of the Commission. Alexandria, Va, National Association of State Mental Health Program Directors Research Institute, 2006
17.
Glisson C, Schoenwald SK, Hemmelgarn A, et al: Randomized trial of MST and ARC in a two-level evidence-based treatment implementation strategy. Journal of Consulting and Clinical Psychiatry. 78:537–550, 2010
18.
Mitton C, Adair CE, McKenzie E, et al: Designing a knowledge transfer and exchange strategy for the Alberta Depression Initiative: contributions of qualitative research with key stakeholders. International Journal of Mental Health Systems 3:11, 2009
19.
Levesque P, Davidson S, Kidder K: Knowledge exchange for Attention Deficit Hyperactivity Disorder Research: an integrated evidence and knowledge exchange framework leading to more effective research dissemination practices. Journal of the Canadian Academy of Child and Adolescent Psychiatry 16:51–56, 2007
20.
Tanenbaum SJ: Evidence-based practice as mental health policy: three controversies and a caveat. Health Affairs 24:163–173, 2005
21.
Goldner EM, Jeffries V, Bilsker D, et al: Knowledge translation in mental health: a scoping review. Health Policy 7:83–98, 2011
22.
Taylor RS, Reeves BC, Ewings PE, et al: Critical appraisal skills training for health care professionals: a randomized controlled trial [ISRCTN46272378]. BMC Medical Education 4:30, 2004
23.
Dobbins M, Cockerill R, Barnsley J, et al: Factors of the innovation, organization, environment, and individual that predict the influence five systematic reviews had on public health decisions. International Journal of Technology Assessment in Health Care 17:467–478, 2001
24.
Dobbins M, Hanna SE, Ciliska D, et al: A randomized controlled trial evaluating the impact of knowledge translation and exchange strategies. Implementation Science 4:61, 2009
25.
Waters E, Armstrong R, Swinburn B, et al: An exploratory cluster randomised controlled trial of knowledge translation strategies to support evidence-informed decision-making in local governments (the KT4LG study). BMC Public Health 11:34, 2011
26.
CIPHER Investigators: Supporting Policy In health with Research: an Intervention Trial (SPIRIT)-protocol for a stepped wedge trial. BMJ Open 4:e005293, 2014
27.
Stroup DF, Berlin JA, Morton SC, et al: Meta-analysis of observational studies in epidemiology: a proposal for reporting. JAMA 283:2008–2012, 2000
28.
Moore G, Redman S, Haines M, et al: What works to increase the use of research in population health policy and programmes: a review. Evidence and Policy 7:277–305, 2011
29.
Glisson C, Hemmelgarn A, Green P, et al: Randomized trial of the Availability, Responsiveness and Continuity (ARC) organizational intervention for improving youth outcomes in community mental health programs. Journal of the American Academy of Child and Adolescent Psychiatry 52:493–500, 2013
30.
Saldana L, Chamberlain P: Supporting implementation: the role of community development teams to build infrastructure. American Journal of Community Psychology 50:334–346, 2012
31.
Chamberlain P, Roberts R, Jones H, et al: Three collaborative models for scaling up evidence-based practices. Administration and Policy in Mental Health and Mental Health Services Research 39:278–290, 2012
32.
Driedger SM, Kothari A, Graham ID, et al: If you build it, they still may not come: outcomes and process of implementing a community-based integrated knowledge translation mapping innovation. Implementation Science 5:47, 2010
33.
Luck J, Hagigi F, Parker LE, et al: A social marketing approach to implementing evidence-based practice in VHA QUERI: the TIDES depression collaborative care model. Implementation Science 4:64, 2009
34.
McGrath PJ, Lingley-Pottie P, Emberly DJ, et al: Integrated knowledge translation in mental health: family help as an example. Journal of the Canadian Academy of Child and Adolescent Psychiatry 18:30–37, 2009
35.
Stark C, Innes A, Szymczynska P, et al: Dementia knowledge transfer project in a rural area. Rural and Remote Health 13:2060, 2013
36.
Ward V, Smith S, House A, et al: Exploring knowledge exchange: a useful framework for practice and policy. Social Science and Medicine 74:297–304, 2012
37.
Feinberg ME, Greenberg MT, Osgood DW, et al: The effects of training community leaders in prevention science: Communities That Care in Pennsylvania. Evaluation and Program Planning 25:245–259, 2002
38.
Chamberlain P, Brown CH, Saldana L, et al: Engaging and recruiting counties in an experiment on implementing evidence-based practice in California. Administration and Policy in Mental Health and Mental Health Services Research 35:250–260, 2008
39.
Chamberlain P, Brown CH, Saldana L: Observational measure of implementation progress in community based settings: the Stages of Implementation Completion (SIC). Implementation Science 6:116, 2011
40.
Glisson C, Green P, Williams NJ: Assessing the Organizational Social Context (OSC) of child welfare systems: implications for research and practice. Child Abuse and Neglect 36:621–632, 2012
41.
Henggler SW, Borduin CM, Schoenwald SK, et al: Multisystemic Therapy Adherence Scale–Revised (TAM-R). Charleston, Medical University of South Carolina, Department of Psychiatry and Behavioral Sciences, 2006
42.
Achenbach TM: Manual for the Child Behavior Checklist: 4-18 and 1991 Profile. Burlington, University of Vermont, Department of Psychiatry, 1991
43.
Glisson C, Landsverk J, Schoenwald S, et al: Assessing the organizational social context (OSC) of mental health services: implications for research and practice. Administration and Policy in Mental Health and Mental Health Services Research 35:98–113, 2008
44.
Glisson C, Schoenwald SK: The ARC organizational and community intervention strategy for implementing evidence-based children’s mental health treatments. Mental Health Services Research 7:243–259, 2005
45.
LaRocca R, Yost J, Dobbins M, et al: The effectiveness of knowledge translation strategies used in public health: a systematic review. BMC Public Health 12:751, 2012
46.
El-Jardali F, Lavis J, Moat K, et al: Capturing lessons learned from evidence-to-policy initiatives through structured reflection. Health Research Policy and Systems 12:2, 2014
47.
Milner KK, Healy D, Barry KL, et al: Implementation of computerized medication prescribing algorithms in a community mental health system. Psychiatric Services 60:1010–1012, 2009
48.
Wang W, Saldana L, Brown CH, et al: Factors that influenced county system leaders to implement an evidence-based program: a baseline survey within a randomized controlled trial. Implementation Science 5:72, 2010
49.
Glisson C, Schoenwald SK, Kelleher K, et al: Therapist turnover and new program sustainability in mental health clinics as a function of organizational culture, climate, and service structure. Administration and Policy in Mental Health and Mental Health Services Research 35:124–133, 2008

Information & Authors

Information

Published In

Go to Psychiatric Services
Go to Psychiatric Services

Cover: The Violoncellist, by Lilla Cabot Perry, 1906. Private collection.

Psychiatric Services
Pages: 783 - 797
PubMed: 25828881

History

Received: 23 July 2014
Revision received: 25 November 2014
Accepted: 22 December 2014
Published online: 31 March 2015
Published in print: August 01, 2015

Authors

Details

Anna Williamson, B.Psych., Ph.D.
The authors are with the Sax Institute, Sydney, New South Wales, Australia (e-mail: [email protected]). Dr. Williamson is also with the Department of Public Health and Community Medicine, University of New South Wales, Sydney, New South Wales.
Steve R. Makkar, B.Psych., Ph.D.
The authors are with the Sax Institute, Sydney, New South Wales, Australia (e-mail: [email protected]). Dr. Williamson is also with the Department of Public Health and Community Medicine, University of New South Wales, Sydney, New South Wales.
Catherine McGrath, LL.B., M.Phil.
The authors are with the Sax Institute, Sydney, New South Wales, Australia (e-mail: [email protected]). Dr. Williamson is also with the Department of Public Health and Community Medicine, University of New South Wales, Sydney, New South Wales.
Sally Redman, B.A.(Psych.), Ph.D.
The authors are with the Sax Institute, Sydney, New South Wales, Australia (e-mail: [email protected]). Dr. Williamson is also with the Department of Public Health and Community Medicine, University of New South Wales, Sydney, New South Wales.

Funding Information

National Health and Medical Research Council10.13039/501100000925: 510 391, APP1001436
This work was funded by grant APP1001436 from the Centre for Informing Policy in Health with Evidence from Research (CIPHER), an Australian National Health and Medical Research Council (NHMRC) Centre for Research Excellence that is administered by the University of Western Sydney. Dr. Williamson holds an NHMRC postdoctoral fellowship (510 391).The authors report no financial relationships with commercial interests.

Metrics & Citations

Metrics

Citations

Export Citations

If you have the appropriate software installed, you can download article citation data to the citation manager of your choice. Simply select your manager software from the list below and click Download.

For more information or tips please see 'Downloading to a citation manager' in the Help menu.

Format
Citation style
Style
Copy to clipboard

View Options

View options

PDF/EPUB

View PDF/EPUB

Login options

Already a subscriber? Access your subscription through your login credentials or your institution for full access to this article.

Personal login Institutional Login Open Athens login
Purchase Options

Purchase this article to access the full text.

PPV Articles - Psychiatric Services

PPV Articles - Psychiatric Services

Not a subscriber?

Subscribe Now / Learn More

PsychiatryOnline subscription options offer access to the DSM-5-TR® library, books, journals, CME, and patient resources. This all-in-one virtual library provides psychiatrists and mental health professionals with key resources for diagnosis, treatment, research, and professional development.

Need more help? PsychiatryOnline Customer Service may be reached by emailing [email protected] or by calling 800-368-5777 (in the U.S.) or 703-907-7322 (outside the U.S.).

Media

Figures

Other

Tables

Share

Share

Share article link

Share