Skip to main content
Full access
Articles
Published Online: 1 July 2015

Effect of Pediatric Behavioral Health Screening and Colocated Services on Ambulatory and Inpatient Utilization

Abstract

Objective:

The study sought to determine the impact of a pediatric behavioral health screening and colocation model on utilization of behavioral health care.

Methods:

In 2003, Cambridge Health Alliance, a Massachusetts public health system, introduced behavioral health screening and colocation of social workers sequentially within its pediatric practices. An interrupted time-series study was conducted to determine the impact on behavioral health care utilization in the 30 months after model implementation compared with the 18 months prior. Specifically, the change in trends of ambulatory, emergency, and inpatient behavioral health utilization was examined. Utilization data for 11,223 children ages ≥4 years 9 months to <18 years 3 months seen from 2003 to 2008 contributed to the study.

Results:

In the 30 months after implementation of pediatric behavioral health screening and colocation, there was a 20.4% cumulative increase in specialty behavioral health visit rates (trend of .013% per month, p=.049) and a 67.7% cumulative increase in behavioral health primary care visit rates (trend of .019% per month, p<.001) compared with the expected rates predicted by the 18-month preintervention trend. In addition, behavioral health emergency department visit rates increased 245% compared with the expected rate (trend .01% per month, p=.002).

Conclusions:

After the implementation of a behavioral health screening and colocation model, more children received behavioral health treatment. Contrary to expectations, behavioral health emergency department visits also increased. Further study is needed to determine whether this is an effect of how care was organized for children newly engaged in behavioral health care or a reflection of secular trends in behavioral health utilization or both.
Child behavioral health issues arise frequently in primary care settings (1). Screening for these issues is promoted in national guidelines as a strategy for early identification and treatment of behavioral health conditions (1,2). Simultaneously, these recommendations include a variety of possible mechanisms for increasing the capacity of primary care to respond to these issues, including task shifting, behavioral health screening, collaborative care, and colocation of behavioral health and general medical services in the same location. Trials of various integrated models (screening, colocation, and collaboration) demonstrate improved mental health outcomes (35), provider satisfaction, and identification rates.
In studies of screening alone, findings suggest that identification rates increase for behavioral health issues (6), as do mental health referrals (7,8). Unfortunately, referral completion rates remain low, with studies reporting rates ranging from 17% to 45% (710). In a recent study we conducted using Medicaid claims data, only 30% of newly identified children utilized behavioral health services (11). However, some studies have noted higher behavioral health initiation rates (>80%) with adults when colocated or collaborative care models were used (1214). Colocation of behavioral health services with general medical services reduces the stigma associated with seeking behavioral health care and reduces logistical barriers for patients and for those involved in specialist–primary care collaboration (15). Although studies have examined the impact of colocation on behavioral health care initiation and clinical outcomes, few have examined the impact on primary care–related behavioral health visits, inpatient services, or emergency department mental health services. These are important areas to explore because models for financing integrated care frequently call for “offsets” or “shared savings” attributable to reductions in inpatient care or urgent care, to “masked” mental health presentations, or to shifting of specialty services to lower-cost settings, such as primary care.
The goal of the study was to understand how a child behavioral health screening and colocation program affects health care utilization, including primary care, specialty behavioral health care, and behavioral health–related emergency and inpatient service utilization.

Methods

Conceptual Framework

Our underlying conceptual framework assumed that as screening increases, patients would be identified and enter either specialty behavioral health services or behavioral health services in primary care settings. This would result in increased ambulatory behavioral health services. If early mental health treatment proves efficacious, inappropriate emergency department and inpatient behavioral health admissions should decrease over time. To test our assumptions, we used data from the Cambridge Health Alliance (CHA) pediatric clinics to capitalize on a natural experiment. CHA clinics phased in the use of a validated screening tool during well-child visits from 2004 to 2007. Using data from the CHA data warehouse (16,17), we conducted an interrupted time-series (ITS) analysis of utilization rates in the months pre- and postimplementation of the behavioral health screening and colocation program among a rolling cohort of primary care pediatric patients receiving care. The CHA Institutional Review Board approved the study in 2011.

Context

CHA is a public hospital and clinic network in Cambridge, Massachusetts. At the time of this study, CHA operated three acute care hospitals with three emergency departments, two child mental health units, an inpatient pediatric medical unit, and multiple ambulatory health clinics, as well as a large division of child and adolescent psychiatry. All ambulatory clinics used the same inpatient and outpatient psychiatric resources located less than one mile away. In 2004, CHA began screening children ≥4 years 9 months (fifth-year well-child visit) to <18 years in three of its largest pediatric sites. The Pediatric Symptom Checklist (PSC [18]) was used for children under age 14, and the Youth-PSC (Y-PSC [18]) was used for those age 14 or older. Screening occurred at the annual well-child visit for children in the age range. These sites treated annually almost 16,000 children (from birth to 18 years old) during 2004–2008 (Cambridge Pediatrics, 6,672; Somerville Pediatrics, 7,267; and Union Square, 1,452). Twenty-four providers were available at these sites (seven each at Cambridge and Somerville and ten at Union Square), and none had used behavioral health screens or colocated social workers before project implementation. Sixty-four percent of children were from racial-ethnic minority groups (Cambridge, 61%; Somerville, 55%; and Union Square, 70%), and 43% spoke a language other than English (19). As noted elsewhere, a majority of mental health referrals for primary care patients were made to CHA providers (9).
During the study time frame, many changes occurred nationally in the delivery of child and adolescent mental health, including the black-box warning on the use of antidepressants (20), rising rates of specific disorders (specifically, bipolar disorder) (21), and higher use of antipsychotic medications (22). There was also heightened interest in identifying and treating behavioral health issues in pediatric offices (23).

Screening and Colocation Model

The behavioral health screening and colocation program (still in existence today) was initially implemented at CHA in phases; Cambridge began screening in December 2003, followed by Somerville in July 2005 and Union Square in April 2007. The PSC and Y-PSC are used as screening tools because they are well validated in diverse urban populations (6,24) and incur no cost to providers. PSC results have compared favorably to the Child Behavior Checklist and the Children’s Global Assessment Scale (25). These 35-item instruments are completed in the waiting room by parent or teen prior to the well-child visit. The physician scores the screen during the visit, and later a medical assistant inputs the information into the electronic medical record (EMR). This process also automates provider reporting of screening rates and is described in depth elsewhere (10). Screening was not reimbursable until January 2008 as part of the Rosie D versus Patrick remedy, which mandated that primary care providers screen for behavioral health issues with validated tools at well-child visits for Medicaid-eligible children (26).
The colocation of behavioral health providers in the clinics happened simultaneously with screening implementation. At each site, a part-time licensed clinical social worker (four total), supervised by the CHA Division of Child and Adolescent Psychiatry, conducted on-site behavioral health services, including initial assessments, therapy, and consultation to pediatricians. This collaborative care model is similar to Kolko and Perrin’s (23) description of on-site interventions with aspects of care coordination. Providers made behavioral health referrals to child psychiatry, and based on patient preference, services were conducted by on-site social workers or at the child psychiatry main office.
Before program implementation, primary care providers and social workers were trained at each site in the use of the PSC and Y-PSC tools and relevant behavioral health diagnosis codes, which were already being reimbursed in Massachusetts. Regular reports on screening were generated and shared monthly with the clinical staff. Within six months of starting the program, each of the sites was able to increase the share of well-child visits with a screen for children in the age range to over 50% per month.

Study Population

We extracted clinical and demographic data for youths ages ≥4 years 9 months to <18 years 3 months seen between 2003 and 2008. These ages include “screenable” youths and allowed for sufficient time after screening to capture any subsequent utilization. Data were extracted from the CHA data warehouse, which contains all ambulatory encounters within CHA, emergency department encounters, inpatient hospitalizations at a CHA hospital, and demographic data, such as gender, race-ethnicity information, and language preferences (17). We extracted all encounter and inpatient data that included location of service, provider department (pediatrics, family medicine), date of service, CPT4 procedure codes, and ICD-9 diagnosis codes for each visit. To calculate monthly utilization rates, we identified patients for inclusion in the denominator (patient panel) on the basis of location of their well-child and ambulatory care visits (defined by location and CPT code) and on age eligibility for screening. Any child without a primary care visit at the site in the prior year was excluded from the denominator for the given month. Children who had well-child care at multiple CHA sites were likely to have experienced the impact of the intervention before the time when other sites began to screen. Because this would have caused contamination of our data, we excluded these children (N=1,900, 14.7%) from both the pre- and postindex date panels. This group was significantly different from the rest of the sample on all demographic variables (p<.001). They were more likely to be female (55% versus 49%), to speak other languages (2% versus 5%), to be black (16% versus 12%), to be under age seven (19% versus 15%), and to be between 16 and 18 years old (18% versus 12%). They were less likely to be Hispanic (18% versus 21%). Demographic data for children included in the denominator of any study month were generated and aggregated by prepolicy and postpolicy periods for reporting purposes.
The period of patient panel identification and the calendar date of implementation (the index date) were different for each of the three sites. The denominator for the time series was the patient panel counts in each month. The preperiod time frame (18 months) was limited by data availability but allowed for a stable baseline and control for seasonality. For the postperiod, the longest time frame possible (30 months) was used. We initially constructed stratified time-series analysis of monthly utilization using the pre and post time frames for each site. We also conducted a sensitivity analysis with and without data from Union Square, given that it spanned the Rosie D remedy (January 2008). No significant differences in individual site analyses or in our sensitivity analysis were found, and thus data for the three sites were merged and centered on the index date.

Variables

We collected utilization data for each month of the observation period. The primary outcomes of interest were defined as follows: specialty behavioral health services were ambulatory psychiatric services delivered by behavioral health clinicians either colocated in the sites or in the department of child and adolescent psychiatry; behavioral health–related primary care visits were well-child visits or other ambulatory visits with an associated mental health–related ICD-9 code delivered by primary care providers; behavioral health–related hospitalizations were inpatient hospitalizations occurring in the CHA inpatient psychiatry unit or under the specialty of inpatient psychiatry; behavioral health–related emergency department visits were visits occurring in the CHA psychiatric emergency department or in any CHA emergency department with an associated mental health–related ICD-9 code.
Demographic variables included sex, age, race-ethnicity (white, black, Hispanic, Asian, other, or unknown), and preferred language of care (English, Portuguese, Spanish, Haitian, other, or unknown). Race and language are self-reported and entered into the medical record by CHA clerical staff. Age and race-ethnicity were dichotomized (under age 13 versus age 13 or older and white, non-Hispanic versus all other races) in order to standardize utilization rates in preparation for interrupted time-series analysis.

ITS Analysis

We used segmented regression models (27,28) to evaluate the effect of the screening used simultaneously with colocation on population service utilization rates. ITS is the strongest quasi-experimental design for evaluating the effects of natural experiments (29). One threat to the validity of the ITS design is changes in the composition of the population. To account for some change in the clinical demographic characteristics (specifically, age, sex, and race-ethnicity) over time, we standardized (30,31) the distribution of demographic characteristics in each month to that in month 1 at each site. After standardization, we adjusted for seasonality (32). The segmented models included terms for the intercept, prescreening slope, change in level, and change in slope and time, as well as autoregressive terms for up to six-month lags in rates. Because clinics took approximately six months to ramp up screening to a routine procedure, we censored the six months immediately after the behavioral health intervention, as has been done elsewhere (33). All analyses were conducted with SAS version 9.3 (34).

Results

The total number of unique children included in the study was 11,223, and the number of children appearing in the denominator for any given month ranged from 6,833 to 7,281. Forty percent of children appeared in the data set at least once in each of the four years of the study period, and 59% appeared in both the pre- and postpolicy periods. In the postpolicy period, 51% of the population were male, 56% were mainly English-speaking members of racial-ethnic minority groups, and there was good distribution across age groups (Table 1).
TABLE 1. Characteristics of pediatric patients before and after implementation of screening and colocation of behavioral health services with general medical carea
CharacteristicPediatric primary care patients (N=11,223, for 3 sites)
Prepolicy (N=8,235)Postpolicy (N=9,552)
N%N%
Sex    
 Female4,03249.04,64148.6
 Male4,20351.04,91151.4
Language    
 English6,47578.67,39277.4
 Portuguese7589.21,06911.2
 Spanish3424.24875.1
 Haitian1772.22542.7
 Other4095.03233.4
 Unknown74.927.3
Race-ethnicity    
 White, non-Hispanic3,79546.14,17543.7
 Black, non-Hispanic1,73421.12,07521.7
 Hispanic1,01412.31,25713.2
 Asian3844.75375.6
 Other, non-Hispanic1,09513.31,31513.8
 Unknown2132.61932.0
Age (years) at start of periodb    
 <4.86037.31,24313.0
 4.8–6.91,21914.81,33013.9
 7.0–9.91,75621.31,77118.5
 10.0–12.91,79421.81,86219.5
 13.0–15.91,71720.92,08321.8
 16.0–18.01,00812.21,08011.3
 >18.01381.71831.9
Insurance, at policy implementationc    
 Medicaid2,45129.83,58737.6
 Free care or public1,21814.81,73618.2
 Private3,03536.93,57937.5
 Other41.550.5
 Not indicated1,49018.16006.3
First behavioral diagnosis at specialty behavioral visitsd    
 Episodic mood disorder (bipolar)84925.12124920.6
 Hyperkinetic syndrome83324.6147124.1
 Adjustment reaction63118.7169127.7
 Anxiety, dissociative, or somatoform disorder57917.178212.8
a
Some youths (N=6,630, 59.1%) were patients in both pre- and postpolicy periods.
b
Prepolicy age calculated as of beginning of prepolicy period; postpolicy age calculated as of beginning of postpolicy period
c
Insurance category was based on the type indicated at the most proximal encounter to policy implementation.
d
N=3,382 prepolicy, N=6,123 postpolicy. These are the top four diagnoses based on the first diagnostic code at the first specialty behavioral health visit occurring within the study period.
Figures 1, 2, and 3 show the baseline and postscreening and colocation rates of behavioral specialty visits as well as primary care and emergency visits related to behavioral health needs. Table 2 presents the results of the segmented regression analysis. There was a significant increase in rates of specialty behavioral health visits, behavioral health–related primary care visits, and behavioral health–related emergency department visits during the 30 months postscreening and colocation period. The trend in specialty behavioral health visits increased .013% per month (slowly initially and accelerating over time) from approximately 1.7% per month to almost 2.5% per month (p=.049), a cumulative increase of 20.4% compared with expectations. In other words, by the end of the postimplementation observation period, the rate of behavioral health visits was 20% higher than expected based on preprogram trends. Similarly, the trend in behavioral health–related primary care visits increased by .019% per month from approximately 1% to 1.5% per month (p<.001), a cumulative increase of 67.7% compared with expections. Finally, the trend in behavioral health–related emergency department visits increased .01% per month from 1.5 per 1,000 to almost three per 1,000 (p=.002), a cumulative increase of 245% compared with expections. Behavioral health–related inpatient admissions at CHA remained stable (data not shown).
FIGURE 1. Pediatric specialty behavioral health visits to a system with routine screening and colocated servicesa
a Rate is seasonally adjusted and standardized for age, sex, and race. Trend change estimate=.13 visits per 1,000 patients; 95% confidence interval=.001–.020, p=.049
FIGURE 2. Pediatric behavioral health–related primary care visits to a system with routine screening and colocated servicesa
a Rate is seasonally adjusted and standardized for age, sex, and race. Trend change estimate=.19 visits per 1,000 patients; 95% confidence interval=.15–.23, p<.001
FIGURE 3. Pediatric behavioral health–related emergency department visits to a system with routine screening and colocated servicesa
a Rate is seasonally adjusted and standardized for age, sex, and race. Trend change estimate=.10 visits per 1,000 patients; 95% confidence interval=.04–.16, p=.003
TABLE 2. Results of segmented regression analysis of change after implementation of behavioral health screening and colocation policies in a health care system
VariableEstimate (visits per 1,000 patients)SE95% CIp
Specialty behavioral health servicesa    
 Intercept14.99.5113.96 to 16.02<.001
 Trend change.13.06.00 to .25.049
 Trend.07.04–.01 to .16.099
Behavioral health–related primary care visitsb    
 Intercept8.48.327.84 to 9.12<.001
 Trend change.19.02.15 to .23<.001
Behavioral health–related emergency department visitsc    
 Intercept1.42.26.89 to 1.95<.001
 Trend change.10.03.04 to .16.002
 Trend–.03.02–.08 to .01.144
Behavioral health–related inpatient hospitalizationsd    
 Intercept.32.12.08 to .57.012
 Level change–.47.32–1.12 to .18.154
 Trend change.04.02.01 to .07.019
Behavioral health–related inpatient hospitalizations or emergency department visitsc,d    
 Intercept1.47.29.88 to 2.05<.001
 Trend change.11.03.04 to .18.003
 Trend−.03.02–.08 to .01.157
Any behavioral health–related utilizationad    
 Intercept23.93.3223.29 to 24.57<.001
 Level change1.50.90–.30 to 3.31.102
 Trend change.36.04.28 to .45<.001
Non–behavioral health–related utilizatione    
 Intercept83.701.0081.68 to 85.71<.001
 Trend.06.03–.01 to .13.072
a
Psychiatric services such as diagnostic interviews, psychopharmacology management and psychotherapy (CPT codes 90801–90899), health behavioral assessment and intervention services (CPT codes 96100–96103, 96105, 96111, 96115–96120, 96125, and 96150–96155), Healthcare Common Procedural Codes (HCPC) for other mental health professionals (all H codes) and the Massachusetts Behavioral Health Partnership’s additional HCPC codes introduced in Massachusetts to track Child Behavioral Health Initiative remedial services, including crisis intervention (S9484 and S9485) and family counseling and case management (T1027, T1017, and T2022)
b
Well-child visits (CPT codes 99383–99385 and 99393–99395) or ambulatory visits (CPT codes 99201–99205, 99211–99215, 99401–99404, and 99241–99245) with an associated behavioral health–related ICD-9 code (290–319)
c
Inpatient hospitalizations defined by specialty (psychiatry inpatient), location (inpatient psychiatry unit), or both
d
Emergency department visits defined by specialty (psychiatric emergency department or emergency department) with an associated behavioral health–related ICD-9 code (290–319)
e
Inpatient hospitalizations, emergency department visits, well-child visits, or ambulatory visits not meeting the above criteria

Discussion

Main Findings

In this local study of a pediatric behavioral health screening and colocation program, rates of behavioral health service utilization, including both specialty and primary care, increased significantly after implementation of the program. Although behavioral health–related inpatient admissions were stable, there was also a significant increase in rates of emergency admissions with behavioral health diagnoses. To our knowledge, this is the first study to report increases in emergency department use associated with screening and colocation for a pediatric population.
As noted in the literature, behavioral health screening in primary care leads to increased rates of referral by primary care providers (8), particularly when mental health services are colocated in the same practice (14). In this study, utilization of specialty behavioral health services increased significantly, suggesting that referrals were completed and more children with behavioral health needs were getting into treatment. However, our data did not include clinical outcome measures; thus we could not determine whether more care was actually better care. It is also important to note that the prepolicy behavioral health care utilization was higher than reported elsewhere for Massachusetts statewide (15% versus 12%) (35). This may reflect access to available child psychiatry. In addition, nationally, outpatient pediatric visits with psychiatric diagnoses increased from 1999 to 2010, but we identified no increases simultaneous to CHA program implementation (36).
The greater increase in CHA’s behavioral health–related primary care visits in comparison to the more limited increase in specialty behavioral health services mimics the national trends seen over the past decade (36). This is encouraging, given pediatricians’ limited knowledge of behavioral health codes, discomfort with behavioral health treatment, and concerns about reimbursement (37,38). It suggests that screening and colocation may influence task shifting of behavioral health treatment from specialty to primary care. Unfortunately, our data do not reveal whether primary care providers were delivering behavioral health treatment or were simply more comfortable than at prepolicy with coding procedures.
Perhaps the most surprising study finding is that emergency department visit rates increased after the intervention (the rate doubled, but the number remained low). We suspect that many of these visits were avoidable (39). There are many possible reasons for this increase. For example, increased screening may identify issues that cannot be adequately triaged or managed on site, requiring emergency visits. Also, providers’ increasing awareness of emergent mental health issues may increase emergency referrals. In the recent study of Oregon Medicaid expansion, both primary care and emergency department utilization increased initially (40). Further, children and families entering treatment may experience behavioral health crises. Therapists themselves could increase utilization by telling patients to go to the emergency department if crises occur in their absence. Unfortunately, we do not know whether crises increased after the intervention. We also recognize that the upward national trend in behavioral health–related emergency visits (4143) might have contributed to our findings, but this is unlikely to have happened simultaneously to the intervention at each of our sites. To understand the validity of these explanations, additional provider-level data and a longer time frame are required to determine whether this phenomenon is merely a short-lived consequence of improving behavioral health identification.
Despite increased emergency utilization, we did not find a concomitant increase in behavioral health–related inpatient admissions, which is surprising because most would emanate from emergency department visits. But we recognize that CHA patients are not necessarily sent to CHA facilities for inpatient behavioral health admissions. Children are sent to open beds throughout the region. Thus it is possible that inpatient utilization increased without our knowledge.

Limitations

This study had a number of limitations. First, the study lacked information on behavioral health need and clinical outcomes required to assess clinical improvement. Also, because the study occurred at one delivery system and excluded children who used primary care at multiple CHA sites, it may not be representative of all CHA children or other nonurban populations. However, although this aspect of our study may limit generalizability of our findings, it would not explain the sudden slope changes in outcomes. Second, because data were collected from the CHA data warehouse, our data did not include visits that took place outside CHA. Also, the accuracy of diagnostic codes could not be validated. However, because our data showed increases in utilization, the impact of the program is unlikely to have been underestimated to the degree that outside utilization was not captured. Third, because of a lack of statistical power, we were unable to identify any change in behavioral health–related hospitalization. A claims-based study that captures inpatient utilization more completely would help to clarify these findings. Fourth, the study relied on one validated screening tool (the PSC), and it is possible that results might have differed with other instruments. Fifth, because the study examined the impact of a combined screening and colocation model, we were unable to estimate the independent impact of screening and colocation. We also lacked a concurrent comparison group. Finally, other factors, such as staffing changes, EMR adoption, and coding awareness, may have influenced our findings. However, none took place concurrently with the implementation of screening and colocation at each CHA site and thus would not explain our findings. Further, our ITS design explicitly controlled for national, secular trends affecting child and adolescent mental health delivery (23,36,43,44).

Conclusions

This was the first study of its kind to examine with an ITS design health care utilization rates after the initiation of a behavioral health screening and colocation model in a major health care system. There is strong evidence that the intervention led to increased use of behavioral health services and provider identification of behavioral health issues, both of which are positive developments. However, the sharp twofold increase in admission rates to the emergency department is concerning, given the assumption that early identification of mental health issues and appropriate outpatient care should decrease emergency department utilization. More work is required to understand this unanticipated outcome.

References

1.
Foy JM, American Academy of Pediatrics Task Force on Mental Health: Enhancing pediatric mental health care: report from the American Academy of Pediatrics Task Force on Mental Health: introduction. Pediatrics 125(suppl 3):S69–S74, 2010
2.
American Academic of Pediatrics Task Force on Mental Health: The case for routine mental health screening. Pediatrics 125:S133–S139, 2010
3.
Kolko DJ, Campo J, Kilbourne AM, et al: Collaborative care outcomes for pediatric behavioral health problems: a cluster randomized trial. Pediatrics 133:e981–e992, 2014
4.
Wissow LS, Gadomski A, Roter D, et al: Improving child and parent mental health in primary care: a cluster-randomized trial of communication skills training. Pediatrics 121:266–275, 2008
5.
Kolko DJ, Campo JV, Kilbourne AM, et al: Doctor-office collaborative care for pediatric behavioral problems: a preliminary clinical trial. Archives of Pediatrics and Adolescent Medicine 166:224–231, 2012
6.
Jellinek MS, Murphy JM, Little M, et al: Use of the Pediatric Symptom Checklist to screen for psychosocial problems in pediatric primary care: a national feasibility study. Archives of Pediatrics and Adolescent Medicine 153:254–260, 1999
7.
Chisolm DJ, Klima J, Gardner W, et al: Adolescent behavioral risk screening and use of health services. Administration and Policy in Mental Health and Mental Health Services Research 36:374–380, 2009
8.
Wissow LS, Brown J, Fothergill KE, et al.: Universal mental health screening in pediatric primary care: a systematic review. Journal of the American Academy of Child and Adolescent Psychiatry 52:1134–1147, 2013
9.
Hacker K, Arsenault L, Franco I, et al: Referral and follow-up after mental health screening in commercially insured adolescents. Journal of Adolescent Health 55:17–23, 2014
10.
Hacker KA, Myagmarjav E, Harris V, et al: Mental health screening in pediatric practice: factors related to positive screens and the contribution of parental/personal concern. Pediatrics 118:1896–1906, 2006
11.
Hacker KA, Penfold RB, Arsenault LN, et al: Behavioral health services following implementation of screening in Massachusetts Medicaid children. Pediatrics 134:737–746, 2014
12.
Auxier A, Runyan C, Mullin D, et al: Behavioral health referrals and treatment initiation rates in integrated primary care: a Collaborative Care Research Network study. Translational Behavioral Medicine 2:337–344, 2012
13.
Kessler R: Mental health care treatment initiation when mental health services are incorporated into primary care practice. Journal of the American Board of Family Medicine 25:255–259, 2012
14.
Calkins LE, Michelson IR, Corso AS: Provider proximity as a predictor of referral rate and success. Psychological Services 10:395–400, 2013
15.
Eapen V, Jairam R: Integration of child mental health services to primary care: challenges and opportunities. Mental Health in Family Medicine 6:43–48, 2009
16.
Hacker KA, Williams S, Myagmarjav E, et al: Persistence and change in Pediatric Symptom Checklist scores over 10 to 18 months. Academic Pediatrics 9:270–277, 2009
17.
Hacker KA, Arsenault LN, Williams S, et al: Mental and behavioral health screening at preventive visits: opportunities for follow-up of patients who are nonadherent with the next preventive visit. Journal of Pediatrics 158:666–671, 2011
18.
Pediatric Symptom Checklist. Boston, Massachusetts General Hospital, Department of Psychiatry, 2012. Accessed at www.massgeneral.org/psychiatry/services/psc_home.aspx
19.
Patient Demographic Distribution, 2012–2013. Cambridge, Mass, Cambridge Health Alliance, 2014
20.
Lu CY, Zhang F, Lakoma MD, et al: Changes in antidepressant use by young people and suicidal behavior after FDA warnings and media coverage: quasi-experimental study. BMJ (Clinical Research Ed) 348:g3596, 2014
21.
Moreno C, Laje G, Blanco C, et al: National trends in the outpatient diagnosis and treatment of bipolar disorder in youth. Archives of General Psychiatry 64:1032–1039, 2007
22.
Olfson M, Blanco C, Liu L, et al: National trends in the outpatient treatment of children and adolescents with antipsychotic drugs. Archives of General Psychiatry 63:679–685, 2006
23.
Kolko DJ, Perrin E: The integration of behavioral health interventions in children’s health care: services, science, and suggestions. Journal of Clinical Child and Adolescent Psychology 43:216–228, 2014
24.
Murphy JM, Arnett HL, Bishop SJ, et al: Screening for psychosocial dysfunction in pediatric practice: a naturalistic study of the Pediatric Symptom Checklist. Clinical Pediatrics 31:660–667, 1992
25.
Pediatric Symptom Checklist, Research. Boston, Massachusetts General Hospital, Department of Psychiatry, 2012. Accessed at www.massgeneral.org/psychiatry/services/psc_research.aspx
26.
Rosie D: Reforming the Health Care System in Massachusetts. The Remedy: The Pathway to Home-Based Services. Northampton, Mass, Center for Public Representation, 2007. Available at www.rosied.org/page-84564? Accessed Jan 2015
27.
Gillings D, Makuc D, Siegel E: Analysis of interrupted time series mortality trends: an example to evaluate regionalized perinatal care. American Journal of Public Health 71:38–46, 1981
28.
Cook TD, Campbell DT: Quasi-experimentation: Design and Analysis Issues for Field Settings. Boston, Houghton Mifflin, 1979
29.
Shadish W, Cook T, Campbell D: Experimental and Quasi-Experimental Designs for Generalized Causal Inference. Boston, Houghton Mifflin, 2002
30.
Brownson RC, Petitti DB (eds): Applied Epidemiology: Theory to Practice. New York, Oxford University Press, 1998
31.
Kleinbaum DG, Kupper LL, Morgenstern H: Epidemiologic Research: Principles and Quantitative Methods. New York, Van Nostrand Reinhold, 1982
32.
Bobbit LG, Otto MC: Effects of Forecasts on the Revisions of Seasonally Adjusted Data Using the X-11 Adjustment Procedure. Proceedings of the Business and Economic Statistics Section of the American Statistical Association. Washington, DC, American Statistical Association, 1990
33.
Schneeweiss S, Maclure M, Soumerai SB, et al: Quasi-experimental longitudinal designs to evaluate drug benefit policy changes with low policy compliance. Journal of Clinical Epidemiology 55:833–841, 2002
34.
SAS OnlineDoc, Version 9.3. Cary, NC, SAS Institute, 2006
35.
Sturm R, Ringel JS, Andreyeva T: Geographic disparities in children’s mental health care. Pediatrics 112:e308, 2003
36.
Olfson M, Blanco C, Wang S, et al: National trends in the mental health care of children, adolescents, and adults by office-based physicians. JAMA Psychiatry 71:81–90, 2014
37.
Kelleher KJ, Campo JV, Gardner WP: Management of pediatric mental disorders in primary care: where are we now and where are we going? Current Opinion in Pediatrics 18:649–653, 2006
38.
Kelleher KJ, Stevens J: Evolution of child mental health services in primary care. Academic Pediatrics 9:7–14, 2009
39.
Billings J, Parikh N, Mijanovich T: Emergency Room Use: The New York Story. New York, Commonwealth Fund, 2000. Available at www.commonwealthfund.org/publications/issue-briefs/2000/nov/emergency-room-use--the-new-york-story
40.
Taubman SL, Allen HL, Wright BJ, et al: Medicaid increases emergency-department use: evidence from Oregon’s Health Insurance Experiment. Science 343:263–268, 2014
41.
Dolan MA, Fein JA: Pediatric and adolescent mental health emergencies in the emergency medical services system. Pediatrics 127:e1356–e1366, 2011
42.
Grupp-Phelan J, Harman JS, Kelleher KJ: Trends in mental health and chronic condition visits by children presenting for care at US emergency departments. Public Health Reports 122:55–61, 2007
43.
Pittsenbarger ZE, Mannix R: Trends in pediatric visits to the emergency department for psychiatric illnesses. Academic Emergency Medicine 21:25–30, 2014
44.
Olfson M, Kroenke K, Wang S, et al: Trends in office-based mental health care provided by psychiatrists and primary care physicians. Journal of Clinical Psychiatry 75:247–253, 2014

Information & Authors

Information

Published In

Go to Psychiatric Services
Go to Psychiatric Services

Cover: Horse Drawn Cabs at Evening, New York, by Childe Hassam, circa 1890. Watercolor. Daniel J. Terra Collection, 199.66. Terra Foundation for American Art. Photo credit: Terra Foundation for American Art, Chicago/Art Resource, New York City.

Psychiatric Services
Pages: 1141 - 1148
PubMed: 26129994

History

Received: 13 July 2014
Revision received: 25 January 2015
Accepted: 13 February 2015
Published online: 1 July 2015
Published in print: November 01, 2015

Authors

Details

Karen A. Hacker, M.D., M.P.H.
Dr. Hacker is with the Allegheny County Health Department, Pittsburgh, Pennsylvania (e-mail: [email protected]). Dr. Penfold is with the Department of Health Services Research, Group Health Research Institute, Seattle. Dr. Arsenault is with the Institute for Community Health, Cambridge, Massachusetts. Dr. Zhang and Dr. Soumerai are with the Harvard Pilgrim Health Care Institute and the Department of Population Medicine, Harvard Medical School, both in Boston. Dr. Wissow is with the Department of Health Policy and Management, Johns Hopkins Bloomberg School of Public Health, Baltimore.
Robert B. Penfold, Ph.D.
Dr. Hacker is with the Allegheny County Health Department, Pittsburgh, Pennsylvania (e-mail: [email protected]). Dr. Penfold is with the Department of Health Services Research, Group Health Research Institute, Seattle. Dr. Arsenault is with the Institute for Community Health, Cambridge, Massachusetts. Dr. Zhang and Dr. Soumerai are with the Harvard Pilgrim Health Care Institute and the Department of Population Medicine, Harvard Medical School, both in Boston. Dr. Wissow is with the Department of Health Policy and Management, Johns Hopkins Bloomberg School of Public Health, Baltimore.
Lisa N. Arsenault, Ph.D.
Dr. Hacker is with the Allegheny County Health Department, Pittsburgh, Pennsylvania (e-mail: [email protected]). Dr. Penfold is with the Department of Health Services Research, Group Health Research Institute, Seattle. Dr. Arsenault is with the Institute for Community Health, Cambridge, Massachusetts. Dr. Zhang and Dr. Soumerai are with the Harvard Pilgrim Health Care Institute and the Department of Population Medicine, Harvard Medical School, both in Boston. Dr. Wissow is with the Department of Health Policy and Management, Johns Hopkins Bloomberg School of Public Health, Baltimore.
Fang Zhang, Ph.D.
Dr. Hacker is with the Allegheny County Health Department, Pittsburgh, Pennsylvania (e-mail: [email protected]). Dr. Penfold is with the Department of Health Services Research, Group Health Research Institute, Seattle. Dr. Arsenault is with the Institute for Community Health, Cambridge, Massachusetts. Dr. Zhang and Dr. Soumerai are with the Harvard Pilgrim Health Care Institute and the Department of Population Medicine, Harvard Medical School, both in Boston. Dr. Wissow is with the Department of Health Policy and Management, Johns Hopkins Bloomberg School of Public Health, Baltimore.
Stephen B. Soumerai, Sc.D.
Dr. Hacker is with the Allegheny County Health Department, Pittsburgh, Pennsylvania (e-mail: [email protected]). Dr. Penfold is with the Department of Health Services Research, Group Health Research Institute, Seattle. Dr. Arsenault is with the Institute for Community Health, Cambridge, Massachusetts. Dr. Zhang and Dr. Soumerai are with the Harvard Pilgrim Health Care Institute and the Department of Population Medicine, Harvard Medical School, both in Boston. Dr. Wissow is with the Department of Health Policy and Management, Johns Hopkins Bloomberg School of Public Health, Baltimore.
Lawrence S. Wissow, M.D., M.P.H.
Dr. Hacker is with the Allegheny County Health Department, Pittsburgh, Pennsylvania (e-mail: [email protected]). Dr. Penfold is with the Department of Health Services Research, Group Health Research Institute, Seattle. Dr. Arsenault is with the Institute for Community Health, Cambridge, Massachusetts. Dr. Zhang and Dr. Soumerai are with the Harvard Pilgrim Health Care Institute and the Department of Population Medicine, Harvard Medical School, both in Boston. Dr. Wissow is with the Department of Health Policy and Management, Johns Hopkins Bloomberg School of Public Health, Baltimore.

Competing Interests

The authors report no financial relationships with commercial interests.

Funding Information

National Institute of Mental Health10.13039/100000025: R21MH094942, U19MH092201
All phases of this study were supported by grants R21MH094942 and U19MH092201 from the National Institute of Mental Health.

Metrics & Citations

Metrics

Citations

Export Citations

If you have the appropriate software installed, you can download article citation data to the citation manager of your choice. Simply select your manager software from the list below and click Download.

For more information or tips please see 'Downloading to a citation manager' in the Help menu.

Format
Citation style
Style
Copy to clipboard

View Options

View options

PDF/EPUB

View PDF/EPUB

Get Access

Login options

Already a subscriber? Access your subscription through your login credentials or your institution for full access to this article.

Personal login Institutional Login Open Athens login
Purchase Options

Purchase this article to access the full text.

PPV Articles - Psychiatric Services

PPV Articles - Psychiatric Services

Not a subscriber?

Subscribe Now / Learn More

PsychiatryOnline subscription options offer access to the DSM-5-TR® library, books, journals, CME, and patient resources. This all-in-one virtual library provides psychiatrists and mental health professionals with key resources for diagnosis, treatment, research, and professional development.

Need more help? PsychiatryOnline Customer Service may be reached by emailing [email protected] or by calling 800-368-5777 (in the U.S.) or 703-907-7322 (outside the U.S.).

Media

Figures

Other

Tables

Share

Share

Share article link

Share