Skip to main content

Abstract

Objective:

The Certified Community Behavioral Health Clinic (CCBHC) demonstration is designed to increase access to comprehensive ambulatory care and crisis services, which may reduce emergency department (ED) visits and hospitalizations. This study examined whether the demonstration had an impact on ED visits and hospitalizations in Missouri, Oklahoma, and Pennsylvania.

Methods:

This difference-in-differences analysis used Medicaid claims data from 2015 to 2019 to examine service use during a 12-month baseline period and the first 24 months of the demonstration for beneficiaries who received care from CCBHCs and beneficiaries who received care from other behavioral health clinics in the same state, representing care as usual. Propensity score methods were used to develop treatment and comparison groups with similar characteristics.

Results:

In Pennsylvania and Oklahoma, beneficiaries who received care from CCBHCs had a statistically significant reduction in the average number of behavioral health ED visits, relative to the comparison group (13% and 11% reductions, respectively); no impact on ED visits in Missouri was observed. The demonstration was associated with a statistically significant reduction in all-cause hospitalizations in Oklahoma, when the analysis used a 2-year rather than a 1-year baseline period, and also in Pennsylvania, when hospitalizations were truncated at the 98th percentile to exclude beneficiaries with outlier hospitalization rates.

Conclusions:

The CCBHC demonstration reduced behavioral health ED visits in two states, and the study also revealed some evidence of reductions in hospitalizations.

HIGHLIGHTS

The Certified Community Behavioral Health Clinic (CCBHC) demonstration, which eight states began implementing in mid-2017, allows states to test a new strategy to deliver and reimburse services in behavioral health clinics.
In two of the three states included in the study, Medicaid beneficiaries who received care from CCBHCs had a statistically significant reduction in behavioral health emergency department visits, compared with beneficiaries who received care from other community behavioral health clinics.
Findings from sensitivity analyses suggested that CCBHCs could reduce hospitalizations in the same two states.
Community behavioral health clinics provide critical services and function as safety-net providers for people with mental and substance use disorders. States have increasingly relied on these clinics to help people live independently in the community and avoid institutional care (1, 2). There are nearly 2,600 community mental health centers and 5,600 specialty outpatient mental health clinics across the United States (3). In the absence of federal licensing or accreditation standards, these clinics vary in the types of services they provide and populations they serve, which may also reflect the local workforce and the ability of these clinics to leverage different funding streams (46). Medicaid is an important funding source for these clinics, but Medicaid payment rates have not historically covered the full costs of the services that these clinics provide (7, 8). As a result, clinics have turned to a patchwork of state and local funding and philanthropy to supplement costs for Medicaid beneficiaries and people without insurance (9). Such variation in funding and the services available from these clinics could contribute to disparate access to care across communities (10, 11).
Section 223 of the Protecting Access to Medicare Act authorized the Certified Community Behavioral Health Clinic (CCBHC) demonstration, which allows states to test a new strategy to deliver and reimburse services in behavioral health clinics (12). Participating states certify that clinics provide a standard set of comprehensive ambulatory mental health and substance use services, crisis services, primary care screening and monitoring, and care coordination to adults and children (12). Participation in the demonstration also requires CCBHCs to maintain relationships with hospitals and a wide range of other providers to coordinate care. They must also implement other activities to increase access to care, such as offering same-day appointments and conducting outreach to underserved populations.
The CCBHC demonstration established a new Medicaid prospective payment system designed to cover the full costs of CCBHC services. In this system, state Medicaid programs elect to reimburse all CCBHCs in the state by using either a fixed daily or a monthly rate for each day or month, respectively, that a Medicaid beneficiary receives care from a CCBHC. The CCBHC receives the same daily or monthly payment regardless of the number or type of services provided during a visit or month. This reimbursement mechanism gives clinics some flexibility to tailor services to the beneficiary without being concerned about the financial impact of every encounter or procedure. CCBHCs also report a common set of quality measures, and states can award quality bonus payments based on measure performance. Congress initially authorized the demonstration for 2 years, and eight states began implementing the CCBHC model in mid-2017. Congress has extended the demonstration and allowed additional states to participate.
In this study, we examined how the demonstration affected emergency department (ED) visits and hospitalizations during the first 2 years in Missouri, Oklahoma, and Pennsylvania. The CCBHC model could increase access to comprehensive services, thereby helping people avoid EDs and hospitalizations. We hypothesized that Medicaid beneficiaries who received care from CCBHCs would have fewer ED visits and hospitalizations, compared with Medicaid beneficiaries who received care from other community behavioral health clinics.

Methods

Data

We selected three of the original demonstration states to reflect different geographic areas and CCBHC payment models. Missouri and Pennsylvania used a daily rate to reimburse CCBHCs, whereas Oklahoma used a monthly rate (13). We limited the analysis to these states because of resource constraints and concerns about the usability of Medicaid data from some other states. At the beginning of the study period, there were 15 CCBHCs in Missouri, which served 78% of the counties; three CCBHCs in Oklahoma, serving 22% of the counties; and seven CCBHCs in Pennsylvania, serving 10% of the counties. We established a data use agreement with each state to obtain Medicaid fee-for-service claims and managed care encounter data covering calendar years 2015 through 2019. The data included inpatient, ED, and ambulatory claims for adults and children-adolescents who received care from CCBHCs and other community behavioral health clinics for a 2-year period before the demonstration (mid-2015–mid-2017) and for the first 2 years of the demonstration (mid-2017–mid-2019). The analysis was exempt from institutional review board approval.

Treatment and Comparison Groups

We conducted analyses separately for each state rather than pool data across states because of state differences in CCBHC payment models and Medicaid programs. In each state, the sample included beneficiaries who received care from a community behavioral health clinic (including those that became CCBHCs) at any time in the 24 months before the demonstration. Beneficiaries were assigned to the treatment group if their last visit to a behavioral health clinic before the demonstration was to a clinic that became a CCBHC. Beneficiaries were assigned to the comparison group if their last visit to a behavioral health clinic was a clinic that did not become a CCBHC, thus representing usual care. We defined the samples on the basis of where they received care before the demonstration to minimize bias that could be introduced by changes in CCBHC case mix during the demonstration. We excluded beneficiaries if they died before the demonstration or were dually enrolled in Medicare (because we did not have Medicare data) and beneficiaries who were not eligible for full Medicaid benefits or were not continuously enrolled in Medicaid for at least 6 months during the demonstration (a table in the online supplement to this article provides additional details). The compositions of the treatment and comparison groups did not differ whether we made the assignment on the basis of the clinic where the beneficiary received most behavioral health services in the 2 years before the demonstration or whether we made the assignment on the basis of the last visit. The treatment and comparison groups had <5% crossover during the demonstration, likely because these clinics typically serve different catchment areas.

Outcomes Measures and Variables Describing Baseline Characteristics

Outcome measures included ED visits that did not result in a hospitalization and hospitalizations (to avoid double counting hospitalizations and ED visits) per 1,000 beneficiaries. We categorized hospitalizations and ED visits as related to behavioral health if the claim included a principal behavioral health diagnosis (ICD-9 and ICD-10 codes for mental health and substance use diagnoses, excluding developmental and intellectual disabilities, autism, dementia, and Alzheimer’s disease). Claims without a principal behavioral health diagnosis were categorized as general medical health. The analysis included variables for demographic characteristics, Medicaid eligibility category, residence (urban, suburban, or rural), and health status in the 2-year predemonstration period (14, 15). For Missouri and Pennsylvania, we included variables for enrollment into specific managed care plans; Oklahoma did not enroll beneficiaries in managed care.

Propensity Score Methods

The sample for propensity score analyses included 21,453 beneficiaries in Missouri, 36,866 in Oklahoma, and 186,414 in Pennsylvania. For Missouri and Oklahoma, the comparison group was smaller than the treatment group. As a result, we used the variables in Table 1 to reweight the comparison group to resemble the treatment group on the basis of a beneficiary’s predicted probability of being assigned to treatment given the person’s observable predemonstration characteristics. In Missouri, the treatment and comparison groups generally had similar levels (e.g., rates of hospitalizations per 1,000 beneficiaries) and quarterly trends in outcomes in the baseline period before and after propensity score weighting. In Oklahoma, the groups had similar trends but levels differed for some outcomes. We prioritized parallel trends during the eight baseline quarters, because the parallel-trend assumption is critical for the difference-in-differences design and helps protect against regression to the mean (16, 17), which we examined visually in graphs after weighting.
TABLE 1. Baseline characteristics of Medicaid beneficiaries included in the analysis who received care from Certified Community Behavioral Health Clinics (treatment group) or another behavioral health clinic (comparison group), by statea
 MissouriOklahomaPennsylvania
CharacteristicTreatment group (N=18,545)Comparison group (N=2,891)Standardized differenceTreatment group (N=10,839)Comparison group (N=25,836)Standardized differenceTreatment group (N=6,620)Comparison group (N=22,571)Standardized difference
Demographic         
 Age (M±SD years)30±1830±17.0224±1524±16−.0425±1524±15.04
 Male4645.014241.015453.03
 Race         
  White7982−.066363<.014948.03
  Black1715.061212−.013841−.05
  All other races43.022524.011211.03
 Residence         
  Urban55.0266−.012322.04
  Suburban8181<.017677−.017677−.03
  Rural1415−.021817.0211−.02
 Medicaid eligibility category         
  Adult77<.012425−.023026.07
  Child3838<.015553.044953−.06
  Disabled5555<.012123−.032121<.01
 Ever enrolled in Medicaid managed care in baseline period (%)b4546−.017069.01100100 
Health status and condition         
 CDPS score (M±SD)c2.5±1.72.5±1.7<.012.0±1.52.1±1.7−.012.1±1.62.1±1.5.01
 Any mental health condition (not limited to those listed below)8384−.048074.166061−.03
  Anxiety disorder5756.025049<.013941−.03
  Bipolar disorder4241.012627−.012423.01
  Depressive disorder4948.025144.153938.01
  Schizophrenia or other psychotic disorder2121.021817.041010−.02
 Any substance use disorder2221.032623.063734.06
  Alcohol use disorder99.0156−.011513.05
  Any drug use disorder1817.022321.063231.03
  Opioid use disorder55−.0156−.011615.04
 General medical condition         
  Hyperlipidemia1313.0188<.0166.01
  Hypertension2524.021415−.011111.01
  Obesity1111.0199<.011415−.02
  Diabetes1312.0267−.0155−.01
Service use during 2-year baseline period         
 All-cause hospitalizations per 1,000 beneficiaries per year (M±SD)526±1,110542±1,272−.01134±297131±311.01266±680254±744.02
 Behavioral health hospitalizations per 1,000 beneficiaries per year (M±SD)327±869342±969−.0295±25785±257.04167±559159±605.01
 Emergency department (ED) visits per 1,000 beneficiaries per year (M±SD)2,013±3,6622,065±4,317−.011,681±2,9051,523±3,426.051,686±3,0991,660±2,924.01
 Behavioral health ED visits per 1,000 beneficiaries per year (M±SD)295 ±1,024312±1,165−.02175±654148±862.04213±756185±698.04
 Any hospitalizations3938.042120.022523.04
 Any ED visits7269.066963.147371.04
 Propensity score.87.87.01.33.32.05.06.06.01
a
Values are weighted percentages unless otherwise specified. Data reflect application of propensity score weighting (Missouri and Oklahoma) or matching (Pennsylvania) by using the variables listed in the table. Smaller standardized differences (the ratio of the treatment-comparison difference and the treatment group SD) indicate more closely matched or balanced groups; 0.25 is the typical threshold indicating a good match or balance between groups. The sample for propensity score analyses included 21,453 beneficiaries in Missouri, 36,866 in Oklahoma, and 186,414 in Pennsylvania. In Missouri and Oklahoma, 17 and 191 beneficiaries, respectively, were dropped from the propensity score models because of data issues. A total of 157,223 beneficiaries in Pennsylvania were dropped after propensity score matching because they did not form part of a matched set. The N listed for each column is the denominator for the percentages listed in the column.
b
Missouri enrolled beneficiaries in plans that provided comprehensive health and behavioral health services or only behavioral health services (beneficiaries could be enrolled in both). Pennsylvania enrolled beneficiaries in separate plans for general medical health and behavioral health services. Oklahoma predominantly delivered Medicaid services through fee-for-service arrangements but enrolled beneficiaries in a primary care case management program, which would not affect the findings of this analysis.
c
CDPS, Chronic Illness and Disability Payment System; a higher CDPS score signifies a higher expected risk profile and higher expected spending, whereas a lower CDPS score signifies a lower expected risk and lower expected spending. The scores are normalized so that the average spending in each population equals 1. A score >1 indicates higher than average expected spending, and a score <1 indicates lower than average spending.
For Pennsylvania, we used an optimal-matching algorithm to develop matched sets of treatment and comparison group beneficiaries, because the potential comparison group was much larger than the treatment group. We exact-matched beneficiaries who were enrolled in the same managed care plan at the beginning of the demonstration to ensure that the treatment and comparison groups were drawn from the same regions and had access to the same provider networks. We gave each treatment group beneficiary a weight of 1 and each comparison beneficiary a weight equal to the ratio of treatment beneficiaries to comparison beneficiaries in the matched set. We assessed the quality of the weighted or matched samples on the basis of standardized differences in means (calculated as the ratio of the treatment-comparison difference and the treatment group SD), percentage difference in means, equivalence tests, and trend plots (18).

Impact Analyses

We fit ordinary least-squares (OLS), difference-in-differences regression models with beneficiary fixed effects to estimate the impact of the demonstration on the number of hospitalizations and ED visits over the full 24-month demonstration period and for each of the two 12-month periods after the start of the demonstration. In our main analyses, we limited the baseline period to the 12 months preceding the demonstration. Standard errors (SEs) were adjusted for multiple observations of the same beneficiary to allow for serial correlation of the outcomes within individual beneficiaries over time. We weighted each observation by using the weights from the propensity score models and an eligibility weight that accounted for the number of months the beneficiary was enrolled in Medicaid in each observation period. A key assumption of this design is that the change in outcomes observed among those in the comparison group is what would have been observed in the treatment group in the absence of the demonstration. Consistent with other studies that have used claims data to examine the impacts of new service delivery models (19, 20), we defined p≤0.10 as statistically significant at the outset of the study to avoid falsely concluding that the demonstration did not have effects. We also interpreted the findings in the context of the SEs and effect sizes.
We conducted two sensitivity tests. First, to determine whether the results were sensitive to outliers (that is, to a small number of beneficiaries with high service use), we truncated outcomes at the 98th percentile. Second, we implemented the models by using 24 months (instead of 12 months) of baseline data to examine whether the impact estimates changed when we accounted for longer predemonstration trends.
We considered using zero-inflated negative binomial (ZINB) models, given that hospitalizations and ED visits are not normally distributed. However, ZINB models did not accommodate beneficiary fixed effects to adjust for time-invariant beneficiary characteristics. Previous studies have found that ZINB models produced point estimates similar to those of OLS models but with less conservative SEs (20). Our application of OLS is consistent with previous studies that measured impacts on hospitalizations and ED visits (1922). However, we cannot rule out that ZINB models could have produced different findings.

Results

The treatment and comparison groups were well balanced after propensity score adjustment, with some exceptions. In Oklahoma, the treatment group included a larger share of beneficiaries with a mental health condition during the baseline period, relative to the comparison group (80% vs. 74%, standardized difference=0.16), mostly driven by a larger share of beneficiaries with depression (Table 1). The treatment group in Oklahoma also had a larger share of beneficiaries with an ED visit during the baseline period, relative to the comparison group (69% vs. 63%, standardized difference=0.14).
We noted some differences across states in the characteristics of beneficiaries included in the final analytic samples after propensity score adjustment. For example, 55% of the overall sample in Missouri qualified for Medicaid on the basis of disability, compared with less than one-quarter of the samples in Oklahoma and Pennsylvania. The sample in Missouri was also, on average, slightly older (mean age=30 years), compared with the Oklahoma and Pennsylvania samples (mean age=24 years for both states). The racial composition of the samples also varied by state; approximately 80% of beneficiaries in Missouri were White, compared with 63% in Oklahoma and almost 50% in Pennsylvania. About 25% of beneficiaries in Oklahoma were in the “other” race category, compared with 4% in Missouri and 12% in Pennsylvania. Approximately 15% of beneficiaries in Pennsylvania had an opioid use disorder, compared with about 5% in the other two states. Across states, >80% of treatment and comparison group beneficiaries remained in the sample by month 19 of the 24-month demonstration period.
The demonstration had a statistically significant (p≤0.10) impact on the average number of behavioral health ED visits in Oklahoma and Pennsylvania but not in Missouri (Table 2). Over the 24-month demonstration period, reductions of 11% and 13% were observed in Oklahoma and Pennsylvania, respectively, in the average number of behavioral health ED visits among those who received care from CCBHCs, relative to the comparison group (p=0.08 and p=0.09, respectively). No impact was observed on all-cause ED visits, likely because behavioral health ED visits represented only about 10% of ED visits in each state.
TABLE 2. Emergency department (ED) visits of Medicaid beneficiaries in three states, by service receipt from a clinic in the Certified Community Behavioral Health Clinic (CCBHC) demonstration (treatment group) or another behavioral health clinic (comparison group) in that statea
State and measureTreatment group MComparison group MImpact estimate±SEPercentage impactp
Missouri     
 All-cause visit-days per 1,000 beneficiaries per year     
  Baseline year1,9752,032   
  Months 1–121,8391,81680±755.29
  Months 13–241,7501,74562±1024.54
  Cumulative (months 1–24)1,7961,78172±824.38
 Behavioral health visit-days per 1,000 beneficiaries per year     
  Baseline year284296   
  Months 1–1226024626±2111.22
  Months 13–2425523731±2514.22
  Cumulative (months 1–24)25724228±2112.17
 General medical health visit-days per 1,000 beneficiaries per year     
  Baseline year1,6911,736   
  Months 1–121,5791,57054±694.44
  Months 13–241,4951,50831±932.74
  Cumulative (months 1–24)1,5391,54044±753.56
Oklahoma     
 All-cause visit-days per 1,000 beneficiaries per year     
  Baseline year1,6721,505   
  Months 1–121,5691,409−6±34<1.85
  Months 13–241,4791,328−16±42−1.71
  Cumulative (months 1–24)1,5321,375−10±33<1.75
 Behavioral health visit-days per 1,000 beneficiaries per year     
  Baseline year170145   
  Months 1–12134129−20±10b−13.04
  Months 13–24132118−12±14−8.40
  Cumulative (months 1–24)134124−16±9c−11.08
 General medical health visit-days per 1,000 beneficiaries per year     
  Baseline year1,5021,361   
  Months 1–121,4351,28014±31<1.66
  Months 13–241,3471,210−4±37<1.91
  Cumulative (months 1–24)1,3981,2516±29<1.84
Pennsylvania     
 All-cause visit-days per 1,000 beneficiaries per year     
  Baseline year1,6731,632   
  Months 1–121,4911,527−78±51−5.12
  Months 13–241,4041,421−59±51−4.25
  Cumulative (months 1–24)1,4511,475−66±45−4.14
 Behavioral health visit-days per 1,000 beneficiaries per year     
  Baseline year215183   
  Months 1–12169163−26±16c−14.10
  Months 13–24157147−22±15−13.14
  Cumulative (months 1–24)164155−23±14c−13.09
 General medical health visit-days per 1,000 beneficiaries per year     
  Baseline year1,4591,449   
  Months 1–121,3221,364−51±45−4.25
  Months 13–241,2461,274−37±46−3.43
  Cumulative (months 1–24)1,2881,321−42±40−3.29
a
A negative impact estimate signifies a favorable impact for the CCBHC demonstration; that is, it indicates a greater decrease in ED visit days for the treatment group relative to the comparison group.
b
Significantly different from zero at the α=0.05 level (two-tailed test).
c
Significantly different from zero at the α=0.10 level (two-tailed test).
The demonstration did not have a statistically significant impact on the average number of all-cause, general medical health, or behavioral health hospitalizations in any state in our main analysis (Table 3). However, findings from sensitivity analyses suggested that CCBHCs could have reduced hospitalizations. In Oklahoma, the demonstration was associated with an approximately 22% decrease in all-cause hospitalizations, behavioral health hospitalizations, and general medical health hospitalizations when we used a 2-year baseline period to account for longer predemonstration trends (p<0.05 for all analyses) (see table in online supplement). In Pennsylvania, beneficiaries who received care from CCBHCs had a 10% decrease in all-cause hospitalizations, relative to the comparison group (p=0.06), but only when we truncated the number of all-cause hospitalizations at the 98th percentile.
TABLE 3. Hospitalizations of Medicaid beneficiaries in three states, by whether they received services from a clinic in the Certified Community Behavioral Health Clinic (CCBHC) demonstration (treatment group) or another behavioral health clinic (comparison group) in that statea
State and measureTreatment group MComparison group MImpact estimate±SEPercentage impactp
Missouri     
 All-cause stays per 1,000 beneficiaries per year     
  Baseline year519507   
  Months 1–1246743321±245.38
  Months 13–2445440437±279.17
  Cumulative (months 1–24)46041830±237.19
 Behavioral health stays per 1,000 beneficiaries per year     
  Baseline year328319   
  Months 1–1227325212±195.51
  Months 13–2425723117±217.42
  Cumulative (months 1–24)26524115±176.39
 General medical health stays per 1,000 beneficiaries per year     
  Baseline year191187   
  Months 1–121931809±155.55
  Months 13–2419717320±1712.22
  Cumulative (months 1–24)19517615±148.30
Oklahoma     
 All-cause stays per 1,000 beneficiaries per year     
  Baseline year104116   
  Months 1–127287−3±6−4.62
  Months 13–246790−11±7b−14.08
  Cumulative (months 1–24)7088−7±6−8.25
 Behavioral health stays per 1,000 beneficiaries per year     
  Baseline year7775   
  Months 1–125054−6±5−10.29
  Months 13–244451−8±6−15.14
  Cumulative (months 1–24)4752−7±5−12.17
 General medical health stays per 1,000 beneficiaries per year     
  Baseline year2741   
  Months 1–1222333±313.42
  Months 13–242340−3±4−12.36
  Cumulative (months 1–24)2336<1±3<1.98
Pennsylvania     
 All-cause stays per 1,000 beneficiaries per year     
  Baseline year257248   
  Months 1–12199201−11±13−6.39
  Months 13–24198197−8±14−4.58
  Cumulative (months 1–24)198199−10±12−5.39
 Behavioral health stays per 1,000 beneficiaries per year     
  Baseline year155155   
  Months 1–12108112−4±11−3.73
  Months 13–241061042±122.90
  Cumulative (months 1–24)107108−1±8−1.91
 General medical health stays per 1,000 beneficiaries per year     
  Baseline year10293   
  Months 1–129190−8±8−8.33
  Months 13–249293−10±8−10.26
  Cumulative (months 1–24)9191−9±7−9.20
a
A negative impact estimate signifies a favorable impact for the CCBHC demonstration; that is, it indicates a greater decrease in the number of hospitalizations for the treatment group relative to the comparison group.
b
Significantly different from zero at the α=0.10 level (two-tailed test).

Discussion

In this study, we examined the impacts of the CCBHC demonstration on ED visits and hospitalizations in Missouri, Oklahoma, and Pennsylvania. In Oklahoma and Pennsylvania, Medicaid beneficiaries who received care from CCBHCs had a greater reduction in behavioral health ED visits, compared with those who received care from other behavioral health clinics in these two states. Findings from sensitivity analyses suggested that CCBHCs could have reduced hospitalizations in the two states. Several features of the CCBHC model could have contributed to these findings, including requirements for CCBHCs to provide access to comprehensive behavioral health and crisis services, peer support, and care coordination. CCBHCs also undertake other activities to increase access to care, such as offering same-day appointments and delivering care in locations beyond the clinic. The demonstration payment system does not require CCBHCs to bill for every procedure or type of service, which could have allowed them to tailor services to clients and help these individuals avoid seeking care from EDs and hospitals. However, daily and monthly billing processes impeded use of the claims data to determine whether the delivery of specific services contributed to these impacts. It is notable that the CCBHC demonstration in these two states achieved these impacts during the first demonstration year and despite early implementation challenges related to workforce shortages and changes in billing processes (23). The CCBHCs were able to provide the required scope of services at the launch of the demonstration, which could have had an immediate impact on ED visits and hospitalizations. Detecting impacts on hospitalizations for the full CCBHC population could require a longer evaluation period, given that psychiatric hospitalizations are less common than ED visits (24, 25).
Previous interventions to increase access to comprehensive care in community behavioral health clinics have typically been narrower in scope than the CCBHC demonstration and have yielded mixed findings. For example, some state Medicaid programs have implemented behavioral health home models that share some features of CCBHCs (26); some studies of these models have reported positive impacts on all-cause (but not behavioral health) ED visits and no impacts on hospitalizations (27), whereas other studies found positive impacts on hospitalizations (28), likely reflecting differences in interventions, implementation contexts, and study periods. Likewise, studies have produced mixed findings on whether integrating primary care services into behavioral health clinics has an impact on ED visits and hospitalizations (29, 30). These past efforts have generally focused on changing a limited number of care processes often for specific populations, supported by grant funding or modest changes in Medicaid payment rates (31, 32). In contrast, the CCBHC demonstration requires clinic-wide implementation of a new model supported by a full redesign of the Medicaid payment system. This study provides the first insights into how this new model and payment system function together to affect ED visits and hospitalizations.
Readers should interpret the findings within the context of each state’s population and service delivery system characteristics, which we could not fully account for by using the available data. We did not design the study to directly compare states or draw conclusions about the best approach to implementing CCBHCs. As noted, there were some differences in each state’s implementation approach (for example, Missouri implemented CCBHCs to serve most counties, whereas the other two states implemented within regions) and populations, which could explain some differences in the findings across states. However, there were also evaluation design constraints (for example, fewer comparison group beneficiaries in Missouri relative to other states) that could explain these different findings across states.
This study had limitations. Although the study design allowed us to attribute impacts to the demonstration, it required limiting the analytic population to Medicaid beneficiaries who received care from these clinics before the demonstration. Therefore, the findings reflect the impacts among beneficiaries who were already receiving care from these clinics. Findings could differ among those who newly entered services during the demonstration. Future studies could use alternative designs to compare their results with these findings. In addition, although the treatment and comparison groups were comparable on observable characteristics, they could have differed on characteristics that are not measurable with Medicaid data. This may have been particularly relevant in Missouri, where the areas not affected by the demonstration were more limited than in other states. Small residual imbalances in observed characteristics, such as the higher prevalence of depression and higher baseline rate of ED visits in the treatment group in Oklahoma relative to the comparison group, could also have affected impact estimates. Finally, the findings reflect the first 2 years of the demonstration because this was the period initially authorized by Congress, and the analysis did not include all demonstration states.

Conclusions

The CCBHC demonstration reduced behavioral health ED visits among Medicaid beneficiaries, and there was some evidence of reductions in hospitalizations. Because the demonstration has continued beyond the initial 2 years and expanded to additional states, future research could examine impacts over a longer period and in other implementation contexts.

Acknowledgments

The authors thank Crystal Blyler, Ph.D., for providing feedback on earlier drafts of the manuscript; Rachel Hildrich Gross, B.S., Mark Lee, M.P.H., Sybil Pan, B.S., and McCayla Sica, B.S., for providing Medicaid data acquisition and management support; and Harold Alan Pincus, M.D., for consultation on the study design and interim findings. They also appreciate the assistance of the state Medicaid staff, who provided data for the study and guidance on the use of the data.

Footnote

Ms. DeWitt and Mr. Chapman completed the work while employed by Mathematica.

Supplementary Material

File (appi.ps.20220410.ds001.pdf)

References

1.
Rosenbaum S: Using the courts to shape Medicaid policy: Olmstead v LC by Zimring and its community integration legacy. J Health Polit Policy Law 2016; 41:585–597
2.
Mechanic D: Mental health services then and now. Health Aff 2007; 26:1548–1550
3.
National Mental Health Services Survey (N-MHSS): 2019, Data on Mental Health Treatment Facilities. Rockville, MD, Substance Abuse and Mental Health Services Administration, 2020. https://www.samhsa.gov/data/report/national-mental-health-services-survey-n-mhss-2019-data-mental-health-treatment-facilities. Accessed Feb 3, 2023
4.
Wishon A, Brown J: Variation in Services Offered by Certified Community Behavioral Health Clinics and Community Mental Health Centers. Rockville, MD, Substance Abuse and Mental Health Services Administration, 2021. https://www.mathematica.org/publications/variation-in-services-offered-by-certified-community-behavioral-health-clinics-and-community-mental. Accessed Feb 3, 2023
5.
Brown JD: Availability of integrated primary care services in community mental health care settings. Psychiatr Serv 2019; 70:499–502
6.
Walker ER, Berry FW, Citron T, et al: Psychiatric workforce needs and recommendations for the community mental health system: a state needs assessment. Psychiatr Serv 2015; 66:115–117
7.
Scharf DM: Considerations for the Design of Payment Systems and Implementation of Certified Community Behavioral Health Centers. Santa Monica, CA, RAND, 2015
8.
Report to Congress on Medicaid and CHIP. Washington, DC, Medicaid and CHIP Payment and Access Commission, 2015. https://www.macpac.gov/wp-content/uploads/2015/06/June-2015-Report-to-Congress-on-Medicaid-and-CHIP.pdf
9.
Excellence in Mental Health Act: An Introduction. Rockville, MD, Substance Abuse and Mental Health Services Administration, National Council for Behavioral Health, 2015. http://www.thenationalcouncil.org/wp-content/uploads/2015/11/Fact-Sheet_Excellence-in-mental-health-act-an-introduction-FINAL.pdf
10.
Cummings JR, Allen L, Clennon J, et al: Geographic access to specialty mental health care across high- and low-income US communities. JAMA Psychiatry 2017; 74:476–484
11.
Shim RS, Lally C, Farley R, et al: Medicare care services in community mental health centers: a national survey of psychiatrists. J Behav Health Serv Res 2015; 42:395–400
12.
Criteria for the Demonstration Program to Improve Community Mental Health Centers and to Establish Certified Community Behavioral Health Clinics. Rockville, MD, Substance Abuse and Mental Health Services Administration, 2016. https://www.samhsa.gov/sites/default/files/programs_campaigns/ccbhc-criteria.pdf
13.
Brown JD, Breslau J, Wishon A, et al: Implementation and Impacts of the Certified Community Behavioral Health Clinic Demonstration: Findings From the National Evaluation Report. Washington, DC, Mathematica, 2021
14.
Kronick R, Gilmer T, Dreyfus T, et al: Improving health-based payment for Medicaid beneficiaries: CDPS. Health Care Financ Rev 2000; 21:29–64
15.
Chronic Conditions Data Warehouse: Condition Categories. Baltimore, Centers for Medicare and Medicaid Services, 2022. https://www2.ccwdata.org/web/guest/condition-categories. Accessed Feb 3, 2023
16.
Daw JR, Hatfield LA: Matching and regression to the mean in difference-in-differences analysis. Health Serv Res 2018; 53:4138–4156
17.
Daw JR, Hatfield LA: Matching in difference-in-differences: between a rock and a hard place. Health Serv Res 2018; 53:4111–4117
18.
Rubin DB: Using propensity scores to help design observational studies: application to the tobacco litigation. Health Serv Outcomes Res Methodol 2001; 2:169–188
19.
Peterson GG, Geonnotti KL, Hula L, et al: Association between extending CareFirst’s medical home program to Medicare patients and quality of care, utilization, and spending. JAMA Intern Med 2017; 177:1334–1342
20.
Peikes D, Anglin G, Harrington M, et al: Independent Evaluation of Comprehensive Primary Care Plus (CPC+): First Annual Report, Appendices to the Supplemental Volume. Princeton, NJ, Mathematica, 2019. https://downloads.cms.gov/files/cmmi/cpcplus-first-ann-rpt-supp-rpt-app.pdf
21.
Gilman B, Whicher D, Brown R, et al: Evaluation of the Health Care Innovations Awards, Round 2: Final Report. Cambridge, MA, Mathematica, 2020. https://innovation.cms.gov/data-and-reports/2020/hcia2-round-2-final-eval-report-sept-2020-0
22.
Breslau J, Han B, Lai J, et al: Impact of the Affordable Care Act Medicaid expansion on utilization of mental health care. Med Care 2020; 58:757–762
23.
Wishon A, Miller R, Little J, et al: Implementation Findings From the National Evaluation of the Certified Community Behavioral Health Clinic Demonstration. Washington, DC, Mathematica, 2020. https://aspe.hhs.gov/report/implementation-findings-national-evaluation-certified-community-behavioral-health-clinic-demonstration. Accessed Feb 3, 2023
24.
Pinals DA, Fuller DA: Beyond Beds: The Vital Role of a Full Continuum of Psychiatric Care. Arlington, VA, National Association of State Mental Health Program Directors, 2017
25.
Bouchery E, Chao S, Geibel MA, et al: Medicaid Emergency Psychiatric Demonstration: Response to 21st Century Cures Act Requirements: Report to Congress. Washington, DC, Mathematica, 2019. https://innovation.cms.gov/files/reports/mepd-curesact-rtc.pdf
26.
Murphy KA, Daumit GL, Stone E, et al: Physical health outcomes and implementation of behavioural health homes: a comprehensive review. Int Rev Psychiatry 2018; 30:224–241
27.
Bandara SN, Kennedy-Hendricks A, Stuart EA, et al: The effects of the Maryland Medicaid health home waiver on emergency department and inpatient utilization among individuals with serious mental illness. Gen Hosp Psychiatry 2020; 64:99–104
28.
Highland J, Nikolajski C, Kogan J, et al: Impact of behavioral health homes on cost and utilization outcomes. Psychiatr Serv 2020; 71:796–802
29.
Breslau J, Sorbero MJ, Kusuke D, et al: Primary and Behavioral Health Integration Program: Impacts on Healthcare Utilization, Costs, and Quality. Washington, DC, RAND, 2019. https://aspe.hhs.gov/reports/primary-behavioral-health-care-integration-program-impacts-health-care-utilization-cost-quality-0. Accessed Feb 3, 2023
30.
Bouchery EE, Siegwarth AW, Natzke B, et al: Implementing a whole health model in a community mental health center: impact on service utilization and expenditures. Psychiatr Serv 2018; 69:1075–1080
31.
Goldman ML, Scharf DM, Brown JD, et al: Structural components of integrated behavioral health care: a comparison of national programs. Psychiatr Serv 2022; 73:584–587
32.
McGinty EE, Presskreischer R, Breslau J, et al: Improving physical health among people with serious mental illness: the role of the specialty mental health sector. Psychiatr Serv 2021; 72:1301–1310

Information & Authors

Information

Published In

Go to Psychiatric Services
Go to Psychiatric Services
Psychiatric Services
Pages: 911 - 920
PubMed: 36916061

History

Received: 9 August 2022
Revision received: 5 October 2022
Revision received: 5 January 2023
Accepted: 12 January 2023
Published online: 14 March 2023
Published in print: September 01, 2023

Keywords

  1. Community mental health centers
  2. Hospitalization
  3. Service delivery systems
  4. Service delivery
  5. Reimbursement
  6. Medicaid

Authors

Details

Jonathan D. Brown, Ph.D., M.H.S. [email protected]
Mathematica, Washington, D.C. (Brown, Stewart, Miller, Dehus, Rose, Wishon); Verana Health, San Francisco (DeWitt, Chapman); RAND Corporation, Pittsburgh (Breslau); Office of the Assistant Secretary for Planning and Evaluation, U.S. Department of Health and Human Services, Washington, D.C. (Dey, Jacobus-Kantor).
Kate A. Stewart, Ph.D., S.M.
Mathematica, Washington, D.C. (Brown, Stewart, Miller, Dehus, Rose, Wishon); Verana Health, San Francisco (DeWitt, Chapman); RAND Corporation, Pittsburgh (Breslau); Office of the Assistant Secretary for Planning and Evaluation, U.S. Department of Health and Human Services, Washington, D.C. (Dey, Jacobus-Kantor).
Rachel L. Miller, M.P.A.
Mathematica, Washington, D.C. (Brown, Stewart, Miller, Dehus, Rose, Wishon); Verana Health, San Francisco (DeWitt, Chapman); RAND Corporation, Pittsburgh (Breslau); Office of the Assistant Secretary for Planning and Evaluation, U.S. Department of Health and Human Services, Washington, D.C. (Dey, Jacobus-Kantor).
Eric Dehus, M.S.
Mathematica, Washington, D.C. (Brown, Stewart, Miller, Dehus, Rose, Wishon); Verana Health, San Francisco (DeWitt, Chapman); RAND Corporation, Pittsburgh (Breslau); Office of the Assistant Secretary for Planning and Evaluation, U.S. Department of Health and Human Services, Washington, D.C. (Dey, Jacobus-Kantor).
Tyler Rose, M.S.
Mathematica, Washington, D.C. (Brown, Stewart, Miller, Dehus, Rose, Wishon); Verana Health, San Francisco (DeWitt, Chapman); RAND Corporation, Pittsburgh (Breslau); Office of the Assistant Secretary for Planning and Evaluation, U.S. Department of Health and Human Services, Washington, D.C. (Dey, Jacobus-Kantor).
Kathryn DeWitt, M.S.
Mathematica, Washington, D.C. (Brown, Stewart, Miller, Dehus, Rose, Wishon); Verana Health, San Francisco (DeWitt, Chapman); RAND Corporation, Pittsburgh (Breslau); Office of the Assistant Secretary for Planning and Evaluation, U.S. Department of Health and Human Services, Washington, D.C. (Dey, Jacobus-Kantor).
Richard Chapman, M.S.
Mathematica, Washington, D.C. (Brown, Stewart, Miller, Dehus, Rose, Wishon); Verana Health, San Francisco (DeWitt, Chapman); RAND Corporation, Pittsburgh (Breslau); Office of the Assistant Secretary for Planning and Evaluation, U.S. Department of Health and Human Services, Washington, D.C. (Dey, Jacobus-Kantor).
Allison Wishon, M.H.S.
Mathematica, Washington, D.C. (Brown, Stewart, Miller, Dehus, Rose, Wishon); Verana Health, San Francisco (DeWitt, Chapman); RAND Corporation, Pittsburgh (Breslau); Office of the Assistant Secretary for Planning and Evaluation, U.S. Department of Health and Human Services, Washington, D.C. (Dey, Jacobus-Kantor).
Joshua Breslau, Ph.D., Sc.D.
Mathematica, Washington, D.C. (Brown, Stewart, Miller, Dehus, Rose, Wishon); Verana Health, San Francisco (DeWitt, Chapman); RAND Corporation, Pittsburgh (Breslau); Office of the Assistant Secretary for Planning and Evaluation, U.S. Department of Health and Human Services, Washington, D.C. (Dey, Jacobus-Kantor).
Judith Dey, Ph.D.
Mathematica, Washington, D.C. (Brown, Stewart, Miller, Dehus, Rose, Wishon); Verana Health, San Francisco (DeWitt, Chapman); RAND Corporation, Pittsburgh (Breslau); Office of the Assistant Secretary for Planning and Evaluation, U.S. Department of Health and Human Services, Washington, D.C. (Dey, Jacobus-Kantor).
Laura Jacobus-Kantor, Ph.D.
Mathematica, Washington, D.C. (Brown, Stewart, Miller, Dehus, Rose, Wishon); Verana Health, San Francisco (DeWitt, Chapman); RAND Corporation, Pittsburgh (Breslau); Office of the Assistant Secretary for Planning and Evaluation, U.S. Department of Health and Human Services, Washington, D.C. (Dey, Jacobus-Kantor).

Notes

Send correspondence to Dr. Brown ([email protected]).

Competing Interests

As an employee of Verana Health, Mr. Chapman has worked on projects funded by Amgen, Bausch + Lomb, Iveric, Janssen, Novartis, and Sight Sciences. The other authors report no financial relationships with commercial interests.

Funding Information

This study was sponsored by contract HHSP233201600017 from the Office of the Assistant Secretary for Planning and Evaluation, U.S. Department of Health and Human Services (DHHS).The opinions and views in this article are those of the authors and do not necessarily reflect the opinions or views of the Office of the Assistant Secretary for Planning and Evaluation or the DHHS.

Metrics & Citations

Metrics

Citations

Export Citations

If you have the appropriate software installed, you can download article citation data to the citation manager of your choice. Simply select your manager software from the list below and click Download.

For more information or tips please see 'Downloading to a citation manager' in the Help menu.

Format
Citation style
Style
Copy to clipboard

View Options

View options

PDF/EPUB

View PDF/EPUB

Login options

Already a subscriber? Access your subscription through your login credentials or your institution for full access to this article.

Personal login Institutional Login Open Athens login
Purchase Options

Purchase this article to access the full text.

PPV Articles - Psychiatric Services

PPV Articles - Psychiatric Services

Not a subscriber?

Subscribe Now / Learn More

PsychiatryOnline subscription options offer access to the DSM-5-TR® library, books, journals, CME, and patient resources. This all-in-one virtual library provides psychiatrists and mental health professionals with key resources for diagnosis, treatment, research, and professional development.

Need more help? PsychiatryOnline Customer Service may be reached by emailing [email protected] or by calling 800-368-5777 (in the U.S.) or 703-907-7322 (outside the U.S.).

Media

Figures

Other

Tables

Share

Share

Share article link

Share