HHS: 'Certified Community Behavioral Health Clinics Demonstration Program'
The report was written by
Here are the excerpts:
* * *
TABLE OF CONTENTS
ACRONYMS ... xii
EXECUTIVE SUMMARY ... xv
I. INTRODUCTION ... 1
A. Goals of the Certified Community Behavioral Health Clinic Demonstration ... 2
B. Certified Community Behavioral Health Clinic Demonstration Roll Out ... 3
C. Evaluation of the Certified Community Behavioral Health Clinic Demonstration ... 5
D.
II. DATA SOURCES AND METHODS ... 8
A. Evaluation Questions ... 8
B. Data Sources ... 9
III. DEMONSTRATION IMPLEMENTATION ... 20
A. State Context at the Beginning of the Demonstration ... 20
B. Scope of Services ... 21
C. Staffing and Training ... 26
D. Access to Care ... 30
E. Care Coordination ... 34
IV. QUALIFY OF CARE PROVIDED TO CERTIFIED COMMUNITY BEHAVIORAL HEALTH CLINIC CLIENTS ... 42
A. Experience with Measures ... 42
B. Quality Measure Performance ... 43
C. Awarding of Quality Bonus Payments ... 62
V. CERTIFIED COMMUNITY BEHAVIORAL HEALTH CLINIC PAYMENT RATES AND COSTS ... 65
A. Process for Establishing Payment Rates ... 65
B. Certified Community Behavioral Health Clinic Payment Rates ... 66
C. Average Daily or Monthly Costs and Per Client Costs ... 69
D. Convergence of Rates and Costs Over Time ... 70
E. Distribution of Certified Community Behavioral Health Clinic Costs ... 71
F. Total State and Federal Medicaid and Children's Health Insurance Program Expenditure for
VI. IMPACTS ON MEDICAID SERVICE USE AND COSTS ... 74
A. Analytic Populations Included in Impact Analyses ... 74
B. Impacts on Medicaid Service Use ... 77
C. Impacts on Costs ... 81
D. Summary ... 82
VII. SUMMARY OF KEY FINDINGS AND RECOMMENDATIONS ... 83
A. Summary of Findings Relevant to the Protecting Access to Medicare Act ... 83
B. Recommendations ... 86
C. Conclusions ... 90
REFERENCES ... 91
APPENDICES
APPENDIX A. Implementation Findings ... 97
APPENDIX B. Quality of Care and Quality Bonus Payments ... 102
APPENDIX C. Certified Community Behavioral Health Clinic Payment Rates and Costs ... 128
APPENDIX D. Impact Study Methods and Findings ... 136
* * *
ACKNOWLEDGEMENTS
We appreciate the support and guidance of
* * *
EXECUTIVE SUMMARY
In
Effective evidence-based treatments for mental health conditions and SUDs are unavailable or difficult to access in many communities (Blyler et al. 2021). Pervasive behavioral health workforce shortages create long wait-times for appointments, and in some areas, emergency departments (EDs) and the criminal justice system are the only sources of care for people in crisis (Cama et al. 2017; Nordstrom et al. 2019; Bradley et al. 2020; SAMHSA 2021). Even when services are available, behavioral health providers often do not have the resources, staff, or data systems to monitor chronic conditions and coordinate care with external health and social service providers (Kilbourne et al. 2018; Pincus et al. 2016).
Community mental health centers (CMHCs) play an essential role in delivering ambulatory behavioral health care. Historically, the Federal Government has maintained a narrow definition of CMHCs (pertaining only to providers who participate in Medicare; CMS n.d.), but states and localities use the term more broadly to refer to ambulatory care facilities that specialize in the delivery of behavioral health care. Following the repeal of the Mental Health Systems Act and introduction of block grants in the 1980s, states were largely responsible for determining what services to provide through CMHCs and how to integrate them into systems of care (
Over the past several decades, Medicaid has become an increasingly important source of funding for CMHCs and behavioral health care more generally as funding has shifted toward community-based services and away from more restrictive institutional settings (Medicaid and
A. Goals of the Certified Community Behavioral Health Clinic Demonstration
Section 223 of the Protecting Access to Medicare Act (PAMA), enacted in
CCBHCs must offer nine types of services including: (1) crisis mental health services; (2) screening, assessment, and diagnosis; (3) patient-centered treatment planning; (4) outpatient mental health and substance use services; (5) outpatient clinic primary care screening and monitoring; (6) targeted case management (TCM); (7) psychiatric rehabilitation services; (8) peer support, counselor services, and family supports; and (9) intensive, community-based mental health care for members of the armed forces and veterans (SAMHSA 2016a). However, states have some flexibility to tailor these services to align with their state Medicaid Plans and other state regulations, and to meet the needs of communities. Services must be person and family-centered, trauma-informed, and recovery-oriented. In addition, CCBHCs are required to expand service hours, provide services beyond the walls of the clinic (for example, in clients' homes and elsewhere in the community), and maintain partnerships with a range of health and social service providers to facilitate referrals and care coordination (SAMHSA 2016a). CCBHCs can partner with Designated Collaborating Organizations (DCOs) to provide some of the required services. DCOs are entities that are not directly supervised by a CCBHC but have a formal relationship with a CCBHC to provide specified services. CCBHCs that engage DCOs maintain clinical responsibility for services the DCO provides to CCBHC clients.
The PPS in each state is designed to provide CCBHCs with the financial support and stability necessary to deliver these required services. States participating in the demonstration select one of the following PPS models to reimburse all CCBHCs in the state: a fixed daily payment (PPS-1) for each day a Medicaid beneficiary receives demonstration services or a fixed monthly payment (PPS-2) for each month in which a Medicaid beneficiary receives demonstration services. States set the payment rates, which can vary across CCBHCs within a state. PPS-1 states have the option to provide CCBHCs with quality bonus payments (QBPs) based on their performance on quality measures. PPS-2 states are required to provide QBPs based on quality measures.
States and CCBHCs are required to report 21 quality measures following each demonstration year (DY). These are calculated from Medicaid claims and managed care encounter data, electronic health records (EHRs), and surveys of CCBHC clients and their family members. These measures assess best practices in care delivery (for example, timely follow-up after discharge from a hospital), outcomes (for example, improvement in depression symptoms), and client and family member experiences with care. Quality measure reporting provides CCBHCs and state officials with standardized metrics to monitor the quality of care, inform quality improvement efforts, and award QBPs. CCBHCs also submit standardized cost reports to the state following each demonstration year. The cost reports include information on clinic operating costs and the number of daily (for PPS-1 states) or monthly (for PPS-2 states) visits to the clinic in each demonstration year.
B. Certified Community Behavioral Health Clinic Demonstration Roll Out
In
* * *
Figure ES.1: Number of CCBHCs and Type of PPS Model for Initial Demonstration States
* * *
In
Among the initial eight demonstration states, the number of CCBHCs participating in the demonstration and the characteristics of the counties served by those CCBHCs varies across states (Table ES.1). For example,
* * *
Table ES.1: Characteristics of CCBHC Counties and Clients
* * *
Across seven of the eight initial demonstration states, CCBHCs served 304,988 clients in the first demonstration year (DY1) and 332,135 clients in the second demonstration year (DY2), representing a 9 percent aggregate increase across CCBHCs (Table ES.2).
* * *
Table ES.2: Number of Clients Served by CCBHCs in Each Demonstration Year
* * *
C. Evaluation of the Certified Community Behavioral Health Clinic Demonstration Section 223 of PAMA mandates that HHS submit reports to
In
The evaluation was designed to answer several overarching questions that align with the PAMA requirements for HHS's reports to
The evaluation included interviews with state officials and consumer and family representatives at different stages of the demonstration to assess implementation over time; site visits to selected CCBHCs to interview clinic administrators and frontline clinical staff to understand their experiences implementing the model; analysis of progress reports that CCBHCs submitted in each demonstration year to report their staffing, training activities, accessibility of services, scope of services, EHR/health information technology (HIT) capabilities, care coordination activities, and relationships with other providers; and analysis of the cost reports and quality measures that states and CCBHCs submitted following each demonstration year.
* * *
Figure ES.2: Overarching CCBHC Evaluation Questions
* * *
The evaluation also assessed changes in Medicaid service use (including ambulatory visits, ED visits, and hospitalizations) and costs among beneficiaries who received care from CCBHCs relative to beneficiaries with similar demographic and diagnostic characteristics who received care from other (non-certified) community behavioral health clinics in the same state, representing care as usual. Although changes in service use do not necessarily reflect changes in access to care or the quality of care, the findings from these analyses are important to understand how the CCBHC model affects the broader health care system. Hospitalizations and ED visits are typically viewed as unfavorable outcomes from a health system perspective. CCBHCs' efforts to increase access to care and deliver new services could potentially result in the identification of untreated conditions and increase the use of services. Conversely, providing more comprehensive ambulatory care to CCBHC clients could decrease ED visits and hospitalization rates.
We compared pre-post changes in service use and costs for the treatment group (beneficiaries who received services from clinics that became CCBHCs in the year prior to the demonstration) with pre-post changes in service use and costs for a comparison group (beneficiaries who received services from clinics that did not become CCBHCs in the year prior to the demonstration) within the same state. This study design (commonly referred to as a difference-in-differences design) allowed us to identify changes in service use and costs attributable to the demonstration, as opposed to general historical trends. This component of the evaluation included only beneficiaries enrolled in Medicaid in the year prior to the demonstration. Due to data or study design limitations in some states, this component of the evaluation was limited to
This final report summarizes key findings for each of the areas related to the PAMA requirements for HHS's reports to
D. Evaluation Findings
1. Access to community mental health services
The evaluation examined the changes that CCBHCs implemented to increase access to care, improvements in wait-times for initial evaluations at CCBHCs (an indicator of timely access to care), changes in the number of clients served by CCBHCs over time (which may reflect efforts to increase access to care), and the extent to which consumer and family stakeholder representatives reported that access to care changed as a result of the demonstration. We also examined changes in Medicaid service use to understand how the introduction of the CCBHC model affected where and how frequently Medicaid beneficiaries received care.
* * *
Figure ES.3: Proportion of CCBHCs that Provided Services Outside of Physical Clinic Space in the Past 12 Months
* * *
Activities to increase access to care. CCBHCs implemented a wide range of activities to increase access to care. These activities included, for example, expanding operating hours, accommodating same-day and walk-in appointments, outreach to underserved populations, and moving service delivery beyond the clinic walls to reach people in their homes and communities (Figure ES.3). CCBHCs also established and sustained partnerships with external providers to facilitate referrals and coordinate care. In the first demonstration year, all or nearly all CCBHCs had formal or informal relationships with inpatient psychiatric facilities, residential treatment facilities for SUD, schools, child welfare agencies, adult criminal justice agencies and courts, juvenile justice agencies, primary care providers, and Federally Qualified Health Centers (FQHCs). According to state officials, these efforts to expand access to care were unique to CCBHCs relative to other community behavioral health clinics in the state.
All CCBHCs provided services to individuals regardless of their ability to pay. For comparison, 78 percent of non-CCBHC state-licensed or certified CMHCs in the eight original demonstration states offered treatment at no charge or for minimal payment in 2020 based on an analysis of
As noted above (see Table ES.1), the number of clients served by CCBHCs increased by 9 percent from the first to the second demonstration year (this ranged from 1 percent to 23 percent, depending on the state), suggesting that efforts to increase access to care may have been successful at attracting new clients.
Wait-times for initial evaluation. In all but one state, the proportion of new adult clients who received an evaluation within ten business days of their first contact with the CCBHC improved from the first to the second demonstration year (Figure ES.4). On average, adults received an initial evaluation within nine days of contact with the CCBHC in the first demonstration year, which decreased to 5.4 days in the second demonstration year (Figure ES.5). All states except
* * *
Figure ES.4: Proportion of New Adult Clients with Initial Evaluation Provided within 10 Business Days of Contact with CCBHC
Figure ES.5: Average Number of Days from Initial Contact with CCBHC to Evaluation for New Adult Clients
* * *
Consumer and family stakeholder perceptions of access to care. Consumer and family representatives interviewed in several states credited the demonstration with increasing access to care. These stakeholders praised efforts to accommodate same-day appointments and expand service hours and noted that consumers experienced much shorter wait-times for appointments. These stakeholders also perceived that providing mental health and SUD services for both adults and children at the same location facilitated greater access to comprehensive services for whole families, noting that CCBHCs became more family- oriented environments that offer care to children and their parents. These stakeholders also reported that the inclusion of peer support staff in the CCBHC model was critical to engaging clients and families in treatment. Across states, over 80 percent of adult clients had positive perceptions of access to care in both demonstration years, as reported in the quality measures.
Impact of CCBHCs on service use. Among the three states included in the difference-in-differences analyses, the introduction of the CCBHC model impacted the use of Medicaid services differently in each state (Table ES.3):
* In Missouri, the number of behavioral health-related ambulatory visits increased 5.7 percent among CCBHC clients relative to the comparison group. The demonstration did not impact hospitalization rates or ED visits.
* In Pennsylvania, there was a 7.4 percent reduction in the average number of physical health-related ambulatory visits and a 9.9 percent reduction in the average number of behavioral health-related ambulatory visits among CCBHC clients relative to the comparison group. CCBHC clients did not differ from the comparison group in their probability of having any ED visit during the demonstration, but there was a 13 percent reduction in the average number of behavioral health- related ED visits among CCBHC clients relative to the comparison group (in other words, the likelihood of any ED visit was not different between the two groups but CCBHC clients had fewer behavioral health-related ED visits over time relative to the comparison group). The demonstration did not impact hospitalization rates.
* In Oklahoma, there was a 3 percent reduction in the number of physical health-related ambulatory visits among CCBHC clients relative to the comparison group, but there was no impact on ambulatory behavioral health-related visits. CCBHC clients had a higher probability of any ED visit during the demonstration relative to the comparison group. However, there was an 11 percent reduction in the average number of behavioral health-related ED visits among CCBHC clients relative to the comparison (in other words, although the likelihood of any ED visit was higher among CCBHC clients they had fewer behavioral health-related ED visits over time relative to the comparison group). Finally, CCBHC clients had a lower probability of hospitalization relative to the comparison group during the demonstration, but the demonstration did not impact the average number of hospitalizations. This could, reflect, in part, relatively low hospitalization rates in this state, which could make it difficult to detect changes in averages.
* * *
Table ES.3: Summary of Impacts on Service Use Over the First 2 Demonstration Years
* * *
In sum, there was not a consistent pattern across states in how the introduction of the CCBHC model impacted hospitalization rates, ED visits, or ambulatory service use. Changes in ambulatory service use do not necessarily indicate better or worse access to care. In the context of this demonstration, in which CCBHCs are paid either a daily or monthly rate to provide comprehensive services, an increase in daily or monthly ambulatory visits among CCBHC clients could indicate that CCBHCs are providing needed services. A decrease in daily or monthly ambulatory visits among CCBHC clients could indicate that CCBHCs are able to provide the necessary services in fewer visits. Theoretically, the delivery of more comprehensive services (regardless of the number of visits) might correspond with a decrease in ED visits or hospitalization rates. However, across these states, there was no consistent pattern in the relationship between changes in ambulatory visits and ED visits. The demonstration also did not impact hospitalization rates in any state.
The variation in findings across states could reflect differences in how the model was implemented across states as well as other state contextual factors that are not directly measurable using Medicaid data. As noted above,
These impact findings are limited to the first two years of the demonstration. There are also several limitations to the analysis. Although the evaluation used the strongest design to avoid potentially misattributing impacts of the demonstration to changes over time in the case-mix of CCBHCs, it required limiting the analytic population to beneficiaries enrolled in Medicaid and receiving care from these clinics prior to the demonstration. This does not compromise the validity of the findings, but the results are best interpreted as the impacts among beneficiaries who were already engaged in care as opposed to those who newly entered services after the demonstration began. The introduction of the CCBHC model could impact clients who are not already engaged in care differently from those who have an existing relationship with a community behavioral health clinic. As the demonstration continues and expands to other states, there may be opportunities to implement alternative evaluation designs to capture impacts on clients newly seeking services at CCBHCs. Finally, although the treatment and comparison groups within each state were comparable on key characteristics, and there was adequate sample size to detect impacts, it is possible that the final population included in the comparison group differed from clients in the CCBHC group on characteristics that are not measurable using Medicaid data. This may have been particularly relevant to
2. Scope of services
The demonstration establishes a minimum scope of services for CCBHCs and requires states and CCBHCs to adopt evidence-based practices (EBPs). However, the demonstration allows states to select EBPs that address the needs of communities and align with Medicaid State Plans and other state regulations. The evaluation examined the types of staff and services that CCBHCs added to meet the certification requirements and the partnerships that CCBHCs developed to deliver the required services and coordinate care. CCBHCs varied widely in the types of services they provided and populations they served prior to the demonstration, and consequently required different changes to meet certification requirements. However, officials reported that, as a result of the certification process, CCBHCs provided a more comprehensive and broader range of services than other community behavioral health clinics in the state. CCBHCs were generally able to sustain the delivery of the required services throughout the demonstration.
Expansion of services to meet CCBHC criteria. Nearly all clinics expanded or added services to meet CCBHC certification requirements (Figure ES.6). CCBHCs most often added services within the categories of outpatient mental health and/or SUD services, psychiatric rehabilitation services, crisis services, and peer support.
* * *
Figure ES.6: Proportion of CCBHCs that Added Each Type of Service as a Result of Certification
* * *
CCBHCs offered a wide range of EBPs and rehabilitative services consistent with the certification criteria (Table ES.4). All or nearly all CCBHCs provided motivational interviewing, individual and group cognitive behavioral therapy (CBT), peer support for clients, emergency crisis intervention, 24-hour mobile crisis teams, crisis stabilization, primary care screening and monitoring, TCM, evidence-based medication evaluation and management, and medication-assisted treatment (MAT) for alcohol and opioid use. Most CCBHCs provided community wraparound services for youth/children, dialectical behavioral therapy, peer support for families, supported employment, supported housing, and supported education. The evaluation did not obtain data from all CMHCs or community behavioral health clinics across states to facilitate direct comparisons with CCBHCs, but 2020 N-MHSS data suggest that several of these services were less frequently available from other CMHCs in the demonstration states (Wishon et al. 2021); in 2020, only 49 percent of state-licensed or certified CMHCs (that were not CCBHCs) in demonstration states provided any type of SUD treatment, 59 percent had a crisis intervention team, 43 percent provided peer support, 40 percent offered on-site services for psychiatric emergencies, and 27 percent provided supported housing.
Many of these services were added as a result of the CCBHC certification process (Table ES.4). For example, 46 percent of CCBHCs added 24-hour mobile crisis teams, 46 percent added MAT, 43 percent added peer support for clients, 42 percent added primary care screening and monitoring, 40 percent added TCM, 34 percent added peer support for families, and 31 percent added emergency crisis intervention and crisis stabilization.
* * *
Table ES.4: Proportion of CCBHCs that Offer Select EBPs and Other Services
* * *
CCBHCs delivered most of the required services directly rather than engaging external providers in DCO relationships. This was true for many of the new services that CCBHCs added to meet the certification requirements. For example, only one of the 61 CCBHCs that provided MAT in the second demonstration year engaged a DCO to provide these services. CCBHCs cited concerns about their ability to maintain clinical responsibility for services provided through DCOs and uncertainty about how the PPS would work under DCO arrangements as reasons for preferring to provide services directly. Some CCBHCs also preferred to build their own internal service capacity through the demonstration. The exception was suicide/crisis hotlines or warmlines; 30 percent of CCBHCs developed DCO relationships with these types of providers, most often formalizing their existing relationships with these providers.
Staff hiring and training. States and CCBHCs reported that the PPS model allowed clinics flexibility to hire different types of staff and form treatment teams that were tailored to the needs of their clients. CCBHCs employed a wide range of staff before the demonstration. CCBHCs most often hired case managers, peer specialists/recovery coaches, psychiatrists, and family support workers during the CCBHC certification process, and most CCBHCs were able to retain these staff over the first two years of the demonstration (Table ES.5).
In the first demonstration year, 93 percent of CCBHCs provided training in risk assessment, suicide prevention, and suicide response; 91 percent provided training in evidence-based and trauma-informed care; 88 percent provided training in cultural competency; and 76 percent provided training in family- centered care, recovery-oriented care, and primary and behavioral health care integration. A similar proportion of CCBHCs provided these trainings in the second demonstration year.
* * *
Table ES.5: Proportion of CCBHCs that Employed Select Types of Staff
* * *
Composition of treatment teams. In the first demonstration year, 76 percent of CCBHCs reported a change in the membership of their treatment teams as a result of the certification process and 58 percent of CCBHCs continued to make some changes to their treatment teams in the second demonstration year. In both years, nearly all CCBHCs reported including case managers and consumers/clients on treatment teams in addition to mental health providers, SUD providers, and psychiatrists; 78 percent of CCBHCs included clients' family members on treatment teams. Only 48 percent of CCBHCs included primary care physicians (PCPs) on treatment teams in the second demonstration year.
Primary care services. Ninety-one percent of CCBHCs reported offering primary care screening and monitoring in the second demonstration year and 99 percent reported some type of partnership with a primary care provider. Although not required in the certification criteria, 55 percent of CCBHCs provided on-site primary care. However, 84 percent of those clinics were already providing on-site primary care before the demonstration (only six CCBHCs added on-site primary care during or after the CCBHC certification process). Some states established primary care requirements for CCBHCs that went beyond the certification requirements. For example,
CCBHCs varied, however, in their ability to capture physical health information and coordinate physical health care with other providers. Only 56 percent of CCBHCs had EHRs that included primary care records and only 45 percent reported that their EHRs allowed electronic exchange of clinical information with any external provider.
Fifty-eight percent of CCBHCs reported receiving a notification (electronic or otherwise) when a hospital treated a CCBHC client for a physical health condition and 53 percent reported receiving a notification when an ED treated a CCBHC client for a physical health condition.
In sum, CCBHCs expanded their scope of services, which included the adoption of various EBPs, rehabilitative services, and primary care screening and monitoring. They also hired and trained staff to support the delivery of these services. Data were not available to facilitate a direct comparison between all the services provided by CCBHCs with other clinics in the state or with clinics in other states. However, state officials reported that CCBHCs provided a more comprehensive and broader range of services relative to other community behavioral health clinics in the state, and the findings from N-MHSS described above support their observations.
* * *
Table ES.6: Quality of Care Provided to CCBHC Clients Compared to Medicaid Benchmarks in DY1 and DY2
* * *
3. Quality of care
The delivery of comprehensive services and care coordination, the addition of staff, and provision of additional training could lead to improvements in the quality of care. Conversely, quality of care could suffer if the PPS incentivizes CCBHCs to deliver fewer services while still collecting the daily or monthly payment. The evaluation examined performance on the 21 quality measures (representing eight domains of quality) that states and CCBHCs reported. The analysis assessed how the quality of care delivered to CCBHC clients compared to state Medicaid benchmarks and assessed changes in the quality of care over time. Since these measures were not reported by similar clinics in regions of the state that did not participate in the demonstration, direct comparisons between CCBHCs and comparable non-CCBHC behavioral health clinics were not possible. However, comparing the quality of care provided to CCBHC clients with state Medicaid benchmarks for the same measures provides context for understanding whether CCBHC clients received higher-quality care than the broader group of Medicaid beneficiaries in the state. Interpretation of CCBHC performance relative to these benchmarks should consider that the populations treated in CCBHCs are likely to be more severely ill and disadvantaged than the broader Medicaid population with these conditions.
According to state officials, most CCBHCs did not have previous experience reporting the quality measures required for the demonstration and CCBHCs' data systems did not always facilitate reporting the measures before the demonstration. As a result, 97 percent of CCBHCs enhanced their EHRs and/or other HIT to capture the information they were required to report; 33 percent of CCBHCs adopted a new EHR or HIT system (most often in addition to making changes to their existing systems). Modifying data systems required considerable resources and staff time. State agencies played a critical role in providing technical assistance to help CCBHCs make these changes and, in some states, helped clinics link to external data systems. In contrast, calculating the state-reported measures generally did not require major changes to state data systems.
Quality measure performance among CCBHC clients relative to Medicaid benchmarks. Several of the quality measures used in the demonstration align with measures that state Medicaid programs voluntarily report to the
For several measures, the quality of care provided to CCBHC clients most often met or exceeded the quality of care provided to the broader Medicaid population in states where data were available to make these comparisons (Table ES.6):
* The proportion of adult CCBHC clients with major depression who received antidepressants and continued those antidepressants for at least six months was similar or better than the state Medicaid average in four of the five states where comparisons were possible.
* The proportion of adult CCBHC clients who initiated treatment for alcohol or other drug (AOD) use within 14 days of their initial AOD diagnosis and the proportion who remained engaged in care (defined as having at least two other AOD visits within 30 days of the initial AOD visit) was similar or better than the state Medicaid average in four of the five states where comparisons were possible.
* The proportion of all CCBHC clients who received follow-up care within 30 days after an ED visit for a mental health condition or AOD use was similar or better than the state Medicaid average in five of the six states where comparisons were possible.
* The proportion of adult and child/adolescent CCBHC clients who received follow-up care within 30 days of discharge from a hospital for a mental health condition was also similar or better than the state Medicaid average in five of the six states where comparisons were possible.
* The proportion of adult CCBHC clients who were readmitted to a hospital within 30 days of discharge was lower than the state Medicaid average in four of the six states where comparisons were possible.
* The proportion of children/adolescents receiving care from CCBHCs prescribed medication for attention deficit hyperactively disorder (ADHD) who had a visit with a provider with prescribing authority within 30 days after starting the ADHD medication was better than the state Medicaid average in all three states where comparisons were possible.
There was one measure for which the quality of care provided to CCBHC clients never exceeded the state Medicaid average: Adherence to antipsychotic medications (defined as receiving antipsychotic medications for at least 80 percent of the days enrolled in Medicaid during the year) among adults with schizophrenia who received care from CCBHCs was similar to the Medicaid state average in one state but worse in two states. However, this comparison was only possible in three states. As shown in Table ES.6, performance on some measures was worse among CCBHC clients relative to the state Medicaid average, indicating room for improvement. In addition, some states without benchmarks for a particular measure demonstrated high performance. For example, 93 percent of adults discharged from a hospital for a mental health condition in
Change in quality of care during the demonstration. Performance on several of the measures that assessed process of care within CCBHCs (such as those focused on timely access to care and screening and assessment for specific conditions) improved from the first to the second demonstration year (Table ES.7). For example, the proportion of adult CCBHC clients with a new episode of depression who received a suicide risk assessment (SRA) increased in all but one state; however, there was room for improvement in some states. Likewise, rates of screening and follow-up care for tobacco use, unhealthy alcohol use, and body mass index (BMI) also generally improved from the first to the second demonstration year. These improvements may reflect changes that CCBHCs made in response to first year performance on the measures, such as implementing new screening processes. Some CCBHCs also made changes to how the data for the quality measures were collected in the second demonstration year, including continuing enhancements to EHRs and other data systems, which could have influenced changes in performance rates.
* * *
Table ES.7: Change in Quality of Care for CCBHC Clients During Demonstration
* * *
There was less improvement on measures that assessed transitions between settings of care (for example, follow-up after discharge from a hospital) and medication management and adherence, which were reported using Medicaid claims data. This could reflect that changing performance on some of these measures requires CCBHCs to have partnerships with hospitals or other entities, which may require more time to put into place. There were some indications from the progress reports that the strength of such partnerships varied across CCBHCs. For example, in the second demonstration year, about one-quarter of CCBHCs reported that they did not receive notifications from ED or hospitals when a client in their care was treated for behavioral health conditions in those settings. Improving performance on measures of antidepressant medication management or adherence to antipsychotics might also require more time to put into place processes for monitoring and following-up with clients.
Awarding of QBPs. CCBHCs in states with QBPs were required to achieve state-defined performance thresholds on six measures (Table ES.8). States could also require CCBHCs to meet performance thresholds on additional measures included in the PPS guidance or other measures with approval from CMS. States set the amount of the QBPs and had the option to modify the parameters of the QBPs from the first to the second demonstration year.
States varied in the performance thresholds used to award QBPs. For example, some states awarded QBPs if performance on the measures met or exceeded state or national averages. Other states specified targets for particular measures (for example, at least a 10 percent improvement toward a specified goal) or required CCBHCs to improve from year to year without a specified target. Some states used data from the first six months or year of the demonstration to establish performance thresholds.
* * *
Table ES.8: Quality Measures Used in QBP Systems
* * *
States also varied in how they tied measure performance to the amount of the QBPs. For example, some states created a sliding scale in which the lowest-scoring CCBHC received no payment and the highest- scoring CCBHC received the maximum payment for a particular measure. Some states also tied the amount of QBPs to the magnitude of improvement on a measure. For example, 1 percent improvement above a specified performance threshold received 10 percent of the QBP, whereas 10 percent improvement above the threshold would earn 100 percent of the QBP. In sum, no two states had an identical QBP structure even though they used the same required measures.
Across states, 54 of the 67 participating CCBHCs were eligible for QBPs during the first and second demonstration years: 33 received QBPs in the first demonstration year and 27 received QBPs in the second demonstration year.
There were also some indications that the functioning of the QBP systems varied from states' expectations. Some states substantially overestimated or underestimated the anticipated amount of the QBPs at the beginning of the demonstration relative to the amount awarded. For example,
In sum, for most measures, the quality of care provided to CCBHC clients was comparable or better than the quality of care provided to state Medicaid benchmarks where comparisons were possible. Performance on several quality measures improved from the first to second demonstration year, depending on the state. Quality of care worsened for few measures during the demonstration but there was room for improvement in some states depending on the measure. More evidence is needed to compare the quality of care provided to CCBHC clients relative to beneficiaries served by other community behavioral health clinics and to understand whether the QBPs incentivized better care.
4. Payment rates and costs
Historically, Medicaid has reimbursed community behavioral health clinics through FFS or managed care rates, and some evidence suggests that these rates did not cover the full cost of clinic services (Scharf et al. 2015). The demonstration addresses this problem by allowing states to develop a PPS that reimburses CCBHCs based on expected cost of providing comprehensive services to all individuals who seek care, based on projected costs. PAMA does not require that the demonstration achieve cost neutrality. Rather, the demonstration was designed to provide CCBHCs with more financial resources. As described above, states chose between PPS models developed by CMS (although states were allowed some flexibility in operationalizing the models):
* PPS-1 provides CCBHCs with a fixed payment for each day that a Medicaid beneficiary receives demonstration services from the clinic (known as a visit-day). This payment model resembles how FQHCs are paid. As described above, the PPS-1 model includes a state option to provide QBPs to CCBHCs that first meet performance thresholds on the six measures required by CMS and any additional state-specified performance requirements on quality measures.
* PPS-2 provides CCBHCs with a fixed payment for each month in which a Medicaid beneficiary receives demonstration services from the clinic (known as a visit-month). PPS-2 rates have multiple categories--a standard rate and separate rates for special populations that the state defines. As described above, the PPS-2 model requires states to award QBPs based on meeting performance thresholds on the six measures required by CMS, and outlier payments for costs above a specific threshold.
These payment models enable CCBHCs to exercise considerable flexibility in tailoring services to the needs of their clients without being concerned about the financial impact of each service decision or procedure. Ideally, in contrast with FFS systems, where each additional service brings an additional payment, the PPS should not incentivize providing high volumes of care. Rather, the amount that clinics are paid is determined by the average cost of care, regardless of the quantity of services provided on a given day or month. While there is an incentive for clinics to have more frequent visits with clients, particularly under PPS-1, this incentive only operates over the short term because states have the option to adjust the payment rates based on the cost data from the previous year (a process known as re-basing). If a clinic has many visit-days or visit-months in a year, it will collect more reimbursement during that year, but the state can adjust the rates for the next year to bring them in alignment with actual costs. In this context, cost-reporting provided critical information for states to set and adjust payment rates over time.
The evaluation used all available cost data from the demonstration to assess the extent to which payment rates covered the costs of CCBHC services in each demonstration year and describe variation in the average costs of CCBHC services per client and per visit-day (for PPS-1 states) or visit-month (for PPS-2 states). The evaluation also examined how the introduction of the CCBHC model in
Payment rates. States initially struggled to set rates that reflected CCBHC costs, in part, because they did not always have good data to inform cost projections. The rate-setting process required accurate data for calculating the allowable costs and number of visit-days or visit-months. It also required clinics to forecast anticipated changes in costs as a result of implementing the CCBHC certification criteria. Since the clinics would be broadening their scope of services to meet the criteria, they would generally be increasing their total operating costs. However, because there was a lack of historical data on the actual costs of providing the enhanced scope of services, clinics had to estimate these future costs, which included staffing, spending on training or infrastructure, and other anticipated costs approved by the state./1
CCBHC payment rates varied within and across states. The average daily rate across the 56 CCBHCs in PPS-1 states was
Average costs for CCBHC services. States also varied in the average daily or monthly costs of CCBHC services and in the average cost per client over the full demonstration year.
* Among PPS-1 states, the average cost per visit-day in DY1 ranged from
* In Oklahoma (the only PPS-2 state for which we could analyze the cost reports), the average monthly cost was
Sufficiency of rates to cover CCBHC costs. During the first demonstration year, average CCBHC payment rates were higher than CCBHC costs in five states and lower than costs in
Impacts on Medicaid costs. As described above, the Medicaid data available for the evaluation did not include the costs of services delivered through managed care arrangements in
* * *
Figure ES.7: Average Payment Rates as Percentage Above or Below Average Costs Per Visit-Day or Visit-Month, by State and Demonstration Year
* * *
The limitations of the Medicaid claims analysis described above also apply to the cost impact analyses for
In sum, on average, payment rates covered the costs of CCBHC services in all but one state (
* * *
The report is posted at: https://aspe.hhs.gov/sites/default/files/documents/a3f46569fb39cb41c5f2610ca17f49cf/ecs-catalog-and-crosswalk-methods.pdf
GRP Acquires BPW in South Wales
Financial Conduct Authority: 'Insurance Guidance for the Support of Customers in Financial Difficulty'
Advisor News
Annuity News
Health/Employee Benefits News
Life Insurance News