The clinic reports encounter data, clinical outcomes data, quality data, and such other data as the Secretary requires.
The criteria through which CCBHCs become certified include 21 quality measures. CCBHCs are required to report on nine of these measures; the respective state entity reports on the other 12. (See Figure H.) The measures reported by CCBHCs rely on data typically derived from electronic health records (EHRs) or other electronic administrative sources. The measures reported by the states use data from Medicaid claims and encounter data, among other sources. Appendix B presents demonstration state information on the state entity for monitoring compliance with the criteria, the monitoring instrument, method and frequency of monitoring, and additional safeguards put in place to promote consistency in accessible and quality treatment and support services, in fidelity to the CCBHC model and as intended by Congress.
|FIGURE H. Required Quality Measures|
|9 Clinic-Reported CCBHC Quality Measures||12 State-Reported CCBHC Quality Measures|
Changes Made to Data Collection Systems
States and CCBHCs invested considerable time ensuring that participating CCBHCs had data systems in place to meet the reporting needs of the demonstration. Extensive training and technical assistance were provided on preparing EHR/health information technology (HIT) systems, facilitating information exchange between disparate data systems, and streamlining the reporting process.
As part of the CCBHC certification process, 97 percent of CCBHCs changed their EHR/HIT systems, and 33 percent adopted a new EHR/HIT system. Among those CCBHCs that reported implementing new EHR features, the most commonly added features were allowance of electronic exchange of clinical information with DCOs and other external providers, as well as quality measure reporting capabilities, reflecting the importance of these features to the CCBHC care model, as shown in Table 9. All CCBHCs reported that their EHRs include mental health, SUD, and case management or care coordination records. (For most CCBHCs, these features were not new per CCBHC certification.) Quality measure reporting capability, generation of electronic care plans, and electronic prescribing were also available in more than 90 percent of CCBHCs.
|TABLE 9. Number and Proportion of CCBHCs that Reported Function of CCBHC EHR and HIT Systems|
|New Due to
|EHR contains mental health records||67||100||6||9|
|EHR contains SUD records||67||100||8||12|
|EHR contains case management or care coordination records||67||100||16||24|
|EHR has quality measure reporting capabilities||63||94||34||54|
|EHR generates electronic care plan||62||93||8||13|
|Use any form of electronic prescribing||61||91||NA||NA|
|EHR incorporates laboratory results into health record||55||82||12||22|
|EHR provides clinical decision support||52||79||15||29|
|EHR contains primary care records||41||61||11||27|
|EHR communicates with laboratory to request tests or receive results||38||57||8||21|
|EHR allows electronic exchange of clinical information with other external providers||31||46||16||52|
|EHR allows electronic exchange of clinical information with DCOs||26||39||16||62|
|NOTE: Columns are not mutually exclusive.
Challenges associated with CCBHCs' lack of familiarity with the required measures and difficulty in obtaining certain variables, such as new service codes or new population subgroups, from clinic EHRs were typically resolved through training webinars and direct technical assistance that was provided through multiple channels (phone, online, or in-person) to:
Explain the measures and the information needed to produce each metric from the CCBHCs.
Provide examples of how to extract information and create measures from the EHR data (e.g., what queries to run; the numerators and denominators to use).
Explain how to complete the reporting template.
States continued to provide support as the year progressed and the process unfolded. For example, they worked with state Medicaid agencies to conduct "test" data collections with the CCBHCs that may reveal missing or inaccurate data for CCBHC-reported measures. As of March 2018, several states were in the process of reviewing preliminary submissions of test data from CCBHCs to conduct validation and quality assurance prior to the formal reporting deadline.
Pursued Quality Improvement Initiatives
CCBHCs are measuring the degree to which clients are receiving required services; tracking clients' appointment wait time and symptoms over time; developing alerts for screening or follow-up service; and standardizing treatment protocols. Nearly 80 percent of the CCBHCs reported using the quality measures as an opportunity to change and improve the services they offer.
All but one of the CCBHCs described the continuous quality improvement (CQI) projects they are conducting. The focus of these projects frequently aligns with the quality measures the CCBHCs are required to report for the demonstration. Specific examples include:
Twenty-six percent of these CCBHCs used suicide risk assessment and prevention measures, such as the C-SSRS, in routine practice and provided staff training on the measures.
Nineteen percent more consistently used measures to screen for depression (especially the Patient Health Questionairre-9) and more regularly conducted follow-up with those served.
Seventeen percent used measures related to reducing time between intake and assessment to ensure timely care.
Other CQI initiatives focus on safety planning and risk assessment, increasing use of MAT, and preventing unnecessary emergency department visits and hospitalizations. CCBHCs also implemented new internal performance improvement processes to ensure better CQI, such as conducting more frequent measurement and reporting of data, holding regular review meetings, hiring dedicated quality assurance staff, providing staff trainings, and aligning demonstration-focused CQI initiatives with state and other reporting requirements and programs.
The methods CCBHCs used to implement changes in response to the measures have been diverse and include hiring more providers, improving intake and assessment processes, and hiring external consultants to help implement changes.
CCBHCs have informally shared information regarding quality measures during collaborative meetings and have expressed interest in reviewing aggregate data regarding performance on quality measures across CCBHCs within a state to better understand their performance relative to the larger group. Several demonstration states are planning to implement systems for sharing aggregate quality measures data with the CCBHCs, which will provide CCBHCs with benchmarks for different measures and help them identify specific quality improvement and technical assistance needs.
State CCBHC leaders collaborated closely with the CCBHCs to develop and support CQI initiatives by providing clinics with tools to facilitate internal performance monitoring and staff development. They also engaged with the CCBHCs in regularly scheduled webinars, conference calls, site visits, and in-person work groups or training workshops on targeted subjects, such as data system buildout, metric development, and reporting processes. At the time of the interviews, Minnesota officials were planning a CCBHC learning collaborative to assist CCBHCs in sharing "lessons learned" and "best practices" in direct response to requests from the CCBHCs for this type of resource.