Implementation Findings from the National Evaluation of the Certified Community Behavioral Health Clinic Demonstration. II. METHODS


The findings in this report are based on: (1) responses to progress reports each clinic completed in spring 2018 and 2019; (2) three rounds of interviews with state Medicaid and behavioral health officials; and (3) site visits to clinics in four demonstration states.

CCBHC progress reports. In spring 2018 (Demonstration Year 1 [DY1]), clinics submitted an online progress report that included information about their staffing, training, accessibility of services, scope of services, electronic health record (EHR)/health information technology (HIT) capabilities, care coordination activities, and relationships with other providers. Clinics submitted a second progress report in spring 2019 to report on Demonstration Year 2 (DY2) activities (the 2018 and 2019 progress report templates appears in Appendix B). Questions in the DY2 progress report were almost identical to those in the DY1 progress report, with a few minor changes to streamline data collection for clinics and update the timeframes referenced by the questions. In collaboration with the CCBHC demonstration program leadership in each state, we conducted extensive outreach to clinic leaders via telephone and email before and during collection of the progress reports to encourage clinics' participation and answer any questions. In 2018, all 67 participating clinics completed the progress report. In 2019, the remaining 66 clinics completed the report. At both time points, all participating CCBHCs completed the progress reports--a 100 percent response rate.[8] Unless otherwise noted, the 2018 and 2019 findings in this report are based on the number of clinics participating in the demonstration at the time of data collection each year (67 CCBHCs in 2018, and 66 CCBHCs in 2019 respectively).

We computed descriptive statistics (for example, means, percentages) by using Excel and SAS to analyze the clinic progress report data. We summarize findings across all clinics and within each state. However, readers should interpret state-level variation in the findings cautiously, given that some states such as Nevada and Oklahoma account for a small number of clinics participating in the demonstration (n = 3 each), whereas others, such as New York and Missouri, have over a dozen clinics. In addition, the service systems and policy context in which clinics operate vary considerably across states, posing a challenge to direct cross-state comparisons. Finally, although we compare across the first and second demonstration years across similar items, we focus in this report on the status of implementation as of March 2019 (three months prior to the end of DY2), when the clinics submitted their second progress reports to us. CCBHCs have also continued to make changes and implement new programs and procedures since completion of the progress reports as they approach the end of the demonstration period; thus, the progress report findings reported here do not capture the most recent developments.

Telephone interviews. We conducted three rounds of telephone interviews with state behavioral health and Medicaid officials involved in leading implementation of the CCBHC demonstration in each state. We conducted the first round of interviews early in DY1--September and October 2017. We conducted the second round from February to March 2018 and the third round from February to April 2019. The first round of interview questions gathered information about early implementation, decisions made during the demonstration planning phase, early successes and challenges in fulfilling the certification requirements and following the data collection and monitoring procedures, and projected challenges or barriers to successful implementation. The second round of interviews gathered information on interim successes and challenges since the initial interview; successes in implementing demonstration cost-reporting procedures and quality measures; and early experiences with the PPS systems. The third round of interviews collected information on implementation successes and challenges in the second demonstration year. The interview guides for each round appear in Appendix C.

We conducted 29 state official interviews (ten interviews during the first two rounds and nine during the third). In seven states, the behavioral health and Medicaid officials asked to participate in the interviews together to reduce scheduling burden and provide comprehensive answers.[9] Each state interview required approximately 90 minutes. In the third round, we also conducted interviews with consumer and family representative organizations in four states in order to gather the perspective of consumers and families on the demonstration.

Two researchers conducted each interview, with one leading the interview and one taking notes. We asked interviewees' permission to audio record the discussions for purposes of confirming the accuracy and completeness of interview notes. Following the interviews, to expedite analysis, we organized the interview information into categories defined by the CCBHC certification criteria. We summarized interviewees' responses about implementation experiences within each domain of the certification criteria covered by this report (that is, staffing; access to care; scope of services; care coordination) separately for each state and then identified cross-state themes in the findings.

Site visits. We conducted site visits to two clinics in each of four demonstration states in February and March 2018. In collaboration with ASPE, we selected the four states to visit: Missouri, Oklahoma, Oregon, and Pennsylvania.[10] Using information from clinic responses to the progress report and interview transcripts, we selected two clinics within each state to visit that varied in terms of the following characteristics: urban-rural designation, location and proximity to other CCBHCs, size and number of CCBHC service locations, implementation of intensive team-based supports, Assertive Community Treatment (ACT), medication-assisted treatment (MAT), and any innovative engagement strategies or mobile/community-based supports that clinics' reported in their progress reports or that we learned about during interviews with state officials. During the site visits, we conducted in-depth discussions with clinic administrators and frontline clinical staff about how care has changed following implementation of the demonstration. Interview topics included successes and barriers related to CCBHC staffing, steps clinics have taken to improve access to care and expand their scope of services, CCBHCs' experience with payments and the PPS, and quality reporting practices. The interview guides for each staff type appear in Appendix D. We asked interviewees' permission to audio record the discussions to facilitate our analysis. Following the interviews, we organized the interview information into categories defined by the CCBHC certification criteria to facilitate analysis and to develop the clinic profiles in Chapter III.