DISCLAIMER: The opinions and views expressed in this report are those of the authors. They do not necessarily reflect the views of the Department of Health and Human Services, the contractor or any other funding organization.
In the fall of 2010, the Substance Abuse and Mental Health Services Administration (SAMHSA) launched the Community Resilience and Recovery Initiative (CRRI). CRRI was a multi-level, place-based demonstration project aimed at helping grantee communities cope with the ongoing behavioral health effects of the Great Recession. SAMHSA funded three grants based on applications submitted in response to its Request for Applications (RFA): Union City, New Jersey; Fall River, Massachusetts; and Lorain, Ohio. Each applicant was awarded $1.4 million a year for 2 years to improve the coordination and availability of behavioral health services in their respective communities. The RFA anticipated that funding would be available for up to 4 years, but ultimately only 2 years of funds were available. Grantees then operated for up to 1 year more on carryover monies. Required activities included social marketing efforts, community-wide screenings, provision of brief interventions (such as motivational interviewing), and referrals to more intensive services, as needed. The initiatives also required grantees to work in collaboration with various social service agencies in their communities, including employment and job training agencies, mental health service providers, and agencies and organizations that provide services to combat substance use disorders.
To assess the implementation and potential success of these grants, the Office of the Assistant Secretary of Planning and Evaluation within the U.S. Department of Health and Human Services awarded Westat a contract to evaluate the initiative throughout the program's duration. The objectives of the evaluation were threefold: First, to describe the characteristics of grantee implementation processes. In order to achieve this objective, Westat conducted two-person site visits to each grantee community in the fall of 2011, 2012, and 2013. In each site the evaluation team conducted in-depth interviews with key project staff, staff from partner agencies, and service recipients. An important finding from the site visits was that each grantee made significant adjustments to the original program design in order to meet their community's unique needs. In Union City, for example, the emphasis of the program was on providing in-school substance use services to ensure that young people caught using drugs or alcohol would receive appropriate treatment and be able to complete their high school education on time. In Lorain, the director of the employment program paid particular attention to the city's African American community, which had been hit by the Great Recession, but also had been disproportionately affected by previous economic downturns. Finally, Fall River used a case management approach to meet its clients' economic and behavioral health needs. This service delivery model allowed clients to establish 6-month relationships with their case managers, which resulted in excellent recordkeeping and strong outcomes. However, the model was much more intensive than that envisioned in the original RFA.
The second objective of the Westat evaluation was to report on the client outcomes achieved by each of the grantees. Client information was recorded by grantees in the Services Accountability Improvement System, the data system developed by SAMHSA to meet the Government Performance and Results Act requirements. Each year of the evaluation, SAMHSA sent Westat a set of de-identified, client-level data for all three grantees. Westat staff then analyzed the dataset for descriptive information about enrolled clients, overall client outcomes, and the effectiveness of several grantee programs. Overall, employment and behavioral health data indicated that clients were doing much better 6 months after enrolling in the program than they were at the point of intake. In each community, more clients were employed at follow-up than at intake; substance use and abuse had decreased 6 months after program enrollment; and clients reported fewer symptoms of depression and anxiety at follow-up than when they enrolled in the programs. Although the study design does not allow us to claim that the programs were responsible for these improvements,1 the data are encouraging.
Third and final objective of the evaluation was to assess the extent to which this place-based initiative was able to improve community-level resilience in the face of adverse economic circumstances. Grantees were to conduct surveys in their communities each year to get measures of residents' sense of well-being, as well as collect key community indicators (e.g., number of domestic violence incidents, number of alcohol-related or drug-related hospitalizations) that would allow Westat to assess community-level change over time. With the exception of Union City, grantees struggled to implement community surveys and obtain consistent and reliable community-level indicators. As a consequence, we were unable to meet the third objective of the evaluation.
This project resulted in several important lessons learned. First, all three grantees noted the value of linking behavioral health and employment services in their communities. Interviewees reported that for many of their clients, seeking employment assistance or job training support is less stigmatizing than asking for help for depression, anxiety, or a substance use disorder. Employment services thus functioned as a safe gateway for clients in need of additional assistance. In addition, interviewees said the CRRI initiative alerted the service providers to the potential behavioral health sequelae from losing a job. Providers in all three sites reportedly had not really thought about this connection, and the projects opened their eyes to the potential emotional distress that can results from economic difficulties. The work-behavioral health connection thus appears to have significant potential to make a difference for both help-seekers as well as those providing the assistance and perhaps merits additional exploration by SAMHSA in other grants.
Second, these projects pointed out the value of having sufficient flexibility built into the grants so that programs can be adapted to a community's specific needs. Each of the three CRRI project directors understood the intentions of the grants and incorporated the fundamentals of the RFA (e.g., screening, brief interventions, referrals) into their programs. But each also understood the idiosyncrasies of their communities that required they take a "theme-and-variations" approach to the projects rather than a "cookie cutter" one. Local knowledge allows these and similar initiatives to make a difference in the community and reinforces SAMHSA's philosophy that local communities, rather than outside entities, are best suited to develop solutions to local challenges. Unfortunately, this local variation made it difficult to evaluate the program as an intervention across sites.
Finally, the CRRI initiative was fast-moving and required the grantees to undertake several new activities (e.g., development of a media campaign, creating community partnerships for screening) and begin enrolling clients in their programs within 4 months of the contracts being awarded. Despite the steep learning curve, each of the three grantees did a remarkable job bringing their programs online within or near to the required timeframe. There were some small missteps during that run-up (e.g., enrolling clients at the point of referral, rather than when the client arrived for services), but project directors provided excellent leadership and problem-solving to bring the programs past these hurdles. One lasting challenge, however, was having the grantees conduct the data collection for the community evaluation. Even though each hired an outside evaluator, grantees were oriented towards service delivery, not the evaluation of those services. Having the grantees start from ground zero to develop a community survey instrument was perhaps one requirement too many in an already ambitious initiative. It may be worth considering both the cost and data quality associated with having site-based evaluators and assess whether hiring an outside evaluation firm would be less costly -- or at least cost neutral -- and result in a higher quality assessment of the programs.
-
In order to draw this inference, there would have had to be a randomized control group in each community, or a stronger quazi-experimental design (i.e., a group of individuals whose demographics and baseline characteristics paralleled those of CRRI participants, but who did not participate in any of the CRRI-funded programs). Only a comparison of the outcomes of CRRI clients with non-CRRI clients would have allowed us to make more definitive claims that the programs themselves were responsible for individuals' improvements.