Key Implementation Considerations for Executing Evidence-Based Programs: Project Overview. Highlights from the Forum

02/01/2013

Each Forum session focused on one of four aspects related to evidence-based programs (EBPs): selecting EBPs for replication; replicating EBPs; scaling up EBPs; and implementing evidence-informed and innovative strategies.

During the first session, panelists and participants considered the selection process and criteria for choosing EBPs for replication. Lessons learned from the Maternal, Infant, and Early Childhood Home Visiting Program (managed by the Health Resources and Services Administration and the Administration for Children and Families), as well as from state intermediaries working to encourage the implementation of EBPs, were shared. The panelists discussed several principle considerations when selecting an EBP. For example, needs assessments were noted as an important tool to use when selecting evidence-based programs. Assessments can help staff determine which problems (such as teen pregnancies, rates of repeat pregnancies) are most pressing and ensure that the program selected aligns with the problems that the community seeks to solve. Participants also noted the critically important role that leadership (such as that provided by program directors or advocates) plays in guiding the EBP selection process and building political support for the program selected. The conversation then turned to assessing program readiness to determine whether a program is ready for implementation in the current environment, including adequate staffing, training and materials, community support, and funding. If a program is not ready for replication or the conditions are not suitable for the program, implementation may not succeed.

The next session focused on the need to maintain the fidelity of EBPs during replication, while at the same time adapting programs to fit the situation at hand. Panelists and participants discussed particular factors that should be considered when identifying appropriate adaptations for diverse populations. Examples ranged from Federal efforts to replicate evidence-based teen pregnancy prevention programs to the Boys & Girls Clubs. Participants discussed the relation of fidelity and adaptation in program implementation. In particular, they noted that no tension exists between the two, but that each is critically important to successful replication.

The first day concluded with a discussion of how EBPs can be “scaled up” and the specific steps for doing so. Panelists presented Federal and program perspectives on taking EBPs to scale, and the importance of ensuring quality and accountability in the programs as they expand. The panelists discussed the different effect sizes that are found for EBPs, and the discussion turned to program quality and the implementation strategy needed to expand the EBPs. To expand a program effectively, an initiative needs critical infrastructure such as leadership within and across agencies to implement the program, a target population of potential consumers, adequate resources, and revised policies and practices to support the EBP implementation strategy. Participants also identified numerous challenges programs face in replicating programs and/or taking them to scale. They cited staff turnover, which not only requires training for new staff but may also require that new leadership be persuaded to support the program. Resource constraints may also inhibit the fidelity with which a program can be replicated. For example, resources can affect a variety of decisions from how many staff can be supported to carry out the project to whether some components must be dropped. These decisions can greatly inhibit successful replication.

The second day of the Forum focused on how to use research and evidence in developing innovative interventions for children and youth in the absence of rigorous gold standard evaluations. Several presentations from the child welfare community set the stage for the discussion and were followed by a technical presentation on innovation and creating solutions for new settings, and a talk on developing evidence-informed strategies. Evidence-based programs can be very helpful in improving the potential effectiveness of social programs, but the number of EBPs available for replication is small relative to the need for social service programs. In some program areas, such as in the child welfare system, there are very few EBPs to replicate. In other cases, programs may not be suitable for the population that needs services. Participants at the Forum suggested several strategies for incorporating evidence into program development when evidence-based programs are not available. For example, meta-analysis can be useful in identifying the individual characteristics, such as past behaviors, psychological characteristics, and family features, most strongly associated with negative or positive outcomes. This knowledge can be helpful in determining what aspects of previous interventions to incorporate into the new initiative. Participants also noted that researcher-practitioner partnerships may be very beneficial as researchers can synthesize information from the literature about what types of behaviors should be targeted and how. Practitioners, in turn, can share their experience and provide insights from working with individuals and families in the community, which researchers can then investigate and communicate more broadly.

View full report

Preview
Download

"rb_keyimplement.pdf" (pdf, 130.64Kb)

Note: Documents in PDF format require the Adobe Acrobat Reader®. If you experience problems with PDF documents, please download the latest version of the Reader®