Assessing the Field of Post-Adoption Services: Family Needs, Program Models, and Evaluation Issues. Evaluation Issues. 4.2 Strategies To Facilitate Evaluation


Promote evaluation as a tool for program improvement. Program staff are more likely to support and use evaluation if they believe that it is likely to inform their practice (Patton, 1997; Gibbs, Napp, Jolly, Westover, and Uhl, 2002). Patton's utilization-focused evaluation approach stresses the importance of engaging the primary users of evaluation in every step of the process. These stakeholders include not only representatives of funding agencies and program coordinators, but frontline staff who implement the program. Focusing the evaluation on the questions they consider critical will improve both its relevance and implementation.

This approach may suggest strategies that elaborate on the more standardized evaluation approaches provided by the expert panel above. Program staff may identify questions on how to tailor interventions to specific types of clients, for example, that can be addressed with qualitative studies. These can complement, rather than conflict with, the use of standardized measures and clinical instruments.

Structure evaluation processes so that they are useful to programs. A related recommendation is to ensure that evaluation processes provide useful feedback to participants. A major barrier to evaluation among PAS program staff was the belief that families were being asked to spend time completing instruments without receiving any direct benefit in return. The choice of instruments should favor those that can provide useful feedback to program staff and families. This will also help mitigate the sense among program staff that evaluations compete with program activities for scarce resources.

The Case Study Report from this project (Gibbs, Siebenaler, Harris, and Barth, 2002) reported that families were frustrated by the difficulty of obtaining assessments for their children and desired detailed discussion of their children's needs and strengths with someone who could interpret clinical data for them. Although evaluation instruments would not substitute for a comprehensive assessment, feedback on the information collected is likely to be perceived as valuable information by many families. Susan Smith of ILSU's Center for Adoption Studies reports that this aspect of the Adoption/Guardianship Preservation Program evaluation has received a favorable response from participating families.

Earmark funds for evaluation. PAS programs need funding that is specifically designated for evaluation and related activities. Without separate evaluation funds, many program leaders will choose to use all, or nearly all, of their resources for services to families and children. Earmarking funds for evaluation will convey the fact that funding agencies (at both the federal and state levels) view evaluation as essential. Designating funds will also help mitigate concerns by program coordinators that evaluation takes resources away from needed services. Program leaders would then be held accountable for allocating those resources for evaluation.

Fund programs for multiple years. Short funding cycles make it difficult to plan, implement, and evaluate programs in the time allotted, so that managers are unlikely to invest in evaluation staff and activities. Funding programs for four years or longer ensures that they have sufficient time to develop, implement, learn from their evaluations, and incorporate those lessons into ongoing practice. Extended funding also provides opportunities for PAS programs to conduct follow-up activities, producing more substantive evaluations and facilitating assessment of outcomes.

Provide evaluation technical assistance. Accessible, culturally appropriate technical assistance can be used to supplement PAS programs' evaluation skills, or to build long-term evaluation capacity within the organization. Depending on the program's needs, technical assistance may emphasize support (where the provider conducts some of the evaluation activities with input from the program) or capacity building (where the provider trains and coaches program staff who carry out the evaluation). Technical assistance should be tailored to the particular needs and interests of the program, and it may include evaluation design, development or selection of data collection tools, data management and analysis, and application of findings to program development.

View full report


"report.pdf" (pdf, 399.11Kb)

Note: Documents in PDF format require the Adobe Acrobat Reader®. If you experience problems with PDF documents, please download the latest version of the Reader®