The evaluation of the WtW grants program has four core components:
- A descriptive assessment of all WtW grantees, based on two surveys of all WtW grantees nationwide to document program planning and early operations (Perez-Johnson and Hershey 1999; and Perez-Johnson et al. 2000)
- A process and implementation analysis, based on exploratory visits to 22 local WtW grant-funded programs (Nightingale et al. 2000), and more detailed analysis of programs in a subset of those sites, referred to as the "in-depth" study sites (Nightingale et al. 2002)
- A program cost analysis in the in-depth study sites, documenting the total program costs and participant costs by service category and grantee site (Perez-Johnson et al. 2002)
- A participant outcomes analysis in the in-depth study sites, based on analysis of longitudinal data on individual participants, integrating information from two follow-up surveys with administrative data on welfare receipt, employment, and earnings; this report is the first of three on the outcomes analysis
In addition to the four-part core evaluation, a special process and implementation study focuses on tribal programs. It documents welfare and employment systems operated by American Indian and Alaska Native WtW grantees, the supportive services they provide, and how these tribal grantees integrate funds from various sources to move members from welfare to work (Hillabrandt and Rhoades 2000; Hillabrandt et al.2001).
Originally this evaluation was to estimate the net impacts of the WtW grants program on participants, based on an experimental design, and then use those estimates to analyze the program's costs and benefits. However, enrollment in the local programs funded by the WtW grants proceeded much more slowly than expected (Nightingale et al. 2002). With the difficulties that service providers were experiencing in achieving their enrollment goals, they were uniformly unwilling to allow the random assignment of enrollees to treatment and control groups, as would be required under an experimental evaluation design.(8)
Given the impossibility of a rigorous experimental approach to estimating program impacts, DHHS consulted with its partners in the evaluation's interagency work group DOL, the Department of Housing and Urban Development, and the Office of Management and Budget (OMB) and with Mathematica Policy Research to develop an alternate evaluation design. The resultant design entailed the replacement of the infeasible impact analysis based on random assignment with an outcomes analysis. The alternate design and data collection instruments for all components of the evaluation were submitted to OMB and received formal clearance. A critical implication of this change in the evaluation design is that none of the findings presented in this report on the outcomes analysis should be interpreted as estimates of the net impacts of the local WtW grant-funded programs that participated in the evaluation.