How Effective Are Different Welfare-to-Work Approaches? Five-Year Adult and Child Impacts for Eleven Programs. Data Sources

12/01/2001

The outcomes and impacts presented in this report are drawn from four primary data sources: unemployment insurance, welfare, and Food Stamp administrative records; surveys of sample members that were conducted at the two-year and five-year follow-up points (the Two-Year Client Survey and the Five-Year Client Survey); a survey of sample members focused on outcomes for children (the Child Outcomes Study survey); and a teacher survey.

Client characteristic data. Standard personal data, such as educational background and welfare history, were collected by welfare staff during routine interviews at the time of random assignment and are available for all 41,715 heads of the single-parent families in the full impact sample.

Private Opinion Survey. Data on attitudes and opinions about welfare-to-work programs and employment prospects were collected through the Private Opinion Survey (POS), a brief, self-administered survey that was completed at program orientation in four of the sites (Atlanta, Grand Rapids, Riverside, and Portland), and are available for 18,461 respondents in these sites. These sample members represent 93 percent of those randomly assigned in the four sites during the periods when the POS was being administered.

Reading and math tests. Reading and math achievement tests were administered in four sites (Atlanta, Grand Rapids, Riverside, and Portland) at random assignment. Test scores are available for 20,577 sample members. These sample members represent about 93 percent of those randomly assigned in the four sites during the period when the tests were administered.(23)

Field research. MDRC staff observed all 11 programs in operation and interviewed enrollees, case managers, service providers, and program administrators in each site. Information was collected about a range of issues, such as management philosophy and structure, the degree to which the participation mandate was enforced, the nature of interactions between caseworkers and program participants, the extent to which the program was able to work with all those mandated to participate in it, the availability of services, and the relationships that program staff had established with outside service providers and income maintenance staff in the sites.

Unemployment insurance, welfare, and Food Stamp administrative records data. Most employment, earnings, and public assistance impacts were computed using automated county and state unemployment insurance (UI), welfare, and Food Stamp administrative records data. Five years of follow-up data from the UI system are available for all members of the full impact sample; five years of follow-up data from welfare and Food Stamp administrative records are available for all sample members in all sites except Oklahoma.

UI earnings, which are recorded statewide, provide unbiased measures of program impacts on employment and earnings. These data, however, do not include earnings from out of state; from jobs not usually covered by the UI system, such as self-employment, federal employment, or informal child care (all types of work that may have been "off the books"); or from employers who do not report earnings. Some of the earnings missed by the UI system may be captured by earnings and employment data collected through the two-year and five-year surveys.

In all sites except Riverside, welfare and Food Stamp payments were also recorded statewide, and payments are captured for all sample members except those who moved out of state. In Riverside (as everywhere in California), welfare and Food Stamp payments were recorded only within each county, which means that payments received by sample members who moved outside the county were not included in the analysis. Although this could lead to an underestimate of the payments received in the Riverside sample, it should not bias the impact estimates because there is no reason to expect the program and control groups to show different patterns of moving between counties.

UI earnings data are collected by calendar quarter: January through March, April through June, and so forth. For purposes of the evaluation, these data were reorganized so that the quarter during which a sample member was randomly assigned is always designated quarter 1, followed by quarter 2, and so forth. These quarters are then grouped into "years." Quarter 1 is not included in year 1 because it includes some income earned before random assignment, especially for sample members randomly assigned near the end of a calendar quarter. Thus, year 1 covers quarters 2 through 5, year 2 covers quarters 6 through 9, and so forth. Welfare and Food Stamp payments were recorded monthly but were grouped into quarters and years to align with the earnings data.

Two-Year Client Survey and Five-Year Client Survey. As noted in a previous section, this report includes the results of a survey administered at the five-year follow-up point and some results of a survey administered at the two-year follow-up point. Both the two-year and five-year surveys provide information about sample members' participation in training and education activities, attainment of education credentials, views of work and welfare, employment history, income, receipt of noncash benefits such as health coverage, child care use, living situations, and children's well-being.

Survey responses are the only source of information about many key outcomes, such as participation patterns for control group members, work hours and wages, income from other people in the household, and outcomes for children. For some outcomes, such as employment, respondents provided information that was also recorded from administrative data. It is possible for data from these two sources to differ. Because the five-year survey respondents represent a subsample of the full impact sample that was selected during a shorter period of random assignment months, the impact and survey samples may differ with respect to observed characteristics (such as educational attainment or prior work history) or with respect to unmeasured characteristics (such as assertiveness or learning style) that might have affected their ability to find and retain employment. (For more information on survey response bias and the degree to which the survey sample and full impact samples differ, see Appendix G.)

In some cases, administrative records data may be more accurate than the survey data. The client survey depends on people's ability to recall information about events or jobs that they may have held up to five years prior to being interviewed, and failures of memory can give rise to discrepancies between the dates of employment or amounts of earnings reported in the survey and reflected in administrative records. In addition, some respondents may have been reluctant to provide information on employment and income that could be found in administrative records or, alternatively, may have exaggerated their earnings and income. In other cases, however, survey data may be more accurate, such as when respondents were working off the books or in short-term employment. The survey may also have captured earnings that employers failed to report or reported inaccurately to the UI system. (For more information on the differences between UI-reported and survey-based measures of earnings, see Appendix H.)

Additional COS survey data. COS respondents provided information on focal children's academic functioning, social skills, and health and safety. In addition, mothers and the focal children themselves completed a Self-Administered Questionnaire (SAQ). Mothers' SAQ included questions about domestic abuse; children's SAQ included questions about academic functioning and social skills.

Teacher survey. Current teachers of focal children in the COS were asked to assess them with respect to their academic standing, academic progress, school engagement, behaviors requiring disciplinary action, and social skills. The teacher survey complements the data collected from mothers and the children themselves. Reports from teachers and mothers sometimes differ. Possible explanations include the following: The children behaved differently in the presence of mothers and teachers, mothers and teachers perceived the children's behavior differently, or mothers and teachers based their reports on different criteria.

Cost data. The cost analysis used data drawn from state, county, and local fiscal records, supportive service payment records, administrative records, the Two-Year Client Survey, the Five-Year Client Survey, and case file participation records.

Benefit-cost data. The benefit-cost analysis is based on administrative records data (UI-reported earnings, welfare, and Food Stamp payments), Two-Year Client Survey data, Five-Year Client Survey data, and published data.

Published data and agency reports. Published data and reports from government agencies were used to gather additional information about the environments in each of the sites, including unemployment rates, welfare caseloads, and welfare grant levels.