The National Evaluation of Welfare-to-Work Strategies
Evaluating Two Welfare-to-Work Program Approaches:
Two-Year Findings on the Labor Force Attachment and
Human Capital Development Programs in Three Sites
U.S. Department of Health and Human Services
Administration for Children and Families
Office of the Assistant Secretary for Planning and Evaluation
U.S. Department of Education
Office of the Under Secretary
Office of Vocational and Adult Education
Manpower Demonstration Research Corporation
|The Manpower Demonstration Research Corporation is conducting the National Evaluation of Welfare-to-Work Strategies under a contract with the U.S. Department of Health and Human Services (HHS), funded by HHS under a competitive award, Contract No. HHS-100-89-0030. HHS is also receiving funding for the evaluation from the U.S. Department of Education. The study of one of the sites in the evaluation, Riverside County (California), is also conducted under a contract from the California Department of Social Services (CDSS). CDSS, in turn, is receiving funding from the California State Job Training Coordinating Council, the California Department of Education, HHS, and the Ford Foundation. Additional funding to support the Child Outcomes portion of the study is provided by the following foundations: the Foundation for the Child Development, the William T. Grant Foundation, and an anonymous funder. |
The findings and conclusions presented herein do not necessarily represent the official positions or policies of the funders.
Chapter 2: Research Design, Samples, and Data Sources
Utilizing an unusually strong research design and multiple data sources, this report examines and compares the experiences of single-parent AFDC recipients enrolled in welfare-to-work programs with two different approaches. In addition, the report compares these two different experiences with those of a control group who received no program services. Recipients in each of the three evaluation sites analyzed in this report were placed in one of three treatment, or research, groups through random assignment. The use of a random assignment research design had the advantage of creating, within each evaluation site, a situation in which individuals in each research group had similar background characteristics and faced identical labor market conditions, financial incentives to leave welfare for work, and community services. It assured that any measured differences between the research groups during a follow-up period for example, in terms of participation patterns in job search, education, or training activities and the concomitant costs of providing these employment-related services, and in terms of individuals levels of GED attainment and employment, earnings, and AFDC receipt were due solely to the program approach to which individuals were randomly assigned.
This chapter describes the methodological underpinnings of the analyses presented in the rest of the report. It begins with a discussion of how AFDC recipients became enrolled in JOBS in the three evaluation sites, since it was at this point that individuals were randomly assigned to the research groups analyzed here. Included is an explanation of why differing proportions of the entire AFDC caseloads in the three sites eventually enrolled in JOBS. The second section discusses how random assignment was conducted in each site and the implications of Riverside's pre-existing program regulations on the definition of the research groups in that site. The third section presents the baseline characteristics of JOBS enrollees in the sites: individuals' ages, welfare histories, reading and math achievement levels, number and ages of children, and other descriptive characteristics. The fourth section discusses program enrollees' perceived barriers to employment or program participation, expectations of the JOBS program, and views on employment as of random assignment. The chapter concludes with a discussion of the report's data sources and attendant sample sizes.
The JOBS Enrollment Process and Its Effect on Eligibility for Random Assignment and Sample Composition
As noted in Chapter 1, until August 1996 the JOBS program was the government's vehicle for moving families from welfare to work. However, individuals had to first enroll in JOBS in order to avail themselves of the program's services. In the three sites analyzed in this report, JOBS program enrollment occurred at JOBS orientations; this was also the point at which enrollees were randomly assigned to one of three research groups As a result, the research samples analyzed in this report consist of those who attended a JOBS orientation, and the impacts presented in the report represent the effects of the 'treatment' provided after orientation.(1) If a random sample of the entire AFDC caseload in each of the three evaluation sites were enrolled in JOBS, then research findings would be generalizable to the entire AFDC caseload. In actuality, some programmatic practices, such as federally and state-defined exemption criteria, referral practices to JOBS, and waiting lists for JOBS orientations resulted in certain AFDC recipients never attending a JOBS orientation. Thus, it is important to understand the process by which AFDC recipients were identified as JOBS-mandatory, referred to JOBS, and scheduled for orientations, since it will shed light on the types of AFDC recipients who were likely to have attended a JOBS orientation. With this knowledge, it is possible to examine the extent to which the research sample analyzed in this report is representative of the entire AFDC caseload.
A number of steps were taken before an AFDC recipient attended a JOBS orientation and was randomly assigned to a research group. Figure 2.1 depicts the process in Atlanta and Grand Rapids; Figure 2.2 depicts the process in Riverside. The first step toward JOBS enrollment was a routine meeting between the AFDC recipient and her income maintenance (IM) worker, who was responsible for the financial aspects of each case, including AFDC, food stamps, and Medicaid (box 1 in Figures 2.1 and 2.2). At this meeting, which occurred either when the individual first applied for welfare or when continuing eligibility for AFDC was being determined, the IM worker was responsible for assessing whether the individual was required to enroll in JOBS (box 2 in Figures 2.1 and 2.2).
The Family Support Act established the criteria by which to determine if an individual was JOBS-mandatory.(2)According to the FSA, any single-parent AFDC recipient whose youngest child was age 3 (or 1, at state option) or over and who did not meet certain exemption criteria was mandated to participate in the state's JOBS program. Exemption reasons included having a disabling illness, being employed full time (30 hours or more per week), living in a remote area that made program activities inaccessible, or being in at least the second trimester of pregnancy. While JOBS-exempt individuals could volunteer for the JOBS program, they were not randomly assigned, and were not included in the samples evaluated in this report. Michigan (Grand Rapids) added a number of state-specific exemption reasons: if a recipient had three children or more under age 10,(3) had been within the past five years a resident of a mental institution, had been using prescribed medication for mental illness, or had been enrolled in a rehabilitation program for at least 15 hours per week. There were no such state exemptions in California (Riverside) or Georgia (Atlanta).
After determining whether an individual was, indeed, mandated to participate in the JOBS program in that site, it was the IM worker's responsibility to refer her to the JOBS program. Typically, this was done by sending a form, either on paper or via computer, from the IM office to the JOBS office. At this point, JOBS staff took over. (In contrast to IM workers, JOBS workers were responsible for recipients' participation in JOBS training, education, and employment-related activities.)(4) Once received, JOBS referrals were placed on a list, to be called in for a JOBS orientation on a first-in, first-out basis. In Grand Rapids, there was effectively no wait for this call-in. In Riverside, there was a short waiting list in the early months of random assignment, but for most of the random assignment period, there were no waiting lists. In Atlanta, however, it was not unusual for an individual to remain on a waiting list for as long as six months before being called in to attend a JOBS orientation. These waiting lists were the result of the Atlanta program adopting a more stringent participation mandate, in combination with resource issues. At the start of the evaluation, Atlanta staff began to refer many individuals to JOBS who previously were not served in their program. However, the county had a limited budget for hiring additional case managers and desired to keep caseloads at what they considered a manageable and effective level. Thus, only a certain number of individuals could be scheduled for program orientations each week, and it took some time to enroll all mandatory individuals in the site's JOBS program.
Orientation waiting lists have important ramifications for the characteristics of individuals enrolling in welfare-to-work programs. When a waiting list is in place, some welfare recipients find jobs and leave welfare before they are scheduled for an orientation. In this case, those who end up attending orientations may be more disadvantaged (for example, they are less likely to have prior work experience or more likely to have lower education levels) than is the case when all individuals are immediately scheduled for a program orientation. In the National Evaluation of Welfare-to-Work Strategies, waiting lists are not of concern when making within-site comparisons between research groups, as the random assignment process (which occurred at orientation) draws upon the same pool of AFDC recipients for all research groups. However, the possible effect of waiting lists on sample characteristics is an important consideration when making comparisons between sites.
A related consideration is the length of time during which a site had been working with the entire JOBS-mandatory population. The sample for this report represents an early cohort of the entire sample of individuals randomly assigned at JOBS orientation in these three sites. In the first 6 to 12 months of random assignment in each site, welfare recipients who may have been JOBS-mandatory for some time were scheduled for JOBS orientations and, once they attended them, were randomly assigned to a research group. Those randomly assigned in the later months of the evaluation in each site tended to be more recent AFDC applicants or individuals who were newly JOBS-mandatory, most commonly because their youngest child had just turned age 3 or, in Grand Rapids, age 1. As a result, individuals included in this report who were randomly assigned during roughly the first two-thirds of the random assignment period are somewhat more disadvantaged (for example, in terms of length of adult lifetime AFDC receipt) than those randomly assigned toward the end of the random assignment period.
In the JOBS enrollment process, once an AFDC recipient's name appeared at the top of a JOBS-referral list, a letter was sent directing the individual to attend a specific JOBS orientation and stating that a sanction could be imposed for nonattendance (Box 3 in Figures 2.1 and 2.2). Welfare recipients who did not show up after as many as four call-in letters may have had their AFDC grants reduced. After a sanction or threat of a sanction, some individuals may have tried to comply; others may have accepted the reduced grant level as the cost for nonparticipation in JOBS; still others may have found employment or left welfare.
There are likely to have been some considerable differences between the characteristics of AFDC recipients who attended welfare-to-work program orientations and the characteristics of those who never attended them. As mentioned above, some recipients left welfare before being scheduled for an orientation, and a portion of this group may even have left because they did not want to participate in a welfare-to-work program. Still others may have been willing to take a sanction so as to avoid participation. Others may have 'fallen through the cracks,' that is, may have become lost in the bureaucratic maze as caseworkers tried to keep track of hundreds of schedulings and re-schedulings, and may never have been sanctioned for their nonparticipation. Given these different situations, which imply that the characteristics of orientation attenders may have been different from those of nonattenders, this report's findings are generalizable to those who attended JOBS orientations but may not be generalizable to the entire JOBS-mandatory AFDC caseloads in the three sites.(5)
Recipients who attended JOBS orientations (box 4 in Figures 2.1 and 2.2.) heard a presentation about the evaluation (including its random assignment design), were tested to determine their basic reading and math skills levels, provided information on many of the basic demographic characteristics presented in this chapter, and were randomly assigned (box 5 in Figures 2.1 and 2.2). Riverside was the first JOBS site to begin random assignment, in June 1991, and random assignment concluded there in June 1993. In Grand Rapids, random assignment began in September 1991 and ended in January 1994. In Atlanta, random assignment began in January 1992 and ended two years later.
The Random Assignment Process and Resulting Research Groups
As noted in Chapter 1, a fundamental question of the National Evaluation of Welfare-to-Work Strategies is to determine the relative effectiveness of two different program approaches for promoting self-sufficiency. Many evaluations utilize cross-site comparisons of alternative program designs, but must overcome the difficulty of isolating the effects of these approaches from other factors, such as local economic conditions and welfare grant levels. To avoid these difficulties, this evaluation took an innovative approach to comparing program strategies. In the three sites examined in this report (Atlanta, Grand Rapids, and Riverside), a three-way random assignment design was used and two different types of welfare-to-work JOBS programs were operated side by side in each site.(6)
In each of the three sites, JOBS orientation attenders were randomly assigned to one of three groups: a Labor Force Attachment (LFA) group, a Human Capital Development (HCD) group, or a control group. Control group members were free to seek out, on their own initiative, training and education programs available in their communities. In addition, since the Family Support Act created a guarantee that child care would be available to welfare recipients participating in JOBS-approved activities, a decision was made by HHS early in the evaluation that control group members, as long as they were participating in an approved activity, should be eligible for this assistance. Finally, in Grand Rapids, control group members in approved activities were eligible for transportation assistance as well.
Using the three-way random assignment design, three sets of comparisons can be made in each site. First, comparisons can be made between outcomes for individuals assigned to each of the program groups and outcomes for those assigned to the control group (LFA versus control; HCD versus control), enabling one to estimate the added benefit of either of these approaches above what the individuals would achieve in the absence of a welfare-to-work program. Additionally, a direct comparison can be made between outcomes for participants in the two program groups (LFA versus HCD), to assess the relative effectiveness of each of these approaches. Thus, impacts (for example, on participation in employment-related activities or on employment, earnings, or welfare receipt) and net costs presented for each of the two program groups represent the difference between outcomes for control group members, that is, what people would do without a welfare-to-work program, and the outcomes for those assigned to each of the two program approaches. Similarly, impacts and net costs presented in the last chapter of the report on a direct comparison of the LFA and HCD approaches represent the added benefit of one approach vis-à-vis the other.
In Atlanta and Grand Rapids, JOBS orientation attenders were equally likely to have been assigned to one of the two program groups or to the control group, as shown on the left side of Figure 2.3. Riverside, however, had pre-existing program regulations governing participation in adult basic education, following regulations in California's Greater Avenues for Independence (GAIN) program, the state's JOBS program. As a result, there were, in effect, two different random assignment evaluations in Riverside. Prior to the National Evaluation of Welfare-to-Work Strategies, GAIN program regulations dictated that only individuals determined to be in need of basic education would be assigned to educational activities as a first step toward self-sufficiency. Thus, all JOBS enrollees were evaluated at orientation to determine whether, according to program regulations, they required basic education: Those who had a high school diploma or GED, or scored 215 or above on both the math and the literacy sections of the GAIN Appraisal test,(7) and were proficient in English were determined not to need basic education. As seen on the right side of the Riverside part of Figure 2.3, this group could be randomly assigned only to the LFA or control group. Those without a high school diploma or GED, who scored below 215 on either section of the GAIN Appraisal test, or who required English remediation were determined by the program to be in need of basic education and, according to program regulations, were eligible for assignment to an education activity. As a result, individuals with these characteristics were eligible to be randomly assigned to any of the three evaluation research groups, including the HCD group.
The situation in Riverside has several implications for the research group comparisons made in this report. First, since only those without a high school diploma or with low reading and math skills were eligible for random assignment to the HCD group in Riverside, any comparisons between the LFA and the HCD groups in Riverside must include only those individuals determined to be in need of basic education as of random assignment. In other words, when the effects, in Riverside, of the LFA approach vis-à-vis the HCD approach are examined, individuals in the LFA group who are not in need of basic education must be dropped from the analysis. Second, Riverside's design also affects the comparability of the HCD research groups across the three evaluation sites. Compared with the HCDs in Atlanta and Grand Rapids, HCDs in Riverside have lower education levels than those in other sites.
In order to present information that can be accurately and easily used to make within-site LFA-HCD comparisons in Riverside and to make cross-site HCD comparisons, subgroup (as well as full-sample) participation, cost, and impact estimates are presented throughout the report. The subgroup estimates always divide the full LFA and HCD samples in each site into those determined to be not in need and in need of basic education in Riverside and into those with and without a high school diploma or GED in Atlanta and Grand Rapids. Those determined to be not in need of basic education in Riverside and those with high school diplomas or GEDs in the other two sites appear under the high school diploma/GED subgroup heading throughout the report; those determined to be in need of basic education in Riverside and those without high school diplomas or GEDs in the other two sites appear under the no high school diploma/GED subgroup heading.
While those determined to be in need of basic education in Riverside appear under the no high school diploma/GED heading, scores on the GAIN Appraisal test are also taken into account in determining if an AFDC recipient in California is in need of basic education, in line with GAIN regulations. As a result, 23 percent of the HCDs in Riverside who appear under the no high school diploma/GED heading actually did have such a credential but scored low on either the reading or math portion of the CASAS test.(8)
Baseline Characteristics of the Research Sample
At JOBS orientation, immediately prior to random assignment, case managers recorded standard characteristics about attendees, such as educational levels, AFDC history, and information about their family settings. (This data source and the data source used in the next section are described more fully in Section V of this chapter.) Table 2.1 presents selected baseline characteristics of sample members included in this report, by site. Following are some highlights.
All sample members included in this report were single-parent heads of AFDC cases when they were randomly assigned.(9) The vast majority of individuals were female, ranging from 90 percent in Riverside to 98 percent in Atlanta. Sample members were, on average, about 31 years old as of JOBS orientation. The sites vary widely in the ethnic composition of their JOBS enrollees. In Atlanta, virtually all sample members, 95 percent, were African-American. In Grand Rapids, 50 percent were white and 40 percent were African-American. In Riverside, 50 percent were white, 29 percent were Hispanic, and 17 percent were African-American.
|Age of youngest child (%)|
|Living in public housing (%)||40.5||2.6||2.5|
|Living in subsidized housing (%)||26.1||13.9||7.3|
|Education and basic skills levels|
|No high school diploma or GED (%)||44.3||41.9||43.4|
|Enrolled in education or training in past 12 months (%)||13.2||38.7||19.4|
|Scored at level 1 or 2 on the TALS document literacy testa (%)||60.6||39.4||36.9|
|Scored below 215 on the GAIN Appraisal math test (%)||67.4||37.2||34.6|
|Labor force status|
|Never worked full time for six months or more for one employer (%)||31.4||36.4||29.0|
|Any earnings in past 12 months (%)||18.2||43.9||41.8|
|Currently employed less than 30 hours per week (%)||5.2||12.0||10.8|
|Public assistance status|
|On welfare two years or more (cumulatively) prior to random assignment (%)||78.4||63.5||54.1|
|Raised as a child in a household receiving AFDC (%)||27.6||32.9||19.8|
|First spell of AFDC receipt (%)||4.6||28.0||22.0|
|SOURCE: MDRC calculations from information routinely collected by welfare staff and from test data.|
a. TALS (Test of Applied Literacy Skills) scores for Riverside are based on scores earned on the GAIN Appriasal literacy test and are converted to their TALS equivalent.
The proportion of sample members with a preschool-age child varied widely by site, based upon whether the site was in a state that mandated JOBS participation by single parents with children as young as age 3 or in a state that had exercised the FSA option to mandate JOBS participation of single parents with children as young as age 1. The State of Michigan exercised this option and, consequently, in Grand Rapids, 44 percent of JOBS enrollees had a youngest child aged 2 or under; 22 percent had one aged 3 to 5. In Atlanta and Riverside, which are in states that did not exercise this option, these proportions were smaller. In Riverside, 6 percent of JOBS enrollees had a youngest child aged 2 or under;(10) 49 percent had one aged 3 to 5. In Atlanta, these same figures were 0.5 percent and 35 percent, respectively. The proportion of JOBS enrollees residing in public or subsidized housing as of random assignment, for whom increases in income could affect housing status as well as rent, was large only in Atlanta. In this site, two-thirds of the sample members were living in such housing; this was the case for less than one-sixth of the sample members in the other two sites.
Between 56 and 58 percent of enrollees in the three sites had earned a high school diploma or GED. Few enrollees had earned a college degree (either an A.A. or a B.A.): 2 percent or less in any site. A substantial proportion of enrollees in the Grand Rapids sample 39 percent reported having been enrolled in an education or training program in the 12 months prior to random assignment.
Achievement tests were administered to determine the basic skills levels of JOBS enrollees in each site. In all three sites, the GAIN Appraisal math test, developed by the Comprehensive Adult Student Assessment System (CASAS), was used to determine basic math skills. In Atlanta and Grand Rapids, the Test of Applied Literacy Skills (TALS) document literacy test was administered. In Riverside, however, the state-mandated GAIN Appraisal literacy test was used to gauge reading skills. These GAIN Appraisal scores have been converted to the corresponding TALS score to facilitate comparisons between test scores across the three sites.(11)
Sixty-one percent of JOBS enrollees in Atlanta, 39 percent in Grand Rapids, and 37 percent in Riverside had TALS document literacy scores (or a TALS equivalent score) placing them in the lowest two levels (of five levels). According to the test developers, these individuals are likely to experience considerable difficulty integrating or synthesizing information in complex or lengthy text. (They would have difficulty, for example, using a hospital campus map and its legend to identify a building that houses a specified medical department.) Similarly, 67 percent of JOBS enrollees in Atlanta, 37 percent in Grand Rapids, and 35 percent in Riverside scored in the lowest levels on the GAIN Appraisal math test, that is, below a score of 215. According to the test developers, these individuals are likely to have extremely limited employment choices and would have difficulty calculating gas mileage or writing a letter or service order. While the lower reading and math achievement levels of Atlanta's JOBS sample may have been due, in part, to the effect of the waiting list on the characteristics of those who eventually attended JOBS orientation (discussed earlier), the AFDC caseload in Atlanta was generally more disadvantaged than the caseloads in the other two sites, and the orientation waiting list did not account for all of these differences.
JOBS enrollees in the three sites had varying levels of prior work experience, a valuable asset when attempting to secure future employment. Across the three sites, about one-third of enrollees had never worked for six months or longer for the same employer, ranging from 29 percent in Riverside to 36 percent in Grand Rapids.
About one JOBS enrollee in ten was employed less than 30 hours per week as of orientation, and there was not much variation between sites on this measure. Atlanta, the site with the lowest AFDC grant level and the most disadvantaged sample, was on the low end of this rate, with 5 percent of enrollees employed. In Riverside, the site with the highest grant level, 11 percent of enrollees were employed. It should be kept in mind, however, that California's AFDC grant level allowed some individuals to work at least 30 hours per week and remain eligible for AFDC. Once an individual was employed for more than 30 hours per week, however, federal regulations specified that they were no longer JOBS-mandatory and, as a result, they would not be included in the National Evaluation of Welfare-to-Work Strategies research sample. Thus, many more AFDC recipients in Riverside were working while receiving AFDC than are shown in Table 2.1, but they were not JOBS-mandatory and, consequently, were not eligible for random assignment.
At least half of the JOBS enrollees in each site had received AFDC, on their own or spouse's case, for at least two years (cumulatively) during their adult life, though not necessarily for two years continuously prior to random assignment. This figure was highest in Atlanta, where 78 percent of the enrollees had received welfare for at least two years; 64 percent of enrollees in Grand Rapids and 54 percent in Riverside met this criterion. Atlanta also had the greatest proportion of welfare recipients for whom this was not a first spell of welfare receipt; only 5 percent of JOBS enrollees in this site were in the midst of their first spell on AFDC compared with about 25 percent in the other two sites. These figures indicate that the vast majority of sample members were AFDC recidivists: that is, individuals who, at least once, had previously received AFDC, left AFDC (because of employment or another reason), and then had returned to AFDC at some point.
Less than one-third of JOBS enrollees recalled living as a child in a household receiving AFDC. Enrollees in Riverside were the least likely to be 'second-generation' welfare recipients; only 20 percent reported receiving AFDC as a child.
Clients' Expectations for and Perceptions of JOBS
At orientation, prior to hearing about the services offered by the JOBS program and the results of the random assignment process, sample members were asked to complete a survey on barriers to and expectations for employment and participation in the JOBS program. Selected measures from this survey appear in Table 2.2.
|Attitude or Opinion||Atlanta||Grand Rapids||Riverside|
|Client-reported barriers to welfare-to-work program participation|
|Percent who agreed or agreed a lot that they could not go to a school or job training program right now forthe following reasons:|
|Client-reported barriers to employment|
|Percent who agreed or agreed a lot that they could not get a job right now for the following reasons:|
|Client-reported preferred welfare-to-work program components and expectations
regarding the effectiveness of the components
|Given the choices of going to school to study basic reading and math, going to a program to get help looking for a job, or going to school to learn a job skill, percent who would prefer to:|
|Percent who agreed or agreed a lot that the following would help them get a good job:|
|Client-reported expectations regarding employment|
|Percent who agreed or agreed a lot that it will probably take more than a year to get a full-time job and get off welfare||47.3||56.3||47.9|
|Percent who would probably take a full-time job today if the job paid less than or the same as welfare||51.2||53.3||52.9|
|If someone offered client a full-time job with full medical benefits, minimum amount per hour at which the client would take the job|
|If someone offered client a full-time job with no medical benefits, minimum amount per hour at which the client would take the job|
|SOURCE: MDRC calculations from Private Opinion Survey data.|
Approximately 77 percent of JOBS enrollees anticipated at least one obstacle to welfare-to-work participation, with between 58 and 70 percent reporting that the cost of child care would prevent them from attending program activities. Lack of transportation was another commonly perceived barrier to participation, with 37 to 41 percent of enrollees reporting that this was a barrier. Health and emotional problems were also perceived as barriers to participation; between 19 and 21 percent reported that they could not participate in a welfare-to-work program because they themselves suffered from a health or emotional problem. Furthermore, 18 to 20 percent reported that they could not participate because a family member was suffering from a health or emotional problem.
Over 80 percent of JOBS enrollees in each site reported a barrier to employment. The two most commonly reported reasons why individuals felt that they could not get a job at the time were that they preferred to take care of their family full time (reported by 20 to 31 percent of the sample members) and that they had no available trusted person to take care of their children (reported by 20 to 28 percent of the sample members).
Respondents were also asked in which of three types of welfare-to-work activities they would prefer to participate. Of the three choices provided, job training was the preference of the largest number of individuals, ranging from 42 percent in Atlanta to 61 percent in Riverside. Respondents' second choice was a program to get help looking for a job, with 23 to 41 percent of respondents preferring such a program. Least favored was basic education, with 6 to 10 percent of respondents choosing school (to learn basic reading and math) as the preferred activity.
Many sample members, however, felt that these types of program activities would help them get a good job, even if some of the activities were not their first preference. Across the three sites, 79 to 88 percent of respondents reported that a job training program would help them find a good job; between 57 and 73 percent of respondents thought that a program to help them look for a job would be helpful; and, while lowest in popularity, over half of the respondents thought a basic education program would help them secure a good job, ranging from 55 percent in Grand Rapids to 68 percent in Atlanta.
About half of the respondents believed that it would probably take them over a year to find full-time employment and leave welfare, ranging from 47 percent of enrollees in Atlanta to 56 percent in Grand Rapids. Half agreed that they would take a full-time job if the job paid the same as (or, in some cases, less than) welfare. When asked the minimum wage at which the respondent would take a full-time job, with medical benefits, the median response was $6 per hour in Atlanta and Grand Rapids and $7 per hour in Riverside. When asked the minimum acceptable wage for a full-time job which did not offer medical benefits, the median response was $7 per hour in Atlanta, $8 per hour in Grand Rapids, and $10 per hour in Riverside. In the three sites, the provision of full medical benefits represented approximately $2.25, on average, of JOBS enrollees' hourly reservation wages.
Sample Sizes and Data Sources
The findings in this report on participation in employment-related activities, program costs, and employment, earnings, and welfare impacts for single-parent AFDC recipients cover a two-year follow-up period. At this writing, two years of follow-up data are available only for those individuals randomly assigned to a research group through December 1992, while random assignment continued for an additional 6 to 13 months in the three sites examined in the report. The site samples thus represent 50 to 63 percent (depending on the site) of the eventual single-parent AFDC recipient samples that will be analyzed as part of the evaluation.(12)
The following paragraphs describe the data sources and the sizes of the samples examined for each type of analysis in the report. Appendix Table A.1 presents a complete breakdown of the sample sizes, by data source, site, and research group.
AFDC and Unemployment Insurance Administrative Records Data
Employment, earnings, and welfare impacts were computed using automated county and state AFDC administrative records and state unemployment insurance (UI) records data. AFDC and UI records were available for all 11,977 sample members for whom two years of follow-up were available.(13) The administrative records sample is depicted on Figure 2.4 by the largest circle and includes all sample members in this report.
Two-Year Client Surveys
Some client opinions and participation rates examined throughout the report are based on results compiled from a survey administered to a sample of individuals in all three research groups approximately two years after random assignment.(14) In Figure 2.4, the client survey sample is represented by the circle with horizontal lines. The survey sample was randomly selected from the larger report sample, but it intentionally oversampled certain subgroups to produce a large enough sample for special analyses to appear in later reports. The survey sample was thus a stratified, random sample. For this report, the survey sample was weighted to replicate the demographic characteristics of the entire report sample.
SOURCE: See Appendix Table A.1.
Survey respondents were asked about issues such as their participation in training and education activities, if they had received a GED or high school diploma in the past two years, their perceptions of the JOBS program, and their expectations for the future. Interviews included in this report were conducted with individuals randomly assigned between March 1992 and December 1992 in Atlanta and Grand Rapids and between September 1991 and December 1992 in Riverside. The responses of 1,389 sample members in Atlanta, 832 sample members in Grand Rapids, and 1,586 sample members in Riverside are included in this report.(15) Ninety-one percent of fielded surveys in Atlanta, 90 percent in Grand Rapids, and 75 percent in Riverside were completed.
JOBS and Income Maintenance Case File Data
Findings on the LFA and HCD patterns of participation in program activities presented in Chapters 5 and 6 are based on material collected from the review of the JOBS and income maintenance case files of 1,093 single-parent AFDC recipients randomly assigned to the two program groups in the three sites. Case file data were collected for a stratified, random subsample that was demographically representative of the entire report sample. As displayed in Figure 2.4, the case file sample (the circle with vertical lines) is, by and large, a subsample of the above Two-Year Client Survey sample. This overlapping group is represented in Figure 2.4 by the area with both vertical and horizontal lines.
In reviews of case files, MDRC staff recorded sample members' enrollment in activities, length of stay in JOBS, changes in JOBS-mandatory status, sanctions, and deferrals over a 24-month period(16) using standard coding procedures, so that welfare recipients' actions and statuses could be compared across the sites and research groups. Case file documents consulted included standard program forms, case notes, and correspondence between the individuals, their caseworkers, and JOBS activity providers. Note that because individuals in the control group were not eligible for services through JOBS, no case file reviews were conducted for control group members.(17)
Cost Data Sources
The cost analysis used data drawn from state, county, and local fiscal records, program participation records, supportive service payment records, administrative records, Two-Year Client Survey responses, and case file participation records. Sample sizes varied by data source and included individuals assigned to the LFA, HCD, and control groups.
MDRC staff observed the JOBS programs and interviewed enrollees, case managers, service providers, and program administrators in each of the three sites. Information was collected about a range of issues, such as management philosophies and structure, the degree to which a participation mandate was enforced, the nature of interactions between caseworkers and program participants, the extent to which the program was able to work with all JOBS-mandatory individuals in the site, the availability of services, and the relationships JOBS staff had established with outside service providers and the sites' IM staff. Materials gathered in these visits are used throughout the report, but particularly in Chapters 3 and 4.
JOBS and Income Maintenance Staff Surveys
JOBS case managers and income maintenance (IM) workers and their immediate supervisors were surveyed about their opinions of JOBS, experiences administering the program, and attitudes toward their clients. These surveys were administered in November 1993 in Atlanta and covered all of the 27 JOBS workers employed at the time and 113 IM workers and supervisors selected at random. In Riverside, surveys were administered in October 1993 and covered all of the 71 JOBS workers and 105 IM workers and supervisors selected at random. Survey administration in Grand Rapids occurred in September 1993 and covered all of the 23 JOBS and 120 IM staff members and supervisors. Completion rates ranged from 90 to 100 percent for JOBS staff and from 94 to 100 percent for IM staff.
Adult Basic Education Teacher Surveys and Administrator Interviews
Basic education teachers were surveyed in the three JOBS sites discussed in this report during the fall-winter of 1993. MDRC targeted programs that offered basic education instruction and had enrolled a large number of JOBS participants in the site. All of the full-time teachers in those programs were asked for a description of their program and about issues such as linkages with JOBS, instructional styles, measures of student progress, and class size. The responses of 24 teachers in Atlanta, 79 teachers in Grand Rapids, and 45 teachers in Riverside are included in this report. In addition, while visiting each of the adult basic education institutions included in the teacher survey, an in-person interview was conducted with the program's administrator.
JOBS Enrollees' Characteristics, Attitudes, and Opinions as of Random Assignment
Standard client characteristic data, such as educational background and AFDC histories, were collected by welfare staff during routine interviews with individuals at JOBS orientation, and are available for all individuals in the report sample. Reading and math achievement test scores are also available for 9,060 individuals, representing about 76 percent of the report sample randomly assigned during the time period when the tests were administered.(18) Data on attitudes and opinions about welfare-to-work programs and employment prospects were collected through a brief, client-completed Private Opinion Survey (POS) administered at JOBS orientation, and are available for 6,953 individuals in the three sites, representing a response rate of 91 percent during the period when this instrument was used.
1. In two of the three sites -- Grand Rapids and Riverside -- random assignment for research purposes also occurred at a point earlier than JOBS enrollment: when individuals were identified as JOBS-mandatory by income maintenance staff. In these two sites, the research groups analyzed in this report, that is, the ones generated at JOBS enrollment-orientation, are 'nested' within one of the previously created research groups. Analyses using the samples randomly created at an earlier point in the path toward JOBS will measure JOBS's deterrence effects prior to orientation (impacts in addition to the ones discussed in this report) and will examine such issues as clients' reasons for not attending JOBS orientations. Those results will be reported in a separate, forthcoming publication.
2. Family Support Act of 1988.
3. This exemption reason was eliminated in December 1992.
4. In two of the National Evaluation of Welfare-to-Work Strategies sites not included in this report, Oklahoma City and Columbus, some sample members did not have IM workers. Instead, an integrated case manager handled all case management functions, including financial and JOBS monitoring functions. The effectiveness of this approach compared with the traditional approach of separate case workers for different functions will be examined in a forthcoming document.
5. A future National Evaluation of Welfare-to-Work Strategies publication will more closely examine the process by which AFDC recipients came to attend JOBS orientations and, as noted earlier, will estimate the impacts of being referred to JOBS and obligated to enroll in the program by attending a JOBS orientation.
6. While a number of earlier studies, such the San Diego Job Search and Work Experience Demonstration and the Virginia Employment Services Program Demonstration, utilized side-by-side tests of two different program strategies (see, for example, Goldman, Friedlander, and Long, 1986; and Riccio et al., 1986), no prior evaluation has conducted a side-by-side test of comprehensive program models. These earlier studies restricted access to some program services, such as work experience or basic education activities, to one program group, while permitting the other program group to access these services. Thus, these earlier evaluations were tests of individual service components. In contrast, the National Evaluation of Welfare-to-Work Strategies is a study of two pervasive program philosophies in three sites. In these three sites, sample members in the two program groups received very different messages about the goals of the program and were offered a range of services compatible with that message. Chapter 3 discusses the implementation differences between these two program models.
7. The GAIN Appraisal test, an instrument developed by the Comprehensive Adult Student Assessment System (CASAS) specifically for use by the California GAIN program, was given to sample members at orientation. According to the designers of the test, individuals who score below 215 have difficulty completing tasks that require more than minimal literacy or computation skills.
8. Restricting the Riverside HCD sample to those who did not have a high school diploma or GED, regardless of how they performed on the GAIN Appraisal test, would have further complicated Riverside within-site comparisons as well as full-sample cross-site comparisons. In addition, this would have created a group with no operational policy relevance to California or Riverside welfare administrators.
9. Case heads receiving AFDC for unemployed parents (AFDC-UP) were also randomly assigned in Riverside as part of this evaluation. These individuals, who were primary wage earners (typically male) in two-parent households, were also required to participated in JOBS. AFDC-UP sample members are not included in this report; the effects of the LFA and HCD approaches on Riverside AFDC-UPs will be analyzed in a future publication.
10. The Riverside GAIN program generally did not mandate participation for women with children aged 2 or under, but it did require participation of two groups of single parents regardless of the age of their youngest child: teen parents, on their own or their parents' AFDC case, who did not have a high school diploma or GED; and individuals who worked more than 15 hours per week while receiving AFDC. The first group, teens, was randomly assigned but is not included in the sample for this report. The second group was included in the random assignment process and in the sample for this report.
11. In order to facilitate comparisons between the reading achievement test scores of research sample members in Riverside and the other sites, the National Evaluation of Welfare-to-Work Strategies commissioned a team led by Walter Haney, Senior Research Associate at the Center for the Study of Testing, Evaluation, and Education Policy at Boston College, to conduct a calibration study of research sample members' scores on the GAIN Appraisal reading test Form 2 and the TALS document literacy tests. The findings of this study, which are discussed in detail in Haney et al., 1996, were used to estimate the TALS document literacy test score that best corresponds to the GAIN Appraisal score received by each research sample member in Riverside.
12. Note that some individuals were randomly assigned to a research group as part of the evaluation during this time period, but are not analyzed in this report: JOBS-mandatory individuals randomly assigned prior to JOBS enrollment-orientation in Grand Rapids and Riverside as part of a special study of possible deterrence effects of JOBS; AFDC-UPs and teens randomly assigned at JOBS orientation in Riverside; and individuals in Riverside who were randomly assigned at JOBS orientation and were also part of a six-county random assignment evaluation of GAIN services in California, conducted in the late 1980s. (Individuals who were randomly assigned to the GAIN study program group less than three years prior to attending a JOBS orientation were randomly assigned to an evaluation program group, either the HCD or LFA group, but were not eligible to be assigned to the control group.)
13. In Atlanta and Grand Rapids, AFDC records were not available for sample members who moved out of state during the follow-up period; in Riverside, AFDC records were not available for sample members who moved out of Riverside County during the follow-up period. UI records were not available for sample members who moved out of state during the follow-up period. In addition, UI records often underrepresented certain types of employment, such as domestic service, which may have been 'off the books.' Finally, while Georgia and California employers were required to provide wage information, employers in Michigan were requested to provide this information.
14. There were no large differences in response rates across research groups. The presence of large differences would have been a potential source of bias in research group comparisons.
15. These survey sample sizes reflect regression-adjusted measures, including all impact measuring and some participation and cost measures. For a few surveyed individuals, missing data prevented their inclusion in the regression model. The responses of these individuals were included in measures that were not regression-adjusted, that is, some participation and cost measures. For measures that were not regression-adjusted, sample sizes are 1,391 sample members in Atlanta, 836 sample members in Grand Rapids, and 1,588 sample members in Riverside.
16. The length of follow-up in the case file reviews varied by individual, ranging from a 24-month period to a 37-month period. For this analysis, activities that occurred more than 24 months after random assignment have been disregarded.
17. Periodically, MDRC staff reviewed the case file records of control group members to confirm that these individuals were not receiving JOBS services. These reviews found that no members of the control groups included in this report received JOBS services while residing in their county of random assignment.
18. Among those who did not take the tests, about one-third did not speak English; others were unable to remain for the testing, spoke English but were unable to read or write it, or had other reasons for not taking the tests.