Skip to main content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Evaluating Two Approaches to Case Management: Implementation, Participation Patterns, Costs, and Three-Year Impacts of the Columbus Welfare-to-Work Program

Publication Date

Submitted to:
U.S. Department of Health and Human Services
Administration for Children and Families
Office of the Assistant Secretary for Planning and Evaluation

U.S. Department of Education
Office of the Under Secretary
Office of Vocational and Adult Education
June 2001

Prepared by:
Susan Scrivener and Johanna Walter with Thomas Brock and Gayle Hamilton
Manpower Demonstration Research Corporation

"

Acknowledgments

This evaluation could not have been conducted without the dedication and cooperation of administrators and staff from the Ohio Department of Human Services and the Franklin County Department of Human Services (now part of the Department of Job and Family Services). State and county administrators embraced the idea of testing two case management approaches and remained committed to operating two separate welfare-to-work programs and subjecting them to intensive study. Staff were willing to comply with the rigorous requirements of the complex research design, including keeping the two programs distinct. In addition, they facilitated access to research sample members' case files, created the automated AFDC and Food Stamp payment files and unemployment insurance files used in this and other reports, and candidly discussed their experiences during field research visits.

The following key staff deserve special thanks: in the Ohio department, Michael Haas, Joel Rabb, Richard Deppe, Michael Koss, Scott Kozlowski, Nancy Mead, and Brenda Newsome; and in the Franklin County department, John Hahn, Mary Lou Langenhop, Leila Hardaway, Annette Mizelle, Toni Smith, and Georgianna Hayes.

In addition, gratitude is owed the research sample members, who went through the random assignment process, granted researchers access to confidential information about themselves, and participated in surveys. Without them, the research would not have been possible.

How to Obtain a Printed Copy

To obtain a printed copy of this report, fax or mail the title and your name and mailing address to:

Human Services Policy, Room 404E
Assistant Secretary for Planning and Evaluation
U.S. Department of Health & Human Services
200 Independence Ave, SW
Washington, DC 20201

Fax: 202-690-6562

1. Introduction

Welfare program case management is usually organized in one of two ways. Under traditional case management, welfare recipients interact with two separate workers: one worker who deals with welfare eligibility and payment issues, often called income maintenance, and one who deals with employment and training issues. Under integrated case management, welfare recipients work with only one staff member who handles both the income maintenance and employment and training aspects of their case. Although both strategies have certain advantagesВ  for example, the traditional structure allows staff members to specialize in one particular role, and the integrated structure allows staff members to quickly emphasize the importance of employment and eliminates failures in communication between staff membersВ  little information exists on the effects of the two approaches.

This report presents results of a study designed to evaluate the two case management approaches. The study was conducted in Columbus, Ohio,(1) as part of the National Evaluation of Welfare-to-Work Strategies (NEWWS Evaluation), a large-scale study of welfare-to-work programs in seven sites across the nation. The evaluation is being conducted by the Manpower Demonstration Research Corporation (MDRC), under contract to the U.S. Department of Health and Human Services (HHS), with support from the U.S. Department of Education.(2) For the study, Columbus ran two separate welfare-to-work programs: one that used integrated case management, referred to in this report as the integrated program, and one that used traditional case management, referred to as the traditional program. Apart from the case management difference, the welfare-to-work programs were the same: They required welfare recipients to participate in activities designed to enhance their skills before looking for work, provided child care and other services to support this participation, and penalized those who did not follow program rules by decreasing their cash grant.

This report provides information on how the integrated and traditional programs were implemented, how the programs affected participation in employment-related activities, and the costs of providing employment-related services in the two programs. It also discusses program effects, measured three years after people entered the programs, on employment, earnings, and welfare receipt.(3) (A future NEWWS Evaluation report will follow Columbus sample members for five years and present a comparison of the programs' benefits and costs.)

The evaluation's rigorous research design (discussed later in this chapter) allows researchers to determine the effects of each program, as well as the relative effects of the two programs. In other words, the report provides two types of information. First, it describes and evaluates the effects of two mandatory, education-focused welfare-to-work programs relative to the effects of no special welfare-to-work program. The Columbus programs are different from many previously studied programs that emphasized skills-building. Others have engaged most participants in basic education classes; the Columbus programs engaged many people in basic education but also engaged others in post-secondary education, primarily at two-year colleges. (After the follow-up period covered in this report, Columbus's welfare-to-work program changed its focus to quick entry into the labor market. The next section briefly describes the reformed program.)

Second, this report compares the effectiveness of a welfare-to-work program that used integrated case management with the effectiveness of one that used traditional case management. For example, the report discusses which program engaged more recipients in program activities, which produced larger welfare receipt reductions, and which generated larger earnings increases. Because all features of the two programs were identical, except for their case management approach, these comparisons indicate the relative effectiveness of the two case management approaches.

Policy Context of the Columbus Programs

Columbus's integrated and traditional programs were operated under the Family Support Act (FSA) of 1988. The FSA required states to provide education, employment, and support services to Aid to Families with Dependent Children (AFDC) recipients, who were, in turn, required to participate in the Job Opportunities and Basic Skills Training (JOBS) program created by the act to equip them for work. (The abbreviated AFDC and JOBS are used throughout this report.)

In 1996, Congress passed the Personal Responsibility and Work Opportunity Reconciliation Act (PRWORA), which replaced AFDC with a block grant program, Temporary Assistance for Needy Families (TANF). The law limits most families to five years of federal assistance and created financial incentives for states to run mandatory, work-focused welfare-to-work programs. States must meet work program participation rates that are higher than those under the FSA or face reductions in their TANF block grant amount. Although the PRWORA substantially reformed the nation's welfare policies, its overarching goal is similar to the goal of the FSA: to foster the economic self-sufficiency of welfare recipients through increased employment and decreased welfare receipt.

Columbus (and the rest of Ohio) began operating its TANF program in October 1997, after the follow-up period covered in this report. Ohio Works First, as the name suggests, shifted the focus from building welfare recipients' skills through education to quickly engaging them in jobs. Among other changes, the program limits recipients to three years of cash benefits (with up to two additional years of benefits available under certain circumstances) and increased the severity of penalties for noncompliance with program requirements. At the same time, Columbus began to use integrated case management for all TANF recipients.

Historical Context of Integrated and Traditional Case Management

The idea of administering income maintenance together with employment services and other social services is not new.(4) After the Social Security Amendments of 1962, which increased federal compensation to state welfare agencies for administrative costs related to social services, most states adopted what was called a casework model. Welfare departments hired caseworkers to review applications for welfare and to attempt to "rehabilitate" recipients so that they would become self-supporting.(5) Proponents of the casework model believed that it would let welfare staff show concern for recipients during the course of income maintenance discussions and respond to problems, and make it easier for recipients to request services.

Critics of the model questioned its effectiveness and philosophical underpinnings.(6) In many states, staff members hired to perform casework were not professionally trained and did not know what to look for or how to confront recipients about the problems they observed. Few "hard" services, such as job training, placement assistance, or substance abuse treatment, were provided. Professional social workers argued that "the money function disables or overwhelms the social services."(7) Conservative lawmakers in Congress feared that liberal caseworkers authorized benefits to which individuals were not entitled. Welfare rights and civil rights groups objected to the assumption that welfare recipients needed rehabilitation and attacked the home visits as an invasion of privacy. Responding to these criticisms, in 1967 the U.S. Department of Health, Education, and Welfare (HEW) issued a directive that urged states to reorganize the administration of their welfare programs by creating separate line agencies to determine welfare eligibility and provide social services.(8)

The 1967 Work Incentive (WIN) program directed some AFDC recipients to participate in employment-related activities and provided funding for services including job counseling and placement, work experience, and on-the-job training. The WIN program fostered the separation of income maintenance and employment services because of its joint administration by HEW and the U.S. Department of Labor. This administrative structure was replicated at the state and local levels in all but two states, resulting in a system in which welfare staff generally referred recipients outside the income maintenance office to employment security agencies for WIN assessment and services.(9)

By the 1970s, the separation of social services and employment services from income maintenance left most welfare offices focused on determining eligibility, authorizing welfare grants, and distributing welfare checks. Many agencies that once recruited college graduates to do casework downgraded the income maintenance role to a clerical level. A goal of minimizing AFDC payment errors replaced the previous decade's goal of rehabilitating welfare recipients. The federal and state governments invested in automated systems that could calculate grant amounts, approve benefits, and send out checks and other notices. Although many welfare agencies became efficient at these tasks, the welfare system remained unpopular with most recipients, taxpayers, and politicians.(10) The WIN program was also criticized for failing to provide effective employment-related services and to enforce a meaningful participation requirement.(11)

During the 1980s, Congress and HHS attempted to increase work among welfare recipients and reduce welfare receipt but did not try to change or redefine the role of the line worker in local welfare agencies.(12) To the contrary, the federal government gave state welfare agencies more authority to determine how their welfare-to-work programs would be administered. The Omnibus Budget Reconciliation Act of 1981 gave state welfare agencies the option of running their WIN programs themselves rather than in cooperation with employment security agencies.(13) The FSA of 1988, which created the JOBS program, made state welfare agencies directly accountable for enrolling welfare recipients in education and work-related services. Under the JOBS program, states were required to develop a coordinated system of service delivery that would involve many public institutions working together on behalf of welfare recipients: welfare agencies, employment security offices, Job Training Partnership Act systems, public schools, and community or state colleges.

Because JOBS involved a brokered model of service delivery and states had to meet minimum participation rates among JOBS-mandatory welfare recipients, case management emerged as a central feature of most states' programs. Typical case management responsibilities included the following: assessing welfare recipients' employability, placing recipients in appropriate services, arranging support services such as child care, overseeing participation in JOBS activities, and initiating financial sanctions for those who did not follow JOBS rules.(14) Most states continued to separate income maintenance and employment-related services, although a 1992 survey found that 17 states operated programs in which JOBS case managers performed an integrated income maintenance and JOBS role.(15) (As this report was being prepared, information was not yet available on the number of states that have elected to combine or separate income maintenance and employment services under the PRWORA.)

Both case management approaches can be argued to have certain advantages and disadvantages. The separation of income maintenance from employment and training tasks allows each staff member to specialize in a particular role. It can also allow the employment services case managers to develop a distinct and often more prestigious professional identity. Common criticisms of this model are that a lack of coordination between income maintenance and employment and training services may prevent the quick enrollment of welfare recipients in work activities or may hinder the imposition of penalties on individuals who do not comply with work participation requirements.

By combining the income maintenance and employment and training roles in one position, the integrated approach eliminates failures in communication between staff members. Integration also allows staff to quickly emphasize the importance of employment. Two prominent welfare scholars have suggested that integration may change the "eligibility-compliance culture" of the average welfare office to a "self-sufficiency culture"  that is, one that structures "interactions and expectations around work and preparation for work, with most of the attention of clients and workers devoted to moving off welfare rather than to validating the credentials for staying on it."(16) A common criticism of integrated case management, however, is that the two functions may overwhelm staff members, and, because they must deal with welfare payments each month, this may lead them to pay less attention to employment and training.

Integrated and Traditional Programs in Columbus

MDRC researchers, Ohio and Franklin County welfare officials, and HHS officials developed the integrated and traditional programs in Columbus. As noted earlier, the integrated program relied on one type of staff member, integrated case managers, to perform both the income maintenance and employment and training tasks for welfare recipients. The traditional program, in contrast, employed two types: income maintenance (IM) workers and JOBS case managers.

The program developers planned that integrated case managers would carry relatively small caseloads, so they could work closely with all of their clients. In practice, as discussed in more detail in Chapter 2, integrated caseloads were somewhat larger than the designers had intended. In fact, the overall recipient-to-staff ratio in the integrated program was similar to that in the traditional program (every two integrated case managers worked with approximately the same number of welfare recipients as every two staff members in the traditional program). Because the evaluation in Columbus is comparing the effectiveness of integrated and traditional case management approaches with the same recipient-to-staff ratios, any differences that exist between the programs' outcomes can be attributed to the case management approach.

Beyond the case management difference, the major facets of the integrated and traditional programs were identical. They both required welfare recipients to participate in program activities or face a reduction in their cash grant. Both programs aimed to build sample members' skills before requiring them to look for work and engaged people in a wide array of services, including basic education, post-secondary education (primarily classes at two-year colleges), work experience, and job search activities. They provided child care assistance, transportation assistance, and other services to support participation in these activities, and both programs benefited from an unusually rich array of administrative supports. (Chapter 2 provides more detail on the implementation of the programs.) Furthermore, participants in the programs were subject to the same public assistance eligibility and payment system.

Evaluation of the Programs in Columbus

Research Questions and Hypotheses

This report answers several research questions about the integrated and traditional programs in Columbus:

  • How did the integrated and traditional programs operate and what were the differences between them?
  • How did the programs affect involvement in employment and training activities and how did they deal with people who did not comply with program requirements? Did one program engage more recipients in program activities than the other?
  • What were the costs of employment-related services in the programs and how do those costs compare?
  • What were the effects of the programs, relative to the experience of a control group, on employment, earnings, and welfare receipt and payments? How do the effects of the integrated and traditional programs compare?

As the questions above indicate, one of the main purposes of the evaluation in Columbus is to compare the integrated and traditional case management approaches. Following are the four key hypotheses about the differences between the programs that the evaluation designers developed at the beginning of the study.

  1. The integrated case management structure would allow case managers to deliver program services more efficiently and effectively and to more closely monitor welfare recipients' situations than the traditional structure.

    The evaluation designers assumed that integrated case management would operate more efficiently than traditional case management because each recipient would work with only one staff member. This would reduce time spent on communication among staff, as well as reduce delays between case events. They also thought that integrated staff would have closer relationships with recipients. Because integrated case managers handle both eligibility and employment services, they would see recipients more often and have a more complete picture of their situation.

  2. The integrated approach would engage more people in the program than the traditional approach.

    The evaluation designers hypothesized that the integrated program would lead to a higher attendance rate at JOBS orientation and subsequently to a higher participation rate in JOBS activities, and thus would better enforce the "social contract" idea that people who receive welfare should be engaged in employment-focused services. This hypothesis was based primarily on the belief that welfare recipients would take the threat of financial sanction more seriously from an integrated case manager who could impose the sanction herself than from a traditional case manager who had to rely on another staff member, an IM worker, to impose the sanction. In addition, the evaluation designers thought that recipients might have more difficulty avoiding participation requirements if they had to deal with one worker who knew their whole situation, rather than two workers who each had limited information about their JOBS and welfare statuses.

  3. The integrated program would produce larger increases in employment and earnings than the traditional program.

    The architects of the study believed that if, as suggested above, the integrated program exposed more people to the program messages and services (by engaging more people in the program), and more efficiently and effectively delivered services, it also would produce larger effects on employment and earnings.

  4. The integrated program would produce larger decreases in welfare receipt and payments than the traditional program.

    This hypothesis was based on two factors. First, if the integrated program increased employment and earnings more than the traditional program, as discussed above, that, in turn, would likely have resulted in larger welfare reductions. Second, evaluation designers expected that the integrated structure would engender more effective eligibility case management than the traditional structure. Integrated case managers might find out about employment and welfare status changes more quickly than traditional staff because they would see their clients more often. They would also be able to respond more quickly to status changes because they could reduce a grant amount or close a grant themselves rather than requesting another staff member to do so. It is also possible that the closer contact between integrated case managers and recipients could help these staff members learn about eligibility changes that traditional staff might not.

Research Design

The study in Columbus uses an unusually strong research design, a random assignment experiment, to estimate program effects. In this design, welfare applicants and recipients were assigned, at random, to one of three groups:

  • the integrated group, whose members were required to participate in the integrated program or face a reduction in their cash grant (a financial sanction);
  • the traditional group, whose members were required to participate in the traditional program or face a financial sanction; or
  • the control group, whose members were neither required nor eligible to participate in any special welfare-to-work program. (Control group members could seek out employment-related services available in the community, and if they did, they could receive child care assistance from the welfare department.)

Because people were assigned to the three groups through a random process, any differences that emerge over time between the groups' outcomes, such as average earnings or welfare payments, can reliably be attributed to the programs.

The three-way design allows researchers to make two types of rigorous comparisons. First, estimates of the net effects of each program can be made by comparing outcomes of the integrated group with outcomes of the control group and by comparing outcomes of the traditional group with outcomes of the control group. (The integrated and traditional groups are also referred to as program groups in this report.) Second, estimates of the differential effects of the programs can be made by comparing outcomes of the integrated group with outcomes of the traditional group. All of these differences in outcomes are referred to as impacts.

In Columbus, 7,242 single-parent welfare applicants and recipients, who were determined to be JOBS-mandatory, were randomly assigned for the evaluation. During the period studied, Columbus mandated participation in the JOBS program from all recipients whose youngest child was at least 3 years old and who did not meet federal exemption criteria. These included working 30 hours or more per week, being ill or incapacitated or caring for an ill or incapacitated household member, being of advanced age, being in at least the second trimester of pregnancy, or living in a remote area that made program activities inaccessible.

The steps leading to random assignment are depicted in Figure 1.1. Between September 1992 and July 1994, IM workers identified new AFDC applicants and ongoing AFDC recipients (who were single parents aged 21 or over) who were JOBS-mandatory. Once an individual was approved to receive welfare, she was randomly assigned to either the integrated, traditional, or control group.(17) People in the integrated group were assigned to an integrated case manager; people in the traditional group were assigned to an IM worker and a traditional JOBS case manager; and people in the control group were assigned to an IM worker (but not to a JOBS case manager). Then, the integrated and traditional JOBS case managers were responsible for sending a letter to each person scheduling her for an orientation session.(18) (See Chapter 3 for more information on the programs' orientation process.)

The fact that random assignment occurred at the welfare office when individuals were referred to JOBS affects how the results in this report should be interpreted. In this design, sample members who did not show up for a JOBS orientation are included in the research sample, and their outcomes, such as earnings and AFDC payments, are averaged together with those of orientation attenders. Since orientation is the gateway to program services, people who did not attend an orientation session could not receive any services. In most of the other sites in the NEWWS Evaluation, random assignment occurred during a JOBS orientation, and thus orientation nonattenders were excluded from the sample. Whereas the evaluations in these other sites test only the effects of the program services and mandates, the evaluation in Columbus captures these program effects plus the effects of a referral to a mandatory welfare-to-work program and any follow-up relating to this referral, such as sanctioning for failing to attend an orientation session.(19)

Figure1.1
Steps Leading From Income Maintenance to Random Assignment

Steps leading from income maintenance to random assignment

Key Characteristics of the Program Environment and the Sample Members

Program Environment

Table 1.1 summarizes some key aspects of the Columbus environment. Between 1992 and 1997, the period covered in this report, Columbus was a growing metropolitan area with a population of close to 1 million. The labor market was robust: the unemployment rate was low and decreased throughout the follow-up period, and employment grew by 8 percent.

Table 1.1.
Characteristics of the Program Environment
Characteristic Total

Population, 1990

961,437

Population growth, 1990-1995 (%)

5.2
Unemployment rate(%)

1992

4.6

1993

4.5

1994

3.7

1995

2.9

1996

2.9

1997

2.7

Employment growth, 1992-1997a(%)

8.1
AFDC caseloadb

1992

24,583

1993

24,904

1994

24,393

1995

21,786

1996

19,474

1997

16,886

AFDC grant level for a family of three, 1993($)

341

Food stamp benefit level for a family of three, 1993c($)

292

Source: Hall and Gaquin,eds., 1997; Freedman et al., 2000; Hamilton and Brock, 1994; U.S. Department of Labor, Bureau of Labor Statistics; site contacts.
Notes: Data are for Franklin County. aEmployment growth figures were calculated using data from the U.S . Department of Labor, Bureau of Labor Statistics. bAnnual average single-parent monthly caseloads as reported by the state. cAssumes the receipt of the maximum AFDC payment.

The number of people receiving cash assistance was relatively stable at the start of the study period and began to decrease after 1994.(20) Over the follow-up period the caseload decreased by 31 percent. In 1993, a family of three could receive up to $341 per month through the AFDC program, slightly less than the national median of $367, and up to $292 worth of Food Stamps.

Under the FSA, all states were required to disregard some earned income when calculating a family's AFDC grant. During the first four months of employment, $120 and an additional one-third of the remainder of monthly earnings were disregarded. During the next eight months of employment, a flat $120 was disregarded. After one year of work, the disregard fell to $90. In addition, recipients could disregard child care expenditures, up to $175 per child aged 2 and over and $200 per child under age 2.(21) In July 1996, Ohio increased the amount of earned income that would be disregarded to $250 and 50 percent of the remainder of monthly earnings for 12 months. This increased disregard was expanded to 18 months with the implementation of the state's TANF plan in October 1997.(22)

Sample Members

Table 1.2 shows some characteristics of the research sample in Columbus, measured immediately prior to random assignment.(23) Most people in the sample were women; their average age was 32; roughly half were white and half were black; and they had two children on average.

Typical sample members had limited experience in the labor market: Fewer than half reported that they had ever worked full time for six months for one employer, and fewer than one-third reported that they had worked for pay in the year before random assignment. Nearly three-fifths of the sample had received a high school diploma or GED certificate. Almost three-fourths of the sample had received AFDC for at least two years, and a substantial proportion were living in public, subsidized, emergency, or temporary housing. The Columbus sample is among the most disadvantaged of all the samples in the other NEWWS Evaluation sites.(24)

Table 1.2.
Selected Characteristics of Sample Members
Characteristic Full Sample High School Diploma or GED No High School Diploma or GED
Demographic characteristics
Sex(%)
Male 6.5 6.6 6.4
Female 93.5 93.5 93.6
Average age(years) 31.8 31.9 31.8
Ethnicity
White 46.5 41.3 53.5
Black 52.0 57.6 44.4
Hispanic 0.4 0.4 0.5
Other 1.2 0.8 1.7
Family status
Youngest child's age(%)
2 or under 1.8 2.0 1.4
3 to 5 45.1 46.0 43.9
6 or over 53.1 52.0 54.7
Average number of children 2.0 1.9 2.2
Labor force status
Ever worked full time for six months or more for one employer (%) 42.5 50.1 32.3
Any earnings in past 12 months (%) 28.2 34.6 19.5
Education status
Received high school diploma or GED (%) 57.4 100.0 0.0
Highest grade completed (average) 11.2 12.0 10.0
Currently enrolled in education or training (%) 7.8 7.7 8.1
Public assistance status
Received AFDC for two years or more prior to random assignmenta 72.7 66.7 80.7
Housing status

Current housing status (%)

Public housing 15.2 15.3 15.2
Subsidized housing 24.7 25.3 23.9
Emergency or temporary housing 1.4 1.3 1.6
None of the above 58.7 58.1 59.3
Sample sizeb 7,242 4,135 3,073

Source: MDRC calculations from information routinely collected by welfare staff.
Notes: aThis refers to the total number of months accumulated from at least one spell on an individual's own or spouse's AFDC case. It does not include AFDC receipt under a parent's name. bThirty-four individuals in the full sample who did not indicate whether they had a high school diploma or GED at random assignment were excluded from the subgroup analysis.

Data Sources and Sample Sizes

This report presents implementation, participation, cost, and impact results for individuals who were randomly assigned between September 1992 and July 1994. Results and their data sources include:

  • Demographic and other characteristics as of random assignment, collected by staff during the application or redetermination for assistance at the welfare office. These data are available for all 7,242 sample members included in this report.
  • Welfare department staff members' attitudes and opinions about the programs, recorded in a survey administered in October 1993.
  • Interviews with staff members and observations of program activities, completed as part of field research conducted in November and December 1993 and August 1994.
  • Data on JOBS activity participation rates and patterns, collected from a review of JOBS case files using standard coding procedures. Case files were reviewed for a random subsample of program group members who were randomly assigned between October 1992 and March 1993.
  • Participation impacts, computed using results from a survey administered approximately two years after random assignment. Surveys were administered to a subsample of individuals who were randomly assigned between January and December 1993. These data are available for 1,094 individuals in the program and control groups.
  • The cost of the integrated and traditional programs, calculated using state and county fiscal reports, support service payment records, administrative records, client survey responses, case file participation records, education provider fiscal reports, and published data.
  • Employment, earnings, and welfare impacts, computed using automated state unemployment insurance records and AFDC administrative records data. These data are available for all 7,242 sample members.
  • Comparisons with other programs in the NEWWS Evaluation, made using similar data from nine other welfare-to-work programs.

Sample Sizes, by Data Source and Research Group
Data Source Full Sample Integrated Group Traditional Group Control Group
Standard client characteristics
Sample size 7,242 2,513 2,570 2,159
Period of random assignment 9/92 - 7/94 9/92 - 7/94 9/92 - 7/94 9/92 - 7/94
AFDC administrative records and UI-reported earnings
Sample size 7,242 2,513 2,570 2,159
Period of random assignment 9/92 - /94 9/92 - 7/94 9/92 - 7/94 9/92 - 7/94
Two-Year Client Survey
Sample size 1,094 371 366 357
Period of random assignment 1/93 - 12/93 1/93 - 12/93 1/93 - 12/93 1/93 - 12/93
Case file participation data
Sample size 443 225 218 n/a
Period of random assignment 10/92 - 3/93 10/92 - 3/93 10/92 - 3/93 n/a
Staff surveys
Integrated case managers 22 n/a n/a n/a
JOBS case managers 39 n/a n/a n/a
IM workers 114 n/a n/a n/a

About This Report

Data Sources, Samples, and Time Frame for the Analysis

This report draws data from several sources. The accompanying text box describes the data sources used for each analysis and presents the sample sizes that correspond to each source. To facilitate cross-program comparisons, the participation and cost analyses for Columbus cover a two-year period, as did the analogous analyses of other programs that were presented in earlier NEWWS reports. Analyses of the integrated and traditional programs' impacts on employment, earnings, and welfare receipt, however, cover three years of follow-up (because three years of employment and welfare data were available when the analysis was conducted).(25)

As mentioned earlier, the follow-up period covered in this report preceded the implementation of the state's TANF program. It is worth noting, however, that when Ohio Works First began in October 1997, the "embargo" on control services was lifted. In other words, control group members who were receiving welfare or who reapplied for assistance could then be mandated to participate in the state's welfare-to-work program. In addition, at this time all sample members  in the control, integrated, and traditional groups  began receiving integrated case management. (Future reports whose follow-up period extends past October 1997 will address this issue.)

Organization of the Report

The report is divided into five chapters. After this introductory chapter, Chapter 2 describes the implementation of the integrated and traditional programs in Columbus, focusing on issues such as the employment-preparation strategy used and various case management practices. Chapter 3 presents findings on integrated and traditional group members' involvement in employment-related activities as part of their respective programs. The chapter also compares the activity levels of integrated and traditional group members with those of their control group counterparts to determine the net effect of the two programs on participation. Chapter 4 provides estimates of the cost of employment-related services in each program, and Chapter 5 presents the programs' effects on employment, earnings, and cash assistance receipt.

Endnotes

1.  This study draws its sample and data from Franklin County, Ohio. For ease of reference, the name of the county's largest city, Columbus, will be used throughout this report.

2.  Child Trends, as a subcontractor, is conducting analyses of outcomes for young children in three of the sites. Columbus is not included in this substudy.

3.  This report draws on an earlier paper prepared as part of the NEWWS Evaluation (Brock and Harknett, 1998b); a revised version of this paper was published in Social Service Review (Brock and Harknett, 1998a). An earlier NEWWS report (Hamilton and Brock, 1994) discusses the early implementation of the Columbus programs and the other programs in the evaluation, and a recent report (Freedman et al., 2000) presents two-year impacts for all the programs.

4.  This section is slightly modified from Brock and Harknett, 1998a.

5.  Bell, 1983; Bane and Ellwood, 1994.

6.  Bell, 1983; Bane and Ellwood, 1994.

7.  Hamilton, 1962, p. 128.

8.  Bane and Ellwood, 1994.

9.  Nightingale and Burbridge, 1987.

10.  Bell, 1983; Bane and Ellwood, 1994.

11.  Rein, 1982; Mead, 1986.

12.  See, for example, Gueron and Pauly, 1991; Rein, 1982.

13.  Nightingale and Burbridge, 1987.

14.  Hagen and Lurie, 1994.

15.  American Public Welfare Association, 1992.

16. Bane and Ellwood, 1994, p. 7.

17.  Thirty-five percent of the individuals were assigned to the integrated group, 35 percent to the traditional group, and 30 percent to the control group.

18.  This sequence of staff contact differs from what normally occurs in a program using integrated case management. To accommodate the random assignment process in Columbus, all applicants and recipients first met with a worker at the IM office; integrated group members did not see an integrated case manager until a later date when they attended a JOBS orientation. Normally, in a program using integrated case management, the integrated case manager is the first and only person to see a recipient, which allows the staff member to immediately address employment issues.

19.  To allow a separate study of the deterrence effects of a participation mandate and of reasons for not attending program orientation sessions, random assignment in the Grand Rapids and Riverside sites occurred at two points: at the point of referral to JOBS and at JOBS orientation. See Knab et al., 2001, for more details.

20.  Some of the annual caseload counts differ somewhat from those presented in Freedman et al., 2000, because a different data source was used.

21.  Greenberg, 1992.

22.  Gallagher et al., 1998.

23.  The table presents characteristics for the entire Columbus sample: the integrated, traditional, and control groups.

24.  See Freedman et al., 2000, for baseline characteristics for the samples in the other six NEWWS Evaluation sites.

25.  The follow-up period covers different dates for each sample member, depending on the date she or he was randomly assigned. As noted above, random assignment occurred between September 1992 and July 1994. Thus, the inclusive dates for the two-year follow-up period are September 1992 to July 1996, and the dates for the three-year follow-up period are September 1992 to July 1997.

2. Implementation of the Integrated and Traditional Programs

This chapter describes how the integrated and traditional programs were implemented in Columbus, with a particular focus on comparisons between the programs. The data for this chapter are primarily from the staff and client surveys, MDRC field research completed in 1993 and 1994, and numerous other site visits and discussions with program staff. (See the text box in Chapter 1 for more detail about the data sources.)

Summary of Program Implementation

The integrated and traditional programs both emphasized skills-building prior to entry into the labor market. The programs especially stressed the importance of recipients getting a GED certificate, and they placed only the most employable in job search activities.

Recipient-to-staff ratios were similar in the two programs. Although caseloads for the integrated staff were larger than had been planned  limiting the amount of time that staff members could spend with each recipient and generating low morale  they were not so large that they prevented integrated case managers from successfully performing both their income maintenance and employment and training duties. This was facilitated, in part, by the extensive administrative support available to staff. Integrated case managers provided more personalized attention than traditional case managers and more closely monitored participation in program activities. Both programs strongly enforced the participation mandate.

Analysis Issues

Some facets of the integrated and traditional programs, such as the resources, facilities, and strategy used to prepare recipients for employment, were not expected to vary by program approach. The designers of the Columbus evaluation, however, expected that the two case management approaches would differ along one important dimension: service delivery. Specifically, as mentioned in Chapter 1, they hypothesized that the integrated structure would allow case managers to more efficiently and effectively deliver program services and monitor welfare recipients' situations than the traditional structure.

The evaluation designers assumed that integrated case management would operate more efficiently than traditional because each recipient would work with only one staff member. This would reduce time spent on communication among staff, as well as reduce delays between case events. Efficiency is difficult to gauge and was not directly measured in this evaluation.

They also thought that integrated staff would have closer relationships with the recipients they worked with and would know more about them. Because integrated case managers handle both eligibility and employment services, they would see recipients more often and would have a more complete picture of their situation. This chapter explores this assumption by discussing the level of personalized attention in the programs and the degree that staff monitored participation in program activities.

Program Resources and Facilities

The administrators of the Columbus welfare program placed a high priority on the JOBS program; they considered it the centerpiece of an agency-wide mission to make welfare temporary and employment-focused. During the first few years of the evaluation, program administrators focused on increasing the JOBS program's capacity, with the goal of never turning someone away for lack of appropriate services. They largely succeeded: Unavailability of services was rarely, if ever, a problem.

Field researchers rated the Columbus JOBS facilities as "outstanding" compared with those of other welfare-to-work programs.The JOBS center, physically separate from the welfare office, housed the employment and training staff for the integrated and traditional programs. The center, which was extensively renovated prior to the evaluation, also provided spacious classrooms for basic education and job search instruction; offices for state employment services staff, and county alcohol, substance abuse, and mental health workers; and a child care facility for children between ages 2 1/2 and 5.

The programs also benefited from an unusual level of administrative support. Columbus had a child care unit that connected parents with child care providers and a resource unit that collected JOBS activity attendance information and provided it to case managers. Columbus used an automated case record information system, called CRIS-E, which contained information on individuals' past public assistance benefits, JOBS activity assignments, and sanctions for noncompliance. The system guided staff through the welfare eligibility determination process and the JOBS assessment. Although some staff complained about using CRIS-E, it was a powerful system that enabled case management to be fully automated.

Employment Preparation Strategy

Welfare-to-work programs have used different strategies to foster recipients' economic self-sufficiency. Employment-focused, or labor force attachment (LFA), programs have aimed to quickly move people into jobs by requiring and helping them to look for work, reflecting the belief that people can most effectively build employability through work experience. Education-focused, or human capital development (HCD), programs, in contrast, have emphasized building skills through education and training as a precursor to employment, reflecting the belief that an initial investment in the skills levels of welfare recipients will allow them to eventually obtain higher-paying and more secure jobs.

The integrated and traditional programs in Columbus were both education-focused (and were not designed or expected to differ in terms of this program dimension). The programs did not have a specific prescribed activity sequence, but staff strongly encouraged people who did not have a high school diploma or GED certificate to earn one by attending basic education classes. They encouraged many of those who already had a diploma or GED to attend vocational training or post-secondary education classes or to participate in work experience before actively seeking a job. (The accompanying text box describes the various activities offered in Columbus.) The following remarks made by an integrated case manager typify the comments made by many Columbus staff members during field research:

My opinion is that clients should get an education. They should work toward a job that will get them off welfare. If they take a job flipping hamburgers, they will end up right back on welfare.

Program Activities and Services

The Columbus JOBS program supported participation in a wide variety of activities, including:

  • Job search. Job clubs were run at the Columbus JOBS center and the local Goodwill agency. They combined classroom instruction on searching for a job with actual job search. Columbus also required some people  typically those who did not need training on writing a résumé or interviewing  to search for a job on their own, with frequent check-ins with their case manager.
  • Basic education. The welfare department contracted with the public school system to offer basic education classes at the JOBS center. Classes offered included General Educational Development (GED) certificate preparation courses, Adult Basic Education courses that provided reading and mathematics instruction for people whose achievement levels were too low for entry into the GED course (usually at the 8th grade level and below), and English as a Second Language classes that provided non-English speakers with instruction in spoken and written English. During the evaluation, Columbus developed specialized classes for recipients with very low literacy levels.
  • Post-secondary education. Columbus allowed people to take courses for credit toward a college degree at two-year and four-year colleges.
  • Vocational training. Offered primarily through public vocational schools and private proprietary schools, these classes provided occupational training, for example, for nurse's assistants, and in areas such as office computer applications.
  • Work experience. Participants were placed in unpaid positions (they continued to receive their welfare grant) with employers to develop job skills. Most participants were placed in clerical positions, but program staff were willing to match placements to recipients' career interests.
  • Life skills workshops. Columbus offered a pre-education retention program, operated by the local community college, that included career exploration, self-esteem-building activities, and advice on time management and study skills.

Columbus offered support services, including:

  • Child care. The JOBS program paid providers for child care costs incurred as a result of participation for program and control group members who enrolled in employment and training activities. The Columbus JOBS center also provided on-site child care for children aged 2 1/2 to 5. If eligible, sample members could be reimbursed through the Transitional Child Care program for child care expenses incurred while they were employed and no longer receiving cash assistance.
  • Work allowances. The program paid program participants work allowances to cover transportation costs and other incidental costs.

Staff referred only the most employable recipients to job search services  typically those who had at least a high school diploma or GED, some work experience, and no serious problems, such as substance abuse, that might interfere with working. In fact, program participants were sometimes given the impression that a GED was "mandatory" for employment, and staff operating the job clubs and other placement activities preferred that people have a diploma or GED before starting these activities.

Field researchers observed that many Columbus staff members perceived their purpose to be helping recipients overcome barriers, not finding specific job openings for them; one case manager said that, "this is not an employment agency." Also, although the programs had full-time job developers, information on job leads was not communicated effectively to case managers or to recipients, at least early in the follow-up period. During the evaluation, Columbus developed a placement specialist position to connect job developers with case managers; staff disagreed about whether this improved the situation.

Scales created from a survey of staff in all of the NEWWS Evaluation programs confirm that Columbus staff strongly favored the human capital development approach. The first set of bars in Figure 2.1 shows the percentage of integrated and traditional JOBS case managers who leaned toward either the labor force attachment or the human capital development approach as the better way to move recipients into jobs and off welfare.(1) Over 65 percent of Columbus staff leaned toward the HCD approach, and only 5 percent leaned toward the LFA approach. Staff who did not express a strong preference were not placed in either group. The percentage in Columbus favoring HCD is among the highest of the NEWWS Evaluation programs. (See the accompanying text box for a brief description of the other programs in the evaluation.)

The Other Programs in the National Evaluation of Welfare-to-Work Strategies
The National Evaluation of Welfare-to-Work Strategies is assessing the effectiveness of 11 welfare-to-work programs in seven sites, including the integrated and traditional programs in Columbus. Three sites in the evaluation Atlanta, Georgia; Grand Rapids, Michigan; and Riverside, California ran two different programs: an employment-focused, or labor force attachment (LFA), program and an education-focused, or human capital development (HCD), program. The employment-focused programs aimed to quickly get people into jobs, even at low wages, by requiring and helping them to look for work. In these programs, job search was the prescribed first activity for virtually the entire caseload. The education-focused programs emphasized education and training prior to entry into the labor market. In these programs, basic education was the most common first activity because of the generally low educational attainment of the enrollees at program entry. The research design in these three sites, as in Columbus, allows the evaluation to determine the effectiveness of the two different programs relative to no welfare-to-work program (represented by the outcomes of a control group whose members were not required or allowed to participate in either program), as well as the effectiveness of the programs relative to each other.

In the other three sites Detroit, Michigan; Oklahoma City, Oklahoma; and Portland, Oregon the evaluation is testing the net effects of the sites welfare-to-work programs. The Detroit and Oklahoma City programs were primarily education-focused. The Portland program can be considered strongly employment-focused and moderately education-focused.

In total, the 11 evaluation programs range from strongly employment-focused to strongly education-focused and from somewhat voluntary to highly mandatory. The program sites offer diverse geographic locations, caseload demographics, labor markets, and welfare grant levels. However, because of NEWWS Evaluation selection criteria, the programs were all mature welfare-to-work programs, relatively free of the transitional problems associated with the start-up of a complex, multi-component welfare-to-work program. These programs, while not representing all welfare-to-work programs in the nation, represent a wide range of welfare-to-work options.

According to field research, many Columbus staff members encouraged recipients to look for and take jobs that paid more than minimum wage. Survey responses indicate that this varied somewhat by case management approach. As the second set of bars in Figure 2.1 shows, 32 percent of traditional staff  the highest percentage of any program  and 14 percent of integrated staff said that they encouraged recipients to be selective in taking a job. Program participants corroborated this difference: As the third set of bars shows, more recipients in the integrated program than the traditional program said on a survey that they felt pushed by their case manager to take a job before they were ready or before a good job came along (43 percent compared with 29 percent). This difference probably also reflects the fact that because the integrated structure facilitated more frequent contact between recipients and case managers, integrated case managers had more opportunities to reinforce the employment message.

Figure 2.1:
Employment preparation strategy

Employment preparation strategy

Case Management

Staff Duties

Table 2.1 summarizes the primary duties of income maintenance (IM) workers, traditional JOBS case managers, and integrated case managers in Columbus. In the traditional program, IM workers determined eligibility for and authorized public assistance benefits provided by the welfare department, including cash assistance, Food Stamps, and Medicaid. They reevaluated recipients' eligibility for benefits every six months (or sooner if they became aware of a change in a recipient's status), changed benefit amounts as family composition changed or recipients found work, and imposed sanctions (AFDC grant reductions) at the request of JOBS case managers. Traditional JOBS case managers were responsible for the employment and training aspects of cases. They conducted JOBS orientation sessions, assessed recipients' skills and support service needs, assigned them to program activities, monitored their attendance and progress, and initiated sanctions for those who were noncompliant with program requirements. In the integrated program, integrated case managers performed all these duties.

Table 2.1:
Description of Staff Duties
  Traditional Program Integrated Program
IM Workers JOBS Case Managers Integrated Case Managers

Handled all public assistance benefits

X   X

Authorized payments for JOBS-related expenses

X   X

Conducted JOBS orientation and assessment

  X X

Assigned recipients to JOBS activites

  X X

Monitored JOBS attendance and progress

  X X

Initiated sanctions for noncompliance

  X X

Imposed sanctions for noncompliance

X   X

Worked with recipients' entire household

X   X

Location of staff

IM office JOBS office JOBS office

Average caseload size

265 258 140
Sources: JOBS, Income maintenance, and integrated staff activities and attitudes surveys; and MDRC field research.

The manner in which welfare cases were defined in Ohio affected the work of IM workers and integrated case managers. Welfare case numbers were assigned according to address. As a result, everyone receiving welfare at an address had the same case number and either the same IM worker or the same integrated case manager; thus, the staff member knew how expenses in that dwelling were covered. In addition, integrated case managers knew whether other public assistance recipients living at the address had jobs, participated in JOBS activities, or posed a barrier to a client's employment. Integrated staff could refer any welfare recipient at the address to JOBS. In the traditional program, JOBS case managers, in contrast, did not have access to this information and confined their intervention to individual clients.

Traditional JOBS case managers worked in one of two units: one that worked with people in education and vocational training activities and one that worked with people in job search and work experience activities (the "job-ready unit"). Staff reported that this division sometimes led to delays when someone who moved from an education or training activity to a job search activity had to wait until a case manager in the job-ready unit had time to meet with her. In contrast, integrated case managers worked with all types of people, who remained with the same case manager regardless of the activity they were involved in.

Caseload Sizes

Overall recipient-to-staff ratios were approximately the same in the two programs. As Table 2.1 shows, in the traditional program caseloads averaged about 260 for both IM workers and JOBS case managers; in the integrated program caseloads averaged 140. In other words, on average every two staff members in the traditional program worked with about 260 recipients and every two staff members in the integrated program worked with 280 recipients.

Columbus caseloads were at the high end of the range of those in other welfare-to-work programs.(2) However, caseloads were defined differently in different places. In Columbus, caseload tallies for JOBS and integrated case managers included some people who were not participating in JOBS. Also, as mentioned earlier, the Columbus JOBS program provided substantial support to help staff manage their large caseloads; in fact, the level of automated and administrative support for Columbus staff was among the highest of the programs in the NEWWS Evaluation.

Caseloads for integrated case managers (140) were larger than planned.(3) When designing the program, welfare administrators and MDRC researchers intended that integrated staff work with about 100 clients, including about 65 active JOBS participants.(4) Integrated caseloads were not so large, however, that they prevented staff from successfully performing their duties.

Evidence from Oklahoma City, another site in the NEWWS Evaluation, showed that when caseloads in an integrated approach are too large, the income maintenance role may overshadow the employment and training function, particularly if management emphasizes the income maintenance role. In Oklahoma City, large caseloads, coupled with the administrators' focus on income maintenance, limited the time that staff spent on employment and training.(5) In contrast, the Columbus program emphasized the importance of the employment and training aspects of cases. Thus, although caseloads were larger than planned and larger than may be ideal, integrated staff still spent a substantial amount of time focused on JOBS duties. On the staff survey, Columbus integrated staff indicated that they spent, on average, about one-third of their day on JOBS-related duties and two-thirds on income maintenance-related duties. In contrast, integrated staff in Oklahoma City said they spent only about one-fifth of their day on JOBS-related tasks and four-fifths on income maintenance tasks.

When surveyed, half of the integrated case managers in Columbus reported that they felt equally like IM workers and JOBS workers, and most of the rest felt more like IM workers. Almost all integrated staff in Oklahoma City, in contrast, said that they felt like IM workers. In Portland, Oregon, the third site in the NEWWS Evaluation using an integrated approach, most integrated staff viewed themselves primarily as JOBS workers or as both equally, and they said their workday was evenly split between JOBS and income maintenance duties. Average caseloads for integrated case managers in Portland were relatively small (95), and program administrators strongly emphasized the importance of the employment and training duties.(6)

Staff Characteristics and Attitudes Toward the Case Management Approaches

Before the evaluation began, the Columbus JOBS program used a traditional case management approach. The creation of the integrated model coincided with an expansion of the JOBS program and thus with an increase in staffing. IM workers and traditional JOBS case managers were invited to apply for the integrated positions, and new employees were recruited.

Table 2.2 shows that staff members who were hired for the integrated case management positions had somewhat less experience working for the Columbus welfare agency than traditional JOBS case managers and had less prior experience in an employment-related field. More integrated case managers, however, had at least a bachelor's degree. Integrated case managers, on average, were somewhat younger than other staff. All three types of staff were somewhat older, on average, than the recipients; the average ages of staff ranged from 34 to 42, whereas the average age of sample members was 32. The majority of staff were women, but the proportion of men in the staff was higher than the proportion of men in the study sample. The staff's racial-ethnic make-up was similar to that of the sample members, roughly half white and half black.

Table2.2
Characteristics of Program Staff
Characteristic Integrated Case Managers Traditional JOBS Case Managers IM Workers

Average number of years employed with agency

5.1 7.3 11.1

Average number of years in current position

1.1 1.9 5.4

Percent with prior experience in an employment-related field

22.7 41.0 23.7
Percent with prior experience as a(n):

Caseworker in a WIN or other employment and training programa

0.0 10.3 7.0

JTPA caseworkera

0.0 7.7 5.3

Employment counselor, trainer, or job developera

22.7 33.3 17.5

Percent with prior experience as an IM workera

n/a 54.2 n/a
Highest degree/diploma earned(%)

High school diploma/GED

9.1 2.6 8.9

Some college

13.6 28.2 46.4

Associate's degree

4.6 10.3 11.6

Bachelor's degree or higher

72.7 59.0 33.3

Average age (years)

34.2 41.5 41.0
Sex(%)

Male

18.2 31.6 14.3

Female

81.8 68.4 85.7
Race/ethnicity(%)

White

50.0 44.4 37.3

Black

45.0 44.4 53.6

Hispanic

0.0 2.8 0.0

Native American/Alaskan Native

0.0 0.0 0.0

Asian/Pacific Islander

0.0 2.8 0.9

Other

5.0 5.6 8.2

Sample size

22 39 114

Sources: JOBS, Income Maintenance, and Integrated Staff Activities and Atitudes Surveys.
Notes: Sample sizes for individual measures may vary because of missing values. N/a=not applicable(workers were not asked this question).a Missing responses to these questions were recoded as nagative responses(i.e., no experience).

In general, staff members were committed to their program's case management approach but acknowledged its limitations. Most traditional JOBS case managers said that they preferred to spend all of their time working with recipients on employment-related issues and were not interested in learning income maintenance procedures. They noted, though, that because they could not impose sanctions themselves, it was sometimes difficult to persuade recipients to comply with program participation requirements.

Integrated case managers thought that consolidating income maintenance and JOBS functions was a more efficient approach; they particularly appreciated that they did not need to coordinate with an IM worker to impose or remove sanctions. On the down side, integrated staff noted that completing all of their duties was very demanding. They often had to have separate meetings with recipients to review income maintenance issues and to review JOBS progress because there was too much to cover in one sitting. Many integrated staff members said they wished that they could spend more time on each case.

Partnership Between Income Maintenance and JOBS

Overall, the partnership between income maintenance and JOBS was strong in the integrated program and more limited in the traditional program. Both integrated and JOBS case managers complained about the JOBS referral process (recall from Chapter 1 that recipients in both the integrated and traditional programs were initially referred to the JOBS program at the welfare office). They felt that IM workers inappropriately referred some people who were clearly exempt from JOBS  for example, people who were eligible for Supplementary Security Income (SSI) benefits  and that they did not refer all of the people who should have been referred.

After the initial referral, there was by definition a full partnership between income maintenance and JOBS in the integrated program, since one worker performed both duties. In the traditional program, the relationship was more complicated. JOBS case managers felt some lack of control over the sanctioning process, and both JOBS case managers and IM workers thought communication between the two departments was poor. IM workers also expressed a desire to learn more about JOBS. During the follow-up period, Columbus management responded to this concern by providing additional training on the JOBS program for IM staff. Some staff members thought that the relationship improved throughout the follow-up period.

Staff Training, Evaluation, and Job Satisfaction

Before starting work, newly hired integrated case managers received four weeks of training on income maintenance procedures and the automated case management system, CRIS-E, and one week of training on JOBS procedures; newly hired JOBS case managers also received one week of JOBS training. As Figure 2.2 illustrates, however, the percentage of JOBS and integrated staff who reported that they received helpful training on how to be an effective JOBS case manager is lower than the median for the NEWWS Evaluation programs.(7) The staff survey was administered in Columbus at the end of 1993; over time, as part of an agency-wide effort to improve staff performance, training was provided on topics ranging from automated case management procedures to recognizing and confronting substance abuse.

Almost all integrated case managers said that their supervisors paid close attention to case manager performance, compared with about four-fifths of traditional JOBS case managers. In addition, more integrated case managers said that good performance in general was recognized. Columbus did not use performance standards to evaluate individual staff members.

As Figure 2.2 shows, very few integrated staff reported high job satisfaction. When speaking with researchers, their main complaint was that their large caseloads limited the amount of "social work" they could do with recipients. Field researchers observed that the integrated case managers did perform more social work than most JOBS workers in other programs and concluded that their dissatisfaction was largely a product of their and the program administrators' high expectations for the integrated case management approach. Some traditional staff members also complained about large caseloads and noted their concern about the limited relationship between JOBS and income maintenance. Compared with staff in the other NEWWS Evaluation programs, however, the traditional JOBS case managers ranked as relatively satisfied with their jobs.

Figure 2.2
Stuff training, supervision, and evaluation

Stuff training, supervision, and evaluation

Case Manager and Recipient Interactions

Personalized Attention and Encouragement

Data from the staff and client surveys and field research indicate that overall, as expected, the integrated program provided more personalized attention and encouragement to recipients than the traditional program. Administrators and researchers designed the integrated program to facilitate close interaction between case managers and recipients, and they communicated this intention to staff. As noted, integrated caseloads were larger than planned. Integrated staff felt they could not spend as much time as they wanted getting to know recipients, exploring their situation, and helping them, but field researchers concluded that, despite staff frustration, the integrated staff did provide more personalized attention than many welfare-to-work program case managers.

Figure 2.3 shows that although the percentage of staff who tried to identify and remove barriers to participation was similar in the two programs, a higher percentage of integrated staff than traditional staff tried to learn in depth about recipients during program intake and provided positive reinforcement to them. Recipients' survey responses corroborated this difference: More recipients in the integrated program than in the traditional program said they felt their case manager knew a lot about them and their family, and more said they believed program staff would help them resolve problems that affected their participation in activities.

Figure 2.3
Personalized attention and encouragement

Personalized attention and encouragement

Program Mandatoriness

The degree to which a welfare-to-work program is "mandatory" can be considered a product of three factors: (1) how wide a cross section of the eligible caseload is enrolled in the program, (2) how closely the program monitors participation, and (3) how swiftly and consistently the program imposes sanctions (AFDC grant reductions).(8) Accordingly, both programs in Columbus were strongly mandatory, but the integrated program was a bit more so.

As discussed in more detail in Chapter 3, the integrated program enrolled a wider cross section of the eligible caseload than the traditional program. As expected, the integrated program also provided closer monitoring of recipients' progress through the program. Integrated staff reported receiving attendance information from service providers and contacting participants about attendance problems more quickly than traditional staff (see Figure 2.4). Fewer integrated staff members reported receiving a lot of information on participants' progress from service providers, but field research indicated they may have received a bit more information than traditional staff. The survey responses may reflect integrated staff's high expectations about monitoring and their frustration based on their higher than anticipated caseloads. In addition, integrated case managers typically saw their clients more often because they were responsible for income maintenance functions as well.

Figure 2.4
Participation monitoring

Participation monitoring

Staff in both programs believed that those who receive welfare should be obligated to take part in JOBS activities, and they strongly emphasized the program participation mandate. According to survey responses (presented in Figure 2.5), a higher proportion of integrated case managers (86 percent) than traditional JOBS case managers (71 percent) strongly emphasized penalties for noncompliance to new clients. Field research uncovered no evidence that the traditional program communicated a less mandatory message, and client survey responses (illustrated in the figure) indicate that the integrated and traditional group members heard similar messages about penalties for noncompliance. Perhaps traditional staff felt less compelled to communicate the possible penalties since all the recipients they met with had seen an orientation video that stressed the mandatory nature of the program (the integrated program did not use the video).

Figure 2.5
Rule Enforcement and Sanctioning

Rule Enforcement and Sanctioning

Although staff in both programs said that they usually gave recipients a few chances before sanctioning them for program noncompliance, they did not tolerate persistent attendance problems. Thirty-eight percent of traditional JOBS case managers reported never delaying sanction requests, a relatively small percentage compared with most of the other programs in the NEWWS Evaluation. Seventy-one percent of IM workers and integrated staff said that they never delayed imposing sanctions, also toward the low end in relation to the other programs. The field research offers some evidence that the IM staff sometimes did not immediately impose sanctions because they prioritized other duties, especially processing welfare benefits, ahead of sanctioning.

Perceptions of Program Effectiveness

Figure 2.6 shows that most Columbus staff thought that the JOBS program would help welfare recipients become self-supporting, but slightly more integrated case managers than traditional JOBS case managers expressed confidence in the program. A slightly smaller proportion of recipients in the traditional program than in the integrated program said they thought that the program improved their chances of getting or keeping a job.

Figure 2.6
Perceptions of the Effectiveness of JOBS

Perceptions of the Effectiveness of JOBS

During field visits, researchers heard contradictory opinions about the effectiveness of the JOBS program: Some staff believed that they were making a real impact, whereas others were doubtful or felt that the effects would take many years to show up. Staff in both programs thought that although caseloads were higher than ideal, integrated case management was more effective than the traditional model. Some staff thought that requiring a high school diploma or GED for entry into job search activities unnecessarily restricted the number of people who were helped to find employment.

Endnotes

1.  See Scrivener et al., 1998, Appendix B, for a description of this scale and the others used in this chapter.

2.  See Hamilton and Brock, 1994, for caseload sizes in all the NEWWS Evaluation programs.

3.  This average of 140 is from the staff survey administered in October 1993. Caseloads fluctuated over the evaluation period but generally were larger than planned.

4.  Many variables influence caseload sizes in a welfare-to-work program, including factors outside the program, such as the availability of jobs in the community, making caseload predictions difficult.

5.  The average caseload was 175 in Oklahoma City in the middle of the follow-up period covered in the report (see Storto et al., 2000).

6.  See Scrivener et al., 1998.

7.  In this figure, as in others in the chapter, Columbus staff survey responses are depicted along with the range of responses of staff in other NEWWS Evaluation programs, indicated by the low, median, and high points. For example, the "low" point on the first item in Figure 2.2 refers to the NEWWS program with the lowest percentage of staff who said that they received helpful training on how to be an effective JOBS case manager. The "med" point refers to the program with the median percentage among all programs, and the "high" point refers to the program with the highest percentage of staff who said they received helpful training. These ranges include the Columbus staff in the calculation. See Appendix Tables A.1 and A.2 for each program's value on the survey scales presented in this chapter. (Some later figures also show survey responses of Columbus sample members, depicted along with the range of responses of other sample members in the evaluation. Appendix Table A.3 shows each program's value on the client survey question used in the figures.)

8.  Freedman et al., 2000.

3. Participation Patterns in the Integrated and Traditional Programs

This chapter presents findings on the integrated and traditional group members' involvement in employment-related activities in the JOBS program. These findings help describe the "treatment" that people in the two programs received. The chapter also compares the activity levels of the integrated and traditional group members with those of their control group counterparts to determine the net effect of the two programs on participation.

Summary of Participation in the Programs

The integrated and traditional programs were both education-focused, with more people participating in education than in other activities. For those who entered the programs without a high school diploma or GED (nongraduates), the Columbus programs produced large increases in participation in basic education. For high school graduates, the programs substantially increased participation in post-secondary education (primarily classes at a two-year college), job search activities, and unpaid work experience. The traditional program increased the proportion of nongraduate sample members who received a high school diploma or GED during the two years after entering the evaluation.

As expected, the integrated program was more successful in getting people to attend a JOBS orientation, the gateway to program activities, and engaging them in program activities, than the traditional program. These differences probably reflect integrated case managers' closer monitoring of participation and quicker follow-up regarding attendance problems (as reported in Chapter 2). Integrated group members may also have taken the threat of financial sanction for program noncompliance more seriously than traditional group members because integrated case managers could impose sanctions themselves, rather than relying on another staff member to do so. The orientation attendance rate may also have been higher because integrated case managers called people in to orientation (and followed up) more quickly than traditional case managers.

Sanctioning rates were similar in the two programs and very high. The rate of initiating a sanction, however, was higher in the traditional program than in the integrated program; thus, a smaller proportion of those for whom a sanction was initiated were actually sanctioned in the traditional program. This probably resulted from the traditional program's split in duties: Traditional JOBS case managers could request that a person be sanctioned (sanction initiated), but had to rely on an income maintenance (IM) worker to impose the sanction. In addition, because they did not deal with the eligibility aspects of recipients' cases, they probably initiated a sanction for some people who had not attended a program activity because they were no longer receiving cash assistance or were no longer mandatory for JOBS.

Data Sources and Analysis Issues

Two data sources were used for the analyses presented in this chapter:

  • Case file data. MDRC staff reviewed JOBS case files for a random subsample of integrated and traditional group members.(1) These case file data provide information on participation in activities that occurred as part of the JOBS program.
  • Survey data. A survey administered to a random subsample of integrated, traditional, and control group members asked a series of questions about sample members' involvement in employment-related activities during the two-year period following their entry into the evaluation.(2) These survey data provide information on participation in employment-related activities, both inside and outside the JOBS program, and were used to estimate the difference between participation rates for the integrated and control groups and between the traditional and control groups  or in other words, the impact of the programs on participation.

The two data sources do not yield identical results. Most important, the case file data show substantially higher participation rates in the integrated program than in the traditional program, whereas the survey data show only a small difference. This discrepancy may be partly explained by the fact that the two data sources cover different cohorts of the Columbus evaluation sample: Case files were reviewed for sample members randomly assigned between October 1992 and March 1993, and the survey was administered to sample members randomly assigned between January and December 1993. Analysis for the early cohort of the survey sample (those assigned from January through March 1993) revealed larger differences in participation levels between the integrated and traditional programs than were found for the full survey sample. This, along with field research evidence that the traditional program strengthened its participation monitoring and enforcement procedures over time, suggests that there were larger participation differences between the two programs earlier in the follow-up period. Therefore, the results presented in this chapter based on case file data may somewhat overestimate the differences between the two programs.

The researchers are confident, however, that the general finding from the case file data  that the integrated program generated more participation than the traditional one  is valid. This confidence is based on three factors. First, case file data are considered the best source for participation in activities within a program. Second, the difference between participation levels in the two programs indicated by the case file data is very substantial; even if the traditional program succeeded in generating more participation over time, it is almost impossible that the difference between the programs was erased. Third, a higher participation rate in the integrated program is in line with some of the key results from the implementation analysis, namely, that the integrated case managers tracked participants more closely and provided more personalized attention.

It is not known why the survey does not show a larger difference between participation levels of the integrated and traditional groups. Various possible explanations were explored, but none proved to be true. Survey data are used in this chapter to measure whether the two programs increased participation above that of the control group level; as the last section of the chapter shows, the magnitude of the impacts is substantial enough that the precision of the program groups' participation level is not crucial.

One of the major reasons for conducting the test of integrated and traditional case management was to determine whether one approach was more effective in maximizing participation in welfare-to-work activities and in enforcing the "social contract" idea that people who receive welfare should be engaged in employment-focused services. The evaluation designers hypothesized that the integrated program would lead to a higher show-up rate to JOBS orientation and subsequently to a higher participation rate in JOBS activities, and thus would more effectively enforce the social contract. This hypothesis was based primarily on the belief that welfare recipients would take the threat of financial sanction more seriously when it came from a case manager who could impose the sanction herself. In fact, as reported in Chapter 2, traditional JOBS case managers told MDRC staff that it was sometimes difficult to persuade recipients to comply with program requirements because they could not impose sanctions themselves. In addition, the evaluation designers thought that recipients might have more difficulty avoiding participation requirements if they had to deal with one worker who knew their whole situation rather than two workers who each had limited information about their JOBS and AFDC statuses.

Program Participation and Sanctioning Rates

JOBS Orientation Attendance Rates

As in many welfare-to-work programs, a program orientation session was the gateway to program services in Columbus; a person had to attend a JOBS orientation in order to be assigned to and participate in program activities. As expected, the integrated program was more successful than the traditional program at getting people to attend an orientation session. Table 3.1 shows that among sample members whose case file was reviewed, 86 percent in the integrated program and 63 percent in the traditional program attended orientation in the two years following random assignment, a statistically significant difference.(3) (For this report, differences are considered statistically significant if there is less than a 10 percent probability that they occurred by chance.) In the integrated program, people attended an orientation session an average of 11 weeks after random assignment; in the traditional program, this lag was 16 weeks (not shown in the table).

Table 3.1:
Rates of Participation Within a Two-Year Follow-up Period
Measure Integrated Group(%) Traditional Group(%)
For all sample members for whom case files were revieweda

Attended JOBS orientation

85.8 62.8***
Participated in:

Any activity

52.9 33.5***

Job search

14.7 8.3**

Any education or training

33.8 24.3**

Basic education

24.4 15.1**

Post-secondary educaionb

5.8 6.0

Vocational training

7.6 5.1

Life skills workshops

9.8 0.9***

Work experience

11.6 5.1**

Sample size

225 218
For all sample members who attended a JOBS orientationc
Participated in:

Any activity

63.6 53.5

Job search

16.2 14.0

Any education or training

40.9 41.9

Basic education

29.2 25.6

Post-secondary educationb

7.1 9.3

Vocational training

10.4 9.3

Life skills workshops

13.0 0.0

Work experience

16.2 9.3

Sample size

154 86
Sources: MDRC calculations based on MDRC-collected JOBS case file data.
Notes: aFor this sample, the follow-up period began on the day the individual was randomly assigned. Tests of statistical significance were calculated for differences between the integrated and traditional groups. Statistical significance levels are indicated as: *=10 percent; **=5 percent; and ***=1 percent. b Courses for college credit at a two-year or four-year college. cFor this sample, the follow-up period began on the day of JOBS orientation. Only orientation attenders for whom there are two full years of post-orientation data are included. Differences between the integrated and traditional group outcomes, shown in italics, are not true experimental comparisons; statistical significance tests were not calculated.

As mentioned in Chapter 1, at the point of random assignment, staff in the income maintenance office told all integrated and traditional group members that they had to participate in the JOBS program. Then it was up to the integrated case managers and traditional JOBS case managers to send a letter to each person scheduling her for a specific orientation session. A few factors help explain why the integrated program was more successful in getting people to attend orientation.

First, integrated case managers scheduled orientation sessions more quickly than traditional JOBS case managers.(4) Delays in contacting people can reduce orientation attendance rates because some people leave welfare or become exempt from the program prior to being contacted. Delays also dilute the mandatory program message. Second, integrated staff followed up more quickly and more often on those who missed a scheduled orientation session than traditional staff.(5) These two factors probably reflect the fact that integrated case managers had smaller caseloads and thus fewer individuals to call in and monitor at one time.

Third, as suggested earlier, people may have given greater attention to a call-in notice sent from someone who had direct control over their welfare benefits than from someone who only indirectly influenced their benefits.

Participation in Post-Orientation Program Activities

The pattern of activities that people are initially assigned to after attending orientation helps illustrate a program's employment-preparation strategy. Assignment patterns in Columbus confirm that the programs were education-focused: As Figure 3.1 shows, the most common first assignment in both programs was basic education.

Figure 3.1
Assignment patterns within a Two-Year Follow-Up Period
Activities to which individuals were assigned or in which they were allowed to continue

Assignment patterns within a Two-Year Follow-Up Period

More people in the traditional program than in the integrated program were never assigned to an activity in the two-year follow-up period (one-half in the traditional program compared with just under one-third in the integrated program). This difference, however, merely reflects the difference in orientation attendance rates: In both programs 20 percent of those who attended orientation were never assigned to an activity (not shown in Figure 3.1). Both programs had a formal upfront deferral policy at the time of orientation in which they temporarily excused from the program people with possible barriers to participation. It is likely that some of those not assigned to an activity were formally deferred, while others "fell through the cracks."

The integrated program engaged a higher proportion of people in program activities. As Table 3.1 shows, just over one-half of the integrated group members participated in a JOBS activity for at least one day during the follow-up period compared with about one-third of the traditional group. This statistically significant difference is partly explained by the higher attendance rate at program orientation, the gateway to program activities, in the integrated program.

If only those who attended orientation are considered  a nonexperimental comparison because orientation attenders in the integrated program may have different characteristics than attenders in the traditional program  the participation rate is higher in the integrated program than in the traditional program (64 percent compared with 54 percent, shown in the lower panel of the table).(6) As noted, the same proportion of orientation attenders in each program were assigned to an activity; however, a smaller proportion of those assigned actually attended an activity in the traditional program. In other words, integrated case managers were more successful than traditional JOBS case managers in impelling people to attend activities. This probably reflects the integrated case managers' closer monitoring of participation and quicker follow-up regarding attendance problems, as reported in Chapter 2. Integrated group members may also have taken the sanction threat more seriously than traditional group members. The two-year participation rates for orientation attenders in the Columbus programs are in the range of previously studied programs.(7)

Participation patterns confirm that the Columbus programs were education-focused. In both programs people most commonly took part in education activities. Some people in each program participated in job search activities, unpaid work experience, and life skills workshops. Not surprisingly, for people who entered the programs without a high school diploma or GED (nongraduates), basic education was by far the most common activity. (See Appendix Table B.1.) Participation among graduates was more varied, but education and training  including basic education, post-secondary education (primarily courses for college credit at a two-year college), and vocational training  were more common than job search or other activities.

Length of Stay in Program Activities

Integrated group members participated in program activities for more time during the two year follow-up period than traditional group members (3.3 months compared with 1.9 months; see Table 3.2). If only those who participated in a program activity are considered, however, average length of participation was roughly similar (6.5 months in the integrated program and 5.9 months in the traditional program). The length of stay for participants in Columbus falls between the averages for the NEWWS Evaluation labor force attachment (LFA) and human capital development (HCD) programs.(8) For nongraduates, as was found for the full sample, length of stay in program activities was longer in the integrated program than in the traditional program. (See Appendix Table B.2.) For graduates, however, length of stay was similar in the two programs.

Table 3.2
Length of Participation Within a Two-Year Follow-up Period
Measure Integrated Group Traditional Group
For all sample members for whom case files were revieweda

Average number of months receiving AFDC

16.9 17.6

Average number of months in which individuals were JOBS-mandatory

14.4 15.1

Average number of months in which individuals participated in a JOBS activity

3.3 19***

Sample size

225 218
For participants onlyb

Average number of months in which individuals participated in a JOBS activity

6.5 5.9
Number of months in which there was participation(%)

1

13.8 11.3

2

20.7 22.5

3

12.9 8.5

4-6

12.9 22.5

7-12

24.1 23.9

13-18

11.2 4.2

19 or more

4.3 7.0

In any activity at the end of the follow-up period(%)

12.9 11.3

Sample size

116 71

Sources: MDRC calculations from MDRC-collected JOBS case file data and Ohio AFDC records.
Notes: aTests of statistical significance were calculated for the differences between the integrated and traditional groups. Statistical significance levels are indicated as: *=10 percent; **=5 percent; and ***=1 percent. bDifferences between the integrated and traditional group outcomes, shown in italics, are not true experimental comprisons; statistical significance were not calculated.

Sanctioning Rates

Both Columbus programs freely used financial sanctions as a response to individuals' noncompliance with program requirements. (A sanction in Columbus removed the JOBS-mandatory adult from the AFDC grant.)(9) As shown in Table 3.3, roughly one-third of those in each program were sanctioned at some point during the two years following random assignment.(10)

Table 3.3
Sanction Activity Within a Two-Year Follow-up Period
Measure Integrated Group Traditional Group
For all sample members for whom case files were revieweda

Sanction initiatedb(%)

45.3 61.5***

Sanction imposed(%)

36.4 34.9

In sanction at the end of follow-up period(%)

4.4 6.0

Sample size

225 218
For sanctioned individuals onlyc

Average number of months in which sanction was in effect

4.0 5.0
Number of months in sanction(%)

1

26.8 19.7

2

19.5 9.2

3

12.2 21.1

4-6

20.7 26.3

7-12

15.9 17.1

13-18

4.9 4.0

19 or more

0.0 2.6

In sanction at the end of the follow-up period(%)

12.2 17.1

Sample size

82 76

Sources: MDRC calculations from MDRC-collected JOBS case file data .
Notes: aTests of statistical significance were calculated for the differences between the integrated and traditional groups. Statistical significance levels are indicated as: *=10 percent; **=5 percent; and ***=1 percent. b"Sanction inititated " indicates the integrated case manager or the traditional JOBS case manager decided that a sanction should be implemented. cDifferences between the integrated and traditional group outcomes, shown in italics, are not true experimental comprisons; statistical significance were not calculated.

Importantly, although the sanctioning rates in the two programs were similar, a larger proportion of the sample in the traditional program had a sanction initiated: 62 percent, compared with 45 percent in the integrated program. ("Sanction initiated" indicates that the integrated case manager or traditional JOBS case manager decided that a sanction should be imposed.) This means that a smaller proportion of those who had a sanction initiated were actually sanctioned in the traditional program than in the integrated program. This probably resulted from the traditional program's split in duties. Traditional JOBS case managers could request that a person be sanctioned, but had to rely on an IM worker to impose the sanction. As noted in Chapter 2, communication between the traditional JOBS case managers and IM workers was poor; during interviews both types of staff in the traditional program said that sanctioning was a particularly problematic area. Also, since traditional case managers did not deal with the income maintenance aspects of their clients' cases, they initiated sanctions for some people who had not attended a program activity because they were no longer receiving AFDC or were no longer mandatory (and thus should not and could not be sanctioned). In both programs, some people for whom sanctions were initiated demonstrated good cause for their nonparticipation and thus were not sanctioned.

Sanctions were somewhat longer for those in the traditional program than in the integrated program (5 months compared with 4 months), probably because the integrated case manager coordinated the interaction between the JOBS program and the AFDC grant  in other words, she could end the sanction herself as soon as the person complied with JOBS program requirements. Sanction length may also reflect the fact that people in the integrated program, on average, received welfare for less time than people in the traditional program (see Chapter 5).

The sanctioning patterns that were found for the full sample were also found for high school graduate and nongraduate subgroups: For each subgroup, sanctioning rates were similar in the two programs, but more people had a sanction initiated in the traditional program than in the integrated program. (See Appendix Table B.3.) In both programs, a higher percentage of nongraduates than graduates had a sanction initiated and were sanctioned, a pattern found in most of the other NEWWS Evaluation programs.

Participation and Other Statuses Over Time

Activity Sequences

Figure 3.2 depicts various "paths" that people took through the Columbus programs. Reflecting the differences in participation rates presented earlier, a higher proportion of people in the integrated program than in the traditional program followed Paths A or B through the program (participated and exited from AFDC or participated and did not exit from AFDC). The most common path in both programs was Path C (did not participate and exited from AFDC). As noted earlier, a substantial number of people did not attend JOBS orientation and thus had no chance to participate in a program activity. The traditional program had more people in Path D: did not participate and did not exit from AFDC. This suggests that at least some of the people who were never oriented to the traditional program remained on welfare for the entire follow-up period.

Figure 3.2
Distribution of Sample Members by Descriptive - Not Causal - Activity Sequences Within a Two-Year Follow-Up Period,by Case Management Approach

Distribution of Sample Members by Descriptive - Not Causal - Activity Sequences Within a Two-Year Follow-Up Period,by Case Management Approach

Monthly AFDC and JOBS Statuses and Program Coverage

Figure 3.3 shows the proportion of sample members in each program who were in various statuses during selected months of follow-up.(11) Most notably, the proportion of people in the "JOBS mandatory, other" status is larger in the traditional program than in the integrated program. This status includes people who were receiving welfare and were officially required to participate in the program, but were not participating, employed, or sanctioned. In other words, this status indicates that the program was not "covering" a sample member. This reflects both the lower orientation attendance rate and the lower participation rate for orientation attenders in the traditional program.

Figure 3.3
AFDC and JOBS Statuses Within a Two-Year Follow-Up Period, by Follow-Up Month

AFDC and JOBS Statuses Within a Two-Year Follow-Up Period,by Follow-Up Month

This lower degree of program coverage in the traditional program is also illustrated in Figure 3.4, which depicts the length of time that people were participating in a program activity, employed, or sanctioned as a proportion of the time they were considered to be mandatory for the program (required to participate). As the figure shows, both programs left a large proportion of mandatory time "uncovered," but the proportion of time that was uncovered was larger in the traditional program. Program coverage in Columbus was among the lowest of NEWWS Evaluation programs.(12)

Figure 3.4
Proportion of JOBS-Mandatory Months in Various JOBS Statuses
Within a Two-Year Follow-Up Period

Proportion of JOBS-mandatory months in various JOBS statuses whithin a two-year follow-up period

Impacts on Participation in Activities and Receipt of Education Credentials

Participation in Activities

Many welfare recipients take part in education or training activities without the intervention of a welfare-to-work program. For a program to make a difference, it must engage more people than would have volunteered to participate in activities available in the community. In this evaluation, the participation level of the control group represents what happened in the absence of the mandatory welfare-to-work programs. As noted earlier, the participation findings presented in previous sections of this chapter were based on data collected from case files of integrated and traditional group members. This section presents estimates of participation levels based on data collected using a survey that was administered to integrated, traditional, and control group members.

The survey data show that many people participated in employment-related activities on their own, without the intervention of the welfare-to-work programs, but the integrated and traditional programs substantially increased participation levels. As Table 3.4 shows, 11 percent of the control group in Columbus participated in basic education, 10 percent in post-secondary education, and 10 percent in vocational training, all without prompting from a welfare-to-work program.(13) The table also shows the participation levels for the integrated and traditional group members, and the difference in participation between these two groups and the control group. Overall, the table shows that both the integrated and traditional programs increased participation in job search, basic education, post-secondary education, and work experience or on-the-job training. The programs also increased the number of hours spent in activities.

Table 3-4.
Two-Year Impacts on Participation in Job Search, Education, Training, and Work Experience
Outcome Integrated Group Traditional Group Control Group Integrated-Control Difference (Impact) Traditional-Control Difference (Impact)
Participated in (%):

Job searcha

17.3 16.6 3.7 13.6 12.9

Basic education

28.7 27.2 10.7 17.9 16.5

Post-secondary educationb

21.8 18.5 10.2 11.7 8.3

Vocational training

10.9 9.8 9.5 1.3 0.3

Work experience or on-the-job training

14.1 13.1 2.2 12.0 10.9
Hours of participation in:

Job searcha

16.1 26.3 3.1 13.0 23.2

Basic education

104.9 140.9 19.5 85.4 121.4

Post-secondary educationb

131.9 153.4 42.4 89.5 111.0

Vocational training

55.3 79.9 32.9 22.4 47.0

Work experience or on-the-job training

n/a n/a n/a n/a n/a
Hours of participation among participants in :

Job searcha

93.0 158.7 83.6 9.4 75.0

Basic education

365.9 517.7 181.4 184.5 336.3

Post-secondary educationb

603.7 830.5 417.4 186.3 413.2

Vocational training

508.9 814.1 345.9 163.0 468.2

Work experience or on-the-job training

n/a n/a n/a n/a n/a

Simple sizec

371 366 357    

Source: MDRC calculations from the Two-Year Client Survey , ajusted using MDRC-collected case file data.
Notes: Tests of statistical significance were not performed. Estimates are regression-adjusted using ordinary least squares, controlling for pre-random assignment characteristics of sample members. Numbers may not add up to 100 percent because of rounding. N/a= not available or not applicable. Italics are used to signal average outcomes and differences that were calcuated only for participants. Sample sizes for these measures vary. aFor integrated and traditional group members, this measure includes participation in life skills workshops. bCourses for college credit at a two-year or four-year college. cSample sizes for individual measures vary because of missing values.

Table 3.5 presents the programs' effects on participation for high school graduates and nongraduates. As the table shows, both programs substantially increased participation for graduates in job search, post-secondary education, and work experience. The increases in post-secondary education  primarily courses for college credit at a two-year college  are large compared with increases for other programs.(14) For nongraduates, the Columbus programs produced large increases in participation in basic education(15) and small increases in the use of job search services.

Table 3.5
Two-Year Impacts on Participation in Job Search, Education, Training, and Work Experience
by High School Diploma/GED Status
Outcome Integrated Group Traditional Group Control Group Integrated-Control Difference (Impact) Traditional-Control Difference (Impact)
For those with a high school diploma or GED:
Participated in (%):
Job searcha 19.9 22.5 6.0 13.8 16.5

Basic education

7.9 5.3 3.3 4.7 2.0

Post-secondary educationb

30.5 27.4 14.4 16.1 13.0

Vocational training

12.1 11.4 13.9 -1.8 -2.4

Work experience or on-the-job training

19.6 17.8 1.5 18.0 16.3
Hours of participation in:

Job searcha

17.3 36.2 4.6 12.8 31.6

Basic education

22.9 24.3 3.9 19.0 20.4

Post-secondary educationb

181.0 255.4 63.3 117.7 192.1

Vocational training

52.6 120.8 53.0 -0.4 67.8

Work experience or on-the-job training

n/a n/a n/a n/a n/a
Hours of participation among participants in :

Job searcha

87.1 160.4 75.7 11.5 84.7

Basic education

288.9 458.7 119.8 169.1 338.9

Post-secondary educationb

594.0 933.3 440.1 153.9 493.2

Vocational training

434.6 1056.4 381.9 52.8 674.5

Work experience or on-the-job training

n/a n/a n/a n/a n/a

Simple sizec

214 219 211    
For those without a high school diploma or GED:
Participated in (%):

Job searcha

9.5 7.7 0.4 9.2 7.3

Basic education

64.6 63.0 22.7 41.9 40.2

Post-secondary educationb

6.7 6.4 4.4 2.3 2.0

Vocational training

7.1 6.4 4.0 3.1 2.4

Work experience or on-the-job training

15.2 6.2 3.8 11.4 2.4
Hours of participation in:

Job searcha

10.2 11.8 0.3 9.9 11.5

Basic education

245.4 339.4 27.2 218.1 312.2

Post-secondary educationb

51.5 26.9 19.4 32.1 7.5

Vocational training

50.8 28.2 15.4 35.4 12.8

Work experience or on-the-job training

n/a n/a n/a n/a n/a
Hours of participation among participants in :

Job searcha

106.4 153.6 75.7 30.7 77.9

Basic education

379.7 539.1 119.8 259.9 419.3

Post-secondary educationb

766.1 419.0 440.1 326.0 -21.1

Vocational training

713.0 437.1 381.9 331.1 55.3

Work experience or on-the-job training

n/a n/a n/a n/a n/a

Simple sizec

155 146 146    
Sources: MDRC calculations from the Two-Year Client Survey , ajusted using MDRC-collected case file data.
Noes: Tests of statistical significance were not performed. Estimates are regression-adjusted using ordinary least squares, controlling for pre-random assignment characteristics of sample members. Numbers may not add up to 100 percent because of rounding. N/a= not available or not applicable. Italics are used to signal average outcomes and differences that were calculated only for participants. Sample sizes for these measures vary. aFor integrated and traditional group members, this measure includes participation in life skills workshops. bCourses for college credit at a two-year or four-year college.
cSample sizes for individual measures vary because of missing values. In addition, three individuals in the full sample did not indicate whether they had a high school diploma or GED at random assignment. These individuas are excluded from the subgroup analysis.

Receipt of Education Credentials

The survey asked sample members whether they had received any education credentials during the two years since they entered the evaluation. (Results for this question are not presented in a table.) About 4 percent of control group members without a high school diploma or GED certificate at study entry reported that they had received a diploma or GED during the two years; 13 percent of the traditional group nongraduates reported that they received a diploma or GED after entering the evaluation. (Nine percent of integrated group nongraduates reported receiving such a credential, but the 5 percentage-point impact was not statistically significant.)(16) Like most welfare-to-work programs studied, neither program in Columbus increased receipt of a trade certificate, an associate's degree, or a bachelor's degree.

The client survey may not capture the full effect of the programs on receipt of educational credentials. Some sample members may not have received a credential until the third year following random assignment or later. These later effects will not be measured in the evaluation.

Endnotes

1.  Case files were reviewed for 225 integrated group members and 218 traditional group members. Initially, 225 traditional cases were randomly selected for review but seven cases had to be dropped because of missing documentation or because they had become employed by the county and their files were marked confidential.

2.  The survey sample includes 1,094 sample members (371 integrated group members, 366 traditional group members, and 357 control group members).

3.  In the integrated program, 81 percent attended orientation within six months of random assignment; in the traditional program, 50 percent attended within six months. Six-month orientation attendance rates in other programs that MDRC has studied range from 63 to 71 percent. (Some of the participation numbers presented in this chapter differ slightly from those presented in Brock and Harknett, 1998a and 1998b, owing to small differences in data analysis decisions.)

4.  Sample members in the integrated program were sent an orientation scheduling letter an average of 24 days after being referred from the income maintenance office compared with an average of 64 days in the traditional program.

5.  Integrated case managers contacted those who missed an orientation session, on average, 1.4 weeks after the session compared with 2.2 weeks for traditional JOBS case managers. Integrated staff contacted people who did not attend orientation within six months of the initial referral an average of three times compared with two times for traditional staff. Moreover, 16 percent of traditional JOBS case managers reported that they would never follow up with a client who had not attended an orientation session compared with only 5 percent of integrated case managers.

6.  The two panels of Table 3.1 present findings for different samples and follow-up periods. The upper panel presents findings for the full case file sample in each program and tracks activity for two years following random assignment. The lower panel presents findings for a subgroup of the full case file sample: those who attended orientation for whom at least two years of data following orientation were available. The lower panel represents 80 percent of the orientation attenders in the integrated program and 63 percent of the orientation attenders in the traditional program.

7.  For example, two-year participation rates in the other NEWWS Evaluation programs range from 44 to 74 percent. (The participation rate in the Oklahoma City program is not included in this range because the sample is not comparable.) See the following reports for findings based on case file data for the other NEWWS programs: Hamilton et al., 1997; Scrivener et al., 1998; Storto et al., 2000.

8.  The average length of stay was 4.6 months in the LFA programs and 7.8 months in the HCD programs (see Hamilton et al., 1997).

9.  For example, for a family of three, a sanction resulted in a $62, or 18 percent, reduction in a monthly grant of $341. The first time someone was noncompliant, the sanction would remain in effect until she participated as required; the second time, for a minimum of three months; and the third time, for a minimum of six months.

10.  The rates include sanctions imposed for failure to attend JOBS orientation and for failure to attend post-orientation activities; thus, they are not directly comparable to rates that capture only sanctions imposed for failure to attend post-orientation activities.

11.  Since month 1 represents the month of random assignment and thus a partial JOBS month, the figure starts with month 2.

12.  The integrated program left 70 percent of mandatory time uncovered, and the traditional program left 78 percent uncovered. The other programs in the evaluation left between 32 and 71 percent of sample members' mandatory months uncovered.

13. Some statistical adjustments were made in Table 3.4 (and Table 3.5), based on information found in the JOBS case files, to take into account recall error in the client survey data. Similar analyses were conducted for the other NEWWS programs (see Hamilton et al., 1997; Scrivener et al., 1998; Storto et al., 2000). Appendix Table B.4 presents the differences between the integrated, traditional, and control group participation levels using survey data without adjusting for recall error. Some numbers in Appendix Table B.4 differ slightly from those presented in Freedman et al., 2000, because the present analysis considers only sample members for whom the length of participation could be calculated (survey respondents were excluded from the present analysis if they reported an activity end date that preceded the reported activity start date).

14.  The integrated program increased participation in post-secondary education by 16 percentage points, and the traditional program by 13 percentage points; the largest increase in post-secondary education participation for high school graduates in the other NEWWS Evaluation programs was only 8 percentage points.

15.  Increases in participation in basic education for nongraduates in the three HCD programs studied in the NEWWS Evaluation ranged from 43 to 57 percentage points.

16.  For more detail on the programs' impacts on educational attainment, see Chapter 4 in Freedman et al., 2000.

4. Cost of Employment-Related Services in the Integrated and Traditional Programs

The preceding chapter examined the participation in employment-related activities by sample members assigned to the integrated and traditional programs and to the control group within the two-year period following their entry into the study. These are important indicators of the level of investment made in each individual required to participate in JOBS. This chapter presents information on total expenditures for employment-related services over the two-year period, and it shows how these costs varied across program activities and support services.

In addition, this chapter presents estimates of the cost that the government incurred for employment-related services for integrated and traditional group members over and above what was spent on the control group. This is referred to as the net cost per program group member, and it is the difference between the total cost per program group member (integrated group member and traditional group member) and per control group member of all program-related and non-program-related employment services that were used during the two-year follow-up period.

It is important to emphasize that to match the methodological approach used in the cost analyses of the other programs studied in the NEWWS Evaluation, the cost estimates in this chapter reflect only expenditures on employment-related services. That is, the cost estimates presented here reflect the costs of the JOBS program and do not include expenditures on income maintenance services. Income maintenance costs will be included in a benefit-cost analysis in the final report in the evaluation. That analysis will determine whether the programs' net benefits are greater than their net costs after five years. It would be premature to present a two-year benefit cost analysis here because the total return on the JOBS program may be evident only after several years.The later report also will compare the benefits and costs of the two Columbus programs with those of the other programs studied in the NEWWS Evaluation.

The cost figures presented here include JOBS and non-JOBS activities, and they are calculated per program group member rather than per participant. Non-JOBS costs are included in the total cost because they represent additional investments of resources that have the potential to affect program group members' future earnings and welfare receipt, just as they do for control group members. Thus, they are included in the gross cost estimates used to compare the cost per program group member with the cost per control group member.

Similarly, it is necessary to report costs per program group member, not just per participant in JOBS. The requirement to participate may have affected sample members' behavior (some individuals may have chosen to avoid the mandate by finding employment on their own or by leaving welfare). As is true for controls, program group members who did not participate in JOBS could have participated in education and training services on their own. In addition, to exclude nonparticipants could introduce bias because the program may have influenced sample members' behavior.

As described in Chapter 1, unlike random assignment in most of the other NEWWS Evaluation programs, in Columbus random assignment occurred in the welfare office at the point clients were determined to be JOBS-mandatory and were referred to the JOBS program. Therefore, the Columbus sample includes some people who were informed about the program by an income maintenance (IM) worker but never received further information about the program from a JOBS case manager (either a traditional JOBS case manager or an integrated case manager), and who never participated in the JOBS program. Sample members in the program groups who remained JOBS-mandatory may have been sanctioned for noncompliance. Therefore, even though these sample members may not have participated in the JOBS program, costs were incurred on their behalf, and other investments may have been made through nonwelfare sources. Although including nonparticipants in the calculations yields correct cost estimates, because the sample is systematically different from those in the other programs in the evaluation, the cost figures presented here are not directly comparable to those in the other sites. To facilitate cross-site comparisons, nonexperimental estimatesВ  namely, estimates including in the base only participants in a specific activityВ  are used in some places.

Under traditional case management design, clients were assigned a separate IM worker and a JOBS case manager. Sample members in the integrated group were assigned to a single worker who performed both income maintenance and JOBS functions. Case managers in both groups served clients in all programs (for example, the Food Stamps Employment and Training program and the General Assistance [GA] work program), not only the JOBS program.(1) As noted above, welfare department costs in this chapter represent expenditures to provide services for the JOBS program only; costs for income maintenance functions will be accounted for in a future analysis.

To summarize the main findings presented in this chapter: The estimated total cost (in 1993 dollars) per program group member for employment-related services within two years after random assignment was $3,018 for the integrated program and $2,589 for the traditional program. The net costs of the integrated and traditional programs were $2,149 and $1,720, respectively. The integrated program had somewhat higher employment-service costs because it had higher unit costs, particularly for case management and vocational training. However, given that the case management experiment in Columbus was as much about reforming income maintenance as it was about JOBS, without the income maintenance costs it is not possible to draw final conclusions about the comparative total costs of the two program approaches. As mentioned, a future benefit-cost analysis will include these costs.

The original design for the integrated program included reduced caseloads of about 100 cases each, which might have made integrated case management a higher-cost approach. However, as implemented, integrated case managers were assigned about 140 cases for whom they provided both income maintenance and JOBS services. Average caseloads for IM workers and traditional JOBS case managers were approximately 260 cases each.(2) Thus, the relatively small cost difference between the two programs is not surprising.

Components of the Cost Analysis

Figure 4.1 illustrates the elements of the cost analysis. For the two program groups, costs were calculated for two categories of activities and services: those provided to meet the JOBS requirements or to support JOBS participation and non-JOBS services and activities. Within each category, costs are further broken down by whether they were paid for by the welfare department or by non-welfare agencies. As represented in Figure 4.1, the total cost per program group member for employment-related services (box 3) is the sum of the welfare department's operating expense (for example, for case management, job search services, and associated overhead costs) and support services costs (box 1) and the expenses incurred by non-welfare agencies (for example, local adult education providers, community colleges, and vocational training institutes) to provide educational and training activities that met JOBS requirements (box 2). Non-JOBS costs (box 6) include child care expenditures paid by the welfare department for participation in programs other than JOBS (for example, transitional and at-risk child care) (box 4) and the costs of services that program group members received outside the JOBS program (box 5). Total JOBS and non-JOBS costs per program group member make up the total gross cost per program group member (box 7).

The sections of this chapter follow the flow of the diagram in Figure 4.1, beginning with the JOBS-related expenditures and ending with the net cost per program group member (box 11), which is calculated by subtracting the total cost for employment-related services per control group member from the total cost per program group member. The control group estimate represents costs that the government would have incurred in the absence of the JOBS program, and the net cost represents the cost of the JOBS program over and above control group costs.

Figure 4.1

Major components of Gross and Net Costs for employment-related services

Gross Cost per Program Group Member

This section examines expenditures made by the welfare department and by non-welfare agencies for employment and training activities and support services provided to sample members in the integrated and traditional programs. Costs are broken down into the following categories of activities: orientation and appraisal, job search, basic education, post-secondary education, vocational training, and work experience.(3)

In order to determine how much was spent per program group member on each JOBS activity, its unit cost was first calculated using data for the "steady-state" period of calendar year 1993. This year was chosen as a period of relatively stable program operations when many of the sample members were receiving services. The unit cost of an activity is an estimate of the average cost of serving one person in a specified activity for a specified unit of time: for example, one month of participation. For each activity, the unit cost was calculated by dividing expenditures during the steady-state period by the measure of participation.(4) Once the unit cost of an activity was determined, it was multiplied by the average number of units spent in the activity to determine the average cost incurred per program or control group member during the follow-up period.(5)

Table 4.1 (column 1) shows the welfare department unit costs by category. For job search and basic education, the welfare department cost reflects the cost of related case management (assigning recipients to the activity, monitoring attendance, and so on), as well as the cost of providing the activity itself (for example, the cost of classroom instruction, job search facilitation, and classroom space). The welfare department paid for the costs of providing these activities through contracts with local providers who were given on-site space to provide services to JOBS participants. The unit costs for the other activities reflect the cost of case management.

The differences between the two programs in the welfare department's cost of monitoring post-secondary education and vocational training may reflect efficiencies due to economies of scale. The larger traditional JOBS case management unit was divided into two smaller units: one served recipients who were considered "job-ready" and the other served those who were "not job-ready."(6) Having more homogeneous caseloads (in terms of service needs) may have streamlined the case management effort by reducing the number of different types of activities each worker had to monitor. The higher unit cost for the integrated group for vocational training provided by non-welfare agencies (column 4) reflects use of more expensive services  sample members in this group were more likely to attend proprietary schools (particularly cosmetology programs).

Table 4.1
Estimated Unit Costs for Employment-Related Activities, by Program and Agency
(in 1993 Dollars)
Program and Activity Welfare Department Unit Cost Non-Welfare Agency Unit Cost
Average per Month of Participation($) Average per Hour($) Average per Month of Participation($) Average per Participant($)
Integrated program

Orientation and appraisala

17 n/a n/a n/a

Job searchb

198 n/a 90 n/a

Basic education

78 3 n/a n/a

Post-secondary educationc

202 6 n/a n/a

Vocational training

202 n/a n/a 4,523

Work experience

96 n/a 68 n/a
Traditional program

Orientation and appraisala

12 n/a n/a n/a

Job searchb

184 n/a 90 n/a

Basic education

48 4 n/a n/a

Post-secondary educationc

49 6 n/a n/a

Vocational training

49 n/a n/a 2,491

Work experience

41 n/a 68 n/a
Control group

Orientation and appraisala

n/a n/a n/a n/a

Job search

n/a n/a 90 n/a

Basic education

n/a 4 n/a n/a

Post-secondary educationc

n/a 6 n/a n/a

Vocational training

n/a n/a n/a 2,135

Work experience

n/a n/a 68 n/a
Sources: MDRC calculations based on fiscal and participation data from the following: Franklin County Department of Human Services; Ohio Department of Education, Office of Vocational and Adult Education; Ohio Board of Regents; National Center for Education Statistics; and information from MDRC-collected case file data and the Two-Year Client Survey.
Notes:N/a=not applicable. aOrientation cost is applied per session. Orientation is generally a one-day, one-time activity. bFor program group members, this measure includes participation in life skills workshops. cCourses for college credit at a two-year or four-year college.

JOBS-Related Expenditures by the Welfare Department

The Columbus welfare department incurred costs to operate the JOBS program, as well as for child care and other support services.

  1. Operating costs. Welfare department operating costs were determined using expenditure data that captured the costs of JOBS-related activities, starting at the point that sample members were randomly assigned. Total expenditures (salaries and overhead costs) were allocated to JOBS activities using time studies completed by case managers. In addition, payments made by the welfare department to outside organizations that were contracted to provide JOBS services (primarily job search and basic education) are included in the total welfare department costs. Costs incurred by the welfare department to accommodate MDRC research requirements and requests were excluded from total expenditures. As shown in Table 4.2 (column 1), these costs were $631 per integrated program group member and $318 per traditional program group member.
  2. Support service costs. Other JOBS-related costs include payments for child care and participation allowances that participants were eligible to receive. Table 4.2 (column 1) shows that the average JOBS child care cost was $341 per integrated program group member and $349 per traditional program group member.(7) Table 4.3 provides more detailed information about patterns of support service receipt. Average monthly payments of over $370 were high compared with the other programs studied in the NEWWS Evaluation, but low rates of receipt (less than 15 percent) and relatively short durations (about 6.5 months) put the cost per program member near the average of the other programs. The higher participation allowance payments to integrated program group members ($199 versus $133) mainly reflect a higher rate of receipt among sample members in that program  63 percent compared with 48 percent.
  3. Total JOBS-related costs incurred by the welfare department. Table 4.2 shows the combined costs of providing the services described above: the welfare department spent $1,182 per integrated group member for employment-related services and $810 per traditional group member. A large part of this difference is due to the higher cost in the integrated program for monitoring participants in post-secondary education and vocational training activities.
Table 4-2
Estimated Cost per Program Group Member for Employment-Related Services
Within a Two-Year Follow-Up Period, by Program and Agency
(in 1993 Dollars)
Program and Activity or Services JOBS Cost Non-JOBS Cost Total Gross Cost per Program Group Member($)
Welfare Department Cost ($) Non-Welfare Agency Cost ($) Total Program Cost ($) Welfare Department Cost ($) Non-Welfare Agency Cost ($)
Integrated program

Orientation and appraisal

15 0 15 0 0 15

Job searcha

68 0 68 0 3 71

Basic education

87 203 290 0 174 464

Post-secondary educationb

267 451 718 0 327 1,045

Vocational training

136 488 623 0 0 623

Work experience

59 0 59 0 15 74

Subtotal (operating)

631 1,141 1,773 0 519 2,292

Child care

341 0 341 170 0 511

Child Care administrationc

10 0 10 6 0 16

Participation allowance

199 0 199 0 0 199

Total

1,182 1,141 2,323 176 519 3,018
Traditional Program

Orientation and appraisal

8 0 8 0 0 8

Job searcha

90 0 90 0 0 90

Basic education

67 386 453 0 115 568

Post-secondary educationb

69 662 731 0 209 940

Vocational training

38 243 281 0 0 281

Work experience

46 0 46 0 12 57

Subtotal (operating)

318 1,291 1,608 0 336 1,944

Child care

349 0 349 147 0 496

Child care administrationc

10 0 10 5 0 15

Participation allowance

133 0 133 0 0 133

Total

810 1,291 2,101 152 336 2,589
Sources: MDRC calculations based on fiscal and participation data from the following: Franklin County Department of Human Services; Ohio Department of Education, Office of Vocational and Adult Education; Ohio Board of Regents; National Center for Education Statistics; and information from MDRC-collected case file data and the Two-Year Client Survey. Child care and other support service calculations are based on Ohio Department of Human Services payment data
Notes:Rounding may cause slight discrepancies in calculating sums and differences. aFor program group members, this measure includes participation in life skills workshops. bCourses for college credit at a two-year or four-year college. cAdministrative costs for determining child care needs and issuing payments were estimated as a percentage of the value of payments, i.e., by dividing total administrative costs by total payments. Child care administrative costs were 3 percent of total payments.

JOBS-Related Expenditures by Non-Welfare Agencies

Non-welfare agencies also incurred costs providing JOBS services to program group members.(8) For basic education, non-welfare costs reflect expenditures beyond those covered by contracts with the welfare department. In Columbus, non-welfare agencies spent $1,141 per integrated group sample member and $1,291 per traditional group sample member (Table 4.2, column 2). Although basic education and post-secondary education costs were higher for the traditional group (sample members in the traditional group participated in these activities for more hours while in JOBS than their counterparts in the integrated group), these differences are offset by the higher cost of vocational training activities chosen by sample members in the integrated group.

Non-JOBS Expenditures by the Welfare Department

As shown in Table 4.2 (column 4), the welfare department spent an additional $176 per integrated program group member and $152 per traditional program group member on child care services unrelated to the JOBS program. Table 4.3 shows that traditional program group members received nearly equivalent amounts from transitional and other low-income child care programs ($77 and $70, respectively). Sample members in the integrated group received similar levels of support from other child care programs ($69 per program group member), but received half again as much in transitional child care ($101 per program group member).

Table 4.3
Estimated Support Service Costs Within a Two-Year Follow-Up Period, by Program
(in 1993 Dollars)
Program and Support Service Per Program Group Member Who Received Service Program Group Members Who Received Service (%) Cost Per Program Group Member ($)
Average Monthly Payment ($) Average Months of Payments Cost per Person Who Received Service ($)
Integrated program

Child care

JOBS

375 6.2 2,322 14.7 341

Transitional

396 6.3 2,482 4.1 101

Other

441 5.0 2,223 3.1 69

Participation allowance

47 6.8 316 63.0 199

Total

  710
Traditional program

Child care

JOBS

388 6.7 2,604 13.4 349

Transitional

427 5.3 2,263 3.4 77

Other

411 5.4 2,209 3.2 70

Participation allowance

44 6.3 281 47.5 133

Total

  629
Sources: MDRC calculations based on Ohio Department of Human Services payment data.
Notes: Rounding may cause slight discrepancies in calculating sums and differences.

Non-JOBS Expenditures by Non-Welfare Agencies

The cost to non-welfare agencies for education and training activities undertaken outside the JOBS program was $519 per integrated group sample member and $336 per traditional group sample member. These costs are primarily for activities that sample members undertook during periods when they were not required to participate in JOBS. A large part of the difference between the two groups is accounted for by the cost of post-secondary education: in this case, sample members in the integrated group spent more time in post-secondary education activities outside the JOBS program than did those in the traditional group.

Total Gross Cost per Program Group Member

Table 4.4 shows that the sum of the JOBS and non-JOBS costs produces a total gross cost per person of $3,018 for the integrated program and $2,589 for the traditional program, with post-secondary education and vocational training accounting for much of the difference between the groups.

Table 4.4
Estimated Total Gross Costs and Net Costs for Employment-Related Services
Within a Two-Year Follow-Up Period, by Program
(in 1993 Dollars)
Program and Activity or Service Total Gross Cost per Program Group Member ($) Total Gross Cost per Control Group Member ($) Net Cost per Program Group Member ($)
Integrated program

Orientation and appraisal

15 0 15

Job searcha

71 6 64

Basic education

464 80 384

Post-secondary educationb

1,045 243 802

Vocational training

623 203 420

Work experience

74 6 68

Subtotal (operating)

2,292 538 1,754

Child care

511 311 199

Child care administrationc

16 10 6

Participation allowance

199 10 190

Total

3,018 869 2,149
Traditional Program

Orientation and appraisal

8 0 8

Job searcha

90 6 84

Basic education

568 80 488

Post-secondary educationb

940 243 697

Vocational training

281 203 78

Work experience

57 6 51

Subtotal (operating)

1,944 538 1,406

Child care

496 311 184

Child care administrationc

15 10 6

Participation allowance

133 10 124

Total

2,589 869 1,720
Sources: MDRC calculations based on fiscal and participation data from the following: Franklin County Department of Human Services; Ohio Department of Education, Office of Vocational and Adult Education; Ohio Board of Regents; National Center for Education Statistics; and information from MDRC-collected case file data and the Two-Year Client Survey. Child care and other support service calculations are based on Ohio Department of Human Services payment data
Notes: Rounding may cause slight discrepancies in calculating sums and differences. aFor program group members, this measure includes participation in life skills workshops. bCourses for college credit at a two-year or four-year college. cAdministrative costs for determining child care needs and issuing payments were estimated as a percentage of the value of payments, i.e., by dividing total administrative costs by total payments. Child care administrative costs were 3 percent of total payments.

As discussed earlier in this chapter, owing to the different point of random assignment in Columbus, a direct comparison of this site's costs with those of the other programs in the NEWWS Evaluation would not be meaningful. An approximation that results in more comparable figures is achieved by considering the costs of activities per participant in the activity (calculated by dividing the cost per program member by the participation rate). The integrated program had the lowest per-participant cost for basic education of all the programs in the NEWWS Evaluation; the traditional program cost was also relatively low. (The cost of basic education per participant was $1,615 in the integrated program and $2,088 in the traditional program. Among the other JOBS programs, the Riverside labor force attachment program had the lowest per-participant cost in basic education at $1,845.) The average per-participant costs for post-secondary education and vocational training were higher than the average costs in the other programs.(9)

The high levels of office automation and administrative support for staff, described in Chapter 2, may have contributed to lower case management costs in Columbus. In addition, by co-locating contracted job search and basic education activities at the JOBS center, most clients were funneled into services provided by lower-cost agencies. Although on-site services were provided as a convenience for clients, this arrangement may have also reduced the effort required of case managers to monitor participation in these activities.

Gross Cost per Control Group Member

Control group members participated in education and training activities on their own initiative. In addition, they were eligible for some support services from the welfare department. Therefore, the gross cost per control group member for employment-related services includes expenditures by the welfare department and non-welfare agencies. This cost serves as a benchmark against which the gross cost per program group member is compared in order to determine the net cost of the programs.

Welfare Department Costs

Control group members were eligible to receive child care for education and training activities that they participated in on their own and could receive work-related transitional and other non-JOBS child care. Table 4.4 (column 2) shows that the welfare department spent $321 per control group member for child care ($10 of this represents program administration costs) and $10 on participation allowances.

Non-Welfare Agency Costs

Table 4.4 (column 2) shows that the total non-welfare agency cost for control group members in Columbus was $538. Post-secondary education ($243) and vocational training ($203) activities accounted for the majority of these expenditures.

Total Gross Cost per Control Group Member

Summing the welfare and non-welfare agency costs produces a total gross cost of $869 per control group member for employment-related services. This control group cost is used in the next section as the benchmark to determine the net cost per integrated group member and per traditional group member.

Net Cost per Program Group Member

Table 4.4 (column 3) shows the net costs of employment-related services for the integrated and traditional programs. For the integrated group, $2,149 was spent per program group member over and above what was spent on the control group. For the traditional group, the net cost was about $400 lower.

Costs by Educational Attainment Subgroup

Table 4.5 presents gross and net costs of employment-related services for sample members with and without a high school diploma or GED at random assignment. For both programs, gross costs were higher for the subgroup with a high school diploma or GED (graduates) than for those without a credential (nongraduates). Graduates had higher gross costs than nongraduates primarily because they were more likely to participate in higher-cost activities, such as post-secondary education and vocational training. In addition, although graduates and nongraduates received similar participation allowances, graduates received substantially more child care support (both JOBS and non-JOBS). These same patterns in participation and costs are seen for graduates and nongraduates in the control group. Thus, net costs for employment-related services were also higher for graduates than for nongraduates.

Table 4-5
Estimated Total Gross Costs and Net Costs for Employment-Related Services
Within a Two-Year Follow-Up Period, by Program and High School Diploma/GED Status
(in 1993 Dollars)
Program and Activity or Service Total Gross Cost per Program Group Member ($) Total Gross Cost per Control Group Member ($) Net Cost per Program Group Member ($)
For those with a high school diploma or GED:
Integrated program

Orientation and appraisal

15 0 15

Job searcha

83 10 73

Basic education

110 14 96

Post-secondary educationb

1,455 365 1,090

Vocational training

688 296 391

Work experience

61 2 60

Subtotal (operating)

2,412 687 1,725

Child care

615 448 167

Child care administrationc

19 14 6

Participation allowance

194 12 182

Total

3,240 1,161 2,079
Traditional Program

Orientation and appraisal

7 0 7

Job searcha

117 10 107

Basic education

95 14 81

Post-secondary educationb

1,568 365 1,203

Vocational training

336 296 39

Work experience

57 2 55

Subtotal (operating)

2,180 687 1,493

Child care

643 448 195

Child care administrationc

20 14 6

Participation allowance

134 12 122

Total

2,977 1,161 1,816
For those without a high school diploma or GED:
Integrated program

Orientation and appraisal

14 0 14

Job searcha

40 1 40

Basic education

1,070 189 881

Post-secondary educationb

364 80 284

Vocational training

420 86 335

Work experience

50 13 37

Subtotal (operating)

1,958 368 1,590

Child care

372 128 244

Child care administrationc

11 4 7

Participation allowance

209 5 204

Total

2,551 505 2,046
Traditional Program

Orientation and appraisal

9 0 9

Job searcha

48 1 47

Basic education

1,364 189 1,175

Post-secondary educationb

164 80 84

Vocational training

177 86 91

Work experience

29 13 17

Subtotal (operating)

1,791 368 1,423

Child care

299 128 171

Child care administrationc

9 4 5

Participation allowance

133 5 129

Total

2,232 505 1,727
Sources: MDRC calculations based on fiscal and participation data from the following: Franklin County Department of Human Services; Ohio Department of Education, Office of Vocational and Adult Education; Ohio Board of Regents; National Center for Education Statistics; and information from MDRC-collected case file data and the Two-Year Client Survey. Child care and other support service calculations are based on Ohio Department of Human Services payment data
Notes: Rounding may cause slight discrepancies in calculating sums and differences. aFor program group members, this measure includes participation in life skills workshops. bCourses for college credit at a two-year or four-year college. cAdministrative costs for determining child care needs and issuing payments were estimated as a percentage of the value of payments, i.e., by dividing total administrative costs by total payments. Child care administrative costs were 3 percent of total payments.

Endnotes

1.  Only clients who were mandatory for the AFDC JOBS program were eligible to be assigned to integrated case management, so integrated case managers would have had fewer clients in the other work programs. Because integrated case managers were responsible for all members of a household, they would have worked with any GA or Food Stamps-only recipients who were part of the sample member's household. In addition, sample members who stopped receiving AFDC, but received Food Stamps or GA, would have continued on the integrated case manager's caseload.

2.  As noted in Chapter 2, for every two integrated case managers, there were about 280 cases, compared with caseloads of about 260 for every pair of traditional JOBS case managers and IM workers.

3.  These activities are described in Chapter 2.

4.  For example, in calculating a cost per month of participation, the participation measure is "participant-months," which is obtained by summing the monthly total number of participants in the activity across all months in the steady-state period.

5.  A more detailed explanation of general cost methodology can be found in Hamilton et al., 1997, pp. 165-69.

6.  Until mid 1993 there were two integrated units with six case managers each. During the second half of the year, there were three integrated units with seven case managers each. During the same period, there were five traditional units with an average of six case managers each.

7.  For both programs, an additional $10 per program group member was spent in administering these payments.

8.  This analysis assumes that education and training services provided by non-welfare agencies were financed by non-welfare agencies (including the U.S. Department of Education, if program group members received Pell Grants or other financial aid) and not by sample members themselves. To the extent that sample members actually financed their own education and training, this analysis overstates the true costs to non-welfare agencies per sample member. This has distributional implications, but does not overstate the total cost of services. The GAIN evaluation of seven counties in California found that fewer than 10 percent of sample members may have spent their own or their family's resources on education and training. See Riccio, Friedlander, and Freedman, 1994, for details.

9.  The average cost per participant in post-secondary education was $4,939 in Columbus and $4,697 in the other NEWWS programs. The average cost per participant in vocational training was $4,292 in Columbus and $3,994 in the other programs. Because of data limitations, Portland and Detroit costs are not included in these averages.

5. Employment and Welfare Impacts of the Integrated and Traditional Programs

This chapter describes the integrated and traditional programs' three-year impacts on employment, earnings, AFDC receipt and payments, and combined income from earnings, AFDC, and Food Stamps.(1) The impact estimates in the chapter are based on quarterly unemployment insurance (UI) records and monthly AFDC and Food Stamp payment records.(2) As mentioned in Chapter 1, sample members were randomly assigned to either the integrated group, the traditional group, or the control group. This research design allows for three different experimental comparisons: integrated-control, traditional-control, and integrated-traditional. The first two comparisons provide estimates of the effects of each program (averages for control group members represent outcomes that are expected to occur in the absence of the programs); the third comparison provides estimates of the relative effectiveness of the two programs. Unless otherwise stated, the impacts discussed in this chapter are statistically significant.(3)

Summary of the Impact Findings

Over three years, the integrated and traditional programs produced similar employment and earnings gains. Researchers had hypothesized that the higher participation rate in the integrated program would lead to larger impacts on employment and earnings, but this was not the case. Quarterly impact patterns suggest, however, that the integrated program may prove more successful in the fourth year of follow-up than the traditional program.

The integrated program produced somewhat larger decreases in months of AFDC receipt and AFDC payments measured over three years, probably because integrated case managers could more quickly respond to changes in sample members' employment and welfare eligibility status, and because they had more knowledge about status changes than staff in the traditional program.

Neither of the programs increased average "combined income" from earnings, AFDC, and Food Stamps. On average, people in the programs replaced some public assistance dollars with earnings.

Among sample members who had a high school diploma or GED at random assignment (graduates), the two programs produced roughly similar effects. Among nongraduates, however, the integrated program was more successful than the traditional program in increasing earnings and decreasing cash assistance payments.

Analysis Issues

As discussed in prior chapters, both programs aimed to increase welfare recipients' skills levels before they looked for work. Employment gains and welfare reductions in programs such as these may be delayed while recipients participate in education and training activities. After an initial period of investment in skills-building, integrated and traditional group members may make up for forgone earnings by obtaining more jobs or higher-paying jobs than control group members.

The evaluation designers expected that the programs would affect employment and welfare receipt to different degrees. Specifically, they hypothesized that the integrated program would be more effective than the traditional program in increasing employment and in decreasing welfare receipt.

The hypothesis that the integrated program would produce larger employment and earnings gains was based primarily on two expectations. First, as discussed in Chapter 3, the integrated approach was expected to engage more people in the program than the traditional approach, and it did. It was expected that exposing more people to the program's messages and services, would, in turn, result in larger effects on employment and earnings. Second, as discussed in Chapter 2, the integrated program was expected to more effectively deliver program services and monitor welfare recipients' situations than the traditional program, which could lead to larger employment and earnings effects. In fact, the implementation data suggested some differences between the programs: namely, integrated case managers provided more personalized attention than did traditional case managers and more closely monitored participation in program activities.

The hypothesis that the integrated program would produce larger decreases in welfare receipt and payments than the traditional program was predicated on two expectations. First, if the integrated program increased employment and earnings more than the traditional program (as discussed above), that, in turn, likely would result in larger welfare reductions. Second, it was expected that the integrated structure would engender more effective eligibility case management than the traditional structure by giving case managers more knowledge about the client more quickly, and allowing them to close ineligible cases more quickly. For example, the closer contact between integrated case managers and recipients might allow integrated staff to learn about eligibility changes that traditional staff might not. Also, if a sample member became employed, an integrated case manager might find out about this change more quickly because the integrated staff see their clients more frequently. Once they had this knowledge, integrated staff would also be able to respond more quickly because they could reduce a grant amount or close a grant themselves, rather than having to ask another staff member do so.

As Chapter 1 described, random assignment in Columbus occurred at the point of referral to the JOBS program. The impacts presented in this chapter, therefore, reflect the effects not only of the program services and mandates but also of the referral to the program and any related follow-up, such as sanctioning for orientation nonattendance. Telling someone she must participate in a welfare-to-work program could affect her labor market and welfare behavior in at least two ways: She could be motivated to quickly find a job and leave welfare to avoid the program mandate or, alternatively, to delay employment to gain access to the services offered by the program. Because random assignment occurred only at the point of referral to the program, it is impossible to isolate the effects of either the referral to the program or the program services and mandates.(4) The impacts presented in this chapter, therefore, represent estimates of the combined or average effect of the program services and mandates and the referral to the program.

As mentioned in Chapter 3, some people in the integrated and traditional groups never attended JOBS orientation and thus had no chance to attend program activities. The outcomes for these sample members are averaged together with the impacts for orientation attendees. This may "dilute" the estimate of the effects of the welfare-to-work program services and mandates, especially for the traditional program in which even fewer people attended orientation.

Impacts on Employment and Earnings

Table 5.1 shows the two programs' impacts on employment and earnings. The first set of columns shows the impacts of the integrated program (integrated-control comparison), and the second set shows the impacts of the traditional program (traditional-control comparison). The last column shows the difference between outcomes of the integrated and traditional programs (integrated-traditional difference).

Table 5.1
Program Impacts on Employment and Earnings
  Integrated-Control Comparison Traditional-Control Comparison  
Outcome Integrated Group Control Group Difference (Impact) Percentage Change Traditional Group Control Group Difference (Impact) Percentage Change Integrated-
Traditional Difference (Impact)
Ever employed, years 1-3 (%) 81.1 78.5 2.6** 3.3 80.7 78.5 2.2** 2.8 0.4
Year1 60.0 60.1 -0.1 -0.2 59.9 60.1 -0.1 -0.2 0.0
Year2 65.2 62.9 2.3* 3.7 64.5 62.9 1.6 2.6 0.7
Year3 68.9 65.3 3.6*** 5.5 67.9 65.3 2.6** 3.9 1.0
Quarters employed, years 1-3 5.75 5.46 0.29*** 5.3 5.69 5.46 0.23** 4.1 0.06
Year1 1.64 1.62 0.02 1.0 1.66 1.62 0.04 2.7 -0.03
Year2 1.97 1.82 0.15*** 8.5 1.94 1.82 0.13*** 7.0 0.03
Year3 2.14 2.02 0.12** 5.8 2.08 2.02 0.06 2.8 0.06
Earnings, years 1-3 ($) 13,208 12,027 1,181*** 9.8 13,027 12,027 1,000** 8.3 181
Year1 2,994 2,914 80 2.8 3,099 2,914 185 6.4 -105
Year2 4,578 3,982 596*** 15.0 4,472 3,982 490*** 12.3 106
Year3 5,635 5,131 505*** 9.8 5,456 5,131 325* 6.3 180
Sample size (total = 7,242) 2,513 2,159     2,570 2,159      
Sources: MDRC calculations from Ohio unemployment insurance (UI) earnings records.
Notes: Estimates were regression-ajusted using ordinary least squares, controlling for pre-random assignment characteristics of sample members. "Percentage change" equals 100 times "difference"divided by "control group". Rounding may cause slight discrepancies in calculating sums and differences. A two-tailed t-test was applied to differences between outcomes for the program and control groups and to differences between outcomes for the integrated and traditional program groups. Statistical significance levels are indicated as: * = 10 percent; ** = 5 percent; *** = 1 percent. Year 1 refers to quarters 2 to 5; year 2 refers to quarters 6 to 9; year 3 refers 10 to 13. Because quarter 1, the quarter of random assignment, may contain some earnings and AFDC payments from the period prior to random assignment, it is excluded from follow-up measures.

In the context of Columbus's strong labor market, employment rates were high even without the programs' intervention: As the table shows, 78.5 percent of control group members were employed at some point during the three years after random assignment. They were employed for an average of 5.46 quarters (just over 16 months) and earned an average of $12,027 over the three-year period (this average includes zeros for people with no earnings).

Both programs produced small increases in employment rates and the length of time employed. Over three years, 81.1 percent of the integrated group worked for pay, a 2.6 percentage-point increase, and 80.7 percent of the traditional group worked for pay, a 2.2 percentage-point increase. Integrated group members worked an average of 5.75 quarters, an increase of 0.29 of a quarter (almost a month), and traditional group members worked an average of 5.69 quarters, an increase of 0.23 of a quarter (about two-thirds of a month).

Integrated group members earned on average $13,208 over the three-year period, a $1,181, or 10 percent, increase above the control group. Traditional group members earned on average $13,027, a $1,000, or 8 percent, increase above the control mean. (The $181 difference between the program groups' average earnings is not statistically significant.) These gains are similar to the earnings impacts of the other education-focused programs studied as part of the NEWWS Evaluation.(5) The earnings gains in the Columbus programs were primarily the result of longer duration of employment and higher earnings on the job.(6) In other words, the programs raised total earnings by enabling integrated and traditional group members who would have been employed anyway to obtain better jobs.

As is often found for programs that emphasize building skills prior to finding a job, neither program increased employment levels or earnings during the first year of follow-up. (This indicates that the referral to a mandatory welfare-to-work program did not, on average, spur people to quickly begin a job to avoid the program.) Employment and earnings gains began in the second year of follow-up. By the end of the third year of follow-up, the integrated program's impacts had decreased but remained statistically significant. The traditional program's impacts, in contrast, were less consistent during the third year. (See Appendix Table C.1 for the programs' impacts displayed for each quarter of the follow-up period.) These patterns suggest that the integrated program will likely continue to increase employment and earnings during the fourth year of follow-up, but the traditional program may not.

Contrary to researchers' expectations, more personalized attention, closer monitoring, and the higher rate of participation in program activities in the integrated program did not translate into larger employment and earnings impacts (although quarterly patterns suggest that the integrated program may have more positive results than the traditional program during the fourth year of follow-up). A recently published MDRC analysis of participation in welfare-to-work programs found that although a minimum level of participation is necessary to produce employment impacts, above that threshold there is no linear relationship between participation levels and impacts.(7) In light of this new information, one should not expect that higher participation rates would necessarily yield larger employment and earnings impacts.

Impacts on AFDC Receipt and Payments

The employment gains in Columbus were accompanied by cash assistance reductions. Over a three-year period, control group members received AFDC for an average of about 21 1/2 months (see Table 5.2). The integrated program reduced AFDC receipt by more than 2 1/2 months, a decrease of 12 percent relative to the control group mean. The traditional program reduced receipt to a lesser extent  by about 1 2/3 months, or 8 percent. The integrated program's reduction in months of welfare receipt was the largest among the education-focused programs in the NEWWS Evaluation.(8) The Columbus program impacts on cash assistance receipt grew throughout the follow-up period. In the last quarter of year 3, 40.3 percent of the control group received AFDC benefits compared with 33.2 percent of the integrated group and 34.9 percent of the traditional group (see Appendix Table C.1).

Table 5.2
Program Impacts on AFDC Receipt and Payments
  Integrated-Control Comparison Traditional-Control Comparison  
Outcome Integrated Group Control Group Difference (Impact) Percentage Change Traditional Group Control Group Difference (Impact) Percentage Change Integrated-
Traditional Difference (Impact)
Ever received AFDC, years 1-3 (%) 96.4 96.9 -0.5 -0.6 96.3 96.9 -0.6 -0.7 0.1
Year1 95.8 96.6 -0.8 -0.8 96.0 96.6 -0.6 -0.6 -0.2
Year2 65.1 69.1 -4.0*** -5.7 65.9 69.1 -3.2** -4.6 -0.8
Year3 47.0 54.4 -7.4*** -13.6 49.0 54.4 -5.4*** -10.0 -2.0
Months received AFDC, years 1-3 18.87 21.48 -2.61*** -12.2 19.77 21.48 -1.71*** -8.0 -0.90***
Year1 8.91 9.62 -0.71*** -7.3 9.16 9.62 -0.46*** -4.8 -0.25**
Year2 5.91 6.79 -0.87*** -12.9 6.22 6.79 -0.57*** -8.4 -0.30**
Year3 4.04 5.08 -1.03*** -20.4 4.39 5.08 -0.68*** -13.5 -0.35**
AFDC amount, years 1-3 ($) 6,071 7,151 -1,079*** -15.1 6,335 7,151 -816*** -11.4 -264**
Year1 2,880 3,199 -318*** -10.0 2,950 3,199 -249*** -7.8 -70*
Year2 1,895 2,270 -375*** -16.5 1,989 2,270 -281*** -12.4 -95**
Year3 1,297 1,682 -386*** -22.9 1,396 1,682 -286*** -17.0 -99**
Sample size (total = 7,242) 2,513 2,159     2,570 2,159      
Sources: MDRC calculations from Ohio AFDC records.
Notes: Estimates were regression-ajusted using ordinary least squares, controlling for pre-random assignment characteristics of sample members. "Percentage change" equals 100 times "difference"divided by "control group". Rounding may cause slight discrepancies in calculating sums and difference. A two-tailed t-test was applied to differences between outcomes for the program and control groups and to differences between outcomes for the integrated and traditional program groups. Statistical significance levels are indicated as:*= 10 percent; ** = 5 percent; *** = 1 percent. Year 1 refers to quarters 2 to 5; year 2 refers to quarters 6 to 9; year 3 refers 10 to 13. Because quarter 1, the quarter of random assignment, may contain some earnings and AFDC payments from the period prior to random assignment, it is excluded from follow-up measures.

Over three years, control group members received an average of $7,151 in AFDC payments. Both programs reduced welfare payments, but the integrated program's impacts were larger. Integrated group members received an average of $6,071 in AFDC payments over the three-year period, a reduction of $1,079, or 15 percent, compared with the control mean, and traditional group members received an average of $6,335, a reduction of $816, or 11 percent. The percentage reduction in the integrated program is the largest reduction among the NEWWS Evaluation education-focused programs.(9) Most of the decrease in AFDC payments occurred because integrated and traditional group members spent less time on welfare than their control group counterparts, rather than receiving lower grant amounts.(10)

The programs reduced AFDC payments during each year of follow-up; the effects grew over time and remained substantial at the end of year 3 (see Appendix Table C.1). This suggests that the reductions are very likely to persist during the fourth year of follow-up. The fact that during year 1 the programs reduced welfare receipt and payments but did not increase employment and earnings suggests that some people may have left the welfare rolls to avoid the participation mandate.

As hypothesized, the integrated program generated larger reductions in welfare receipt and payments than the traditional program. This difference occurred because integrated group members spent less time on welfare, on average, than their traditional group counterparts.(11) In other words, the integrated case management structure facilitated case closures. Specifically, integrated case managers closed cases more quickly, on average, than traditional staff. They also closed cases that would have remained open in the traditional program, likely because they were better able to detect individuals who should not be receiving welfare.

Impacts on Combined Income

The earnings gains produced by the Columbus programs did not exceed the public assistance losses, thus providing no gain in average "combined income." As discussed in a previous NEWWS Evaluation report, there are several ways to measure a program's effect on sample members' economic self-sufficiency; one way is to examine sample members' average combined income from earnings, AFDC, and Food Stamps.(12) This income measure does not include estimates of the Earned Income Credit, a credit against federal income taxes for low-income taxpayers. Over three years, the Columbus integrated program reduced Food Stamp payments by $697, and the traditional program reduced Food Stamp payments by $483 (these numbers are not presented in a table).(13) During the three years following random assignment, control group members received on average $25,490 from earnings, AFDC, and Food Stamps. Integrated group members received $24,895 ($595, or 2 percent, less), and traditional group members received $25,192 ($298, or 1 percent, less). These small decreases in average combined income are not statistically significant.

Impacts for Educational Attainment Subgroups

Employment and earnings impacts for people entering the programs with a high school diploma or GED (graduates) are presented in Table 5.3. Neither program increased three-year employment levels for graduates, but the integrated program produced small increases in employment levels in years 2 and 3. Measured over the three-year follow-up period, the traditional program increased graduates' average earnings by $1,105, or 7 percent; the $633 increase for the integrated program is not statistically significant. (The difference between the earnings of the integrated group and the traditional group is not statistically significant.) Table 5.4 shows that the two programs decreased the number of months that graduate sample members received welfare and their average welfare payments.

Table 5.3
Program Impacts on Employment and Earnings
for Sample Members with a High School Diploma or GED
  Integrated-Control Comparison Traditional-Control Comparison  
Outcome Integrated Group Control Group Difference (Impact) Percentage Change Traditional Group Control Group Difference (Impact) Percentage Change Integrated-
Traditional Difference (Impact)
Ever employed, years 1-3 (%) 85.5 83.0 2.0 2.4 84.2 83.0 1.2 1.4 0.8
Year1 65.8 65.3 0.5 0.8 64.3 65.3 -0.9 -1.5 1.5
Year2 70.8 68.0 2.8* 4.2 69.1 68.0 1.2 1.7 1.7
Year3 72.5 69.7 2.8* 4.0 71.5 69.7 1.8 2.6 1.0
Quarters employed, years 1-3 6.37 6.12 0.25* 4.2 6.34 6.12 0.22 3.6 0.03
Year1 1.85 1.84 0.01 0.5 1.89 1.84 0.05 2.8 -0.04
Year2 2.20 2.05 0.16** 7.7 2.17 2.05 0.12* 5.9 0.04
Year3 2.32 2.23 0.09 3.9 2.28 2.23 0.05 2.3 0.04
Earnings, years 1-3 ($) 15,544 14,911 633 4.2 16,016 14,911 1,105* 7.4 -473
Year1 3,558 3,617 -59 -1.6 3,854 3,617 237 6.6 -296*
Year2 5,404 5,014 390 7.8 5,525 5,014 511** 10.2 -121
Year3 6,582 6,280 302 4.8 6,637 6,280 357 5.7 -56
Sample size (total = 4,135) 1,428 1,230     1,477 1,230      
Sources: MDRC calculations from Ohio unemploment insurance (UI) earnings records.
Notes: Estimates were regression-ajusted using ordinary least squares, controlling for pre-random assignment characteristics of sample members. "Percentage change" equals 100 times "difference"divided by "control group". Rounding may cause slight discrepancies in calculating sums and difference. A two-tailed t-test was applied to differences between outcomes for the program and control groups and to differences between outcomes for the integrated and traditional program groups. Statistical significance levels are indicated as:* = 10 percent; ** = 5 percent; *** = 1 percent. Year 1 refers to quarters 2 to 5; year 2 refers to quarters 6 to 9; year 3 refers 10 to 13. Because quarter 1, the quarter of random assignment, may contain some earnings and AFDC payments from the period prior to random assignment, it is excluded from follow-up measures.

Table 5.4
Program Impacts on AFDC Receipt and Payments
for Sample Members with a High School Diploma or GED
  Integrated-Control Comparison Traditional-Control Comparison  
Outcome Integrated Group Control Group Difference (Impact) Percentage Change Traditional Group Control Group Difference (Impact) Percentage Change Integrated-
Traditional Difference (Impact)
Ever received AFDC, years 1-3 (%) 96.6 96.6 0.0 0.0 96.2 96.6 -0.4 -0.4 0.4
Year1 96.1 96.2 -0.1 -0.2 95.7 96.2 -0.5 -0.5 0.4
Year2 62.7 65.4 -2.7 -4.1 61.7 65.4 -3.7** -5.7 1.0
Year3 44.0 49.7 -5.7*** -11.5 44.0 49.7 -5.7*** -11.5 0.0
Months received AFDC, years 1-3 17.84 19.94 -2.10*** -10.5 18.23 19.94 -1.72*** -8.6 -0.38
Year1 8.66 9.30 -0.65*** -7.0 8.82 9.30 -0.49*** -5.2 -0.16
Year2 5.48 6.21 -0.72*** -11.6 5.59 6.21 -0.62*** -10.0 -0.10
Year3 3.70 4.43 -0.73*** -16.4 3.82 4.43 -0.61*** -13.8 -0.12
AFDC amount, years 1-3 ($) 5,633 6,486 -853*** -13.2 5,720 6,486 -766*** -11.8 -88
Year1 2,740 3,011 -271*** -9.0 2,778 3,011 -233*** -7.7 -38
Year2 1,723 2,028 -304*** -15.0 1,742 2,028 -286*** -14.1 -19
Year3 1,169 1,447 -278*** -19.2 1,201 1,447 -246*** -17.0 -31
Sample size (total = 4,135) 1,428 1,230     1,477 1,230      
Sources: MDRC calculations from Ohio AFDC records.
Notes: Estimates were regression-ajusted using ordinary least squares, controlling for pre-random assignment characteristics of sample members. "Percentage change" equals 100 times "difference"divided by "control group". Rounding may cause slight discrepancies in calculating sums and difference. A two-tailed t-test was applied to differences between outcomes for the program and control groups and to differences between outcomes for the integrated and traditional program groups. Statistical significance levels are indicated as:* = 10 percent; ** = 5 percent; *** = 1 percent. Year 1 refers to quarters 2 to 5; year 2 refers to quarters 6 to 9; year 3 refers 10 to 13. Because quarter 1, the quarter of random assignment, may contain some earnings and AFDC payments from the period prior to random assignment, it is excluded from follow-up measures.

As Table 5.5 shows, the integrated program was much more successful than the traditional program in generating earnings gains for sample members who entered the programs without a high school diploma or GED (nongraduates). The integrated program increased nongraduates' average three-year earnings by $1,730, or 21 percent; the gain of $734, or 9 percent, in the traditional program is not statistically significant. Both programs decreased time on welfare and welfare payments for nongraduates, although the integrated program did so to a greater extent (see Table 5.6). The integrated program decreased months of welfare receipt by 14 percent, compared with 7 percent for the traditional program, and decreased welfare payments by $1,404, or 17 percent, compared with $874, or 11 percent, for the traditional program. Both programs produced small, not statistically significant reductions in average combined income from earnings, AFDC, and Food Stamps for graduates and nongraduates, as they did for the full sample.

Table 5.5
Program Impacts on Employment and Earnings
for Sample Members Without a High School Diploma or GED
  Integrated-Control Comparison Traditional-Control Comparison  
Outcome Integrated Group Control Group Difference (Impact) Percentage Change Traditional Group Control Group Difference (Impact) Percentage Change Integrated-
Traditional Difference (Impact)
Ever employed, years 1-3 (%) 75.9 72.9 2.9 4.0 76.1 72.9 3.2* 4.4 -0.2
Year1 52.0 53.4 -1.4 -2.6 54.0 53.4 0.6 1.1 -1.9
Year2 57.7 56.3 1.3 2.3 58.1 56.3 1.8 3.2 -0.5
Year3 63.9 59.9 3.9* 6.5 63.0 59.9 3.0 5.1 0.9
Quarters employed, years 1-3 4.89 4.6 0.29* 6.3 4.79 4.60 0.19 4.1 0.10
Year1 1.35 1.33 0.02 1.5 1.35 1.33 0.02 1.7 0.00
Year2 1.66 1.52 0.14** 9.1 1.64 1.52 0.12* 7.8 0.02
Year3 1.89 1.76 0.13* 7.5 1.80 1.76 0.05 2.6 0.09
Earnings, years 1-3 ($) 9,938 8,208 1,730*** 21.1 8,942 8,208 734 8.9 996**
Year1 2,201 1,986 215 10.8 2,079 1,986 93 4.7 121
Year2 3,409 2,632 777*** 29.5 3,042 2,632 410** 15.6 367*
Year3 4,328 3,590 738*** 20.5 3,821 3,590 231 6.4 507**
Sample size (total = 3,073) 1,072 915     1,086 915      
Sources: MDRC calculations from Ohio unemploment insurance(UI) earnings records.
Notes: Estimates were regression-ajusted using ordinary least squares, controlling for pre-random assignment characteristics of sample members. "Percentage Change" equals 100 times "difference"divided by "control group". Rounding may cause slight discrepancies in calculating sums and difference. A two-tailed t-test was applied to differences between outcomes for the program and control groups and to differences between outcomes for the integrated and traditional program groups. Statistical significance levels are indicated as:* = 10 percent; ** = 5 percent; *** = 1 percent. Year 1 refers to quarters 2 to 5; year 2 refers to quarters 6 to 9; year 3 refers 10 to 13. Because quarter 1, the quarter of random assignment, may contain some earnings and AFDC payments from the period prior to random assignment, it is excluded from follow-up measures.

Table 5.6
Program Impacts on AFDC Receipt and Payments
for Sample members Without a High School Diploma or GED
Integrated-Control Comparison Traditional-Control Comparison  
Outcome Integrated Group Control Group Difference (Impact) Percentage Change Traditional Group Control Group Difference (Impact) Percentage Change Integrated-
Traditional Difference (Impact)
Ever received AFDC, years 1-3 (%) 96.0 97.4 -1.3* -1.4 96.8 97.4 -0.6 -0.6 -0.8
Year1 95.4 97.1 -1.7** -1.7 96.7 97.1 -0.4 -0.4 -1.2
Year2 68.5 74.2 -5.6*** -7.6 72.0 74.2 -2.2 -2.9 -3.4*
Year3 51.0 60.8 -9.8*** -16.1 55.8 60.8 -5.1** -8.3 -4.7**
Months received AFDC, years 1-3 20.25 23.58 -3.33*** -14.1 21.93 23.58 -1.64*** -7.0 -1.68***
Year1 9.26 10.04 -0.78*** -7.8 9.65 10.04 -0.40*** -3.9 -0.39***
Year2 6.49 7.59 -1.09*** -14.4 7.10 7.59 -0.48** -6.4 -0.61***
Year3 4.50 5.95 -1.45*** -24.4 5.18 5.95 -0.77*** -12.9 -0.69***
AFDC amount, years 1-3 ($) 6,661 8,065 -1,404*** -17.4 7,191 8,065 -874*** -10.8 -530***
Year1 3,071 3,462 -392*** -11.3 3,190 3,462 -272*** -7.9 -120**
Year2 2,124 2,603 -479*** -18.4 2,335 2,603 -268*** -10.3 -211***
Year3 1,467 1,999 -532*** -26.6 1,666 1,999 -334*** -16.7 -199***
Sample size (total = 3,073) 1,072 915     1,086 915      
Sources: MDRC calculations from Ohio AFDC records.
Notes: Estimates were regression-ajusted using ordinary least squares, controlling for pre-random assignment characteristics of sample members. "Percentage change" equals 100 times "difference"divided by "control group". Rounding may cause slight discrepancies in calculating sums and differences. A two-tailed t-test was applied to differences between outcomes for the program and control groups and to differences between outcomes for the integrated and traditional program groups. Statistical significance levels are indicated as:* = 10 percent; ** = 5 percent; *** = 1 percent. Year 1 refers to quarters 2 to 5; year 2 refers to quarters 6 to 9; year 3 refers 10 to 13. Because quarter 1, the quarter of random assignment, may contain some earnings and AFDC payments from the period prior to random assignment, it is excluded from follow-up measures.

Future Research

The final report in the NEWWS Evaluation will track outcomes for sample members in Columbus for up to five years following random assignment. This longer follow-up period is important when evaluating programs that engage many people in education because it can take some time for sample members to put their newly acquired skills to work in the job market. As noted in Chapter 1, however, in October 1997 control group members began to receive program services, and all sample members  from the control group, the integrated group, and the traditional group  began receiving integrated case management. These two changes, which occurred during the fourth or fifth year of follow-up for most sample members (random assignment occurred from 1992 to 1994), may have diminished the differences between the research groups' outcomes.

Endnotes

1.  As noted in Chapter 1, this report refers to cash assistance as AFDC; although the AFDC program has been converted into a block grant to states, AFDC existed throughout the study period for the report.

2.  UI earnings data are collected by calendar quarter (January through March, April through June, and so on). For the research, the quarter during which a sample member was randomly assigned was designated quarter 1. The first follow-up year (called year 1), covers quarters 2 through 5, the second year (year 2) covers quarters 6 through 9, and so on. Monthly AFDC and Food Stamp payments were grouped into quarters and years covering the same periods as earnings quarters and years. See Freedman et al., 2000, for more detail on the methods used in this analysis and on the impacts of the Columbus programs (estimated using two years of follow-up data), and for a more comprehensive comparison of the effects of the Columbus programs with the effects of the other NEWWS Evaluation programs.

3.  Differences in outcomes are considered statistically significant if there is less than a 10 percent probability that they occurred by chance.

4.  In two sites in the NEWWS Evaluation  Grand Rapids and Riverside  random assignment occurred at two separate points: at the point of referral to the welfare-to-work program and at the point of entry into the welfare-to-work program (program orientation). This design allows researchers to calculate separately the impacts of the referral itself and the effects of the program services and mandates. See Knab et al., 2001, for a presentation of findings from this special study.

5.  Three-year earnings impacts for the other education-focused programs were: Atlanta HCD, $1,003, or 11 percent; Grand Rapids HCD, $892, or 10 percent; Riverside HCD, $740, or 14 percent; and Detroit, $848, or 11 percent. Oklahoma City's impact of $12 is not statistically significant. Impacts for the employment-focused programs in the evaluation ranged from $1,292 to $3,152. (These impact findings are from an unpublished MDRC analysis of NEWWS Evaluation data.)

6.  In both programs, longer duration of employment and higher earnings on the job represent about two-thirds of the earnings gain, and an increase in the number of jobs found represents one-third. This decomposition is not exact. It is based on the approximate mathematical equivalence of the "percentage difference" in average total earnings to the sum of the percentage differences in "total quarters employed if employed," "average earnings per quarter employed," and "ever employed." The contribution of each effect is obtained by dividing its percentage difference by the percentage difference in average total earnings. The sum of all three contributions does not equal 100 percent because a small portion of the earnings impact is attributable to interactions among the components. (The integrated program increased "total quarters employed if employed" by 0.13 of a quarter, or 1.9 percent, and increased "average earnings per quarter employed" by $95, or 4.3 percent. Corresponding gains in the traditional program were 0.09 of a quarter, or 1.3 percent, and $88, or 4.0 percent.)

7.  Hamilton and Scrivener, 1999.

8.  Impacts of the other programs ranged from .58 to 1.94 months (from an unpublished MDRC analysis of NEWWS Evaluation data).

9.  Decreases in average three-year AFDC payments for the other education-focused programs were: Atlanta HCD, 6 percent; Grand Rapids, 13 percent; Riverside HCD, 12 percent; Detroit, 3 percent; and Oklahoma City, 4 percent. Decreases for the employment-focused programs in the evaluation range from 8 to 21 percent. (These findings are from an unpublished MDRC analysis of NEWWS Evaluation data.)

10.  The average monthly payment amount for control group members ($333) multiplied by the reduction in number of months of AFDC receipt indicates what the AFDC savings would have been if average monthly payment amounts were the same for program and control group members who remained on welfare. In the integrated program, for example, this calculation ($333 times 2.61 months) yields $869, which represents 81 percent of the $1,079 three-year AFDC savings. The calculation for the traditional program ($333 times 1.71 months) yields $569, which is 70 percent of the $816 three-year AFDC payment impact. The remainder of the impact on three-year AFDC payments may have come from reductions in grant amounts resulting from sanctions or from increased earnings while still on welfare. Alternatively, the overall reduction in months of receipt may have fallen primarily on cases with above-average monthly grant amounts. This decomposition is not exact, since it ignores interactions between grant level and case closure.

11.  The decomposition of the cash assistance payment impact discussed in footnote 10 indicates that $869 of the integrated program's impact on payments was generated because integrated group members spent less time on welfare than their control group counterparts; this figure for the traditional program was $569, $300 less than the figure for the integrated program. The $300 difference exceeds the $264 difference between the two programs' impacts on payments.

12.  Freedman et al., 2000.

13.  Over three years, the three research groups received Food Stamp benefits valued at the following amounts: control group, $6,312; integrated group, $5,616; traditional group, $5,830. Both programs' impact is statistically significant at the .01 level.

Appendix A: Supplementary Tables to Chapter 2

Table A.1
Selected JOBS and Integrated Staff Survey Measures
Measure Atlanta HCDa Atlanta LFAa Grand Rapidsa,b Riverside HCDa Riverside LFAa Columbus Integrated Columbus Traditional Detroita Oklahoma City Portland
Employment preparation strategy
Percent who lean toward labor force attachment 0.0 27.3 30.4 46.7 83.0 4.6 5.3 0.0 3.0 18.9
Percent who lean toward human capital development 87.5 54.6 43.5 26.7 8.5 68.2 65.8 72.2 87.9 37.7
Percent who encourage clients to take any job 50.0 81.8 73.9 100.0 95.8 57.1 34.2 55.6 44.9 54.0
Percent who encourage clients to be selective in taking a job 25.0 0.0 4.4 0.0 2.1 14.3 31.6 5.6 23.7 16.0
Stuff supervision, evaluation, and training
Percent who say they received helpful trainig on how to be an effective case manager 81.3 45.5 21.7 60.0 51.1 31.8 38.5 38.9 34.3 48.1
Percent who say that supervisors pay close attention to case manager performance 93.8 90.9 78.3 87.5 93.0 95.5 82.1 72.2 53.0 92.6
Percent who report good communication with program administrators 43.8 18.2 13.0 31.3 43.8 36.4 53.9 76.5 34.5 35.3
Percent who say that good performance is recognized 37.5 36.4 47.8 56.3 53.2 50.0 30.8 22.2 26.9 40.7
Percent who report high job satisfaction 12.5 9.1 26.1 25.0 27.7 4.6 28.2 5.6 9.5 22.2
Personalized attention and encouragement
Percent who try to learn in depth about clients' needs, interests, and backgrounds during program intake 93.8 50.0 21.7 75.0 47.8 63.6 46.0 16.7 39.3 61.5
Percent who try to identify and remove barriers to client participation 100.0 90.9 87.0 100.0 100.0 81.8 82.1 44.4 80.0 90.7
Percent who encourage and provide positive reinforcement to clients 31.3 36.4 27.3 62.5 50.0 52.4 38.5 22.2 23.0 39.6
Participation monitoring
Percent who reoport receiving a lot of informaiton on client progress from sevice providers 31.3 27.3 27.3 46.7 40.0 13.6 21.6 11.8 24.7 35.4
Average number of weeks before contacting clients about their attendance from service providers 3.4 2.8 1.6 1.7 1.7 2.5 3.1 3.7 2.7 1.9
Average number of weeks before contacting clients about their attendance problems 1.9 1.7 1.5 1.6 1.4 1.6 2.9 2.5 2.2 1.5
Rule enforcement and sanctioning
Percent who strongly emphasize penalties for noncompliance to new clients 68.8 81.8 82.6 68.8 51.1 86.4 70.6 83.3 58.6 59.1
Percent who never delay requesting sanctions for noncompliant clientsc 50.0 45.5 91.3 93.3 88.4 n/a 38.5 16.7 63.6 91.7
Perceptions of the effectiveness of JOBS
Percent who think JOBS will help clients become self-supporting 81.3 90.9 82.6 93.8 89.6 81.8 74.4 38.9 62.0 98.2
Sample sized 16 11 23 16 48 22 39 18 202 54
Sources: Integrated and JOBS Staff Activities and Attitudes Surveys.
Notes: aThese sites do not have integrated staff; the Integrated Staff Survey was not administered. bThe same Grand Rapids staff worked with both LFA and HCD sample members. cThis scale indicates reponses of JOBS staff only. dSample sizes may vary because some survey items were not applicable to all satff.
Table A.2
Selected Income Maintenance and Integrated Staff Survey Measures
Measure Atlantaa Grand Rapidsa Riversidea Columbus Detroita Oklahoma Cityb Portland
Rule enforcement and sanctioning
Percent who never delay imposing sanctions on noncompliant clients 84.8 98.0 87.2 70.9 87.0 28.5 51.6
Perceptions of effectiveness of JOBS
Percent who think JOBS will help cilents become self-supportingc 33.9 33.3 59.1 67.3 43.1 n/a 74.0
Sample sized 113 120 105 136 114 180 110
Soures: Income Maintenance and integrated Staff Activities and Attitudes Surveys.
Notes: N/a = not applicable aThese sites do not have integrated staff; the Integrated Staff Survey was not administered. bAll staff in Oklahoma City are integrated; the Income Maintenance Staff Survey was not administered. cThis measure indicates responses of income maintenance staff only. dSample sizes may vary because some survey items were not applicable to all satff.
Table A.3
Selected Clients Survey Measures
Measure Atlanta HCD Atlanta LFA Grand Rapids HCD Grand Rapids LFA Riverside HCD Riverside LFA Columbus Integrated Columbus Traditional Detroit Oklahoma City Portland
Employment preparation strategy
Percent who felt pushed to take a job 29.1 39.7 38.7 47.4 46.2 56.2 43.2 28.8 32.2 24.3 44.6
Personalized attention and encouragement
Percent who felt their JOBS case manager knew a lot about them and their family 42.5 44.1 27.7 25.9 39.6 35.7 53.5 38.0 32.1 43.0 35.5
Percent who believed JOBS staff would help them resolve problems that affected their participation in JOBS 43.8 46.5 26.3 25.0 44.0 45.5 54.8 38.6 32.2 35.3 40.9
Rule enforcement and sanctioning
Percent who said they were informed about penalties for noncompliance 68.8 67.9 82.4 80.9 71.9 69.5 68.2 69.1 58.1 44.8 67.6
Percent who felt the JOBS staff just wanted to enforce the rules 52.0 57.4 63.8 71.8 64.9 61.8 64.0 59.6 58.7 49.8 58.8
Perceptions of the effectiveness of JOBS
Percent who thought the program improved their long-run chances of getting or keeping a job 39.3 39.4 28.0 30.5 34.9 32.1 42.3 37.5 43.3 32.0 42.2
Sample size 1,113 804 574 574 621 564 371 366 210 259 297
Source: MDRC calculations from the Two-Year Client Survey.
Notes: Eligible sample members in Columbus, Detoit, and Oklahoma City had an equal chance of being chosen to be interviewed. In contrast, sample members in Atlanta, Grand Rapids, and Riverside had a greater or lesser chance, depending on their background characteristics or month of random assignment. To compensate for these differences, survey respondents in these four sites were weighted by the inverse of their probability of selection.

Appendix B: Supplementary Tables to Chapter 3

Table B.1
Rates of Participation Within a Two-Year Follow-Up Period,
by High-School Diploma/GED Status
Measure High School Diploma or GED No High School Diploma or GED
Integrated Group(%) Traditional Group(%) Integrated Group(%) Traditional Group(%)
For all sample members for whom case files were revieweda
Attended JOBS orientation 88.0 58.3*** 83.2 69.1**
Participated in:
Any Activity 54.7 33.3*** 50.5 34.0**
Job search 22.2 12.5** 5.6 3.1d
Any education or training 27.4 20.0 41.1 29.9*
Basic education 11.1 7.5 39.3 24.7**
Post-secondary educationb 6.0 8.3 5.6 3.1d
Vocational training 12.0 6.7 2.8 3.1d
Life skills workshops 12.8 1.7*** 6.5 0.0d
Work experience 15.4 7.5* 7.5 2.1d
Sample sizec 117 120 107 97
For all sample members who attended a JOBS orientatione
Participated in:
Any Activity 65.5 53.7 61.4 53.3
Job search 26.2 22.0 4.3 6.7
Any education or training 33.3 39.0 50.0 44.4
Basic education 14.3 12.2 47.1 37.8
Post-secondary educationb 6.0 17.1 8.6 2.2
Vocational training 15.5 14.6 4.3 4.4
Life skills workshops 15.5 0.0 10.0 0.0
Work experience 20.2 14.6 11.4 4.4
Sample size 84 41 70 45
Source: MDRC calculations based on MDRC-collected JOBS case file data.
Notes: aFor this sample, the follow-up period began on the day the individual was randomly assigned. Tests of statistical significance were calculated for the differences between the integrated and traditional groups. Statistical significance levels are indicated as: * = 10 percent; ** = 5 percent; ***= 1 percent.
bCourses for college credit at a two-year or four-year college.
cTwo individuals who did not indicate whether they had a high school diploma or GED at random assignment were excluded from the subgroup analysis.
dTests of statistical significance are not appropriate; sample sizes for this measure are too small. eFor this sample, the follow-up period began on the day of JOBS orientaion. Only orientation attenders for whom there are two full years post-orientation data are included. Differences between the integrated and traditional group outcomes, shown in italics, are not true experimental comparisons; statistical significance tests were not calculated.
Table B.2
Length of Participation Within a Two-Year Follow-Up Period,
by High-School Diploma/GED Status
Measure High School Diploma or GED No High School Diploma or GED
Integrated Group Traditional Group Integrated Group Traditional Group
For all sample members for whom case files were revieweda
Average number of months receiving AFDC 16.6 17.2 17.3 18.2
Average number of months in which individuals were JOBS -mandatory 14.6 15.0 14.4 15.4
Average number of months in which individuals participated in a JOBS activity 3.2 2.2 3.5 1.6***
Sample sizeb 117 120 107 97
For Participants onlyc:
Average number of months in which individuals participated in a JOBS activity 5.8 6.7 7.4 4.8
Number of months in which there was paticipition(%)
1 14.1 7.7 13.7 15.6
2 21.9 20.5 19.6 25.0
3 10.9 10.3 13.7 6.3
4-6 18.8 23.1 5.9 21.9
7-12 21.9 23.1 27.5 25.0
13-18 10.9 2.6 11.8 6.3
19 or more 1.6 12.8 7.8 0.0
In any activity at the end of the follow-up period(%) 9.4 18.0 17.7 3.1
Sample size 64 39 51 32
Source: MDRC calculations from MDRC-collected JOBS case file data and Ohio AFDC records.
Notes: aTests of statistical significance were calculated for the differences between the integrated and traditional groups. Statistical significance levels are indicated as: * = 10 percent; ** = 5 percent; ***= 1 percent.
bTwo individuals who did not indicate whether they had a high school diploma or GED at random assignment were excluded from the subgroup analysis.
cDifferences between the integrated and traditional group outcomes, shown in italics, are not true experimental comparisons; statistical significance tests were not calculated.
Table B.3
Sanction Activity Within a Two-Year Follow-Up Period,
by High School Diploma/GED Status
Measure High School Diploma or GED No High School Diploma or GED
Integrated Group Traditional Group Integrated Group Traditional Group
For all sample members for whom case files were revieweda
Sanction initiatedb(%) 40.2 57.5*** 51.4 67.0**
Santion imposed (%) 33.3 30.0 40.2 41.2
In sanction at the end of the follow-up period (%) 4.3 5.8 4.7 6.2
Sample sizec 117 120 107 97
For sanctioned individuals onlyd
Average number of months in which sanction was effect 3.3 4.5 4.6 5.5
Number of months in sanction(%)
1 33.3 27.8 20.9 12.5
2 20.5 13.9 18.6 5.0
3 15.4 13.9 9.3 27.5
4-6 18.0 22.2 23.3 30.0
7-12 10.3 16.7 20.9 17.5
13-18 2.6 2.8 7.0 5.0
19 or more 0.0 2.8 0.0 2.5
in sanction at the end of the follow-up period(%) 12.8 19.4 11.6 15.0
Sample size 39 36 43 40
Source: MDRC calculations from MDRC-collected JOBS case file data.
Notes:aTests of statistical significance were calculated for the differences between the integrated and traditional groups. Statistical significance levels are indicated as: * = 10 percent; ** = 5 percent; ***= 1 percent.
b"Sanction inititated" indicates that the integrated case manager or the traditional JOBS case manager decided that a sanction should be implemented.
cTwo individuals who did not indicate whether they had a high school diploma or GED at random assignment were excluded from the subgroup analysis.
dDifferences between the integrated and traditional group outcomes, shown in italics, are not true experimental comparisons; statistical significance tests were not calculated.
Table B.4
Two-Year Impacts on Participation in JOB Search, Education, Training, and Work Experience,
by Program, Based on Client Survey Data Only
Outcome Participated (%) Hours of Paticiptation Hours of Participation Among Participants
Program Group Control Group Difference Program Group Control Group Difference Program Group Control Group Difference
Integrated program
Participated in:
Any Activity 47.9 24.2 23.7*** 174.6 69.3 105.2*** 364.2 286.2 77.9
Job searcha 14.2 3.9 10.3*** 13.5 3.6 9.9** 95.3 91.2 4.1
Any education or training activity 33.0 20.3 12.7*** 161.1 65.8 95.3*** 488.1 323.7 164.4
Basic education 20.4 8.8 11.7*** 77.1 15.8 61.3*** 377.7 180.8 196.9
Post-secondary educationb 11.1 6.8 4.4** 63.1 28.3 34.8* 566.2 417.3 148.9
Vocational training 4.1 6.4 -2.3 20.9 21.6 -0.7 512.1 340.5 171.6
Work experience or on-the-job training 8.8 2.2 6.7*** n/a n/a n/a n/a n/a n/a
Sample sizec 371 357   371 357   varies varies  
Traditional Program
Participated in:
Any Activity 45.3 24.2 21.0*** 262.7 69.3 193.4*** 580.4 286.2 294.2
Job searcha 11.6 3.9 7.7*** 18.4 3.6 14.8*** 158.9 91.2 67.7
Any education or training activity 34.2 20.3 13.9*** 244.3 65.8 178.6*** 715.1 323.7 391.5
Basic education 19.7 8.8 10.9*** 93.9 15.8 78.1*** 477.1 180.8 296.3
Post-secondary educationb 12.1 6.8 5.4** 99.4 28.3 71.1*** 818.1 417.3 400.8
Vocational training 6.3 6.4 0.0 51.0 21.6 29.4** 806.5 340.5 466.0
Work experience or on-the-job training 7.5 2.2 5.4*** n/a n/a n/a n/a n/a n/a
Sanctioned 30.9 4.2 26.7*** n/a n/a n/a n/a n/a n/a
Sample sizec 366 357   366 357   varies varies  
Source: MDRC calculations from the Two-Year Client Survey.
Notes: Estimates are regression-ajusted using ordinary least squares, controlling for pre-random assignment characteristics of sample members. Numbers may not add to 100 percent because of rounding. A two-tailed t-test was applied to differences between outcomes for the program and control groups. Statistical significance levels are indicated as: * = 10 percent; ** = 5 percent; *** = 1 percent. Italics are used to signal average outcomes and differences that were calculated only for participants. Unlike the full-sample program and control groups, these program and control groups may differ from each other in average background characteristics. Such differences could have influenced the types of employment-related activities people in the groups attended or their length of stay. If so, the program-control differences might understate or overstate the effects of the programs. Because these impact estimates are less reliable than those based on the full sample, statistical significance tests of these results were not conducted. N/a = not available or not applicable.
aFor integrated and traditional group members, this measure includes participation in life skills workshops.
bCourses for college credit at a two-year or four-year college.
cSample sizes for individual measures vary because of missing values.

Appendix C: Supplementary Table to Chapter 5

Table C.1
Three-Year Impacts on Employment, Earnings, and AFDC
Outcome Integrated-Control Comparison Traditional-Control Comparison
Integrated Group Control Group Difference (Impact) Percentage Change Traditional Group Control Group Difference (Impact) Percentage Change
Employed (%)
Quarter 2 35.8 37.9 -2.1 -5.5 37.4 37.9 -0.4 -1.2
Quarter 3 39.4 38.8 0.6 1.6 41.0 38.8 2.2* 5.6
Quarter 4 43.2 42.4 0.9 2.0 43.3 42.4 1.0 23
Quarter 5 45.2 42.9 2.3* 5.3 44.5 42.9 1.6 3.7
Quarter 6 46.2 44.3 1.9 4.4 45.7 44.3 1.4 3.1
Quarter 7 48.7 44.9 3.8*** 8.5 47.4 44.9 2.5* 5.5
Quarter 8 50.3 45.7 4.6*** 10.1 51.0 45.7 5.3*** 11.6
Quarter 9 51.9 46.8 5.1*** 10.9 50.3 46.8 3.5** 7.4
Quarter 10 51.7 49.0 2.6* 5.4 49.7 49.0 0.6 1.3
Quarter 11 53.8 50.4 3.5** 6.9 51.6 50.4 1.3 2.5
Quarter 12 53.5 51.3 2.2 4.3 53.4 51.3 2.2 4.2
Quarter 13 55.1 51.6 3.5** 6.7 53.2 51.6 1.6 3.1
Earnings ($)
Quarter 2 571 581 -10 -1.7 582 581 2 0.3
Quarter 3 700 682 18 2.7 745 682 64* 9.3
Quarter 4 830 809 21 2.6 848 809 40 4.9
Quarter 5 893 843 51 6.0 923 843 81** 9.6
Quarter 6 1,011 922 89** 9.7 1,001 922 79* 8.6
Quarter 7 1,113 968 145*** 15.0 1,087 968 120*** 12.3
Quarter 8 1,195 1,017 178*** 17.6 1,158 1,017 141*** 13.9
Quarter 9 1,258 1,075 183*** 17.0 1,226 1,075 150*** 14.0
Quarter 10 1,321 1,169 152*** 13.0 1,258 1,169 89* 7.6
Quarter 11 1,399 1,259 139*** 11.1 1,325 1,259 66 5.2
Quarter 12 1,422 1,338 85 6.3 1,390 1,338 52 3.9
Quarter 13 1,493 1,365 128** 9.4 1,483 1,365 118** 8.7
Received AFDC (%)
Quarter 2 94.5 96.0 -1.4** -1.5 95.0 96.0 -1.0 -1.0
Quarter 3 83.0 88.9 -5.9*** -6.7 85.7 88.9 -3.2*** -3.6
Quarter 4 73.3 79.3 -5.9*** -7.5 75.2 79.3 -4.0*** -5.1
Quarter 5 68.0 72.5 -4.5*** -6.3 68.9 72.5 -3.6*** -5.0
Quarter 6 60.6 65.9 -5.3*** -8.0 62.1 65.9 -3.7*** -5.7
Quarter 7 55.5 61.8 -6.3*** -10.2 57.6 61.8 -4.2*** -6.7
Quarter 8 51.4 57.8 -6.4*** -11.0 53.8 57.8 -4.0*** -7.0
Quarter 9 47.1 53.8 -6.8*** -12.5 49.3 53.8 -4.6*** -8.5
Quarter 10 42.1 50.4 -8.3*** -16.5 45.2 50.4 -5.2*** -10.2
Quarter 11 38.5 47.2 -8.7*** -18.5 41.7 47.2 -5.5*** -11.7
Quarter 12 35.2 43.8 -8.6*** -19.5 38.4 43.8 -5.4*** -12.4
Quarter 13 33.2 40.3 -7.1*** -17.6 34.9 40.3 -5.5*** -13.5
AFDC amount ($)
Quarter 2 872 923 -51*** -5.5 882 923 -41*** -4.5
Quarter 3 745 840 -95*** -11.3 769 840 -71*** -8.5
Quarter 4 658 751 -93*** -12.3 678 751 -73*** -9.7
Quarter 5 605 685 -80*** -11.7 622 685 -63*** -9.2
Quarter 6 542 630 -88*** -14.0 563 630 -67*** -10.7
Quarter 7 491 588 -96*** -16.4 513 588 -74*** -12.6
Quarter 8 448 545 -96*** -17.7 478 545 -66*** -12.2
Quarter 9 413 507 -94*** -18.6 435 507 -73*** -14.3
Quarter 10 367 475 -107*** -22.6 398 475 -77*** -16.2
Quarter 11 337 438 -100*** -22.9 365 438 -73*** -16.6
Quarter 12 306 400 -94*** -23.5 335 400 -65*** -16.3
Quarter 13 286 370 -84*** -22.7 298 370 -71*** -19.3
Sample size (total = 7,242) 2,513 2,159     2,570 2,159    
Source: MDRC calculations from Ohio unimployment insurance(UI) earnings records and AFDC records.
Notes: Estimates are regression-adjusted using ordinary laest squares, controlling for pre-random assignment characteristics of sample members. Rounding may cause slight discrepancies in calculating sums and differences. Because quarter 1, the quarter of random assignment, may contain some earrnings and AFDC payments from the period prior to random assignment, it is excluded from follow-up measures. "Percentage difference" equals 100 times "difference" divided by "control group." A two-tailed t-test was applied to differences between outcomes for the program and control groups. Statistical significance levels are indicated as: * = 10 percent; ** = 5 percent.; *** = 1 percent.

References

American Public Welfare Association. 1992. Status Report on JOBS Case Management Practices. Washington, DC: American Public Welfare Association.

Bane, Mary Jo, and David T. Ellwood. 1994. Welfare Realities: From Rhetoric to Reform. Cambridge, MA: Harvard University Press.

Bell, Winifred. 1983. Contemporary Social Welfare. New York: Macmillan.

Brock, Thomas, and Kristen Harknett. 1998a. "A Comparison of Two Welfare-to-Work Case Management Models." Social Service Review, December.

Brock, Thomas, and Kristen Harknett. 1998b. "Welfare-to-Work Case Management: A Comparison of Two Models." Paper prepared by MDRC as part of the National Evaluation of Welfare-to-Work Strategies.

Freedman, Stephen, Daniel Friedlander, Gayle Hamilton, JoAnn Rock, Marisa Mitchell, Jodi Nudelman, Amanda Schweder, and Laura Storto. 2000. Evaluating Alternative Welfare-to-Work Approaches: Two-Year Impacts for Eleven Programs. Washington, DC: U.S. Department of Health and Human Services, Administration for Children and Families and Office of the Assistant Secretary for Planning and Evaluation; and U.S. Department of Education, Office of the Under Secretary and Office of Vocational and Adult Education.

Gallagher, L. Jerome, Megan Gallagher, Kevin Perese, Susan Schreiber, and Keith Watson. 1998. One Year After Federal Welfare Reform: A Description of State Temporary Assistance for Needy Families (TANF) Decisions as of October 1997. Washington, DC: Urban Institute.

Greenberg, Mark. 1992. How States Can Reduce Welfare's Work Penalties: The "Fill-the-Gap" Option. Washington, DC: Center for Law and Social Policy.

Gueron, Judith M., and Edward Pauly. 1991. From Welfare to Work. New York: Russell Sage Foundation.

Hagen, Jan L., and Irene Lurie. 1994. Implementing JOBS: Case Management Strategies. Albany: Nelson A. Rockefeller Institute of Government, State University of New York.

Hall, George E., and Deirdre A. Gaquin, eds. 1997. 1997 County and City Extra: Annual Metro, City, and County Data Book. Lanham, MD: Bernan Press.

Hamilton, Gayle, and Thomas Brock. 1994. The JOBS Evaluation: Early Lessons from Seven Sites. Washington, DC: U.S. Department of Health and Human Services, Administration for Children and Families and Office of the Assistant Secretary for Planning and Evaluation; and U.S. Department of Education, Office of the Under Secretary and Office of Vocational and Adult Education.

Hamilton, Gayle, Thomas Brock, Mary Farrell, Daniel Friedlander, and Kristen Harknett. 1997. Evaluating Two Welfare-to-Work Program Approaches: Two-Year Findings on the Labor Force Attachment and Human Capital Development Programs in Three Sites. Washington, DC: U.S. Department of Health and Human Services, Administration for Children and Families and Office of the Assistant Sec retary for Planning and Evaluation; and U.S. Department of Education, Office of the Under Secretary and Office of Vocational and Adult Education.

Hamilton, Gayle, and Susan Scrivener. 1999. Promoting Participation: How to Increase Involvement in Welfare-to-Work Activities. New York: Manpower Demonstration Research Corporation.

Hamilton, Gordon. 1962. "Editor's Page." Social Work 7, no. 1 (January).

Knab, Jean, Johannes M. Bos, Daniel Friedlander, and Joanna W. Weissman. 2001. "Do Mandates Matter? The Effects of a Mandate to Enter a Welfare-to-Work Program." Paper prepared by MDRC as part of the National Evaluation of Welfare-to-Work Strategies.

Mead, Lawrence M. 1986. Beyond Entitlement: The Social Obligations of Citizenship. New York: Free Press.

Nightingale, Demetra Smith, and Lynn C. Burbridge. 1987. The Status of State Work-Welfare Programs in 1986: Implications for Welfare Reform. Washington, DC: Urban Institute.

Rein, Mildred. 1982. Dilemmas of Welfare Policy: Why Work Strategies Haven't Worked. New York: Praeger.

Riccio, James, Daniel Friedlander, and Stephen Freedman. 1994. GAIN: Benefits, Costs, and Three-Year Impacts of a Welfare-to-Work Program. New York: Manpower Demonstration Research Corporation.

Scrivener, Susan, Gayle Hamilton, Mary Farrell, Stephen Freedman, Daniel Friedlander, Marisa Mitchell, Jodi Nudelman, and Christine Schwartz. 1998. Implementation, Participation Patterns, Costs, and Two-Year Impacts of the Portland (Oregon) Welfare-to-Work Program. Washington, DC: U.S. Department of Health and Human Services, Administration for Children and Families and Office of the Assistant Secretary for Planning and Evaluation; and U.S. Department of Education, Office of the Under Secretary and Office of Vocational and Adult Education.

Storto, Laura, Gayle Hamilton, Christine Schwartz, and Susan Scrivener. 2000. Oklahoma City's ET & E Program: Two-Year Implementation, Participation, Cost, and Impact Findings. Washington, DC: U.S. Department of Health and Human Services, Administration for Children and Families and Office of the Assistant Secretary for Planning and Evaluation; and U.S. Department of Education, Office of the Under Secretary and Office of Vocational and Adult Education.

Funder & Selected Publications from NEWWS Evaluation

he Manpower Demonstration Research Corporation (MDRC) is conducting the National Evaluation of Welfare-to-Work Strategies under a contract with the U.S. Department of Health and Human Services (HHS), funded by HHS under a competitive award, Contract No. HHS-100-89-0030. Child Trends, as a subcontractor, is conducting the analyses of outcomes for young children (the Child Outcomes Study). HHS is also receiving funding for the evaluation from the U.S. Department of Education. The study of one of the sites in the evaluation, Riverside County (California), is also conducted under a contract from the California Department of Social Services (CDSS). CDSS, in turn, is receiving funding from the California State Job Training Coordinating Council, the California Department of Education, HHS, and the Ford Foundation. Additional funding to support the Child Outcomes Study portion of the evaluation is provided by the following foundations: the Foundation for Child Development, the William T. Grant Foundation, and an anonymous funder

The findings and conclusions presented herein do not necessarily represent the official positions or policies of the funders.

Selected Publications From This Evaluation

[Some of these publications are available on the NEWWS web site]

Evaluating Two Approaches to Case Management: Implementation, Participation Patterns, Costs, and Three-Year Impacts of the Columbus Welfare-to-Work Program. Prepared by Susan Scrivener and Johanna Walter, MDRC. 2001. Washington, D.C.: U.S. Department of Health and Human Services, Administration for Children and Families and Office of the Assistant Secretary for Planning and Evaluation; and U.S. Department of Education.

What Works Best for Whom: Impacts of 20 Welfare-to-Work Programs by Subgroup. Prepared by Charles Michalopoulos and Christine Schwartz, MDRC. 2001. Washington, D.C.: U.S. Department of Health and Human Services, Administration for Children and Families and Office of the Assistant Secretary for Planning and Evaluation; and U.S. Department of Education.

Implementation, Participation Patterns, Costs, and Two-Year Impacts of the Detroit Welfare-to-Work Program. Prepared by Mary Farrell, MDRC. 2000. Washington, D.C.: U.S. Department of Health and Human Services, Administration for Children and Families and Office of the Assistant Secretary for Planning and Evaluation; and U.S. Department of Education.

Oklahoma City's ET & E Program: Two-Year Implementation, Participation, Cost, and Impact Findings. Prepared by Laura Storto, Gayle Hamilton, Christine Schwartz, and Susan Scrivener, MDRC. 2000. Washington, D.C.: U.S. Department of Health and Human Services, Administration for Children and Families and Office of the Assistant Secretary for Planning and Evaluation; and U.S. Department of Education.

Do Mandatory Welfare-to-Work Programs Affect the Well-Being of Children? A Synthesis of Child Research Conducted as Part of the National Evaluation of Welfare-to-Work Strategies. Prepared by Gayle Hamilton, MDRC, with Stephen Freedman, MDRC, and Sharon M. McGroder, Child Trends. 2000. Washington, D.C.: U.S. Department of Health and Human Services, Office of the Assistant Secretary for Planning and Evaluation and Administration for Children and Families; and U.S. Department of Education.

Evaluating Alternative Welfare-to-Work Approaches: Two-Year Impacts for Eleven Programs. Prepared by Stephen Freedman, Daniel Friedlander, Gayle Hamilton, JoAnn Rock, Marisa Mitchell, Jodi Nudelman, Amanda Schweder, and Laura Storto, MDRC. 2000. Washington, D.C.: U.S. Department of Health and Human Services, Administration for Children and Families and Office of the Assistant Secretary for Planning and Evaluation; and U.S. Department of Education.

Impacts on Young Children and Their Families Two Years After Enrollment: Findings from the Child Outcomes Study. Prepared by Sharon M. McGroder, Martha J. Zaslow, Kristin A. Moore, and Suzanne M. LeMenestrel. 2000. Washington, D.C.: U.S. Department of Health and Human Services, Office of the Assistant Secretary for Planning and Evaluation and Administration for Children and Families; and U.S. Department of Education.

Implementation, Participation Patterns, Costs, and Two-Year Impacts of the Portland (Oregon) Welfare-to-Work Program. Prepared by Susan Scrivener, Gayle Hamilton, Mary Farrell, Stephen Freedman, Daniel Friedlander, Marisa Mitchell, Jodi Nudelman, and Christine Schwartz, MDRC. 1998. Washington, D.C.: U.S. Department of Health and Human Services, Administration for Children and Families and Office of the Assistant Secretary for Planning and Evaluation; and U.S. Department of Education.

Evaluating Two Welfare-to-Work Program Approaches: Two-Year Findings on the Labor Force Attachment and Human Capital Development Programs in Three Sites. Prepared by Gayle Hamilton, Thomas Brock, Mary Farrell, Daniel Friedlander, and Kristen Harknett, MDRC. 1997. Washington, D.C.: U.S. Department of Health and Human Services, Administration for Children and Families and Office of the Assistant Secretary for Planning and Evaluation; and U.S. Department of Education.

Educating Welfare Recipients for Employment and Empowerment: Case Studies of Promising Programs. Prepared by Janet Quint, MDRC. 1997. Washington, D.C.: U.S. Department of Education, Office of the Under Secretary and Office of Vocational and Adult Education; and U.S. Department of Health and Human Services.

Changing to a Work First Strategy: Lessons from Los Angeles County's GAIN Program for Welfare Recipients. Evan Weissman. 1997. New York: MDRC.

Work First: How to Implement an Employment-Focused Approach to Welfare Reform. Amy Brown. 1997. New York: MDRC.

Monthly Participation Rates in Three Sites and Factors Affecting Participation Levels in Welfare-to-Work Programs. Prepared by Gayle Hamilton, MDRC. 1995. Washington, D.C.: U.S. Department of Health and Human Services, Office of the Assistant Secretary for Planning and Evaluation; and U.S. Department of Education.

How Well Are They Faring? AFDC Families with Preschool-Aged Children in Atlanta at the Outset of the JOBS Evaluation. Prepared by Kristin A. Moore, Martha J. Zaslow, Mary Jo Coiro, and Suzanne M. Miller, Child Trends, and Ellen B. Magenheim, Swarthmore College. 1995. Washington, D.C.: U.S. Department of Health and Human Services, Office of the Assistant Secretary for Planning and Evaluation; and U.S. Department of Education.

Early Findings on Program Impacts in Three Sites. Prepared by Stephen Freedman and Daniel Friedlander, MDRC. 1995. Washington, D.C.: U.S. Department of Health and Human Services, Office of the Assistant Secretary for Planning and Evaluation; and U.S. Department of Education.

Adult Education for People on AFDC: A Synthesis of Research. Prepared by Edward Pauly, MDRC. 1995. Washington, D.C.: U.S. Department of Health and Human Services, Office of the Assistant Secretary for Planning and Evaluation; and U.S. Department of Education.

Five Years After: The Long-Term Effects of Welfare-to-Work Programs. Daniel Friedlander and Gary Burtless. 1995. New York: Russell Sage Foundation.

Early Lessons from Seven Sites. Gayle Hamilton and Thomas Brock. 1994. Washington, D.C.: U.S. Department of Health and Human Services and U.S. Department of Education.

The Saturation Work Initiative Model in San Diego: A Five-Year Follow-up Study. Daniel Friedlander and Gayle Hamilton. 1993. New York: MDRC.

From Welfare to Work. Judith M. Gueron and Edward Pauly. 1991. New York: Russell Sage Foundation

Populations
Low-Income Populations