The 1996 welfare reform law spawned many new welfare policies and encouraged states to experiment with new approaches. Almost all the new policies and innovations, however, take for granted the existence of and build on the quid pro quo established by FSA, namely, that welfare recipients must work or participate in some type of welfare-to-work program in order to receive welfare benefits and services from the government. The new initiatives -- which include substantial earned income disregards, welfare time limits, stricter penalties for nonparticipation, and postemployment services -- are not meant to replace welfare-to-work programs. Rather, they are intended to enhance the anticipated payoffs (such as higher employment) of the changes brought about by welfare-to-work programs. In light of this, the NEWWS results, which suggest how welfare-to-work programs can be made most effective for different groups of people, are highly relevant. As the following sections illustrate, NEWWS provides critical insights regarding how best to design and operate programs in order to maximize the payoff of such programs.
Why Impacts Are Better than Outcomes for Assessing Program Effectiveness
Program operators generally have information only on a program's outcomes, for instance, the employment and welfare exit rates among people who enrolled or participated in the program. Although these statistics are valuable, they may lead to misleading conclusions about which programs are the most effective.
The first column in the chart below shows the employment rates and average earnings levels in the second and fifth years of the NEWWS follow-up period for welfare recipients in two of the NEWWS programs Portland and Grand Rapids LFA. Given only the program outcomes shown in the first column, it would be reasonable to conclude that the Grand Rapids LFA program was more successful in getting welfare recipients into jobs, whereas the Portland program was somewhat more successful in raising people's earnings.
When the experiences of the control groups in both sites (shown in the second column) are taken into account, however, the conclusions change. The program groups' experiences are compared with the control groups' experiences in the "Difference" column. These differences are the programs' impacts on employment and earnings. (The asterisks indicate whether the impacts are statistically significant, that is, very unlikely to have arisen by chance. The more asterisks appear next to an impact, the less likely the impact is to be due to chance.) The "Percentage Change" column expresses the impacts as percentage increases or decreases relative to the control group levels.
The impact analysis reveals that the Portland program by far outperformed the Grand Rapids LFA program: Looking only at the second year, during which both programs were successful, the Portland program produced a 21 percent increase in employment and a 40 percent increase in earnings, compared with an 11 percent increase in employment and an 18 percent increase in earnings for the Grand Rapids LFA program.