Both administrative and survey data have their shortcomings, but combining data from these two sources provides a rich description of the overall well-being of leavers. As Table 12-1 shows, eight studies use both survey and administrative data to study the same cohort of leavers.(8) In the following sections, we describe steps researchers can take to examine the accuracy of employment information from administrative data and assess the accuracy and representativeness of survey data. None of these techniques can completely address the potential shortcomings in the data, but if they are employed, they can help readers weigh the findings reported in any given leaver study.
Do UI Records Understate Employment by Welfare Leavers?
With the exception of Missouri, all leaver studies using UI wage records to examine employment only link into a single state's UI system. Consequently, leavers that move out of state or work outside of their home state will not appear in the data.(9) Furthermore, not all jobs are covered by state UI systems so there will be no record of work for a leaver who works in an uncovered job. If a leaver study uses both administrative and survey data and has asked surveyed leavers about their employment status, one can assess the extent of this potential underreporting.
|Employment Rate (%)|
|State/Study||Exit Cohort||Timing of Survey||Survey Data||Administrative Data*|
|District of Columbia||4Q98||12 months||60.3||n.a.|
|Illinois||December 1998||6-8 months||63.2||55|
|Washington||October 1998||6-8 months||59||57|
|*Based on employment rate from the fourth postexit quarter.
SOURCE: See Appendix B for a complete listing of the leavers studies referenced.
Five jurisdictions use surveys of TANF leavers to ask the leavers themselves about their current employment status. The responses of leavers generally refer to employment about 6 months to a year after exit. Table 12-6 compares these self-reported employment rates with fourth quarter post exit employment rates computed from administrative data. The surveys consistently find higher employment rates than those reported in UI wage records; in general they are about 7 percentage points higher. The Illinois survey presents some instructive information. In its administrative records, Illinois finds that 30 percent of leavers never worked over the first four postexit quarters. In its survey, Illinois finds that only 15 percent of leavers say they have never worked since exiting TANF.
Further, a supplemental study by Wisconsin's Department of Workforce Development (1998) examines how much employment is missed using UI wage records by comparing administrative and survey data on families leaving welfare in the first quarter of 1998. This study finds that out of the 375 surveyed leavers, 85 percent reported employment information consistent with administrative records. Among the leavers who reported that they had worked in the survey but did not show up in Wisconsin's UI data, 38 percent claimed to be working in temporary jobs that may not be reported to the UI system. Another 32 percent worked as housekeepers, childcare workers, farmhands, or in other jobs in which they may be considered self-employed and/or for which employers may not file UI reports. Ten percent explicitly stated they were self-employed and 17 percent had left the state.
Are Respondents Answering Survey Questions Accurately?
Survey data are based on self-reported information from respondents. If respondents intentionally or unwittingly provide inaccurate information, the survey findings may not reflect the well-being of leavers. When surveys gather information that duplicates information available through administrative sources, it is possible to compare a respondent's answer to the administrative report to assess accuracy. For example, a survey may ask, "In the year since you exited welfare, have you ever received food stamps?" Because this information is reported in administrative data, it is possible to see if survey respondents are providing reliable information. In general, studies that compare survey and administrative findings on common areas find fairly close agreement, as shown in Table 12-7. Finding similar results using survey and administrative data does not guarantee that all other survey responses are accurate; however, if the findings were different, it would undermine the confidence one would have in the survey results.
|State||Point in Time||Since Exit|
|Administrative (%)||Survey (%)||Administrative (%)||Survey (%)|
|District of Columbia(a)||18.8||18.8||21.1||24.6|
|District of Columbia(a)||37.9||40.8||n.a.||55.2|
|District of Columbia(a)||47.5||53.8||n.a.||n.a.|
|a The periods of follow-up for Arizona and the District of Columbia's survey data are 12-18 months and 12 months, respectively. The administrative data are reported for the fourth quarter after exit.
b The period of follow-up for Missouri's survey is 30 months. However, only 12 months of administrative data are available. The administrative data reported are for the fourth quarter after exit.
c The period of follow-up for Illinois's and Washington's survey data is 6-8 months. The administrative data reported are for the third quarter after exit.
d Data reported for adults.
Of course, the real value of surveys is their ability to obtain information unavailable in administrative records, and for such items it is not possible to obtain external validation. This can be particularly challenging when trying to determine whether a leaver is better off since exit than before. For example, a welfare leaver interviewed 9 months after exit may not recall the trouble he or she had paying the rent prior to leaving welfare. One way to examine the importance of recall problems is to supplement a leaver study with a survey of families still on welfare. The Washington state study is the only study we review that conducts a "stayer" analysis. Surprisingly, while other surveys (Arizona and Illinois) that ask about food security find that leavers generally report the same or lower levels of insecurity prior to exit than after exiting, Washington finds that current recipients actually report higher rates of food insecurity than leavers.
How Representative Are Survey Respondents of Leavers in General?
As we discussed, nonresponse bias is a potentially significant problem for surveys of welfare leavers. Indeed, if the leavers who did not respond to the survey (either because they could not be located or because they refused to participate) are appreciably different from respondents, then survey data will paint a misleading picture of the well-being of TANF leavers. In general, the higher the response rate to a survey, the less concerned one is about its representativeness. (Table 12-4 shows response rates.)
Differences in response rates can affect outcomes for welfare leavers as measured by surveys. We report these results separately for surveys with high, moderate, and low response rates. In general, we would expect respondents to lead more stable lives than nonrespondents and to be more eager to share good news with survey takers. To the extent that nonresponse bias is a problem in these surveys, we would expect surveys with lower response rates to generally show that welfare leavers are better off. Note, however, that even in a survey with a 75-percent response rate, the nonresponse bias may be profound.
Table 12-8 shows employment and earnings information from survey data by response rate. Out of the nine surveys with high response rates, seven report information on hours worked, with five reporting the average number of hours worked by employed leavers. These five studies find that leavers work an average of 35 to 39 hours per week. Five studies report average hourly earnings: They range from $5.77 to $7.70. Among the studies with response rates of between 50 and 70 percent, four report average or median hours worked per week, and they show that employed leavers work between 34 and 37 hours per week. Among low-response-rate studies, three report average hours, and they, too, find an average of about 35 hours per week. The range of hourly wage rates reported in low- and moderate-response-rate studies runs from a low of $5.67 in Tennessee to a high of $8.74 in the District of Columbia.
|Panel A. Response Rate Greater Than 70%|
|Arizona-1||#||Average wage: $7.52|
|Indiana||61% worked 35 or more hours a week||40.7% earned $7 or more an hour|
|Michigan||#||53.2% earned $400 or more a month|
|Mississippi||Average number of hours worked: 35||Average wage: $5.77|
|Missouri-2||Average number of hours worked: 39||##|
|North Carolina||37.9% worked 40 or more hours||Median monthly salary: 849.76|
|South Carolina-2||Average number of hours worked: 36||Average wage: $6.44|
|South Carolina-3||Average number of hours worked: 36||Average wage: $6.45|
|Washington-5||Average number of hours worked: 36||Average wage: $7.70|
|Panel B: Response Rate Between 50% and 70%|
|District of Columbia||Average number of hours worked: 36||Average wage: $8.74|
|Illinois-2||Median number of hours worked: 37||Median wage: $7.42|
|Massachusetts||#||63.3% income $250 or more a weeks|
|Oklahoma||Average number of hours worked: 34||Average wage: $6.15|
|Tennessee||35% worked full time||Average wage: $5.67|
|Washington-3||Average number of hours worked: 36||Average wage: $8.09|
|Wisconsin-3||57% worked 40 or more hours a week||Average wage: $7.42|
|Panel C: Response Rate Less Than 50%|
|Idaho-1||40% worked 30 or more hours a week||21 % earned $7 or more an hour|
|Illinois-I||Average number of hours worked: 35.8||Median wage: $7.11|
|Kentucky||73.5% worked 35 or more hours||40.9% earned $7 an hour or more|
|Montana||47% worked 21 or more hours||##|
|New Mexico||74.6% worked 30 or more hours||29% earned $7 or more an hour|
|New York-1||40% worked 35 or more hours||##|
|Pennsylvania||62% worked 30 or more hours||59% earned $6.50 or more an hour|
|Texas||Average numbers of hours worked: 34||Average wage: $6.28|
|Virginia||#||Median monthly salary: $1,160|
|Washington-2||Average hours worked: 34||Average wage: $8.42|
|Wyoming||#||83% earned $7.50 or more an hour|
|(a) Average weekly earning for full-time work is $305.
# Hours worked not reported.
## Earnings not reported.
Researches use two relatively straightforward techniques to assess the extent of nonresponse bias in surveys of welfare leavers. The first technique involves using administrative data on the entire survey sample and comparing respondents to nonrespondents. The second involves using the survey data to compare the characteristics of easily located and interviewed leavers with those of leavers that were "hard to find."(1)0
First, consider how administrative data can help uncover potentially important non-response bias in survey data. Three studies, the District of Columbia (DC), Missouri, and South Carolina, have compared administrative information on survey respondents and nonrespondents to see if nonrespondents appear to be very different from respondents. Missouri (Dunton, 1999) finds that non-
respondents tend to have less education and lower quarterly earnings than respondents. South Carolina (Edelhoch and Martin, 1999) compares the reasons for TANF exit for survey respondents and nonrespondents and finds that respondents are significantly less likely to have their cases closed because of a sanction and significantly more likely to have their cases closed because of earned income than nonrespondents. These comparisons suggest that findings from these studies may present too sunny a picture of the status of welfare leavers. On the other hand, DC's leaver study finds that nonrespondents are slightly younger, have younger children, and have had shorter spells of receipt than nonrespondents. Overall, however, DC finds that respondents are fairly similar to nonrespondents.
Another technique to gauge the importance and potential biases of nonresponse involves examining differences among respondents, comparing survey responses from respondents who were easy to contact and quickly agreed to be surveyed to the responses of hard-to-contact and reluctant responders.(11) This approach is based on the idea that "hard to interview" cases fall on a continuum between the "easy to interview" and nonrespondents. If the hard to interview are very different from the easy to interview in ways that are important to the study, it is likely that nonrespondents are even more different, and nonresponse bias is likely to be a big problem.
Only DC explicitly uses this technique. DC finds that hard-to-interview cases are neither clearly better nor worse off than the easy-to-interview cases; rather, their experiences are more diverse. For example, easy-to-interview cases are slightly more likely to work than hard-to-interview cases but among those who work, the hard-to-interview have higher hourly wages. In a supplementary study, Missouri (1999) compares employment and earnings among survey respondents in the Kansas City area based on the timing of response. Missouri finds that respondents among the final third of completed interviews are slightly less likely to work than respondents in the first two-thirds of completed interviews (88.5 versus 91.4 percent). The harder to interview also have lower monthly incomes ($935 versus $1,094).
Although we have described several techniques researchers can use to assess the potential for nonresponse bias in leaver studies, the best way to guard against nonresponse bias is to have a high response rate. Even though these techniques cannot rule out the possibility of significant nonresponse bias, they do provide readers with a sense of the potential size and direction of the bias. Interestingly, however, we find that surveys with moderate response rates (50 to 70 percent) report findings that are fairly similar to those with higher response rates (more than 70 percent).
"01.pdf" (pdf, 472.92Kb)
"02.pdf" (pdf, 395.41Kb)
"03.pdf" (pdf, 379.04Kb)
"04.pdf" (pdf, 381.73Kb)
"05.pdf" (pdf, 393.7Kb)
"06.pdf" (pdf, 415.3Kb)
"07.pdf" (pdf, 375.49Kb)
"08.pdf" (pdf, 475.21Kb)
"09.pdf" (pdf, 425.17Kb)
"10.pdf" (pdf, 424.33Kb)
"11.pdf" (pdf, 392.39Kb)
"12.pdf" (pdf, 386.39Kb)
"13.pdf" (pdf, 449.86Kb)