Families on TANF in Illinois: Employment Assets and Liabilities. Survey Methodology


In this section we present the methods that were used to design and conduct the client survey. We discuss the design of the survey sample, the survey instrument, data collection and processing, and the survey completion rates.

Sample Design

The sampling frame for this survey consisted of single-parent TANF cases in Illinois in November 2001. More specifically, the sampling frame consisted of TANF cases that, according to administrative records of the Illinois Department of Human Services (DHS):

  1. Were classified as "single-parent"
  2. Were authorized to receive a cash grant during the routine benefit issuance cycle for November 2001; however, 9 percent of the cases in the sampling frame received a $0 grant(1)
  3. Included the grantee as a case member, thus excluding "child-only" cases
  4. Had a TANF status of "active" or "suspended" when the sampling frame was identified, thus excluding cases whose status had changed to "cancelled" after benefits were issued but before the sampling frame was identified(2)

These criteria were satisfied by 33,495 TANF cases. They constituted the sampling frame for the survey of families on TANF in Illinois.

We implemented a simple stratified sample design. There were just two strata, defined by whether a case was located in Cook County or "downstate" (all other counties). A key objective for this study by Illinois DHS was that the survey data support the description of the statewide TANF caseload, as opposed to supporting separate descriptions for Cook County and downstate. Consistent with that objective, the probability of selection of a case from the frame into the sample was designed to be uniform across the two strata. We selected 532 cases into the sample, of which 431 were located in Cook County and 101 were located downstate. We attempted to interview every case in the sample and succeeded in interviewing 416 (78 percent) of them.

Survey Instrument and Pretest

We developed the survey instrument in consultation with the Office of the Assistant Secretary for Planning and Evaluation (ASPE) in the U.S. Department of Health and Human Services under a separate task-order agreement. The instrument was designed for either paper and pencil administration or computer assisted telephone interviewing (CATI) and was designed to take 35 minutes. Several questions were taken from the Michigan Women's Employment Survey (WES, Wave 2) and MPR's Nebraska Client Barriers Survey. Specific scales covering the topics of learning disabilities, mental health and depression, alcohol and drug abuse, and domestic violence were taken from Washington State's Learning Disabilities screener, the Composite International Diagnostic Interview (CIDI), and the Conflict Tactics Scale, respectively. In addition, we developed a series of questions to assess job readiness skills in collaboration with researchers at the University of Colorado.

We drafted the survey instrument between August and October of 2001 and pretested it in early November. The pretest interviews were conducted with the heads of ten TANF cases in Illinois who had received a cash benefit in October. Those interviews averaged 40 minutes in length. The goals of the pretest were to: (1) identify ways to improve the administration procedures, (2) measure the length of the survey, (3) test the flow and sequencing of questions, (4) clarify question wording for the interviewees, and (5) clarify instructions for the interviewers. Based on information obtained from the pretests through debriefings with the interviewers and through the monitoring of interviews by supervisory staff, we made minor modifications to the newly developed job readiness questions.

Data Collection

Survey data collection began on November 19, 2001 and continued through March 3, 2002--a field period of 16 weeks. At the outset, our target interview completion rate was 75 percent and we exceeded that goal by three percentage points. The interviews averaged 42 minutes in length, all interviews were conducted by telephone using a hard-copy survey instrument, and no in-person follow-up was employed on this study.

Immediately prior to the commencement of interviewing, we held an eight-hour interviewer training session spread over two days, November 14th and 15th 2001, led by the survey director. In attendance were the survey director's assistant, the telephone supervisor, the locating supervisor, the telephone interviewers, and the quality control monitors.

We contacted sample members by mail and by telephone to participate in the survey. DHS administrative records were the source of the initial addresses and telephone numbers. We mailed advance letters to all sample members prior to the first telephone contact. The letters introduced the study, identified the study sponsor and MPR, and invited the sample members to call our toll-free telephone number and participate in the survey at their earliest convenience. The letter explained that participation was voluntary and that the identities and responses of all participants would be kept confidential. It offered sample members $35 if they would call and complete the survey within two weeks of receiving the letter. Otherwise they would receive $20 for completing the survey after that.

Our next step was to call the sample members. We timed the first telephone calls to begin after sample members received the advance letter. This resulted in a number of completed interviews and also helped us to identify the sample members with either no phone number or for whom the number from DHS records was incorrect and would require additional searching efforts. In addition, the advance letters served to identify cases that required additional searching. Some of the advance letters were returned to us if the addresses that we had obtained from DHS records were out of date. Advance letters that were returned with forwarding addresses marked on the envelopes were remailed to the new addresses. Advance letters that were returned without a forwarding address required additional searching.

Our principal searching effort consisted of running identifying information (name, date of birth, last known address and phone number) for sample members through a database owned by Lexis-Nexis, a personal database search company. That generated some new addresses and phone numbers to which we then sent letters or called. We also obtained updated contact information for some sample members through a search of DHS records that we conducted approximately halfway through the field period.

Throughout the 16-week field period we continued mailing letters and postcards to sample members with whom we had not completed interviews. The format and content of the letters and postcards changed every few weeks, as well as the size and appearance of the envelope and method of mailing (first-class mail versus priority mail). This was done to spark the sample members' interest in reading the items sent. However, the most salient information remained the same in each version of the letter or postcard.

A small number of sample members initially refused to participate in the survey. For these cases, we waited approximately one month from the telephone contact in which the refusal occurred and then mailed them a specially crafted letter. The letter reiterated the importance of the study and of their participation. It again invited them to call our toll-free telephone number to participate and reminded them that we would pay them $35 if they completed the interview. We waited until we were confident that a sample member had received the letter, and then a specially trained "refusal-conversion" interviewer called to attempt to gain his or her cooperation. If the result of these steps was a second refusal, we ceased attempts to contact the sample member until the end of the field period. At that time, we sent out a final mailing to all sample members who had not completed an interview to alert them that the study was ending and to offer an increased incentive of $50 for participating.

Data Preparation

As interviews were completed, they were reviewed for completeness, consistency, and accuracy. Based on guidelines developed by MPR, interviewers called back respondents to obtain information or to clarify contradictory answers. Reviewers back-coded "other-specify" responses to prelisted choices where appropriate, or assigned new codes if responses were common enough to warrant the additions. They also assigned numeric codes to open-ended questions and to industry and occupation responses using standard coding manuals.(3)

After the completed interviews had been reviewed and coded, they were sent through the data entry process. A customized data entry program restricted entries to values that were consistent with the skip patterns in the survey instrument and were within allowable ranges. The data were entered two times by different people to verify that they had been entered correctly. After data entry was verified, frequencies for all data elements were produced and reviewed for inconsistencies and out-of-range values. Questionable data were reconciled based on review of the source data and, in some cases, on callbacks to sample members. Following this process, a final data file was produced and turned over to MPR's Research Division for further processing and analysis.

Sample Disposition and Survey Response Rate

We completed interviews with the TANF grantees in 416 of the 532 sampled cases, for an overall survey response rate of 78 percent. Of the completed interviews, 335 were with TANF clients from Cook County and 81 were with clients from downstate. Only two percent of the sample members refused to participate in the survey and only one sample member failed to complete an interview after starting. Table A.1 shows the final survey disposition of all cases in the sample by Cook County, downstate, and combined.

Table A.1
Final Disposition of Sample Cases
Final Status Cook County Downstate Total
Number Col. % Number Col. % Number Col. %
Complete 335 77.7% 81 80.2% 416 78.2%
Refusal 8 1.9% 3 3.0% 11 2.1%
Break-Off 1 0.2% 0 0.0% 1 0.2%
Deceased 1 0.2% 0 0.0% 1 0.2%
Language Barrier 0 0.0% 1 1.0% 1 0.2%
Located, Effort Ended 40 9.3% 6 5.9% 46 8.6%
Unlocatable 46 10.7% 10 9.9% 56 10.5%
Total 431 100.0% 101 100.0% 532 100.0%

Two factors accounted for almost 90 percent of nonresponse to this survey. Nearly half of the nonresponse occurred because sample members could not be located (10.5 percent of all sample members). These were sample members whose addresses and phone numbers, as provided by Illinois DHS, were incorrect and we were unable to locate them by other means, such as searching through various databases for contact information and using the forwarding addresses provided by the U.S. Postal Service on letters returned to us after we had mailed them to sample members. Forty percent of nonresponse (8.6 percent of all sample members) occurred because the sample members were never available to participate in the survey. We believe that our contact information for these sample members was good, but they did not call us in response to our letters, did not answer the telephone in response to our calls, and were not available to take our calls when another household member answered the telephone.

A language barrier--lack of proficiency with English or Spanish--resulted in only one case of nonresponse to the survey (0.2 percent of all sample members). There were very few Spanish-only members of the survey sample and we chose to interview them in their native language so they could be included in the study. We conducted three interviews in Spanish, all with the same interviewer, who is a native Spanish speaker. That interviewer participated in the initial interviewer training session and completed roughly 40 interviews in English prior to conducting interviews in Spanish. She translated the instrument on her own and used the same translation for all three interviews. A native Spanish speaker monitored these interviews.

View full report


"report.pdf" (pdf, 1.55Mb)

Note: Documents in PDF format require the Adobe Acrobat Reader®. If you experience problems with PDF documents, please download the latest version of the Reader®