National Evaluation of Welfare-to-Work Strategies: 2-Year Client Survey Files: Background Information on Selection of the 2-Year Client Survey Sample

09/10/2001

Background Information on Selection of the 2-Year Client Survey Sample


Data on participation, degree receipt, job quality, income,
transitional benefits, health care coverage, child care, child outcomes,
and several other measures used in this report come from the Two-Year
Client Survey.  The survey was administered to a subsample of the full
research sample approximately two years after random assignment.

Key analysis samples

The survey eligible sample ("eligibles"). Sample members in the full
research sample who were randomly assigned during months in which the
survey sample was selected and who met the criteria for inclusion.

The fielded sample ("fieldeds"). Members of the eligible sample who
were chosen to be interviewed.

The respondent sample ("respondents"). Members of the eligible sample,
chosen to be interviewed (i.e., fieldeds), who were interviewed.

The non-respondent sample ("non-respondents"). Members of the eligible
sample, chosen to be interviewed (i.e., fieldeds), who were not
interviewed. They either could not be located or declined to be interviewed.



I.  Survey Selection and Sampling Ratios

Several of the chapters in this report analyze program impacts
calculated from survey responses as well as impacts calculated from
administrative records for the full sample. It is important to
understand the process by which the survey samples were chosen and
survey responses collected in order to assess the comparability of
these results.

Selecting the Eligible Sample

In all sites, the survey eligible sample includes members of the full
research sample who were randomly assigned during some, but not all,
months of sample intake (See SAMTBL1.TXT). Limiting the eligible
sample in this way can introduce "cohort effects," impact estimates
that are especially large or small for sample members randomly assigned
during particular months. A cohort effect may occur because members of
the survey eligible sample differ in measured or unmeasured background
characteristics from persons randomly assigned in other months.
Changes in area labor markets or in program implementation that occur
at some point after the start-up of random assignment may also introduce
cohort effects - e.g., by increasing or decreasing a program's relative
success in moving welfare recipients from welfare to work. These issues
are most germane to Columbus, Detroit, Portland, and Oklahoma City,
where selection of the survey eligible samples took place over fewer
months than in Atlanta, Grand Rapids, and Riverside.

Further, the research strategy for choosing the survey eligible samples
in Atlanta, Grand Rapids, and Riverside required exclusion of sample
members with certain background characteristics:  teen parents; parents
with children less than 3 years old; men with children ages 3 to 5;
people who did not speak either English or Spanish; and people who did
not provide information on their educational status and children's ages
prior to random assignment. This selection strategy may affect the
generalizability of impact results recorded from the survey.

Fortunately, cohort effects were small. For instance, differences in
two-year earnings gains between the full research samples and the survey
eligible samples varied by less than $100 in 9 of the 11 programs and
by less than $200 in every program (results not shown).


Selecting the Fielded Sample

The percentage of the survey eligibles who were chosen for the fielded
sample is the sampling ratio. Across all sites, sampling ratios ranged
from 15 percent to 100 percent.

In four sites, Atlanta, Grand Rapids, Portland, and  Riverside, the
fielded sample was selected by drawing a stratified random subsample
of the survey eligible sample. In Atlanta, Grand Rapids, and Riverside,
the sampling ratio varied (for research purposes) by research group,
date of random assignment, age of youngest child, and pre-random
assignment educational attainment of the sample member. In Portland,
sampling ratios varied by research groups and by date of random
assignment only. Although corrected for, as discussed below, differences
in sampling ratios may also affect survey impact estimates. For instance,
unless the total sample size is large, different sampling ratios increase
the likelihood that persons chosen in one research group differ (perhaps
in unmeasured characteristics) from persons chosen in another research
group.

In two other sites, Detroit and Oklahoma City, the fielded sample for
program and control group members was selected by drawing a simple random
sample from the eligible sample. That is, within these sites, a single
sampling ratio was applied to all program and control group members,
irrespective of their background characteristics. This sampling strategy
was used in Columbus as well, except that the sampling ratio for control
group members was slightly higher than for members of the Integrated and
Traditional groups.


II.  Weighting

To estimate impacts, weights were applied to the survey respondent sample to
correct for differences in sampling ratios between the strata in Atlanta,
Grand Rapids, Portland, and Riverside. In the unweighted fielded survey
sample in these sites, strata (i.e., sample members who share background
characteristics and have the same sampling ratio) with high sampling
ratios are over-represented and strata with low sampling ratios are
under-represented. To make the fielded sample more closely replicate
the background characteristics of survey eligibles, weights for each
stratum were set to equal the inverse of the sampling ratio for that
stratum. For example, a stratum in which 1 in 4 eligible persons were
chosen would receive a weight of 4 (or 4/1), whereas a stratum in which
every eligible person was chosen would receive a weight of 1 (or 1/1).
THE SAME WEIGHTS ARE USED FOR THE RESPONDENT SAMPLE. Weighting was not
required for sample members in Columbus, Detroit, and Oklahoma City,
because sample members' background characteristics did not affect their
chances of selection.


The variable FIELDWGT stores the weight value for each respondent.

It should be noted that under some conditions impacts for a weighted
respondent sample may still be different from those for the eligible
sample. For example, this result could occur if very different
proportions of program and control group fieldeds answered the survey,
or if members of a subgroup within one research group were more likely
to be interviewed than their counterparts in a different research group.


III.  Response Rates

As noted above, sample members who were fielded and interviewed are
survey respondents. Those chosen to be surveyed but who were not
interviewed are non-respondents. The table below shows the response rate,
the percentage of the fielded sample who responded to the survey, by
program and research group. As shown, in most programs, response rates
are high enough to suggest that the survey probably represents the eligible
sample.

The goal of the survey effort was to obtain responses from at least 70
percent of the fielded sample. The 70 percent goal was achieved for all
research groups in all sites; in fact, response rates reached 80 percent
or above for most research groups. These results inspire particular
confidence in the impacts for respondents.


           National Evaluation of Welfare-to-Work Strategies

            Number of Fielded Survey Sample Members and
                  Two-Year Client Survey Response Rates

                                              Number of
                                              Fielded     Response
Site and Program                              Members     Rate (%)

Atlanta Labor Force Attachment                 908         88.5
Atlanta Human Capital Development             1225         90.9
Atlanta Control                               1200         90.5

Grand Rapids Labor Force Attachment            637         90.1
Grand Rapids Human Capital Development         647         88.7
Grand Rapids Control                           631         92.6

Riverside Labor Force Attachment               740         76.2
Riverside Human Capital Development            819         75.8
Riverside Control                             1396         79.8

Columbus Integrated                            455         81.5
Columbus Traditional                           459         79.7
Columbus Control                               460         77.6

Detroit Program                                261         80.5
Detroit Control                                259         83.4

Oklahoma City Program                          356         72.8
Oklahoma City Control                          360         70.0

Portland Program                               385         77.1
Portland Control                               377         83.0