Evaluation Design Options for the Long-Term Care Registered Apprenticeship Program. 5.4. Data Sources


To conduct an evaluation of this type, several qualitative and quantitative methods and sources of data could be used. These include:

  • Surveys that would gather systematic, quantitative information on large numbers of people or programs through mail questionnaires or telephone, web or in-person interviews. At the extremes, mail or web surveys are the lowest cost, but also have the lowest and often biased response rates; in-person surveys are the most expensive approach. Web and mail surveys (with follow-up) might work effectively for surveys of employers. Telephone surveys are typically in between in terms of cost, but the increase in the use of cell phones and the decline in the use of landlines often make these surveys problematic for younger and lower income populations. All of these surveys require relatively large numbers of names and contact information, such as mailing address, telephone number, or e-mail address, which is often difficult to obtain, especially for comparison groups. Experience with the National Nursing Assistant Survey and with the National Home Health Aide Survey suggests that approximately 76% of nursing homes (71% of home health agencies) would respond and within nursing homes 70% of apprentices and other workers (79% of home health apprentices) would respond for an overall two stage response rate of approximately 53% (Squillace, Remsberg, and Bercovitz, 2006 ; National Center for Health Statistics (NCHS), undated).

  • Administrative datasets, such as Unemployment Insurance quarterly earnings record and CMS quality of care data, would provide useful information on factors such as wages, job tenure, jobhistory and facility/agency performance without having to survey respondents. Administrative datasets are collected for purposes other than research and have limited variables on the characteristics of workers and employers. DOL has the Social Security numbers of people who have participated in LTC RAPs, which would allow the identification of apprentices in these databases. Privacy concerns may limit what information can be released since the number of apprentices and employers is relatively small and possibly could be identified in the data. At the employer level, as noted above, CMS has a large amount of quality of care data about individual nursing homes and home health agencies that is publicly available in downloadable datasets. Similar information is not available for programs for people with developmental disabilities or for residential care facilities.

  • Focus groups, which are structured conversations with small groups of respondents (e.g., apprentices or employers) about issues of interest for the evaluation. This approach provides detailed information on the perspectives of a relatively small number of people. The information does not provide quantitative data and cannot be used to determine the effectiveness of LTC RAPs.

  • Case studies, which would include structured discussions with multiple stakeholders in a LTC RAP. For a case study of a LTC RAP, the evaluators would interview the agency or facility administrator, LTC RAP director or liaison, mentors, instructors, apprentices, supervisors, and state officials. If implemented, these case studies would build on the site visits conducted for this contract (Kuehn et al., 2011 ). However, the RTI International/Urban Institute team conducted detailed case studies in 2011 with almost all of the larger programs, so additional case studies may not substantially add to the information already available.

View full report


"LTCRAPedo.pdf" (pdf, 870.86Kb)

Note: Documents in PDF format require the Adobe Acrobat Reader®. If you experience problems with PDF documents, please download the latest version of the Reader®