Skip to main content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Evaluation Design Options for the Long-Term Care Registered Apprenticeship Program

Publication Date

Evaluation Design Options for the Long-Term Care Registered Apprenticeship Program

Executive Summary

Joshua M. Wiener, Ph.D., and Wayne L. Anderson, Ph.D.
RTI International

Daniel Kuehn, M.P.P., and Robert Lerman, Ph.D.
Urban Institute

September 2011



The United States faces a critical need for high-quality long-term care workers. The demand for long-term care services is projected to roughly double between 2000 and 2030 as the population ages (Johnson, Toomey, and Wiener, 2007). The U.S. Department of Labor (DOL) projects that home health aides and home care personal care assistants will be among the fastest growing occupations between 2008 and 2018 (DOL, 2011 ).

Apprenticeship is a well-established strategy for training workers by combining classroom and experiential learning, and placing workers into careers that offer the opportunity for advancement. Best known for training occupations like plumbers and electricians, the apprenticeship model is now being applied to long-term care occupations. By improving the skills of direct care workers, higher wages may be justified by the greater productivity of workers. By restructuring employment in the long-term care industry, apprenticeship may provide a path for career advancement. This report assesses possible research designs to evaluate Long-Term Care Registered Apprenticeship Programs (LTC RAP).

Background

LTC RAPs, registered by DOL Office of Apprenticeship and developed by employers, employer associations and labor-management organizations, provide formal training and work experience for direct care workers in long-term care settings. Since the program’s inception in 2003, 119 long-term care employers have offered LTC RAP employment and training to 4,376 long-term care workers (RTI International/Urban Institute analysis of program data, May 2011).

Registered apprenticeship programs are primarily funded by employers with some assistance with start-up funding from government (including DOL) or foundation grants. The required hours of training for all LTC RAPs far exceed what is normally provided by several orders of magnitude. LTC RAPs include four main components. First, on-the-job training (OJT) occurs at a worker’s place of employment. Second, related instruction may take place onsite or at technical or community colleges. Related instruction may occur through various modes of instruction (e.g., in-person, web-based, or correspondence courses). Third, mentoring is often a feature of many apprenticeships, occurring sometimes through workers who have completed apprenticeships themselves. Mentors provide on-the-job coaching and help apprentices identify and acquire competencies needed to perform their jobs successfully. Fourth, a clear wage and career progression is a key component of apprenticeship programs. Wage progressions are often tied to the completion of certain occupational competencies, either in their related instruction, OJT, or both. This advancement opportunity provides an incentive for the apprentices to acquire skills demanded by employers.

Research Questions

In broad terms, research questions about the LTC RAP can be divided into two groups:

  • How does the LTC RAP affect apprentices in terms of earnings, job tenure, job satisfaction, and increased competency?

  • How does the LTC RAP affect employer sponsors in terms of job turnover, job tenure, improved quality of care, and increased revenue?

In both cases, the comparison is to similar workers and providers who do not participate in or operate LTC RAPs.

Implications of Characteristics of LTC RAPs Relevant to Evaluation Designs

In the evaluation of any program, the particular characteristics of the intervention make it easier or more difficult to design an evaluation. Some of the characteristics of the LTC RAP that affect possible research designs include:

  • Decentralization of design responsibility to individual employers. Although employers have great flexibility in how they design and administer their LTC RAPs, there appears to be enough uniformity in goals and programs to be able to talk meaningfully about a single LTC RAP program.

  • Size of the LTC RAP program. As of May 2011, there were 119 LTC RAPs, 954 active apprentices, a total of 1,347 people who have completed an apprenticeship, and overall, a total of 4,376 apprentices who had ever participated in the program, regardless of whether they completed an apprenticeship or not. Most programs are small with just a handful of apprentices. To have a large enough sampleto detect statistically significant effects requires a large number of apprentices, probably requiring the entire program rather than a sample.

  • Availability of data. Based on our site visits, it appears that few programs collect much systematic data on outcomes. Thus, almost all of the data will need to be collected by an evaluation contractor or from administrative databases collected for other purposes.

  • Selection bias and the problem of comparison groups. A key characteristic of LTC RAPs is that, for most programs, only a small percentage of direct care workers within an employer sponsor are selected to participate. These workers are typically selected because they are the best and most promising workers. Thus, these workers are likely to differ in important ways from other workers of the same age, gender, education and years of work experience, making development of comparison groups more difficult.

  • Sponsors’ use of apprentices to improve non-apprentice staff performance and the problem of comparison groups. One possible strategy to develop a comparison group is to select people working for the same employer who are not apprentices. However, employers visited during our case studies almost always assigned apprentices to act as peer-mentors for other workers. While this is a strength of the program, it means that non-apprentices are not free of the potential impact of the apprenticeship program and are, therefore, problematic as a comparison group.

Assessment of a Broad Range of Evaluation Options

There are many possible research designs for an evaluation of the LTC RAP, with varying costs and degrees of scientific rigor. Most evaluations of job training programs focus solely on the program participants, mainly their gains in employment and earnings. However, since many of the policy motivations for LTC RAPs have to do with improving the performance and quality of care of long-term care providers, the evaluation should address both apprentices and their employers.

EXHIBIT ES-1. Overview of Methods, Data Collection, and Potential Feasibility
Analysis Methods Qualitative Analysis Quantitative Analysis
Descriptive analysis (single point in time AND no comparison group) Multivariate analysis (two points in time OR with a comparison group OR both)
Data Collection Focus groups of workers Case studies of employers In-depth ethnographic studies and implementation evaluation Survey of LTC RAP workers or employers across occupation/ organization types LTC RAP administrative data Linking Medicare/ Medicaid claims/ OSCAR data Survey only Survey and existing secondary data (National Nursing Assistant Survey; National Home and Hospice Care Survey)
Feasibility Lower cost
Lower generalizability
––––––––––> Higher cost
Higher generalizability

The overarching approach for most designs is to compare the apprentice and employer performance to what it would have been in the absence of the LTC RAP. Exhibit ES-1 provides a broad overview of the range of methods, types of data collection, and their relationship to costs and ability to generalize the findings to the total population of LTC RAPs.

Detailed Description of Four Approaches to Evaluating the LTC RAP

After considering a large number of possible options, the RTI/Urban team devised a four- component approach to evaluating the LTC RAP. The four components are: (1) use of the Longitudinal Employer Household Dynamics (LEHD) administrative dataset to compare workers who have participated in the LTC RAP program with workers who have not participated in the LTC RAP; (2) a cross-sectional telephone survey of workers who have ever participated in the LTC RAP and workers who have never participated in the LTC RAP; (3) focus groups with apprentices and focus groups of employers without a comparison group; and (4) a cost-benefit analysis of the LTC RAP from the employer’s perspective. With the exception of the cost-benefit analysis, which depends in part on the analyses of the administrative dataset and the telephone survey, each component is separate, but complementary, and could be funded without the others. Thus, government decision makers can mix-and-match the approaches as they see fit; they can decide to fund any one component or all pieces or any combination. Exhibit ES-2 summarizes the four components and their advantages and disadvantages. The estimated cost for all four components is $985,000 in 2011 dollars.

The first design would use the LEHD administrative database to assess the effect of LTC RAPs on increased apprentice earnings and job tenure, and on the worker turnover rate at the employer level. The LEHD is a Census Bureau database that includes state-level Unemployment Insurance administrative information on employment and earnings merged with certain other Census data. The biggest challenge for this design is using the limited variables available in the Unemployment Insurance data to construct a truly comparable comparison group.

In other studies of job training programs, prior earnings are used to proxy many personal characteristics, but wages (although not hours) are highly constrained in long-term care. In addition, the dataset can identify low-wage workers in other long-term care organizations, but cannot separate direct care workers from other low-wage workers in long-term care organizations (e.g., housekeeping and dietary workers in nursing homes and assisted living facilities). Still, this option provides potentially the most viable design to credibly address the most important research questions facing the industry.

EXHIBIT ES-2. Overview of Main Evaluation Design Options
Option Advantages Disadvantages
Analysis of LEHD, comparing all apprentices with matched sample comparison group
$285,000
27 months
  • Uses data on all apprentices, regardless of when they started and whether they completed the program
  • Captures duration with the firm before, during, and after apprenticeship
  • Addresses major issues of earnings and job tenure and continued employment in the industry
  • Dataset likely to include very high percentage of people ever participating in LTC RAPs
  • Easy access to a large supply of low-earning people working for non-apprentice long-term care providers for comparison group
  • No new data collection required; no Office of Management and Budget review required
  • Limited data on which to match apprentices and comparison group, leaving possibility of uncontrolled for selection bias
  • No data from perspective of apprentices on outcomes such as job satisfaction
  • No data from perspective of employers, except for duration of apprentices within the firm
  • Low-wage workers in comparison group will include housekeepers and dietary staff as well as direct care workers
One-time cross-sectional survey of apprentices and matched comparison group
$450,000
30 months
  • Addresses more subjective outcomes, such as job satisfaction and relationship with supervisor
  • Provides more detailed data on apprentices
  • Possible to more completely control for selection bias
  • As cross-sectional design, only able to analyze “association” rather than causation
  • Comparison group facilities/agencies may be reluctant to provide contact information about apprentices
  • Correction for selection bias can only made after initial contact since providers unlikely or unable to provide detailed information on workers, raising costs
  • Only able to include apprentices who have stayed with employer that trained them; apprentices that left employer or field lost to analysis
  • Less consensus on measurement of “softer” outcomes More expensive than other options
Focus groups of apprentices and of employers
$150,000
14 months
  • Low-cost option
  • Provides information on views of apprentices
  • Can provide detailed suggestions from participants for improving LTC RAPs
  • Qualitative data cannot be used to determine effectiveness of intervention
  • Representativeness of views expressed cannot be directly assessed
  • Views by apprentices and providers provided cannot be easily summarized or quantified
  • Comparisons cannot be made to workers who did not participate in LTC RAP
Cost-benefit analysis
$100,000
14 months
  • Attempts to measure whether benefits to employer exceeds the costs, which is key for establishing business case for program
  • Measures changes in turnover related to LTC RAPs
  • Consistent with approaches used in other studies of apprenticeship costs and benefits
  • Low-cost data collection
  • Measurement of relative productivity of apprentices is not straight forward
  • Employer estimates may be biased as some try to justify their investments

The second design option is a cross-sectional, one-time telephone survey of apprentices and a comparison group of non-apprentices to determine the effects of apprenticeship on job satisfaction, intent to leave one’s job, relations with supervisors and other staff, and other factors that only workers can address. The survey findings could demonstrate an association between the apprenticeship program and outcomes, but causality could not be attributed to the LTC RAP because there are no measures of change over time. Moreover, among direct care staff that stay in their jobs, previous studies have found high rates of job satisfaction, suggesting either that existing measures are not very sensitive or that there is not that much room for improvement among workers who stay in their jobs (Bishop et al., 2009 ).

The third design option would provide a much more detailed understanding of apprentice and employer opinions about how apprenticeship works. Eight focus groups would be conducted among apprentices at eight different employers, and two focus groups would be conducted among management of employer sponsors. The apprentice focus groups would be held in the general geographic area of the employer, but not at the employer’s location; the employer focus groups would be held at national provider association meetings. These focus groups would provide a rich understanding of the value of apprenticeships over traditional training and how employers implement their LTC RAPs, but it could not provide any quantitative estimates of the impact of LTC RAPs.

A fourth evaluation design focuses on the employer-level benefits and costs of the LTC RAPs. Benefits, measured as the increased productivity achieved by the LTC RAPs, and a range of implementation costs would be gathered through an Internet survey process among a selected group of employers. Costs would include supervision time, apprentice time lost to regular work, and whatever curriculum development that the facility or agency does. Data from the LEHD analysis would also be used to determine benefits. The greatest challenge for this design is the lack of data for employers to accurately assess the improvement in performance and productivity due to the LTC RAPs. This design would explicitly address questions related to the business case for employers.

In considering these alternatives, the Office of the Assistant Secretary for Planning and Evaluation, U.S. Department of Health and Human Services and DOL must answer two major questions. First, can the LTC RAP be a strong enough intervention to yield net benefits at the apprentice or employer-level? Is it plausible to expect gains in increased wages, job tenure, job satisfaction, commitment to the industry, productivity and quality of care and decreased turnover? In other words, can the LTC RAP approach be implemented on a large enough scale that it can possibly improve outcomes for consumers, workers, employers, clients and funders for a large number of apprentices and employers? Second, can the research designs presented here or other possible designs produce results that can withstand critical scrutiny from researchers and policymakers? In other words, will the evaluation provide methodologically defensible results that justify the cost of the evaluation?


The Full Report is also available from the DALTCP website (http://aspe.hhs.gov/_/office_specific/daltcp.cfm) or directly at http://aspe.hhs.gov/daltcp/reports/2011/LTCRAPedo.shtml.