Evaluation Design Options for the Long-Term Care Registered Apprenticeship Program. 7. Conclusion


The long-term care industry faces shortages of highly trained direct care workers. As a result, the industry struggles to improve quality of care and the lives of workers, who often receive low wages and few fringe benefits. Jobs, such as CNAs and HHAs, are often viewed as dead-end jobs, with little opportunity for career advancement. While apprenticeship has a long and successful history in other occupations and in other countries, its application to long-term care is fairly new. Understanding whether apprenticeship can address the industry’s workforce shortage and increase quality of care and worker future prospects is unknown.

An important element in determining apprenticeship’s future in long-term care is evaluating the effects of the LTC RAP for employers and workers. In order for policymakers, direct care workers, and employers to decide whether to promote and participate in the LTC RAP depends on the benefits and costs, both monetary and non-monetary, to both groups. Potential evaluation approaches must be considered in the context of the current status of apprenticeship in long-term care organizations, how it is implemented, and the likelihood of its future dissemination among employers.

The analysis of RAPIDS data for this project (Anderson et al., 2010) and the site visits to program sponsors (Kuehn et al., 2011) provided information to identify a range of research questions at both the apprentice and employer-levels. To address these questions, RTI International and the Urban Institute examined a broad range of potential research design options to measure the effects of apprenticeship on outcomes related to these questions. The research designs varied in terms of their ability to provide generalizable findings and in their implementation costs.

Although LTC RAP apprenticeships usually have the same goals of improving long-term care quality through a better trained workforce and thus could potentially be evaluated as a whole, the nature of LTC RAPs poses certain constraints on the ability to implement any given evaluation design. For example, the LTC RAPs are relatively small, have high apprentice turnover rate, include lengthy training periods, feature purposeful selection of better than average employees, and provide limited data collected by employer sponsors on outcomes. These program features pose considerable challenges to most research design options. In addition, the difficulty in identifying appropriate comparison groups threatens the validity of many of the research designs considered.

After careful consideration, the RTI International/Urban Institute team identified four potential research designs that together form a comprehensive approach to an evaluation of the LTC RAP. These designs include apprentice worker or employer sponsor-level analyses. They span the range of less generalizable, less expensive qualitative designs to more generalizable, more expensive multivariate analyses. As shown in Exhibit 10, each has certain strengths and weaknesses. If all four components were funded, the estimated cost would be $985,000.

EXHIBIT 10. Overview of Potential Evaluation Design Options to Evaluate the LTC RAP
Option Advantages Disadvantages
Analysis of LEHD, comparing all apprentices with matched sample comparison group
27 months
  • Uses data on all apprentices, regardless of when they started and whether they completed the program
  • Captures duration with the firm before, during, and after apprenticeship
  • Addresses major issues of earnings and job tenure and continued employment in the industry
  • Dataset likely to include very high percentage of people ever participating in LTC RAPs
  • Easy access to a large supply of low-earning people working for non-apprentice long-term care providers for comparison group
  • No new data collection required; no OMB review required
  • Limited data on which to match apprentices and comparison group, leaving possibility of uncontrolled for selection bias
  • No data from perspective of apprentices on outcomes such as job satisfaction
  • No data from perspective of employers, except for duration of apprentices within the firm
  • Low-wage workers in comparison group will include housekeepers and dietary staff as well as direct care workers
One-time cross-sectional survey of apprentices and matched comparison group
30 months
  • Addresses more subjective outcomes, such as job satisfaction and relationship with supervisor
  • Provides more detailed data on apprentices
  • Possible to more completely control for selection bias
  • As cross-sectional design, only able to analyze “association” rather than causation
  • Comparison group facilities/agencies may be reluctant to provide contact information about apprentices
  • Correction for selection bias can only made after initial contact since providers unlikely or unable to provide detailed information on workers, raising costs
  • Only able to include apprentices who have stayed with employer that trained them; apprentices that left employer or field lost to analysis
  • Less consensus on measurement of “softer” outcomes
  • More expensive than other options
Focus groups of apprentices and of employers
14 months
  • Low-cost option
  • Provides information on views of apprentices
  • Can provide detailed suggestions from participants for improving LTC RAPs
  • Qualitative data cannot be used to determine effectiveness of intervention
  • Representativeness of views expressed cannot be directly assessed
  • Views by apprentices and providers provided cannot be easily summarized or quantified
  • Comparisons cannot be made to workers who did not participate in LTC RAP
Cost-benefit analysis
14 months
  • Attempts to measure whether benefits to employer exceeds the costs, which is key for establishing business case for program
  • Measures changes in turnover related to LTC RAPs
  • Consistent with approaches used in other studies of apprenticeship costs and benefits
  • Low-cost data collection
  • Measurement of relative productivity of apprentices is not straight forward
  • Employer estimates may be biased as some try to justify their investments

The first design option would use the LEHD administrative database to provide findings that could be useful for making the business case to employers, if the findings for employers and workers were positive. The LEHD is a Census Bureau database that includes state-level Unemployment Insurance administrative information on employment and earnings merged with certain other data. This design option would assess the effect of LTC RAPs on increased apprentice earnings and job tenure, and on the worker turnover rate at the employer level. Since an administrative dataset would be used, no additional data collection would be required and few employers or workers should be missing from the dataset. As an administrative dataset designed for other purposes, no data would be available about factors such as job satisfaction, relationships with supervisors, or quality of care provided. The biggest challenge for this design is using the limited variables available in the Unemployment Insurance data to construct a truly comparable comparison group. However, in other studies, prior earnings have been used effectively to proxy many personal characteristics. Although pre-program wage rates may vary little among potential apprentices and comparison group members, hours and weeks worked do vary considerably. In addition, the data can identify low-wage workers in other long-term care organizations that do not run LTC RAPs, but it cannot separate direct care workers from other low-wage workers. In particular, in residential settings, the analysis cannot differentiate between direct care workers and housekeeping and dietary staff, thus making the comparisons with apprentices somewhat imprecise, although matching on earnings should eliminate most of the non-direct care workers. Despite this limitation, this option provides potentially the most viable design to credibly address the most important research questions facing the industry. The estimated cost of this option is $285,000.

The second design option is a cross-sectional, one-time telephone survey of apprentices and a comparison group of non-apprentices to determine the effects of apprenticeship on job satisfaction, intent to leave one’s job, relations with supervisors and other staff, and other factors that only workers can address. The survey findings would be analyzed using statistical techniques that would identify the association of apprenticeship with job satisfaction and intent to leave, but causality could not be attributed to the LTC RAP because there are no measures of change over time. While the survey could include variables that would better control for selection bias, the survey is not as well suited as the administrative data option at assessing the economic consequences of LTC RAP for apprentices. Moreover, among direct care staff that stay in their jobs, previous studies have found high rates of job satisfaction, suggesting either that existing measures are not very sensitive or that there is not that much room for improvement among workers who stay in their jobs (Bishop et al., 2009 ). The cost for this option is relatively high because of the expense of data collection; the estimated cost for this option is $450,000.

The third design option would provide a much more detailed understanding of apprentice and employer opinions about how apprenticeship works. Eight focus groups would be conducted among apprentices at eight different employers, and two focus groups would be conducted among management of employer sponsors while they are attending national provider association meetings. These focus groups would provide a rich understanding of the value of apprenticeships over traditional training and how employers implement their LTC RAPs, but it could not provide any quantitative estimates of the effect of LTC RAPs. Next to the cost-benefit analysis, this option is the lowest cost option. The estimated cost of this approach is $150,000.

The fourth evaluation design option focuses on employer-level benefits and costs of LTC RAPs. Benefits, measured as the increased productivity achieved by the LTC RAP, and a range of implementation costs would be gathered through an Internet interview process among a selected group of employers. Data from the LEHD analysis would also be used to determine benefits. Costs would include supervision costs, the time lost to regular work, and whatever curriculum development that the facility does. One challenge to this design is posed by the need for employers to accurately assess the improvement in performance and productivity due to the LTC RAP. In addition, if limited to relatively large LTC RAPs, the analysis will not include a large enough number of employers to generalize across all LTC RAPs. Still, this design would address questions related to the business case for employers, at a much lower price than the LEHD or survey design options. The estimated cost of this option is $100,000.

In considering these alternatives, ASPE/HHS and DOL must answer two major questions. First, can the LTC RAP be a strong enough intervention to yield net benefits at the apprentice or employer/sponsor-level? Is it plausible to expect gains in increased wages, job tenure, job satisfaction, commitment to the industry, productivity and quality of care and decreased turnover as a result of participation in the LTC RAP? In other words, can the LTC RAP approach plausibly improve outcomes for consumers, workers, employers, clients and funders for a large number of apprentices and employers? Second, can the research designs presented here or other possible designs produce findings that can withstand critical scrutiny from researchers and policymakers? In other words, will the evaluation provide methodologically defensible results that justify the cost of the evaluation?

View full report


"LTCRAPedo.pdf" (pdf, 870.86Kb)

Note: Documents in PDF format require the Adobe Acrobat Reader®. If you experience problems with PDF documents, please download the latest version of the Reader®