Skip to main content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Using Data to Monitor and Improve the Work Participation of TANF Recipients: Examples from New York City and Utah

Publication Date

 

Strategies for Increasing TANF Work Participation Rates

Using Data to Monitor and Improve the Work Participation of TANF Recipients: Examples from New York City and Utah

December 2008

By: Jeffrey Max and Gretchen Kirby Mathematica Policy Research

Project Page:http://aspe.hhs.gov/hsp/08/TANFWPR/index.shtml

About This Project and Brief

This report is available on the Internet at:http://aspe.hhs.gov/hsp/08/TANFWPR/2/

Printer Friendly Version in PDF Format (10 pages)

Contents

EndnotesSuggested Further Readings

Abstract

This practice brief profiles two strategies, one state-wide and one local, for analyzing, reporting, and using data to hold case managers and administrators accountable for increasing the work participation of Temporary Assistance for Needy Families (TANF) recipients. We selected strategies in which data is used to keep staff informed about progress toward participation rate goals and allow program managers to address nonparticipation quickly. New York City developed a special report that tracks the participation rate and the administrative processes that affect the rate for each TANF office; senior staff met regularly with program administrators to review and discuss the report. Utah developed automated tools that case managers and supervisors can use to monitor the participation of individual TANF recipients and to report participation rates for regions, offices, and individual case managers. The data management strategy used in each site represents one element of a broader effort by each site to improve work participation rates.

Introduction

The Deficit Reduction Act of 2005 (DRA) effectively raised the work participation rates that states must achieve in their TANF programs. To address this and other program changes in the implementing regulations issued by the U.S. Department of Health and Human Services, New York City and Utah modified their data collection and reporting systems to support a goal-oriented approach to increasing the work participation of TANF recipients. In both approaches, TANF program data are being used in new ways to (1) make a timely determination about which recipients are not meeting work requirements, and (2) inform strategies used at the case management, supervisory, and administrative levels to engage recipients in work activities. Although neither site had to develop a new data collection system to implement its strategy, they did have to develop new approaches to using and reporting data.

New York City and Utah also are using program data to monitor progress toward meeting federal work participation rate goals. Before the DRA, both sites relied on substantial caseload reduction credits to help them meet federal work participation rates. As a result of declining TANF caseloads, the effective participation rate was just under 40 percent in New York City and approximately 20 percent in Utah.[1] In 2006, the TANF commissioners in each site set a much higher participation goal that they wanted to achieve fairly quickly  60 percent in New York City and 50 percent in Utah. In New York City, work or work-related activity is mandatory for 66,000 TANF recipients, while in the largely rural state of Utah, 2,500 must participate in work or work-related activities.

Each site took specific steps to make the participation rate goals relevant to staff and the data management strategy useful to case managers. Because the participation rate depends on the actions of each recipient, program administrators felt that case managers needed to see the connection between their practices and the indicators and outcomes being tracked. The indicators used in New York City measure the timeliness of the application, engagement, and sanction processes, and in Utah the state provides case managers with more efficient tools to track the status of each recipient.

[ Go to Contents ]

Program Model: The New York City Ring Report

The New York City Ring Report is an initiative of the New York City Human Resources Administration that integrates a focus on the work participation rate into an existing performance management approach used by the agency to track key indicators of Job Center performance in serving TANF recipients and transitioning them to employment. (Job Centers are the local TANF offices.) The Ring Report is an extension of JobStat, the city's long-standing data management approach that involves setting performance goals for Job Centers, tracking centers' performance in monthly reports, and holding regular meetings with senior city officials and Job Center staff to discuss results and share strategies. The city established similar data management systems to monitor the activity of case managers (CenterStat) and employment service vendors (VendorStat).

The Ring Report, which focuses exclusively on the participation rate, was designed as a competition between teams of Job Centers. The city distributes the Ring Report each month to Job Center directors. Each team of Job Centers meets to discuss Ring Report results, and senior officials hold cross-team meetings to monitor changes and trends in Job Center performance on the Ring Report indicators.

The Data Tool

The Ring Report is New York City's primary tool for using data to monitor and improve TANF participation rates. The three steps used to design the Ring Report show how policy goals and priorities can be incorporated into measures of performance.

Step 1: Identifying Performance Indicators. In developing the Ring Report, officials in New York City focused on indicators that would capture two aspects of Job Center performance: (1) their ability to achieve the outcome important to the agency (that is, an increased participation rate), and (2) their ability to carry out the processes that may affect recipient engagement and participation.

Outcome Indicators:
  • Participation rate of current TANF recipients
  • Participation rate of former TANF recipients receiving assistance through New York State's Safety Net program (funded with TANF MOE dollars)
  • Participation rate for other Safety Net program recipients who are not eligible for TANF (non-TANF assistance)
  • Combined participation rate for current and former TANF recipients (TANF and TANF MOE); equivalent to the federal participation rate requirement
  • Monthly change in the combined participation rate (TANF and TANF MOE)

Process Indicators:

  • Percent of recipients in the sanction process for five weeks or fewer
  • Percent of recipients in the engagement process who were engaged within a month
  • Fair hearing affirmation rate on employment issues
  • Fair hearing win rate on employment issues

Qualifying Process Indicators:

  • Application timeliness rates for public assistance and food stamps (90 percent of public assistance and food stamp applications must be completed within the required 30-day limit for TANF and 45-day limit for Safety Net)
  • Outcome Indicators. The outcome indicators in the Ring Report consist of five measures of the participation rate (see Exhibit 1). New York City measures this rate for each Job Center for three public assistance populations. A fourth measure indicates the combined rate for the populations funded by TANF and TANF maintenance-of-effort (MOE) dollars, which is the official rate for federal reporting purposes. Finally a fifth measure indicates the monthly change in the participation rate to recognize Job Centers that have low, but improving, participation rates.
  • Process Indicators. The Ring Report tracks several Job Center processes: engaging TANF recipients in work or work-related activities, the sanction process, and the fair hearing process (see Exhibit 1). These indicators were identified by a group of senior staff of the Family Income Administration (FIA) in New York City's Human Resources Administration (HRA). The group examined the status and activities of nonparticipating TANF recipients and found that more than half either were sanctioned or in the sanction process, and about 20 percent were in the initial intake and engagement process. New York City developed indicators to monitor the percent of recipients who complete both the sanction and the engagement process in a timely manner. Timeliness of the sanction process is measured by the percent of recipients who complete the process in the maximum amount of time expected. The other process indicators, which relate to monitoring the fair hearing process, are intended to encourage case managers to make appropriate decisions when assigning recipients to work-related activities.
  • Qualifying Process Indicators. These indicators are the rates at which Job Center teams process applications for public assistance and food stamps. Although the Ring Report monitors the performance of all Job Center teams, teams must meet a 90 percent rate to be eligible for the Ring Report prize.

Step 2: Measuring Performance Relative to Agency Goals. The Ring Report compares Job Center performance on each indicator to agency targets established by FIA senior staff. This approach ensures that Job Center staff know the expected level of performance and can gauge their success in achieving agency targets. In addition to each target, FIA established a lower threshold based on a minimum or required level of performance and an upper threshold, or goal for agencies to try to reach, that represents "top performance." For example, the FIA commissioner set an agency-wide target of a 60 percent participation rate. The Ring Report specifies a lower threshold of 50 percent to identify Job Centers performing below the FIA target but above the federally required rate, and an upper threshold of 70 percent to identify Job Centers that meet or exceed agency goals.

Job Center performance is measured as the relative distance between the lower and upper threshold for each indicator. A Job Center performing at or below the lower threshold is achieving 0 percent of the agency goal, and a Job Center performing at or above the upper threshold is achieving 100 percent of the goal. Other values are converted to a percentage based on their relative distance between the lower and upper thresholds. For example, a Job Center with a 60 percent TANF participation rate is halfway between the lower threshold of 50 percent and the upper threshold of 70 percent, thus achieving 50 percent of the TANF participation rate goal of 70 percent.

Step 3: Calculating an Index Score. The index score is designed to represent overall Job Center performance and is on a scale of 0 to 100. Senior FIA staff weighted each indicator by assigning a point value to the indicator based on its relative importance to agency goals. For example, the engagement process indicator is worth 14 points, and the fair hearing win rate indicator is worth 5 points. Job Centers earn points toward each indicator based on their performance relative to agency goals  a Job Center achieving 50 percent of the agency goal receives 50 percent of the total point value for that indicator. The points earned for each indicator are added together to calculate an overall point value or index score. The outcome indicators (participation rates) represent 60 of the 100 total points, and the process indicators make up the other 40 points (Exhibit 2). While participation in the Safety Net program has the largest individual indicator point value (20 points), the multiple indicators for the TANF and MOE participation rates total 40 points. (The Safety Net program provides cash assistance to single adults, childless couples, and families with children who have reached the end of their 60-month TANF time limit.)

Exhibit 2. Thresholds, Point Values, and Sample Index Score of Ring Report Indicators
Indicator Actual Thresholds and Point Values Sample Rates and Points Earned
Lower Threshold Upper Threshold Point Value Team Rate Percent of Goal Points Earned
Outcome Indicators
TANF Participation Rate 50% 70% 8 50% 50% 4
MOE Participation Rate 50% 70% 8 50% 0% 0
Safety Net Participation Rate 90% 95% 20 50% 0% 0
TANF & MOE Participation Rate 50% 70% 14 50% 25% 3.5
Change in TANF & MOE Participation Rate 1% 3% 10 2% 50% 5
Total Points for Outcome Indicators     60     12.5
Process Indicators
Percent of Cases in Engagement Process for one Month or Less 90% 95% 14 91% 20% 2.8
Percent of Cases in Sanction Process for Five Weeks or Fewer 95% 97% 13 99% 100% 13
Fair Hearing Affirmation Rate on Employment Issues 70% 95% 8 80% 40% 3.2
Fair Hearing Win Rate on Employment Issues 70% 95% 5 85% 60% 3
Total Points for Process Indicators     40     22
Total Index Score     100     34.5

Accountability Through Competition

New York City used a competition to encourage Job Center staff to improve the work participation rate. According to FIA senior staff, Job Center directors and line staff followed the competition closely. It took place between Ring Report "teams," each consisting of Job Center directors and a regional manager or specialist from three to five Job Centers. The regional manager or specialist, chosen by the directors, served in an advisory position as the "team coach." One of the directors served as the team's manager, who coordinated team meetings and represented the team in regular meetings with FIA senior staff. At the end of the competition, the team with the highest index score received rings.

As mentioned, the Ring Report system consisted of at least two meetings a month: a meeting of each Ring Report team (individual team meeting) and a meeting of FIA senior staff and all Ring Report team managers (cross-team meeting).

Individual Team Meetings. In the individual team meetings, Job Center directors reviewed their performance on each indicator, examined the factors that may have affected performance, and considered how to improve it. Two key activities generally took place:

  1. Review of individual cases or groups of cases. Meeting participants often reviewed a random selection of individual case files or a group of cases with something in common. The former allowed the team to discuss the flow of particular cases and whether case managers took appropriate action along the way. The latter helped Job Centers focus on policies and practices that contributed to their performance on the Ring Report. For example, to better understand an engagement process indicator that might have been low, several Ring Report teams examined TANF cases in which people were enrolled but had not established an employment plan.
  2. Discussion of strategies for improving performance. The individual team meetings provided an opportunity to address the factors that could help improve program performance. For example, when trying to understand a drop in the proportion of recipients engaged in one month (the engagement process indicator), one Ring Report team found that a large number of TANF recipients were called in for multiple appointments to complete the required upfront processes. The team decided that each Job Center director should encourage case managers to get the necessary paperwork and processes in place using fewer appointments. Another team found that good cause exemptions were bringing down the participation rate. The directors became concerned about the appropriateness of these exemptions and required an additional approval by higher level staff for exemptions that previously had not been in place.

Cross-Team Meetings with Senior Staff. These meetings allowed FIA senior staff to meet with Ring Report team managers to discuss Job Center performance. Each Ring Report indicator was reviewed to identify any month-to-month changes or longer-term trends. When a Job Center experienced a noticeable change on an indicator (an increase or decrease of at least two percentage points), it was required to account for the change during the meeting. In preparation for the meeting, Job Center directors would have discussed factors during their individual team meetings such as staffing, case management processes, or external factors unrelated to Job Center activities. For example, a Job Center that improved its performance on the engagement process indicator cited its recent efforts to closely monitor attendance of recipients at case manager appointments. Cross-team meetings also provided an opportunity for further discussion of problems and program improvements.

Data Strategy Results and Status in New York City

The Ring Report competition was initially structured as a short-term event lasting from June to December 2006. Job Centers reacted enthusiastically to the competition but wanted a longer timeframe in which to produce results. The competition was therefore extended through April 2007; at the original end date in December 2006, the team in the lead received watches. The competition was then extended a second time until December 2007, at which point it ended. After a year and a half the Job Centers' enthusiasm had waned for the competitive aspect of the process and the cross-team meetings, possibly because the report and the review process became a routine part of their efforts to achieve higher work participation rates, and because the work participation rate proved difficult to increase. Use of the Ring Report for performance tracking and measurement is now left to the discretion of each Job Center. Staff can access the indicator and outcome reports online via the ongoing JobStat system.

New York City experienced only a slight improvement in its participation rate from 2006 to 2007, from 39 to 42 percent. (See Exhibit 3.) It is difficult to assess the contribution made by the Ring Report to the change in the participation rate because a variety of other factors had the potential to affect the rate.

Participation Rate Goal:  60% Participation Rate (mid-2006):  39% Participation Rate (Dec. 2007):  42%

Source: Interviews conducted in March/April 2007 and April 2008

Senior TANF program staff in New York City believe that the slight change in the work participation rate is not a poor reflection on the Ring Report strategy but a result of the universal engagement strategy they have used for TANF clients for many years. Although the Ring Report helped to focus staff on the participation rate, administrators and staff ultimately feel that, because they were already focused on engaging all recipients in program activities to the extent allowable under New York law, there were few additional actions they could have taken within their current policy framework to raise the rate.

[ Go to Contents ]

Program Model: Utah's Case Management and Participation Reports

Utah uses two data reports to track case management activity and to use the work participation rate in the management of case managers, offices, and regions. These reports were derived from the state's Your Online Data Access, or the YODA, system. This system contains detailed client- and program-level information for a variety of programs (including TANF, the Food Stamp Program, and Workforce Investment Act programs). The system also produces more than 50 reports to assist administrators and staff in managing caseload activity in Utah's highly integrated social and employment services delivery system. Improving case management activities and increasing the participation rate are crucial purposes of these two reports.

The Data Tools

Two reports form the basis of Utah's strategy for analyzing, reporting, and using data to hold TANF staff at all levels accountable for improving the work participation rate. The Case Management Customer Report (CMCR) lists all cases along with information on case management activity and work participation. The Participation Report presents the participation rate for multiple levels, including regions, offices, and individual case managers.

Case Management Customer Report. Case managers and supervisors in Utah routinely use the CMCR to monitor work participation in order to quickly identify potential problems. The report provides a real-time snapshot of the assigned activities and hours for each recipient by case manager (Exhibit 4). It also summarizes recent case manager activity for each case, including the date of the last case note, the next scheduled appointment, and the planned next date of contact. The CMCR also includes warning flags on recipients whose assigned hours are too few to meet participation requirements or who have been assigned to job search activities for more than six weeks in the current year. In addition, it identifies recipients who are not meeting the work participation requirements.

Contact Information
  • Name
  • Address
  • Telephone number
  • Social Security Number

Case Management Information

  • Date of last note
  • Date of next appointment
  • Date of next contact

TANF Status

  • Number of months on TANF
  • Number of sanctions

Participation Assignment Information

  • Assigned activities
  • Activity start date
  • Assigned hours
  • Assigned to job search for more than 6 weeks
  • Assigned to fewer than required hours
  • Not participating in assigned activities

Participation Report. The Participation Report provides information for tracking the participation rate at all levels, from the state to individual case managers. The online report is formatted like the dashboard of a car, with one gauge for each region of the state. Exhibit 5 shows the statewide participation rate gauge and the gauges for two regions. The four dials on each gauge represent different participation rate calculations: (1) the year-to-date rate based on planned hours, (2) the year-to-date rate based on verified hours, (3) the monthly rate based on verified hours, and (4) the future rate needed to meet the required 50 percent rate for the year.

Exhibit 5. Utah Dashboard Report

Exhibit 5. Utah Dashboard Report displays a set of three sample gauges showing the Cumulative Plan Rate, the Monthly Plan Rate, the Cumulative Verified Rate, and the Monthly Verified Rate. It is just for illustration; the actual numbers do not mean anything.

If a user clicks on a region, the report displays a set of gauges for each office within the region. Selecting an office provides a set of gauges for each team of case managers, and selecting a team displays gauges for each case manager. At the case manager level, the Participation Report provides easy access to detailed participation information for each recipient. This in-formation allows case managers to (1) determine whether or not each recipient is meeting his/her federal work participation requirement; (2) identify participation in activities that does not count toward the work participation rate; (3) address discrepancies between planned and actual hours for each recipient; and (4) determine whether recipients have exceeded limits on job search, vocational education, or excused absences. The report complements CMCR by tracking both actual and planned activities and hours. Utah staff regularly review the information and make changes to respond to case managers' and administrators' information needs. Program administrators in Utah organized formal training sessions to train case managers on participation rate requirements and the functionality of the Participation Report.

Accountability Through Monitoring Performance at All Levels

Case managers use CMCR daily to ensure that recipients have not slipped through the cracks. Supervisors use it to review and discuss case management practice and performance. They also draw on the report often to review individual cases with case managers. Their discussion typically covers particularly challenging cases or issues and the appropriate responses. They also discuss steps to increase or change the assigned activities for flagged individuals who are not meeting participation requirements.

State and regional administrators use the Participation Report to track progress toward the 50 percent participation rate and to compare the rates at multiple levels. The latter provides a snapshot of how each level contributes to the state's overall participation rate. The report also allows office supervisors to compare the rates for teams of case managers and for individual managers. (Teams generally consist of up to 10 case managers, but there is considerable variation because of differences in caseload size. Some local offices hold a monthly staff meeting to review their rates in the Participation Report, discuss the factors that improved or lowered their participation rates, and brainstorm about how to achieve a higher rate.

To meet the statewide 50 percent participation goal, the state's regional directors hold each case manager accountable for achieving a 50 percent rate for his or her caseload. This participation target is included in each case manager's performance plan. Staff can be placed on a corrective action plan for not meeting the rate, as is the case for any element in their performance plan. Central Utah, the largest region in the state, holds a quarterly "Passion for Participation" luncheon as a reward for case managers who meet the participation goal. These case managers also receive a half-day off from work in recognition of their performance.

Data Strategy Results and Status in Utah

Utah plans to continue to use and refine the CMCR and the Participation Report. TANF administrators believe that the data strategy was useful in raising staff awareness about the rates and in motivating staff to focus on engaging clients in activities that count toward the federal participation rate.

By the end of 2007, Utah's participation rate was more than 44 percent, a substantial improvement from 19 percent in mid-2006 (see Exhibit 6). The rate varies considerably between Job Centers and is highly variable in remote areas that can have caseloads of 20 or fewer clients.

Participation Rate Goal:  50 % Participation Rate (mid-2006):  19 % Participation Rate (Dec. 2007):  44 %

Source: Interviews conducted in March/April 2007 and April 2008

The state made many simultaneous changes to TANF program policy and practice to increase the work participation rate, making it difficult to determine which changes were the most effective. For example, the state created a transitional cash assistance program for TANF recipients who found employment, revised its sanction policies, and held case managers responsible for meeting individual participation rates. Nonetheless, administrators believe that the increase in the use of program and performance data, and improvements in access to these data by frontline staff helped to substantially improve the state's participation rate.[2]

[ Go to Contents ]

Conclusion

Setting goals, tracking performance, and holding staff accountable have long been standard practices in the business world, and they have increasingly gained credibility in the management of public programs. Since the advent of welfare reform, states have invested considerable resources in tracking recipient outcomes, particularly after individuals leave TANF. Changes to the TANF program as a result of the DRA gave states an added incentive to expand their data collection and analysis efforts to focus on participation rates in work and related activities.

New York City used monthly data to foster competition among Job Center teams in meeting goals across a series of process and outcome indicators for the participation rate. In Utah, administrators and staff were given easy access to participation data, and participation rate goals were incorporated into staff performance plans.

These examples show how different strategies for raising participation rates are built on the use of program data, with agencies monitoring participation rates frequently (e.g., weekly or monthly) and at multiple levels (e.g., state, region, office, or case manager). These strategies also include the use of recipient-level data to quickly identify recipients who fall short of their activity or hours requirements, and to hold case managers, supervisors, and program administrators accountable for achieving desired outcomes.

The strategies in New York City and Utah were the product of thoughtful planning and long-term efforts to better track and use performance data. The following elements, common to both strategies, may be useful for other locations as they develop a strategy for using program data to monitor and improve work participation rates.

Make the data useful for service delivery and accessible to staff at multiple levels. While the participation rate was not a new measure under the DRA, its importance in both Utah and New York City grew considerably with the federal changes to its calculation. In both sites, prior to the passage of the DRA, frontline staff and some administrators did not have to understand the participation rate and how it was affected by client engagement. As the scenario changed after the DRA was passed, TANF program officials in both sites felt that it was important for staff at all levels to understand how the participation rate relates to everyday program functions. Each site developed a strategy for managing and reporting data in which performance measures were clearly defined and tracked in addition to being mapped to case management practices, and in which the information was made useful to all staff.

Build in feedback and program improvement cycles. To maximize the usefulness of their data, both sites promoted ongoing dialogue between senior staff, administrators, and case managers on increasing engagement and participation among TANF recipients. Discussions took place each month after the reports were reviewed, and staff from all levels participated in some fashion. Case managers focused on the particular needs, circumstances, and client motivators that could affect the engagement and participation of recipients. Office and regional administrators considered how the broader economic and service environments could affect the rate of participation, and also focused on key administrative issues such as the performance of contractors and the allocation of staff resources. Together, administrators and staff used program data on the status of caseloads and the participation rate to assess program strategies and consider improvements.

Monitor how staff respond to new data strategies. According to program administrators, case managers in both states had a mixed response to the two performance measurement strategies and the focus on the work participation rate. Some responded favorably to the challenge and reaped the rewards that came with high performance. Others felt that the new strategy made it difficult to balance the individual needs of clients (particularly those with significant personal and family challenges) with the overall focus on the participation rate. Staff did not want to feel pressured into rote case management or to be penalized for developing individualized case plans. Administrators in Utah, in particular, found it challenging to avoid mixed messages; they wanted to use the data to hold program staff accountable while conveying the importance of individualized case management and employment plans.

Build on existing data collection and reporting capacity. The strategies used in New York City and Utah were the result of focused efforts to develop the capacity to collect, analyze, and report data. Two essential elements formed the basis of the strategies: a comprehensive case management system and a data warehouse that linked multiple TANF case management and eligibility systems. The data warehouse allowed the agencies to automatically extract necessary elements and calculate indicators and participation rates quickly and cost-effectively. Because this infrastructure for the strategies was largely in place in both sites, neither had to expend a great deal of financial resources to develop the strategies.

[ Go to Contents ]

Endnotes

1. Based on interviews with TANF agency officials conducted by MPR in March and April 2007.

2. Utah TANF administrators report that the state is likely to meet the federal participation rate requirement with the assistance of the caseload reduction credit, even under the DRA revisions to this credit.

[ Go to Contents ]

Suggested Further Readings

The Deficit Reduction Act (Title VII, Subtitle A), http://frwebgate.access.gpo.gov/cgi-bin/getdoc.cgi?dbname=109_cong_bills&docid=f:s1932enr.txt.pdf

Final TANF Regulations, http://www.acf.hhs.gov/programs/ofa/law-reg/finalrule/finalru.htm

General Strategies to Increase Work Participation Gardiner, Karen, Mike Fishman, Mark Ragan, and Tom Gais. "Local Implementation of TANF in Five Sites: Changes Related to the Deficit Reduction Act." Fairfax, VA: The Lewin Group; and Albany, NY: The Nelson A. Rockefeller Institute of Government, March 2008. http://www.acf.hhs.gov/programs/opre/welfare_employ/local_impl/reports/five_sites_reduction/five_sites_reduction.pdf

Martinson, Karin, and Pamela A. Holcomb. "Innovative Employment Approaches and Programs for Low-Income Families." Washington, DC: The Urban Institute, February 2007. http://www.acf.hhs.gov/programs/opre/welfare_employ/inno_employ/index.html

Strategies for TANF Recipients Living With a Disability Derr, Michelle K. "Providing Specialized Personal and Work Support." Washington, DC: Mathematica Policy Research, Inc., February 2008. http://www.acf.hhs.gov/programs/opre/welfare_employ/identify_promise_prac/reports/personal_employ_support/personal_employ_support.pdf

Derr, Michelle K., and LaDonna Pavetti. "Creating Work Opportunities." Washington, DC: Mathematica Policy Research, Inc., February 2008. http://www.acf.hhs.gov/programs/opre/welfare_employ/identify_promise_prac/reports/creating_wk_opportunities/creating_wk_opportunities.pdf

Pavetti, LaDonna, Michelle K. Derr, and Emily Sama Martin. "Conducting In-Depth Assessments." Washington, DC: Mathematica Policy Research, Inc., February 2008. http://www.acf.hhs.gov/programs/opre/welfare_employ/identify_promise_prac/reports/conducting_in_depth/conducting_in_depth.pdf

Sama Martin, Emily, LaDonna Pavetti, and Jacqueline Kauff. "Creating TANF and Vocational Rehabilitation Agency Partnerships." Washington, DC: Mathematica Policy Research, Inc., February 2008. http://www.acf.hhs.gov/programs/opre/welfare_employ/identify_promise_prac/reports/creating_tanf_vocational/creating_tanf_vocational.pdf

Using TANF Sanctions to Increase Work Participation Rates Kauff, Jacqueline, Michelle K. Derr, LaDonna Pavetti, and Emily Sama Martin. "Using Work-Oriented Sanctions to Increase TANF Program Participation." Washington, DC: Mathematica Policy Research, Inc., September 2007. http://www.acf.hhs.gov/programs/opre/welfare_employ/sanction_pol/reports/sanction_pol/sanction_pol.pdf

Universal Engagement and Working With the Hard-to-Employ Kauff, Jacqueline, Michelle K. Derr, and LaDonna Pavetti. "A Study of Work Participation and Full Engagement Strategies." Washington, DC: Mathematica Policy Research, Inc., September 2004. http://aspe.hhs.gov/hsp/full-engagement04

[ Go to Contents ]


About This Project and Brief

This practice brief is one of a series describing state and local Strategies for Increasing TANF Work Participation Rates. The Deficit Reduction Act of 2005 (DRA) resulted in significant increases in the effective work participation rates that states must achieve. The series of briefs is designed to assist state and local officials in thinking about strategies that might aid them in meeting federal work participation requirements in their TANF programs.

The briefs in this series draw on information gathered from case studies of nine programs and describe approaches adopted by selected states and/or local offices that might be of interest to other program administrators. None of these programs has been rigorously evaluated, so their effectiveness is unknown. The U.S. Department of Health and Human Services does not specifically endorse any of the approaches described in this series. All briefs in the series can be accessed at http://aspe.hhs.gov/hsp/08/TANFWPR.

This brief was prepared by Mathematica Policy Research, Inc., (MPR) under contract to the U.S. Department of Health and Human Services, Office of the Assistant Secretary for Planning and Evaluation and the Administration for Children and Families.

For further information on this practice brief, contact LaDonna Pavetti at 202-484-4697 or at lpavetti@mathematica-mpr.com.

Populations
Low-Income Populations
Program
Temporary Assistance for Needy Families (TANF)