Feasibility Study for the Evaluation of DHHS Programs That Are or May Be Operated Under Tribal Self-Governance. Description of Illustrative Evaluation Models

03/03/2004

NOTE:   These Models are presented only to illustrate possible approaches to evaluation of DHHS programs that may be managed by Tribes under a new Self-Governance demonstration and to provide a framework for the discussion of evaluation feasibility.  If a future evaluation of DHHS programs operated under a new Self-Governance demonstration would be developed, there would be extensive consultation with Tribes to develop the specific evaluation approach.

  1. Comprehensive Evaluation Model:  DHHS Programs that May be Operated by Tribes Under a Self-Governance Demonstration
    1. Objectives
      1. Conduct a comprehensive evaluation of the implementation, process, and outcomes associated with a demonstration of DHHS programs that may be operated by Tribes under Self-Governance. 
    2. Assumptions
      1. Comprehensive Evaluation of Demonstration Programs
        1. 50 demonstration Tribes
        2. Demonstration Tribes may have contracted the programs they are managing under the demonstration prior to the demonstration, or they may elect to manage programs they have not previously contracted.
        3. An Annual Report format would be developed in consultation with Tribes participating in the demonstration and would be submitted by all participating Tribes.
        4. An Evaluation Data Set would be developed in consultation with Tribes. This Evaluation Data Set would include data on characteristics of individual clients/beneficiaries served by each program, services provided/received, and observed process and outcome measures at the individual client/beneficiary level.
        5. A subset of 15-25 Tribes would agree to voluntarily submit the Evaluation Data Set annually, as well as the Annual Report., for each program and for each year of the demonstration
        6. Two rounds of site visits would be conducted to 15-25 Tribes for in-depth evaluation, once during the initial six months of implementation and again approximately 18 months after initiation of the demonstration.
        7. DHHS program offices would provide baseline reports for demonstration programs managed by the Tribes and national benchmark data for all years required.
        8. All participating Tribes would be provided uniform financial reporting formats and Evaluation Data Set reporting formats and would be provided training and technical assistance to ensure comparable and consistent data.
        9. The evaluation would be conducted over a three-year timeframe.
    3. Research Questions to be Examined
      Note:  Specific research questions would be developed in consultation with the Tribes. Based on discussions conducted during the current study, the general research questions that are likely to be identified might include:
      1. Implementation Issues (first 6 months)
        1. What are the characteristics of Tribes that apply to participate in the demonstration?  That are selected to participate in the demonstration? Are these characteristics different from those of Tribes that do not apply?
        2. What factors are reported by demonstration Tribes as influential in their decision to participate?  What was the most important factor in their decision?  What concerns were identified during the decision process?
        3. Was the community involved in the decision to participate in the demonstration?  How was this accomplished?
        4. How was the planning for the demonstration program organized?  Where was responsibility for planning placed organizationally?  Who was involved?
        5. What changes in organization and staffing of each program occurred as a result of the demonstration planning?
        6. What goals/objectives were established for each program during planning?  Were these goals/objectives different from the goals/objectives that had been in place when the programs were contracted?  If so, what are the reasons?
        7. Was there community involvement in setting goals/objectives for each program?
        8. Were changes made in the funding available to each program under the demonstration?  If so, what were the reasons for the changes?
        9. Were changes made that resulted in cost-savings or more efficient use of resources?
        10. What problems were identified during implementation and how were they resolved?
        11. Was the implementation successful?  Why or why not?
      2. Process Questions (six months and throughout the demonstration period)
        1. What changes in programs, staffing, and organization occurred after the initial implementation period?  For each program, what were the reasons for these changes?
        2. Did the demonstration affect overall Tribal management structure and staffing?  Why or why not?
        3. Were there changes in the goals/objectives for each program after the initial implementation period?  If so, what were the reasons for the changes?
        4. How does the Tribe provide oversight and monitoring of each program?  What is the process for addressing problems or issues that are identified through monitoring?
        5. Was there ongoing community involvement in oversight and monitoring of each program?  If so, how was this achieved?
        6. Are goals/objectives for each program met, on a continuing basis?  What factors are important in achieving these goals?  If the goals/objectives are not met, what were the reasons?  What changes were made in response to identifying barriers to meeting goals/objectives?
        7. What are the perceptions of Tribal leaders and program managers of the benefits of Self-Governance, generally, and as a result of the demonstration?  Are there perceived disadvantages of Self-Governance, generally, and for this specific demonstration?
        8. What are the perceptions of Tribal members who receive services from the programs of the benefits and disadvantages of the changes in management and operation of each program?
        9. Were any program changes made to achieve cost-savings and increase efficiency?
        10. Were some program funds re-allocated to other priorities within the Tribe, after the initial implementation period?  How was the decision made?
      3. Quantitative Measures of Process and Outcomes Questions (to be addressed after two years of operation)
        1. Was maintenance of effort achieved?  That is, did each program serve as many people and provide at least the same quantity of services as were available prior to Self-Governance?  If not, what were the reasons?
        2. Did the mix of services provided change under the demonstration, for each program?
        3. For each program, was the Tribe able to achieve at least two quantifiable goals that were established at the initiation of the demonstration program?
        4. For each program, were any changes made in staff levels or types of staff employed? 
        5. Were there changes in the allocation of program funds to personnel, space, materials, administrative costs under the demonstration, compared to the previous contracted program (if the Tribe previously operated the program under contract)?
        6. Did program costs per person receiving services change under the demonstration programs?
        7. Are program users more/less satisfied with services provided under the demonstration program than they were before implementation? 
    4. Comparison Groups for Quantitative Measures Questions
      1. Demonstration
        1. Pre-Post comparisons
        2. Across-site comparisons
        3. National benchmark comparisons
    5. Data Necessary for the Evaluation
      Note:  Specific data needed would depend on the set of evaluation issues and research questions developed in consultation with Tribes.  Likely data needed would include:
      1. 1. DHHS Programs Under Demonstration
        1. Annual Report data for all Tribes participating in the demonstration.
        2. Evaluation Data Set on persons served, age-gender mix, services provided, outcome measures for each year of the demonstration, for each program, for 15-25 participating Tribes
        3. Detailed financial data for demonstration Tribes, baseline through evaluation period, for each program
        4. Detailed data on staffing, for each program
        5. Consumer satisfaction survey of Tribal members receiving services from each program, baseline and second year of demonstration
        6. Two rounds of site visits to 15-25 Tribes to collect qualitative data on implementation process during first six months and again at 18 months to collect information on operational experiences
      2. Other Data Needed
        1. Socio-economic and demographic data for each Tribe (2000 Census)
  2. Limited Evaluation Model
    1. Objectives
      1. To design and conduct an evaluation that addresses a limited set of evaluation issues that are identified by the Tribes and DHHS as high priority and valuable to understanding and assessing DHHS programs that may be operated by Tribes under a demonstration.
    2. Assumptions
      1. 50 Tribes participating in new demonstration.
      2. Demonstration Tribes may have contracted the DHHS programs they are managing under the demonstration prior to the demonstration or they may elect to manage programs they have not contracted.
      3. Site visits would be conducted to 15-25 Tribes for in-depth evaluation.
      4. A Minimum Data Set (MDS) would be developed in consultation with Tribes.
      5. All voluntarily participating Tribes would agree to submit this MDS for the baseline (pre-implementation) period and for each year of the demonstration.
      6. Additional data collection would be conducted only for the 15-25 Tribes selected for in-depth evaluation.
      7. DHHS program offices would provide baseline reports for demonstration programs managed by Tribes and national benchmark data for all years required.
      8. All participating Tribes would be provided uniform financial reporting formats and Minimum Data Set reporting formats and would be provided training and technical assistance to ensure comparable and consistent data.
      9. The evaluation would be conducted over a three-year timeframe.
    3. Research Questions
      Specific research questions would be developed in consultation with the Tribes. Based on discussions conducted during the current study, the general research questions that are likely to be identified would include:
      1. What are the overall benefits to Tribes of participating in Self-Governance of Federal programs?
      2. Do the Tribes use the flexibility of Self-Governance to make changes to programs?
      3. How are decisions made about goals of programs and changes that are made to achieve those goals?  To what extent is the community involved in those decisions?
      4. Do the Tribes meet the specific goals that are established for each program?
      5. Are there innovative approaches that are developed by the Tribes that contribute to effective and efficient management of programs and resources?
      6. What problems are encountered?  How are those problems resolved?
    4. Comparison Groups for Quantitative Measures
      1. Pre-post comparisons
      2. National benchmark comparisons
    5. Data Necessary for the Evaluation
      Note:  Specific data needed would depend on the set of evaluation issues and research questions developed through DHHS consultation with Tribes.  Likely data needed would include:
      1. Baseline data on persons served, age-gender mix, services provided, outcome measures, for each program.
      2. Minimum Data Set on persons served, age-gender mix, services provided, outcome measures for each year of the demonstration, for each program.
      3. Detailed financial data on programs operated by demonstration Tribes, baseline through evaluation period, for each program.
      4. Detailed data on staffing, for each program.
      5. Two rounds of site visits to 15-25 Tribes to collect qualitative data on implementation process during first six months and again at 18 months to collect information on operational experiences.
      6. Socio-economic and demographic data for each Tribe (2000 Census)
  3. Evaluation Model Using Only AGGREGATE  Reporting Data
    1. Objectives
      1. To conduct a limited evaluation that relies on aggregate periodic reports on programs managed by Tribes under the DHHS demonstration program.
    2. Assumptions
      1. 50 demonstration Tribes.
      2. A set of Annual Report Requirements, including Financial Reporting Requirements, would be developed, in consultation with Tribes.
      3. Participating Tribes would agree to submit these reports for each year of the demonstration.
      4. DHHS demonstration program officers would provide additional qualitative information to the evaluation team on implementation and process for demonstration Tribes, based on their ongoing interactions with the demonstration Tribes.
      5. Agencies responsible for programs included in the demonstration would provide national benchmark data for baseline and for all years of the demonstration.
      6. All participating Tribes would be provided uniform reporting formats and training and technical assistance to ensure comparable and consistent data.
      7. No individual-level analyses would be conducted of program clients/beneficiaries.  All evaluation analyses and program descriptions would be conducted at the aggregate level.
    3. Research Questions to be Examined
      Note:  Specific research questions would be developed in consultation with the Tribes.  Based on discussions conducted during the current study, the general research questions that are likely to be identified might include:
      1. Implementation Issues (first 6 months)
        1. What are the characteristics of Tribes that apply to participate in the demonstration?  Those that are selected to participate in the demonstration? Are these characteristics different from those of Tribes that do not apply?
        2. What changes in organization and staffing of each program occurred as a result of the demonstration planning?
        3. What goals/objectives were established for each program during planning? Were these goals/objectives different from the goals/objectives that had been in place when the programs were contracted?  If so, what are the reasons?
        4. Were changes made in the funding made available to each program under the demonstration?  If so, what were the reasons for the changes?
        5. Were changes made that resulted in cost-savings or more efficient use of resources?
        6. What problems were identified during implementation and how were they resolved?
        7. Was the implementation successful?  Why or why not?
      2. Process Questions (six months and throughout the demonstration period) 
        1. What changes in programs, staffing, and organization occurred after the initial implementation period?  For each program, what were the reasons for these changes?
        2. Were there changes in the goals/objectives for each program after the initial implementation period?  If so, what were the reasons for the changes?
        3. Are goals/objectives for each program met, on a continuing basis
        4. How does the Tribe provide oversight and monitoring of each program?  What is the process for addressing problems or issues that are identified through monitoring?
        5. Were any program changes made to achieve cost-savings and increase efficiency?
        6. Were some program funds re-allocated to other priorities within the Tribe, after the initial implementation period? 
      3. Quantitative Measures of Process and Outcome Issues
        1. Was maintenance of effort achieved?  That is, did each program serve as many people and provide at least the same quantity of services as were available prior to Self-Governance?
        2. Did the mix of services provided change under the demonstration, for each program?
        3. For each program, were any changes made in staff levels or types of staff employed?
        4. Were there changes in the allocation of program funds to personnel, space, materials, contracted services, administrative costs under the demonstration, compared to the previous contracted program and/or national benchmark data?
        5. Did program costs per person receiving services change under the demonstration?
    4. Comparison Groups for Quantitative Measures
      1. Pre-Post comparisons/Patterns over time
      2. National benchmark comparisons
    5. Data Necessary for the Evaluation
      Note:  Specific data needed would depend on the set of evaluation issues and research questions developed by the Tribal Working Group.  Likely data needed would include:
      1. Baseline data on persons served, age-gender mix, services provided, outcome measures, for each program.
      2. Annual Report data on persons served, services provided, outcome measures for each year of the demonstration, for each program.
      3. Annual Report financial data for programs managed by Tribes under the demonstration, baseline through evaluation period, for each program.
      4. Annual Report narrative information on goals/objectives, program changes, problems encountered, and how problems were resolved.
      5. Other Data Needed
        1. Socio-economic and demographic data for each Tribe (2000 Census).