The policies and management of the PHS Evaluation Program, carried out on a regular basis by the PHS Agencies and coordinated by OASH, involve five basic functions:
- Evaluation planning to establish priorities and coordinate development of new evaluations;
- Quality assurance to maintain the standards of evaluation projects;
- Project management and tracking;
- Dissemination of evaluation results to program managers and other appropriate audiences; and
- Application of these results for program improvement.
These functions are described below in general terms. Additional information on individual Agency and OASH evaluation programs is included in chapter III.
1. Evaluation Planning
Evaluation planning is conducted annually in concert with program planning, legislative development, and budgeting cycles. Before the start of each fiscal year, evaluation guidance is issued by the Assistant Secretary for Health (ASH) to signal the priority PHS program and policy areas for evaluation. Typically, they include new programs, those undergoing major change, those that are candidates for reauthorization, and those for which key budget decisions are anticipated. In addition, emphasis has been given to evaluations that support strategic planning goals and objectives; evaluation syntheses that address program areas currently under review by both Congress and the Department; and evaluations that cut across PHS Agencies and have broad implications for program or policy change. In FY 1995, a new element was introduced into the PHS evaluation planning process: a panel of experts from universities and private research centers offered consultation before the development of the PHS guidance for FY 1996.
In response to the ASH guidance, each PHS Agency submits a plan of its evaluation strategies and proposed projects for the immediate and subsequent fiscal years. Sharing information allows the Agencies to learn about evaluation proposals being considered throughout PHS, promotes PHS-wide coordination, and avoids duplicative efforts. The evaluation planning process also facilitates coordination with Agencies outside PHS, such as the Departments of Agriculture, Education, and Justice.
2. Quality Assurance Most evaluation projects are developed at the program level, and the initial review is conducted by a committee of Agency-level policy and planning staff members. Before a project is approved, it is reviewed for technical quality, generally by a second committee of staff who are skilled in evaluation methodology. Technical review committees follow a set of criteria for quality evaluation practice established by each Agency. Some PHS Agencies also have external evaluation review committees composed of evaluation researchers and policy experts from universities and research centers. More details on the quality assurance procedures for the various Agencies and OASH are presented in chapter III.
3. Project Management and Tracking A computer database, the PHS Evaluation Management System (PHS-EMS), has been operating since 1992 for managing information on evaluations and policy research studies. Information is continuously entered into the database, based on reports from PHS Agencies, and it is used to produce reports that track the progress of evaluations under way or to locate information about recently completed evaluations. In FY 1996, a component will be added to allow Agency evaluation staff remote computer access to the database. This database system is also coordinated with the HHS Policy Information Center (PIC)--the departmental evaluation database and library maintained by the Office of the Assistant Secretary for Planning and Evaluation. Updated records on PHS evaluations and the final reports on completed evaluations are regularly transferred to the PIC. As an information database and library resource, the PIC contains nearly 6,000 completed, ongoing, and planned evaluation and policy research studies conducted by the Department, including key studies completed outside the Department by the General Accounting Office (GAO) and private foundations.
4. Dissemination Typically, the results of PHS evaluations are disseminated through targeted distribution of final reports, articles in refereed journals, and presentations at professional meetings and conferences. Although the major responsibility for disseminating results lies with the PHS Agencies, departmentwide efforts are under way to expand dissemination to the public health community. Abstracts of all studies maintained in the PIC database are now accessible to the public health community through the Department's World Wide Web server (http://www.hhs.gov) on the Internet. The preparation of this report is another effort to expand dissemination of results; the report will be distributed widely to State and local public health agencies, schools of public health and other university health programs, and private foundations. Finally, a number of evaluation syntheses are being developed to enhance the accessibility of results from multiple evaluations that address similar issues.
5. Application PHS evaluations are generally used directly by program managers to improve program operations and efficiency. In addition, in accordance with a 1993 GAO recommendation (see Publication No. GAO/PEMD-93-13), more evaluation focus will be directed to outcome/impact evaluations that can be applied by Congress and others for program planning, budgeting, and legislative action. This shift is also consistent with implementation of the Government Performance and Results Act (GPRA) and the National Performance Review (NPR), which seek to improve the efficiency and effectiveness of Federal Government programs and the services provided to customers of those programs and directly to citizens.
NPR stresses the importance of consumer satisfaction surveys (two of which are described in chapter II) and performance measures as tools for improving Government programs. Performance measures have been a cornerstone of the total quality management (TQM) movement in business and industry, where production outputs and quality outcomes can often be readily defined. Likewise, NPR has challenged PHS program managers to examine their outputs and outcomes and to conduct studies that would establish valid and reliable indicators of program performance.
While NPR has greatly influenced Federal Agencies to begin developing performance measures, it is GPRA that will link performance measures to the Federal Government's annual planning and budgeting process. The ultimate goal of GPRA is to move Federal Agencies toward performance budgeting. Performance budgets will provide Agencies with information on the direct relationship between program spending and expected results, and the anticipated effects of varying spending levels on results. As for the PHS Evaluation Program, the goal of having performance measures in place by FY 1999 will require a substantial investment of evaluation resources and will consequently change the nature and use of these evaluations.