Skip to main content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.


The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Advancing State Child Indicator Initiatives: Summary of November, 1998. Meeting

A Summary of the Meeting of
November 16 & 17, 1998

Advancing States Child Indicator Initiatives is sponsored by the Office of the Assistant Secretary for Planning and Evaluation of the U.S. Department of Health and Human Services. Martha Moorehouse of ASPE is the Project Officer. This summary was prepared by the Chapin Hall Center for Children at the University of Chicago. Harold Richman is the project Principal Investigator and Mairéad Reidy is the Project Director.

Chapin Hall Center for Children
1313 East 60th Street
Chicago, Illinois 60637

Opening Remarks

Harold Richman, Director of the Chapin Hall Center for Children

Harold Richman welcomed the delegates to Chapin Hall and expressed the hope that the meeting would result in learning both across states and within delegations. Mr. Richman noted that the meeting agenda was developed by gleaning themes of common interest from the states proposals. He then sketched the meeting agenda, noting the session topics and explaining how the sessions would be organized. Those sessions were titled:

  • Indicator Conceptualization and Development
  • Using An Indicator Framework to Monitor Welfare Reform
  • The Uses of Indicators in the Policy-Making Process
  • Technological, Analytical, and Data Availability Issues in Indicator Development
  • Cross-State Discussion of Meeting State Technical Assistance Needs and Next Steps

Mr. Richman then introduced Martha Moorehouse.

Martha Moorehouse, Senior Research Policy Analyst, Office of the Assistant Secretary for Policy and Evaluation of the U.S. Department of Health and Human Services

Ms. Moorehouse expressed her pleasure at the presence of the state delegations and noted the high quality of the applications made by the states to be part of the project. In large part, Ms. Moorehouse focused on the importance of partnerships. She mentioned collaborations that are underway within the Office of the Assistant Secretary for Planning and Evaluation (ASPE) and with ASPE and state agencies, other federal agencies and departments, and universities. Ms. Moorehouse noted the complimentarity between the work of developing and employing indicators and the tracking of welfare reform outcomes for children and families.

Ms. Moorehouse stressed the importance of the collaborative work among the 13 indicators states, pointing out that the project will not provide lengthy, individualized assistance, but will focus on providing technical assistance through collaboration. "We want to focus you on planning your own Technical Assistance future," she said, "so that you have the resources lines within your own partnerships."

Jody McCoy, Policy Analyst, Office of the Assistant Secretary for Policy and Evaluation of the U.S. Department of Health and Human Services

Ms. McCoy began by welcoming the delegates and summarizing key reasons that the applications submitted by these 13 states were successful. Among those reasons were:

  • States demonstrating that they understand the importance of indicator work and the value of the correct use of indicators in policy decision making
  • The desire of the federal government to bring together states who have different levels of expertise within the process of developing indicators
  • The demonstrated ability of the grantee states to create or utilize partnerships among state agencies with responsibility for childrens programs and with university-based research organizations
  • The commitment of senior state leadership to indicators work
  • The states acknowledgment of the importance of working with other states in this effort

Ms. McCoy expressed the hope that the project will help draw connections between the attending states and others working on indicator efforts at the state, federal, and local levels and also create conditions that allow other states to profit from the experiences of the 13 project states.

Ann Segal, Deputy Assistant Secretary for Childrens Policy Initiatives, Office of the Assistant Secretary for Planning and Evaluation, U.S. Department of Health and Human Services

Ms. Segal touched on some of the activities of the ASPE, mentioning in particular a 13-state effort that is seeking to track people who have moved off welfare or have been diverted from the welfare rolls. Ms. Segal made a number of points about the indicators work. She noted that the trends revealed by the indicators will not illuminate the success or failure of welfare reform because there are too many policy changes underway for the influences of welfare reform to be isolated, but they will help policy makers determine trends in childrens well-being, allowing for program course corrections. She warned that indicators are not performance measures of particular programs. "What were talking about are collaborations and not holding one place accountable, but making everybody accountable," she said.

Like the previous speakers, Ms. Segal expressed pleasure in seeing the assembled delegations.


Summary of Session 1
Indicator Conceptualization and Development

Moderator: Harold Richman, Chapin Hall


Harold Richman introduced the session by detailing its agenda. The session featured two lead states, Delaware and Georgia, followed by two reacting states, Minnesota and Utah. Most states see themselves as needing indicators that cross programmatic or departmental boundaries. One area in which indicators seem to be lacking--school readiness--served as the topic of a roundtable discussion at the end of the session.


The Indicator Development Process in Delaware

Delaware sketched its top-down process of developing indicators. The first speaker was Gwendoline B. Angalet, a Special Assistant to the Cabinet Secretary, Delaware Department of Services for Children, Youth and Their Families. The second speaker was Assistant Professor Maria Aristigueta of the University of Delaware.

The childrens cabinet. The Delaware Family Services Cabinet Council, created by the Governor Thomas R. Carper, is composed of the cabinet secretaries of a number of state departments that are involved in child and family issues, such as human services, health, education, housing, corrections, public safety, and labor. The priorities of the childrens cabinet include reform of the welfare system; improving the quality of the educational system; improving access to health care, especially among children; helping parents improve their parenting skills; addressing families needs for adequate housing; addressing the pervasiveness of substance abuse; and linking early care and education services.

Integrated services and indicators. Delawares goal of integrating human and education services led to the development of a service model that employed partnerships among schools, community agencies, and state government. Indicators were seen as a way to measure the impact of the integrated system on children and families. In 1997, a working group that included state government officials and staff from the University of Delaware began to work on developing and conceptualizing indicators, building on the Kids Count indicators. (The University of Delaware is the state Kids Count grantee.) The working group examined indicators work underway in other states and looked at the sources of data available. They also crafted a mission statement and outcome statements that were child-centered, family and community focused, took into account data availability. The working group developed a list of approximately 90 possible indicators, eventually refined to a list of 32 indicators, that will help reveal the contexts in which Delaware children live and also allow comparison with the national and regional circumstances, as data permit. Additional indicators which focus on the educational attainment of children are still in development. The childrens cabinet was very involved in each step of the developmental process from the crafting of the mission and outcome statements to deciding of the final list of indicators, providing important executive level support to insure the institutionalization of using indicators to measure the quality of life for Delawares children and their families.

This work led to the publication of Delaware Families Count, released the week of this meeting. Ms. Angalet summed up by saying: "Now these indicators are helping us communicate our goals with respect the health and well-being of children, po measure the progress of our education system, to measure the abilities of families to be nurturing and self-sufficient, and to really measure the effects of welfare reform."

Future Efforts and Challenges

Professor Aristigueta addressed what Delaware had learned through this process and suggested some challenges that Delaware currently faces. Among the things the state learned were

  • The value of partnering with Kids Count.
  • The importance of agencies and individuals involved in the process understanding that the data need to be published within a year.
  • The importance of securing public input. Delaware is seeking such input to complement its top-down approach.

The Challenges and open questions included:

  • What impact will indicators have on other systems?
  • Will the indicators effort capture the interest of citizens and policy makers in Delaware?
  • Will the indicators information be utilized for internal management, policy making, and resource allocation?

Delaware will move to increase public awareness and utilization of this work in a number of ways. For example, the governor will speak at the press conference at which Families Count is released. A campaign to engage the public in a discourse on the health and well-being of children will be implemented. Also envisioned are a review of agency strategic plans and performance measures to determine how they align with the Governors priorities and how they may linked to the indicators. Delaware is interested in exploring whether new legislation should be developed or existing legislation should be strengthened to link the indicators to the States strategic planning and budgeting process.


James W. Buehler, Chief of the Perinatal Epidemiology Unit of the Georgia Department of Human Resources, outlined the framework Georgia is employing in the conceptualization of indicators. Georgia has developed 26 benchmarks that are, in general, broad measures used to describe the population. Part of the task of Georgias indicators project is to "expand or supplement these benchmarks so that they become more responsive or more reflective or the program or policy efforts of the project." Georgia also wishes to capture the effects of a number of changes to the child and family landscape--including changes to welfare and child care programs and a heightened emphasis on managed care in Medicaid service delivery--and to translate an understanding of those effects into program and policy outcomes.

Ideal Indicators

Georgia sketched some attributes of ideal indicators, while noting that the indicators to be developed will fall short of ideal. Ideal indicators would

  • Reflect the outcomes of programs and policies
  • Capture changes or fluctuations that could be attributable to the effects of programs or policies
  • Be applicable to program purposes, such as accountability and management
  • Communicate their meaning effectively and, in that, be useful in driving actions
  • Be definable, have supporting data available, be timely and consistent so that trends could be monitored, and be applicable at state, district, and local levels
  • Be integrated with other indicators systems, such as Kids Count
  • Be integratable across service programs

Critical Tensions

Georgia sees a number of critical tensions that come into play in trying to meet its ideals. One is that it is relatively easy to come up with process measures that capture program effects, but it is more difficult to create intermediate measures of outcomes with great attributability to program activities. Capturing good information on the variety of populations of interest--some receiving services, some eligible but not receiving services, still others in need but not eligible--is also challenging. Georgia is investigating how much reliance it will need to place on approximate or proxy measures, which proxy measures to employ, and how to make use of survey and other available data.


Georgia sees its challenges as defining a matrix that includes program and policy objectives; that articulates short, intermediate, and long-term effects; and that relates to various levels of the population. Georgia must match that matrix, once defined, to data resources, trying to balance what is desirable to measure with what is possible to measure in a way that maximizes the impact and utility of the indicators.

Reactant States


Mark Larson, Project Manager of Minnesota Milestones, began by inviting each delegation to pick up a copy of Minnesota Milestones and other state publications. Minnesota has been involved in indicators work for almost a decade.

Regarding Georgia. The Minnesota delegate noted the importance of developing a framework that links board indicators to more policy-specific indicators and also of providing technical assistance to help local communities improve their ability to use indicators.

Cautions. The delegate mentioned a Minnesota effort, headed by the local United Way, that seeks to develop monitoring indicators for welfare reform. The delegate noted that the fear of welfare reform among advocacy groups has skewed this effort so that there is little anticipation of potential positive outcomes of reform, only an attribution of any potentially negative developments to welfare reform. The Minnesota delegate offered a second caution, saying that, if the size of the TANF population is small relative to the size of the child population and the labor force, states "really have to look closely at what indicators are sensitive to changes in policy." He recommended more predicative indicators, which he termed "canary" indicators, that do not focus on outcomes but do identify points of problems on the horizon. He suggested, in particular, indicators of early childhood experiences that are predictive of school readiness. He also emphasized the importance of cultivating among policy makers an understanding of indicators and of the research base from which they are drawn.

Delaware. The Minnesota delegate said that he was impressed with Delawares Family Services Cabinet Council and by the links between the indicators work and the governors office. He also acknowledged the public/private council in Georgia. He praised the fact that the Delaware governor released Families Count, and that he did so in combination with Kids Count, feeling that such steps would increase public interest.

The speaker from Minnesota asked if Delaware and Georgia were moving beyond indicators to numerical objectives or targets. He noted that, in Minnesota Milestones, the state has this year introduced "comparisons," choosing that term because they could not agree on what was a definition of the term "benchmark" and because they did not feel their data would support use of the term benchmark as it is used in conversations about quality management or private industry.

The Minnesota delegate concluded by noting that one challenge of conceptualizing and developing indicators is balancing the input of the technical people and the public. He said:

Im glad that Delaware is going back to the public now, obviously for its work to be meaningful it has to ba embraced and communicate well to the public. The technical people, on the other hand, have to help us ensure that we have the soundest research base possible . . .

Minnesota echoed Georgia in pointing out the need to balance the desire for ideal indicators with what it is possible to measure and warned that the process can become paralyzed on the technical side by the inability to achieve perfect indicators and on the public side by public desire for indicators for which data are not available or for which data are not particularly valid. He offered congratulations to both states.


Utah was represented by Rita Penza, coordinator of the Utah Child Indicators project. Ms. Penza recommended that states look to indicators to provide information about the populations of interest and look to performance measures to assess the functioning of programs. She stressed the articulation of positive goals, such as healthy births, rather than negative goals, such as capturing the low birthweight rate. In response to discussion by Delaware and Georgia about working with agencies and groups at the community level, Ms. Penza noted the importance of establishing a common set of terms--a common language--for public use as well as establishing common ground across the agencies involved in this project.

In the development of indicators, Ms. Penza suggested that state grantees play a leading role and noted the helpfulness of site visits to agencies. She said: "We are here because we are involved in coordinating or leading this indicators initiative. So as leaders we have a responsibility to facilitate communication and understanding among our state collaborators. It is important to meet with people involved on an individual level - e.g. Going for a site visit and learn about their agenda and what they deal with on their job".

She proposed three criteria for indicator power(Mark Friedman should take credit for this):

  • Communication Power: Its ability to convey a common meaning among different people;
  • Proxy Power: Its ability to say something of central importance about a goal (similar to validity);
  • Data Power: The indicator corresponds to the availability high quality data that can be obtained on a consistent basis.


As the discussion opened, states offered different perspectives on their processes for involving a statewide constituency in indicator development, including how indicators work can help communities know more about their local situation, how that knowledge can support the tailoring of services to match local needs, or how the indicators work can be threatening or problematic to communities. Approaches sketched by states included those offering grassroots-level involvement in the process, public meetings or surveys of key public groups, and an initial top-down approach. A number of states thought mixed top-down and bottom-up approaches had merit. Some delegates pointed to examples in their states of communities using indicators to tailor services to meet local circumstances.

Open Roundtable on School Readiness

Following the general discussion, the delegates took up the issue of indicators of school readiness. Minnesota spoke first, describing its own progress in developing indicators of school readiness and its mandated health and developmental screening of children at done at the school district level at their entry to school. In response to questions, Minnesota noted that it uses no single screening instrument for this assessment, but allows districts to choose from a list of approved instruments.

Other states responded, describing their approaches to evaluating school readiness, noting the lack of a widely regarded instrument for capturing school readiness, and reporting some concerns of communities and educators in response to assessments. Some states noted the complexity of the idea of school readiness and suggested that it might be thought of as an outcome, rather than as an indicator. The issue of multiple influences on childrens school readiness, including the effects of welfare reform, resurfaced.

Martha Moorehouse had several suggestions for states. These included looking at other data collections done on health matters, exploring the kinds of state population data collections done by the universities in each state, and examining the work that Head Start is doing in relation to performance measures. She also noted that ASPE is fighting hard to make sure that school readiness is not defined in educational terms alone, but that emotional and social skills are part of the school readiness appraisals.


Summary of Session 2
Using an Indicator Framework to Monitor Welfare Reform

Moderator: Bong Joo Lee, Chapin Hall


In the second session, participant states were asked to discuss the strategies, technical needs, and benefits and problems involved in using indicators to monitor welfare reform. The session, moderated by Bong Joo Lee, began with remarks by Martha Moorehouse and Jody McCoy of ASPE, and by Tom Corbett of the Institute for Research on Poverty at the University of Wisconsin. Maryland and Rhode Island served as reactant states. Discussion followed the presentations.

Martha Moorehouse

Ms. Moorehouse began by alluding to a project in which ASPE and representatives from several states--including Florida and Minnesota--examined the contrast between the effectiveness of AFDC and welfare reform. In the process of examination, a set of performance instruments common to a number of states was utilized to measure the effects that welfare reform may have on services and their clients. At the culmination of the study, researchers explicitly noted that a number of things affect child well-being that are beyond the scope of welfare reform. This study also concluded that it is limiting to single out an indicator, that more important is to recognize the relationships among indicators.

For a long period of time now, experiments and analyses of very specific kinds concerning welfare reform have been conducted. Ms. Moorehouse suggested in order for work on reform to evolve, and perhaps thrive beyond specialized experiments, researchers have an obligation to pay particular attention to the type and purposefulness of their data. Data are not autonomous, they are a tool used for analyses and experimentation. Before analyzing data, researchers should ask whether the data they are using are able to be used in context other than those for which they were originally gathered; whether the data will be useful even after the experiment for which they were collected has expired.

Jody McCoy

Ms. McCoy framed her presentation on child care indicators around three elements.

Highlighting why it is important to create and monitor child care indicators. With an increasing number of women entering the labor force, there is a growing demand for child care. Such an increased demand is likely to lead to greater use of unregulated care, since it is less expensive and there may be a shortage of regulated care in certain low-income communities. However, unregulated care is likely to be of lower quality than regulated care. This is important because the quality of care that children receive matters to their cognitive, social, emotional and physical development and school readiness.

Identifying key aspects of child care that should be traded with indicators. ASPE has given some thought to the aspects of care which should be examined. These include: 1) Accessibility, Are child care providers conveniently located? Do mothers have adequate transportation to reach them? 2) Availability, Is the supply of child care in an area adequate, especially for infants, sick children, and families needing care during nonstandard hours? 3) Affordability, How much of their income are families spending on child care? Do they receive financial assistance with their child care cost. 4) Quality, This multidimensional concept is one of the most difficult aspects to measure. Two of the most rudimentary measures of the quality of care children receive could be the stability of the arrangement and whether care is regulated or unregulated. Other measures of quality could include: class size, child/staff ratios, the education and training of providers, whether health and safety standards are met, caregiver turnover and caregiver wages.

Finding potential sources of data for indicators. Currently, the Annie E. Casey Foundation is funding Child Trends to find potential data sources for statewide child care indicators. Their last report stated that not many sources existed and that the ones that did would support only a one-time report. Child Trends is also in the process of developing a survey in which all the questions are uniform.


Tom Corbett, Institute for Research on Poverty, University of Wisconsin

A fundamental change has allowed welfare effects to expand beyond the scope of welfare reform. This has occurred for three reasons, that Professor Corbett termed as "the Three Rs." First, redirection, the current system asks for behavioral change, rather than supplying income support as it had in the previous years. Second, reinvention, the success of the program is focused on the individual and is thus performance-based. Lastly, reallocation, states and communities have assumed authority for programs. In the past, the federal government held authority for services. Combined, these three changes have resulted in agencies and their workers loss of definite boundaries. Before an organization may have focused on one aspect of service. Now, the same organization may house a number of services. Likewise, whereas a worker used to work individually and work in a specialized field, he or she may now work in a team-oriented atmosphere in an agency providing a number of services. As a result of these changes, it is necessary for researchers to adopt a plethora of different approaches and processes in their experiments.

Professor Corbett also said that in this work the environment, the causal relationship between service and outcome can no longer be clearly determined. Indicators can only serve as monitoring tools or as tools to help inform changes in direction.

State Respondents


David Ayer, Director of Research, Evaluation, and MIS in the Maryland Governors Office for Children, Youth, and Families, spoke for Maryland, commenting that there is a general fear that when families leave services, their children are put at risk. Ayer said that there has not been a lot of connection between Marylands indicators work and its efforts to track results for children leaving welfare. The states Department of Human Resources will be part of an effort to see if indicators of child well-being can help to monitor welfare reform effectiveness. If this is thought feasible, the University of Maryland School of Social Work will examine how to include those indicators in ongoing projects and may present indicators of outcomes for families leaving welfare in periodic reports titled Life After Welfare. The Social Work School may also involve indicators in possible studies comparing child outcomes under AFDC and TANF. Mr. Ayer concluded by referring to Ms. McCoys presentation, suggesting that, in addition to collecting information on the availability of child care, information on the kinds of child care parents want could be useful.

Karen Finn, Administrator, Caroline County Human Services Council, described the experience of Caroline County as one of the three Maryland counties trying to prioritize a list of nine outcomes. Caroline Countys choice of stable and economically independent families surprised those who expected a focus on other areas, such as school performance and health. Ms. Finn stressed the importance of using indicators, not just as a measures, but to help drive service planning. She proposed a total quality management model for services, rather than a rewards-and punishment model. Jim Witherspoon added that he saw a difficulty in demonstrating whether welfare reform was producing good or bad outcomes for children because Maryland lacks information on what has happened to the people who left the welfare rolls in the last year and a half.

Rhode Island

Sherry Campanelli, Associate Director of the Division of Individual and Family Support Services of the Department of Human Services, spoke for Rhode Island. Ms. Campanelli introduced herself and noted some of the responsibilities of her department. The Rhode Island delegation had previously distributed copies of its Kids Count book to each of the tables and Ms. Campanelli pointed out that the book contained the states welfare reform indicators and 27 or 28 other indicators the state is tracking. She then turned to the states comprehensive administrative database.

Rhode Islands administrative database, initially created to support federal reporting requirements, now helps inform human service decision making. Begun with the establishment of a cash assistance and food stamp database in 1990, it now contains information on a wide range of cash and noncash assistance and other support programs, and on the characteristics of the service populations. The database has been used to investigate such areas as the relationships among the use of different services. She expects to be able to use the database to examine what happens to recipients after they exit cash assistance and whether those exiting assistance for employment are using available low-income child care programs or state child health care coverage for low-income families.

Ms. Campanelli said that Rhode Island does not use this database to the fullest extent and said that one reason the state is involved in the indicators project is to provide access to "multi-disciplinary people who have expertise in this area to help us mine this database."


Ann Segal made four points.

  • The decline in participation in services, such as a decline in Medicaid rolls, reveals nothing about whether families are doing well or doing poorly under welfare reform.
  • Many states have only anecdotal information about what happens to families who exit the rolls and families that we want to track are being lost.
  • It is important to sell indicators well. The continuous improvement idea is a good one and it helps TANF agencies buy into the indicators idea because it helps them avoid being held accountable for the changing of those indicators.
  • It is very difficult to disentangle cause and effect in peoples minds.

Discussion also covered ways to gather the information on what is happening to families, including direct data collections. The importance of links among agencies and data were stressed. Near the end of the discussion, Martha Moorehouse emphasized the difficulty of this topic and the importance of keeping children as a key part of the story.

Summary of Session 3
The Uses of Indicators in the Policy-Making Process

Moderator: Fred Wulczyn, Chapin Hall


Fred Wulczyn introduced Con Hogan, Secretary of the Vermont Agency of Human Services, and spoke about the leading role played by Vermont in the use of indicators in the policy process.

Con Hogan, Secretary, Vermont Agency of Human Services and President of the Board of the American Public Humane Association

Mr. Hogan began is his remarks by acknowledging the contributions to indicators work in Vermont made by Mr. Richard Mills, formerly of Vermont state government and now Commissioner of Education in New York, and by Kids Count. Mr. Hogan also noted that Vermont, which is small and rural, may be more similar to rural counties in some of the attending states than to the states themselves.

"Policy making is politics," Mr. Hogan declared, and building on this theme, he described the importance of thinking of the nation as a collection of neighborhoods, and of crafting an explanation of the meaning of indicators in simple language that is understandable by the public and that captures the developmental process humans go through from birth to death. "Essentially," Mr. Hogan said, "it ranges from babies born healthy to elders living in places they prefer, with dignity."

Connections among indicators. Because all aspects of the developmental process are related, influencing one portion of the process influences other portions. Mr. Hogan calls this web of influence the "interactivity" among indicators. "The business community" Mr. Hogan said, "gets this. They understand balance sheets and they know what the equity of their business is. And if you start thinking about this, the equity of this work, being the well-being of children and families in this country, then it begins to make sense from another direction."

Local involvement. After outlining some of the advances Vermont has made in improving the lives of children and other citizens, Mr. Hogan moved to the role of indicators in securing involvement by locally minded individuals and organizations. Mr. Hogan uses profiles of local situations to demonstrate to legislators on the states appropriations committee the situations in their particular districts. He also noted that local efforts can "enrich statewide indicators" by creating indicators of their own modeled on the states framework. Mr. Hogan cited a number of examples where an examination of indicators in particular sectors of Vermont led to legislative action to address some of the conditions demonstrated by those indicators.

Presentation. Mr. Hogan also touched on presentation issues. He noted that his department presents indicators graphically and that the printed indicators book is intentionally simple, one indicator per page. Mr. Hogan believes that this enhances the news coverage by making the book more accessible to the media.

Essays by state employees. Mr. Hogan concluded by remarking that he was pleased that the federal government was playing a key role in bringing the indicator states together. He also detailed a new policy within his agency. Since July, in addition to the regular civil service evaluation to which employees are subject, every employee of the Human Services Agency, and every Agency contractor, is required to write an essay on what they and their organization has done to improve the well-being of Vermonters as it is related to the states human service outcomes.


Speaking for Florida were F. Patricia Hall, Office of Economic Self-Sufficiency, Florida Department of Children and Families, and Christine Johnson, Florida Center for Public Management, Florida State University. Ms. Hall noted that in Florida, welfare reform is a public-private partnership including a statewide board and 24 local coalitions. This complex structure is a challenge for Florida as it implements welfare reform and assesses its policy direction.

Over the past five years, Florida has been developing a measurement framework which includes (1) Florida Benchmarks developed by the Commission on Government Accountability to the People, (2) Performance-based Program Budgeting (PBPB) legislated in 1996 and (3) operational measures useful primarily for state agency internal management. The Florida Benchmarks, a set of 260 quality-of-life indicators, are most relevant to citizens and potentially could be used to measure progress toward goals in the State Comprehensive Plan. PBPB requires agencies to develop outcome and output measures for broadly defined programs under their management. In conjunction with PBPB, some agencies are also improving internal measurement systems, comprised primarily of activity/workload measures. Although much work has been done on this measurement framework, it is still in development and not yet integrated into the policy and decision-making process.

Ms. Johnson said that Floridas performance-based program budgeting promises to be an important means of integrating measures with decision-making, noting that "if its tied to money, people are going to pay attention to it." Performance-based budgeting has received broad support in Floridas executive and legislative offices and in some, but not all agencies. Support has also been strong from the business community, which has stressed the importance of telling citizens what they get for their tax dollars.

Ms. Johnson described two recent Florida innovations. One was a cross-agency study that identified funds spent throughout state government on a single concern--juvenile crime. She reported that the state found $1.2 billion expended by 23 state agencies to address juvenile crime --an example of what can be learned once program and agency boundaries are crossed. The second innovation was the FGAR Website ( that provides information to policymakers and citizens on 400 state agency services and programs and identifies Florida Benchmarks and performance measures relevant to those programs.

Reactant States


Nancy Kuntz, Chief of the Family Health Services Division of the state Department of Health, spoke first for Hawaii. Ms. Kuntz announced her intention to discuss two themes. One was, that, in order to affect policy, data must be meaningful. The second theme was sustainability.

Meaning. Ms. Kuntz noted that the meaningfulness of data is related to issues of timeliness, local ownership, and the special problems that come with data on local areas with small populations. She said that a way to motivate people to action is to make the data relevant at the community and neighborhood levels. Ms. Kuntz noted some of the particular circumstances that influence data collection in Hawaii--the geographic fact of being located on seven islands, the need to track seventeen different ethnic groups, and interpreting changes in indicators in populations groups of small size.

Sustainability. Ms. Kuntz made three points about sustainability. The first was that Hawaii is fostering sustainability in part through the involvement of the Hawaiian youth. The second was that when the economy is strained, as Hawaiis is, interagency collaboration and other innovations become more difficult because of the fear that innovation will be related to cutbacks. Third, she said that legislation was passed this year to try and get programs and outcomes linked to the state budget.

Sheila Forman of the governors office spoke briefly. She began by noting how indicators work supports efforts to make government cost less, an outcome the public accepts and understands. She also described the collaboration of business, labor unions, and public and private agencies to establish statewide performance accountability standards and the role of the federal government in rewarding states that move systematically toward greater accountability.

New York

Deborah Benson, Director of Policy at the New York State Council on Children and Families, represented New York. She began her presentation by explaining the role of the Council, a coordinating body composed of the 13 commissioners of state health, education, and human services agencies, in developing the goals and outcomes for the state and in devising indicators in order to measure progress toward those goals. These goals were presented in New York State Touchstones, copies of which were provided to the delegations. The Touchstones are intended to be used by the state agencies to guide their operations, as a basis for interagency and intra-agency collaborations, and as a tool to support collaborations and initiatives at the local level.

Benson stressed that the Touchstones effort is closely related to Kids Count, saying that when she was asked "whats the difference between Touchstones and Kids Count?" she replied, "There isnt any difference, this is one big effort. Touchstones is a framework and Kids Count is about outcomes for children and families." Ms. Benson also touched on data collection issues. She noted that work is underway to consolidate many of the in-school data collections into a single survey usable by many agencies.


The discussion in the third session focused mainly on the political and media aspects of indicator work.

Concerns and Strategies for the Political Arena

The point was made that, with the changes of personnel in the political arena, comes the need to educate officials about the situations of children in their districts and the need to invest funds to reach the outcome goals. There were many strategic suggestions made concerning the role of data in the political community. For instance, it was recommended that agencies be careful not to confuse program budgeting and budgeting indicators with outcome indicators. Outcome indicators need to be broad enough so that all agencies can contribute to and embrace them and can be widely understood by the public.

The manner in which data are presented was noted as crucial to the way in which it is accepted within a community. As a result, the implications data may have for a community should be considered. Likewise, one may want to allow those it may affect to have an early look at a publication. However, the easiest way to avoid provoking any sensitivities is to present data by community. Hand in hand with that notion was the idea that agencies should take a role in being a resource to communities.

A representative from Minnesota commented that parents are players in the policy arena too, and that should not be forgotten.

  • Concerns and Strategies for the Media.  Attracting the medias interest. Put out new reports frequently; the media gets bored with old news.
  • Guaranteeing accurate citation. Break down conclusions in laymens terms and present it in small chunks. Dont give reporters the chance to get confused.
  • Assuring citation in the press. Show reports to the press a week before their general release. This gives them time to ask questions and lets them advertise for you. Invite reporters to interview you.
  • Forming data that the press is interested in. Create data and analyses relevant at the local level so that media members can report on it locally.


Summary of Session 4
Technological, Analytical, and Data Availability Issues in Indicator Development

Moderator: Robert Goerge, Chapin Hall


Robert Goerge opened the session by stating three objectives of the session: to discuss how indicators are created, how they will be created in the future, and methods of dissemination. He then introduced the first state, Vermont.


David Murphey and Robert McNamara spoke for Vermont, noting that state human services agencies are confronted with the issue of transforming data into information and using that information to motivate action. In an effort to produce positive action based on information, Vermont has taken a number of steps. For example, it has assembled a group of summaries of evaluation research on human service programs and has then made these summaries available to communities as tools.

One tool Vermont has created based on its own data is the Vermont School Report, a Web-based examination of each Vermont school based on 33 indicators. The Web-based presentation offers advantages and disadvantages.


  • It is easy to update
  • The Web allows for easy access
  • Mistakes can be easily corrected


  • Data are presented only as tables
  • There is a need to protect data from being used incorrectly

Other Vermont tools for community use are its community profiles. These focus on school supervisory units (a high school and those schools which feed into it). The 40 to 50 indicators for these profiles are grouped by outcome and can be presented over time to reveal trends for the past decade. This approach is advantageous because it illustrates variation at the state, county, and community levels over time, thereby helping to smooth out the extreme fluctuations to which small numbers are subject. Vermont avoids ranking communities, but communities may compare themselves against state or county outcomes.

The speakers noted that the current presentation of the profiles have some drawbacks. One is that data are reported in a single format and there are no tools on the site to allow the user to perform independent analyses (although the data can be downloaded into a spreadsheet program). Vermont is looking to take the next step, to provide a more sophisticated Web-based tool, and at work implementing a Web tool based on "data-smart questions." This program is designed to anticipate the follow-up questions a sophisticated user of the data may ask and to respond with a graphical presentation .

Vermont is also in the process of developing statewide indicators of school readiness and effectiveness.

West Virginia

Mr. Steven Heasley, Financing and Program Development consultant for the West Virginias Governors Cabinet on Children and Families spoke for West Virginia. Mr. Heasley spoke first to the issue of what indicators could accomplish, and where it needed further information to supplement findings. West Virginia perceived its current indicators as good enough to access the condition or well-being of kids, to monitor trends, and to build a political constituency around the agenda for children and families. They believe their existing administrative data, however, is inadequate for determining which practices and programs work, and for assessing the impact of public policy initiatives.

Taking a closer look at the weakness of the data, Mr. Heasley cited four challenges. The first concern was their lack of available data in the areas of school readiness, early childhood development, and mental health and substance abuse. Secondly, the state is concerned about the discrepancies in intervals at which data is gathered. With some indicators based on data collected at yearly intervals, and others based on Census data, the state finds it difficult to assess the current status of children. A third concern of West Virginias is the consistency of its data for monitoring long-term trends. As they develop innovative and improved indicators that may replace current indicators, they worry about how to conduct meaningful trend analysis over time. Finally, while West Virginia would like to move in the direction of asset-based indicators, they are concerned with the potential costs of undertaking this new direction in both data gathering and analysis.

Mr. Heasley then turned to the technology issues facing their state. They would like to take advantage of the fact that all their public schools are connected to the Internet. Using this point of entry, they would like to move in the direction of on-line data collection. Secondly, they cite a need to give some critical thought to their current state agency administrative data systems which are collecting a great deal of data, very little of it useful for measuring outcomes. Given that the state invests millions of dollars in these systems, West Virginia would like to see some purposeful effort on the part of their state to more strategically use these systems to monitor indicators and measure change.


Alaska was represented by Yvonne Chase, Director of the states Division of Community and Rural Development, and by Brad Whistler, Chief of the Medicaid Service Unit. Ms. Chase spoke first. She began by highlighting many of the challenges unique to Alaska, among them developing and implementing measures appropriate to the states diverse population. Ms. Chase cited as an example the language barrier encountered trying to interview tribal elders in small villages. Cultural differences also mean that indicators other states find useful may have relatively little meaning for Alaska. For example, "are children read to every day?" has less meaning for an oral-tradition culture and Alaskas large seasonal labor market and subsistence economy complicates the monitoring of parental employment.

Mr. Whistler focused on a number of challenges within the context of health indicators. First, he noted that when Alaska is included in national studies, the size of the sample population is often too small to be of use to state policy makers. This small sample size can also affect the utility of other indicators. For instance, an increase in the number of teen pregnancies in a small community can appear to be a radical change. Second, Alaska is challenged by a lack of good baseline data, making it difficult to measure the effects of foster care reform, welfare reform, the Medicaid CHIP program, and other programs. Data are further confounded by the fact that these programs have been implemented simultaneously, making it difficult to gauge cause and effect.

Alaska would like to move in the direction of community-based indicators for two reasons. First, they would like their work to become meaningful on the community level to help address issues of confidentiality in data collection and to help effect meaningful community change. Second, they note that national data collections with Alaskan components often gather data from Anchorage and Fairbanks, two cities that differ vastly from the remainder of the state.


Michel Lahti of the University of Southern Maines Institute for Public Sector Innovation began by echoing the remarks made by Alaska on the importance of developing indicators appropriate to diverse populations. He then turned to Maines concerns with capacity. The technical capabilities of bureaus and offices within the state vary, some are just getting computerized, and there is variation among the systems currently in use. As a result, it has been difficult to share data across systems and to analyze the great amount of data that are available.

Further, because Maine is yet in its primary phase of developing databases and forms of analyses, another consideration is creating uniform definitions of child well-being and program performance measures. One obstacle is that, because non-profit agencies have generally controlled the design of measures and reporting the information, there have been inconsistent approaches. Other factors that limit data availability include the states weak county government system, the fact that state human services data system are frequently focused on the budgeting process and not on providing information applicable for other purposes, and the fact that school district data span counties and communities. Like many states, Maine has confidentiality concerns related to small population groups.

Mr. Lahti also sketched some other developments in Maine, including work with the federal government in designing maternal and child health indicators, the presence in state government of a childrens cabinet, the development of risk-protective factor data at the county level, and a federal grant to develop a program for schools to report on their progress toward their safe schools and drug-free schools requirements. 


The session closed with a wide-ranging discussion on the effective uses of data. State representatives cited personal experiences and suggested approaches to data presentation. The following is a summary of the discussion points.

Data Quality

Data quality is key because it influences the ability to transform data into information.

Some states have found it helpful to hire a data quality technician whose job is limited to looking at databases and pointing out important issues related to the data.

It was suggested that a data expert be hired and shared among a number of small communities who do not have the capacity or need to have one to themselves.

Some states have implemented a training program in which people are trained in one technical assistance session, then they go out and teach the locals, producing a domino effect. This assures data quality at all levels.

To be most effective, data must fit into a framework of standards that already exist, for example, state testing standards in math or reading.

It was suggested that all the research pointing to why a specific indicator is legitimate be summarized in a book.

States have different consent requirements to interview children. Both passive and active consent were mentioned.

Getting People Involved

Representatives considered how to get people outside the social service and research sectors involved with data collection and analysis.

A conference might be held in which people in school systems interested in data use are able to meet, network, and potentially push for improvements in the quality and quantity of data.

Knowing the best media outlets in which to present findings and contacting those outlets is important, especially in small communities, and can enhance dissemination.

The media in some states reports on data daily and weekly through the voice of children or administrators, exposing the general public to data and data analyses involving children in understandable language.

Examples of ways to get youth involved included a state senate program in which children assessed their needs as a prelude to creating policy based on those needs. Also, an interactive Website directed at youth that asks questions and displays results to those questions is being constructed in one state. Such approaches result in asset mapping by the children.

Website presentation of indicators can allow quick analysis of a community in relation to other communities similar in size, practice, poverty level, or any number of areas.

Teachers have begun to take data from surveys and use it in their lessons of interpreting data. This gives students the chance to speak about their own lives in relation to the data and may interest them in child well-being issues.

Because the issues being dealt with extend beyond the social service sector, it was suggested that states look beyond the social services sector when seeking partners in their projects.


Because linking is a key element in statewide indicator work, strategies for linkage were also discussed.

The extent to which tracking and linking are used was questioned in light of who would be able to see the records, and to what extent. For example, should a teacher know that a student appeared before the juvenile court and, if so, should a teacher know with what crime the student was charged?

To be sure that data can be put in a useful context, researchers have asked both schools and social services providers what concerns they have regarding child well-being issues and formed studies around their answers.

Common Identifiers

The importance and the ways of achieving a set of common identifiers were discussed.

The ability to identify relationships among people within a database is helpful because this is a stable identifier.

A strategy of assigning a cross-coded identifier to a client as soon as he/she enters services is being implemented in some states.

Giving children identification numbers at birth, much like the Social Security numbers, was suggested.


There were a number of references to potentially informative projects and publications.

The National Center for Education Statistics has established data standards for transmittal of data as well as data standards based upon data dictionaries and coding systems.

Evaluation Software Publishing in Austin, Texas, has designed a software program called "Periodicity" which sends all childrens information on-line as soon as they leave school.

Vermont is in the process of setting up a database using the two types of software listed above. The preliminary phases of the work can be viewed on the Web as of December 1, 1998.

The National Archive has examples of how to manage and preserve data over time and is accepting grants for such projects through May 30, 1999.


Summary of Session 5:
Cross-State Discussion of Meeting State Technical Assistance Needs and Next Steps

Facilitators: Harold Richman and Mairéad Reidy, Chapin Hall

For the concluding session of the meeting, participants gathered by state to discuss and summarize both their technical assistance needs and the areas in which they have expertise to offer to other states. Participants caucused for about 30 minutes, then reconvened and detailed their needs and expertise.

Technical Assistance Needs

States expressed needs for technical assistance in four areas: indicator conceptualization and development; the uses of indicators; data issues, including data analyses, data linking, confidentiality, outreach to the public, and the location of other datasets; and information sharing among grantees.

Indicator Conceptualization and Development

About one-third of states seek technical assistance related to indicator development. Areas identified for attention included developing frameworks detailing what should be measured and why; developing indicators applicable to particular interest areas, such as early childhood development, school readiness, school success, child care quality, and the concerns of Native Americans and other minority populations; and disseminating information among grantee states regarding previously developed and tested indicators. This latter need was one of a number of TA topics that involved information sharing among grantee states and Chapin Hall, a subject addressed in more detail below. States also looked for assistance in involving multiple state agencies in the indicator work and data sharing and in developing strategies to obtain support for indicators work at high levels in state government.

Uses of Indicators

Needs related to indicator use were expressed by a number of states. Five states seek assistance informing or influencing state policy or budgeting processes through the use of indicators and performance measures. One state suggested that it would be useful for the grantees to share information on how indicators can be used by communities and on the development of state-level policy tools that would encourage "best practices" in the use of indicators. Other states called for information on the effective training for local organizations and state agencies in indicator use.


Data analyses. The need for assistance in analyzing existing datasets was raised. Topics for analyses mentioned by states included examining available datasets to identify potential applications and assessing the impact of welfare reform on children.

Data linking. Data linking was approached from a number of angles. At the conceptual level, a state requested support in developing an appreciation for the value of linking and analyzing datasets among appropriate constituencies. Other linkage issues about which states desired TA included building bridges between state initiatives, state agencies, and communities to help ensure public engagement; establishment of common definitions across agencies and communities; and support in understanding the political complications that can stem from common systems.

Confidentiality. States discussed the possibility of linking data across datasets by developing common identifiers for cases or individuals. The need for TA in this area was mentioned by more than half of the states. This discussion included the confidentiality problems associated with analysis of case-level data, a special concern in states with small populations.

Outreach to the general public and to particular audiences. States seek to involve the public in two distinct ways in order to promote understanding of indicators and their meanings. One of these ways was to work with particular segments of the interested public in order to shape the indicators and make them applicable to community concerns. The second was to use the general interest media to inform the public about the meaning of particular indicators and the policy goals they suggest. States looked for technical assistance support in both areas.

One tool for involving the public would be the presentation of information on a Website. A number of states asked for technical assistance regarding Website development, with concerns spanning software, manner of data presentation, which data are appropriate to present, and conceptual development. States indicated that web presentations might allow the general public and particular community constituencies to interpret the indicators and act on them.

The location of other datasets. The acquisition of new datasets was also raised as a technical assistance need. States look for support in the collection of data in particular areas, such as data from and about children in schools. They also seek assistance in exploring and accessing state and federal tax data.

One state called for centralized advocacy around issues of government data quality that might take the form of states voicing support for new data collections related to the development of indicators. Alaska noted a particular concern, that national sample surveys seldom include enough Alaska cases to allow state-level analysis.

Information Sharing Among the Grantees

Four states called for the creation of an email link or list-server to circulate information among the states. Others called for the creation of an information clearinghouse, possibly at Chapin Hall. Among the information that might be circulated by this clearinghouse would be the indicators developed by each state, knowledge of effective policy tools, the sharing of information and agency decisions on confidentiality issues, and discussion of federal reporting requirements. One state asked for help identifying funding sources that would support Website development and other design and implementation activities.

Expertise States Can Make Available to Other States

Each state delegation identified some areas in which they had expertise that might be of potential use to other states. Many of these areas of expertise matched areas of need identified by other states. Areas of expertise included: indicator conceptualization and development; the uses of indicators; data issues, including data analyses, data linking, confidentiality, outreach to the public; the location of other datasets; and information sharing among grantees.

Indicator Conceptualization and Development

A number of states indicated that they had particular expertise in areas of indicator conceptualization and development. In particular, expertise was identified in the development of indicators overall and in such specific program areas as early childhood, maternal and child health, family literacy, youth suicide, substance abuse, and economic issues. States suggested that they could help each other with a range of collaboration activities in indicator development, including the achievement of interagency collaboration in creating common indicators and outcome standards and also in obtaining involvement of communities, community-based agencies, state-level actors, key policy makers, and also others in the work of developing and using indicators.

The Uses of Indicators

States mentioned expertise in achieving the involvement of a variety of interested parties in indicator work. Expertise in sharing data within the policy environment, establishing public/private partnerships around the use of indicators in assessing needs, and other applications of indicators were noted.

Data Issues

Data issues about which state delegations volunteered expertise were varied.

Data gathering and data management. Several states noted their experience in data collections of different types and in managing and warehousing data. A statewide kindergarten assessment, conducting surveys focused on child care, and collecting data in schools were among the kinds of experience cited. States also mentioned different degrees of expertise in employing research findings from other sources and in devising "best practices" in data use. Hawaii noted its experience in creating functional outcome measures and employing those measures to look at children and also at systems.

Data linking. A central concern was the alignment or linking of data from different state government activities in order to help present a comprehensive picture. A number of states volunteered their expertise in this area and these offers included declarations of expertise in such areas as joining education and human services data or joining Medicaid data to other datasets. Two states indicated experience in linking their administrative data with those found in their state Kids Count volumes.

Data quality. New York noted that, through its Center for Technology in Government at the University at Albany, it can provide help in determining the value of information or data.

Confidentiality. Expertise on confidentiality included both actual confidentiality problems, including legal issues associated with linking and sharing datasets, and also governmental management issues such as a state agencys use of confidentiality concerns as a way of avoiding other issues. Utah offered to make available an opinion by its attorney general on sharing data among state agencies.

Outreach to the general public and particular audiences. A number of states noted their experience in engaging the public or engaging communities. Hawaii noted its particular expertise in working with culturally diverse populations.

Information Sharing Among Grantees

Vermont volunteered to share information from an upcoming conference of New England states on indicators. Rhode Island indicated its willingness to help other states work on graphic presentation of data. Alaska has compiled a list of benchmarks that states are already using which it will make available to those interested.

Other Expertise

A number of other areas of expertise were mentioned, including the involvement of youth at all stages of the process, ways of engaging local business communities, developing performance-based program budgets, and innovative use of funding to support indicators work focused on assessing community needs. Utah offered to make available copies of its legislation (known as the "Families, Agencies, and Communities Together for Children and Youth at Risk Act") that promotes community-based planning. This legislation can be accessed via the internet on the following web address: of particular interest are sections: 63-75-5.7; 63-75-6; and 63-75-6.5. Florida mentioned that its FGAR Website illustrates links across service programs.

Meeting Participants


Yvonne M. Chase (primary contact)
Division of Community and Rural Development
Department of Community and Regional Affairs
333 West 4th Avenue, Suite 220
Anchorage, AK 99501-2341
907-269-4635 fax

Margaret Thomas (primary contact)
Project Coordinator
Governors Office
3261 Nowell Avenue
Juneau, AK 998-1
907-465-3533 fax

Norm Dinges
Project Director
Kids Count Alaska
University of Alaska--Anchorage Institute of Social and Economic Research
3211 Providence Drive
Anchorage, AK 99508
907-786-7739 fax

Carolyn Spalding
Rsearch Analyst
Alaska Division of Public Assistance
P.O. Box 110640
Juneau, AK 99811-0640

Brad Whistler
Chief , Medicaid Service Unit
P.O. Box 110610
Juneau, AK 99811-0610
907-465-2898 fax


Gwendoline B. Angalet (primary contact)
Special Assistant
State of Delaware
Department of Services for Children, Youth & Their Families
1825 Faulkland Road
Wilmington, DE 19805
302-633-2735 or 302-633-2735 fax

Maria Aristigueta
Assistant Professor
University of Delaware
182 Graham Hall
Newark, DE 19716
302-831-3587 fax

Leslie Cooksy
Policy Scientist
University of Delaware
298 N. Graham Hall
Newark, DE 19716-7350
302-831-4225 fax

Debra Lightsey
Chief Policy Advisor
Delaware Health & Social Services
1901 N. Dupont Hwy.
New Castle, DE 19720
302-577-4510 fax

Nancy Wilson
Department of Education
State of Delaware
P.O. Box 1402, Townsend Building
Dover, DE 19903
302-739-3744 fax


F. Patricia Hall (primary contact)
Program Administrator
Florida Dept. of Children & Families
1317 Winewood Blvd.
Building 3, Room, 408E
Tallahassee, FL 32399-9700
850-922-5581 fax

Christine Johnson (primary contact)
Senior Management Consultant
Florida Center for Public Management
Florida State University, University Center
Bldg. C, Suite 4400
Tallahassee, FL 32306-2670
850-644-4339 fax

William Hudgens
Database Administrator
Florida Dept. of Children & Families
1317 Winewood Blvd.
Building 3, Room 413-J
Tallahassee, FL 32399-0700
850-921-1806 fax


James W. Buehler (primary contact)
Perinatal Epidemiology Unit
Dept. of Human Resources
2 Peachtree St., NW
4th Floor Annex, Room 522
Atlanta, GA 30303
404-657-2586 fax

Laurie B. Dopkins
Georgia Policy Council for Children & Families
100 Peachtree Street
Suite 500
Atlanta, GA 30303
404-527-7443 fax

Gina Kirkpatrick
Director of Childrens Programs
Georgia Dept. of Medical Assistance
2 Peachtree Street
37th floor
Atlanta, GA 30303
404-651-9496 fax

Lyn Myers
Georgia Dept. of Human Resources
Room 19-402
Two Peachtree Street
Atlanta, GA 30303
404-657-3325 fax

Carroll S. Nason
Project Coordinator
900 Hancel Road
Equality, AL 36026
334-857-2825 fax


Nancy L. Kuntz (primary contact)
Chief, Family Health Services Division
Hawaii State Dept. of Health
P.O. Box 3378
Honolulu, HI 96801
808-586-9303 fax

Elisabeth Chun
Executive Director
Good Beginnings Alliance
828 Fort Street Mall, Suite 203
Honolulu, HI 96813
808-531-5702 fax

Sheila Forman
Special Assistant to the Governor
Office of the Governor, Children and Families
State Capitol, Room 410
415 South Beretania Street
Honolulu, HI 96813
808-586-0122 fax

Marcia Hartsock
Hawai`i Kids Count
University of Hawaii, Center on the Family
2515 Campus Rd. - Miller 103
Honolulu, HI 96822
808- 956-4136
808- 956-4147 fax

Sylvia Yuen
Center on the Family
University of Hawaii
Miller 104-B
Honolulu, HI 96822
808-956-4147 fax


Michel Lahti (primary contact)
Institute for Public Sector Innovation
295 Water Street
Augusta, ME 04330
207-626-5210 fax

Timothy Clifford
Director, Dept. of Human Services
Bureau of Medical Services
249 Western Avenue
Augusta, ME 04333-0011
207-287-2675 fax

Susan L. Dustin
Director of Policy & Programs
Maine Dept. of Human Services Bureau of Family Independence
11 State House Station
Augusta, ME 04333-0011
207-287-5096 fax

Meredith Fossel
Research & Planning Specialist
Dirigo Prevention Coalition
124 State Street
Augusta, ME 04330
207-622-9050 fax


David Ayer (primary contact)
Director of Research, Evaluation and M.I.S.
Governors Office for Children, Youth and Families
301 West Preston Street
15th Floor
Baltimore, MD 21201
410-333-5248 fax

Karen M. Finn
Caroline County Human Servces Council
15 S. Third Street
Denton, MD 21629
410-479-4617 fax

Colleen Mahony
Policy Advisor
Office of Lt. Governor Kahleen Kennedy Townsend
The State House
100 State Circle
Annapolis, MD 21403
410-974-2077 fax

Roann Tsakalas
Special Assistant
Governors Office for Children, Youth and Families
301 West Preston Street
15th Floor
Baltimore, MD 21201
410-333-5248 fax

James B. Witherspoon
Director of Planning
Department of Human Resources
Office of the Secretary for Planning
311 W. Saratoga Street
Baltimore, MD 21201-3521
410-333-0637 fax


Janel K. Harris (primary contact)
Research Scientist
Minnesota Dept. of Health, Division of Family Health/MCH
717 SE Delaware St.
P.O. Box 9441
Minneapolis, MN 55440-9441
612-676-5442 fax

Mark C. Larson
Minnesota Milestones Project Manager
Minnesota Planning
300 Centennial Office Bldg.
St. Paul, MN 55155
651-296-3698 fax

Michael Linder
Supervisor, Department of Human Services
444 Lafayette Road North
St. Paul, MN 55155-3839
651-297-1949 fax

Debby Kay Peterson
Minnesota Dept. of Children, Families & Learning
550 Cedar Street
St. Paul, MN 55101
651-58208577 fax

New York

Deborah Benson (primary contact)
Director of Policy
NYS Council on Children & Families
5 Empire State Plaza
Suite 2810
Albany, NY 12223-1553
518-473-2570 fax

Michael Medvesky
Director, Public Health Information Group
NYS Dept. of Health
Room 750 Tower Bldg.
Empire State Plaza
Albany, NY 12237
518-473-0476 fax

Lorraine Noval
Special Projects Coordinator, Commissioners Office
NYS Office of Temporary & Disability Assistance
40 North Pearl Street, #16A
Albany, NY 12243
518-486-6255 fax

Theresa A. Pardo
Project Director, Center for Technology in Government
University of Albany
1535 Western Avenue
Albany, NY 12203
518-442-3886 fax

Alana M. Sweeny
Executive Director
NYS Council on Children & Families
5 Empire State Plaza
Suite 2810
Albany, NY 12223-1553
518-473-7568 fax

Rhode Island
Ann-Marie Harrington (primary contact)
Research Analyst
Rhode Island Kids Count
70 Elm Street
Providence, RI 02903
401-351-1758 fax

Barbara Burgess
Early Childhood Consultant
Rhode Island Dept. of Education
255 Westminster Street
Providence, RI 02903
401-222-4600 ext. 2363
401-222-4979 fax

Elizabeth Burke Bryant
Executive Director
Rhode Island Kids Count
70 Elm Street
Providence, RI 02903
401-351-1758 fax

Sherry Campanelli
Associate Director
Division of Individual & Family Support Services
600 New London Avenue
Cranston, RI 02902
401-464-1881 fax


Rita Penza (primary contact)
Utah Child Indicators Project Coordinator
Utah Department of Health, Bureau of Surveillance & Analysis
288 North 1460 West, P.O. 142101
Salt Lake City, UT 84114-2101
801-536-0947 fax

Marie Christman
Assistant Director
Employment Development Division
Dept. of Workforce Services
1385 South State, Room 200
Salt Lake City, UT 84115
801-468-0160 fax

Scott D. Williams
Deputy Director
Utah Department of Health
P.O. Box 141000
Salt Lake City, UT 84114-1000
801-538-6306 fax


David Murphey (primary contact)
Senior Policy Analyst
Vermont Agency of Human Services
Planning Division
103 S. Main Street
Waterbury, VT 05671
802-241-4461 fax

Roy C. Haupt
Director of Research & Planning
Vermont Dept. of Social Welfare
103 South Main Street
Waterbury, VT 05671
802-241-3934 fax

Jennifer Jewiss
Graduate Student
University of Vermont
828 Snipe Ireland Road
Richmond, VT 05477
802-434-7077 fax

Robert McNamara
Director, Policy, Planning & Research
Vermont Dept. of Education
State Office Building
Montpelier, VT 05620
802-828-3146 fax

Jason Roberts
SDI statistician
Vermont Dept. of Health
108 Cherry Street
Burlington, VT 05402
802-865-7701 fax

West Virginia

Steven Heasley (primary contact)
Consultant to the Cabinet
West Virginia Governors Cabinet on Children & Families
Building 5, Room 218
Capitol Complex
Charleston, WV 25305
304-558-0596 fax

Julia Howell
Evaluation Coordinator
West Virginia Governors Cabinet on Children & Families
Building 5, Room 218
Capitol Complex
Charleston, WV 25305
304-558-0596 fax

Yvonne Katz
West Virginia Prevention Resource Center at Marshall University
Marshall University Graduate College
100 Angus E. Peyton Drive
South Charleston, WV 25303
304-746-2061 or 800-642-9842
304-746-1942 fax

Other Participants

Tom Corbett
Professor, Institute for Research on Poverty
The University of Wisconsin
1180 Observatory Drive
Social Science Building 3424
Madison, WI 53706
608-265-3119 fax

Cornelius D. Hogan
Secretary, Vermont Agency of Human Services
103 South Main Street
Waterbury, VT 05671-0204
802-241-2220 (Michele)
802-241-2979 fax

Helen Howerton
Director, Child and Family Development Division, OPRE
Administration for Children & Families
370 LEnfant Promenade, SW
Aerospace Building, Room 7 West
Washington, DC 20447
202-205-3598 fax

Jody McCoy
Policy Analyst
Department of Health & Human Services, Office of the Secretary
200 Independence Avenue, S.W.
Room 450-G
Washington, DC 20201
202-690-5514 fax

Debra McLaughlin
Director of Interagency & Community Collaboration
Mass. Executive Office of HHHS
1 Ashburton, Room 1109
Boston, MA 02108
617-727-7600 x405
617-727-1396 fax

Martha Moorehouse
Senior Research Policy Analyst
Department of Health & Human Services, Office of the Secretary
200 Independence Avenue, S.W.
Room 450-G
Washington, DC 20201
202-690-5514 fax

William OHare
Kids Count Coordinator
Annie E. Casey Foundation
701 St. Paul Street
Baltimore, MD 21202
410-223-2956 fax

Ingrid Rothe
Researcher, Institute for Research on Poverty
The University of Wisconsin
1180 Observatory Drive
Social Science Building 6401
Madison, WI 53706-1393
608-265-3119 fax

Ann Segal
Deputy Assistant Secretary for Policy Initiatives
Department of Health and Human Services, Office of the Secretary
200 Independence Avenue, S.W.
Room 450-F
Washington, DC 20201
202-690-7383 fax

Chapin Hall Center for Children

Robert M. Goerge
Associate Director
Chapin Hall Center for Children
at the University of Chicago
1313 E. 60th Street
Chicago, IL 60637

Jeff Hackett
Chapin Hall Center for Children
at the University of Chicago
1313 E. 60th Street
Chicago, IL 60637

Rosemary Gill
Administrative Assistant
Chapin Hall Center for Children
at the University of Chicago
1313 E. 60th Street
Chicago, IL 60637

Allen Harden
Senior Research Associate
Chapin Hall Center for Children
at the University of Chicago
1313 E. 60th Street
Chicago, IL 60637

Bong Joo Lee
Research Fellow
Chapin Hall Center for Children
at the University of Chicago
1313 E. 60th Street
Chicago, IL 60637

Susie Quern
Research Staff
Chapin Hall Center for Children
at the University of Chicago
1313 E. 60th Street
Chicago, IL 60637

Mairead Reidy
Senior Research Associate
Chapin Hall Center for Children
at the University of Chicago
1313 E. 60th Street
Chicago, IL 60637

Harold A. Richman
Center Director and
Hermon Dunlap Smith Professor
Chapin Hall Center for Children
at the University of Chicago
1313 E. 60th Street
Chicago, IL 60637

Ada Skyles
Associate Director
Chapin Hall Center for Children
at the University of Chicago
1313 E. 60th Street
Chicago, IL 60637

Fred Wulczyn
Research Fellow
Chapin Hall Center for Children at the University of Chicago
1313 E. 60th Street
Chicago, IL 60637
518-877-0721 773-753-5940