Skip to main content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.


The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Advancing States' Child Indicator Initiatives: Promotional Indicators Forum

Publication Date

A Summary of a Meeting Attended by Grantees of the Advancing States' Child Indicator Initiatives Project and the STATES Initiative/Family Support America Project

Saint Paul, February 3 & 4, 2000

The host for the meeting was the Minnesota KIDS Initiative. Janel Harris of the Minnesota Department of Health is the KIDS Initiative Principal Investigator and Beth Haney, now of the Minnesota Department of Human Services, was the Project Consultant. The Advancing States Child Indicator Initiatives project is supported by the Office of the Assistant Secretary for Planning and Evaluation of the U.S. Department of Health and Human Services. Martha Moorehouse is the Project Officer. Harold Richman of Chapin Hallis the Principal Investigator for the Advancing States' Child Indicator Initiatives project and Mairéad Reidy is the Project Director. The STATES Initiative/Family Support America project is an initiative of Family Support America (formerly the Family Resource Coalition of America) and supported by the Robert Wood Johnson Foundation. David Diehl, Gail Koser, and Rob Rosenkrantz are the primary staff supporting the Cross-State Work Team on Promotional Indicators, a group that has been working on this issue for some time.

The Chapin Hall Center for Children at the University of Chicago prepared this summary. This paper is downloadable from the Chapin Hall web site.

Chapin Hall Working Paper CS-61
Chapin Hall Center for Children
1313 East 60th Street
Chicago, Illinois 60637


Introduction and Orientation: February 3

Welcome and Goals of the Meeting

Janel Harris

Research Scientist Janel Harris of the Minnesota Department of Health welcomed participants. She thanked the organizing states — Georgia, Minnesota, New York, and West Virginia. She also thanked three organizations: the Family Support America (FSA), the Office of the Assistant Secretary for Planning and Evaluation (ASPE) of the Department of Health and Human Services, and the Chapin Hall Center for Children. Harris presented three goals for the meeting.

  • To explore promotional indicators frameworks
  • To move states toward agreement on some promotional indicators in order to support cross-state compatibility
  • To gain insight into the federal perspective on promotional indicators

Gail Koser

Following Harris, Gail Koser sketched FSA's perspective on family support. She said that family support is the cornerstone of the organization's work. Under a Robert Wood Johnson Foundation grant, FSA works to advance a family support agenda in eight states. In action, family support principles are embodied by such activities as the creation of a welcoming voice for families in society through public information efforts, discussion, and training mechanisms.

Martha Moorehouse

Martha Moorehouse of ASPE noted her pleasure at being at the meeting and thanked FSA and the state of Minnesota. She said that one of ASPE's goals for its project was helping states work together, noting that it is exciting to see the four states who convened the meeting working together to integrate their ASPE- and FSA-supported ventures.

Ann Segal

Ann Segal, Deputy Assistant Secretary for Policy Initiatives at ASPE, pointed out that indicators work is going on at many governmental levels in the U.S. and that two factors — data and cost — are among key concerns. Confidentiality issues, especially threats to confidentiality posed when service data are linked, are also a concern.

Brief Overview of Promotional Indicators

David Diehl of FSA and Betty Cooke of the Minnesota Department of Children, Families, and Learning provided a brief overview. They began by providing definitions. (These, and other definitions that follow, are taken from Diehl's Powerpoint presentation.)

What are outcomes and indicators?

An outcome is a desired condition of well-being for children, families, and communities. (Similar terms include "result" or "goal".)

An indicator is a measure that helps to quantify the achievement of an outcome. (Similar terms include "benchmark" and "milestone".)

What are traditional and promotional indicators?

A traditional indicator is a measure of the reduction or elimination of diseases or dysfunctional or at-risk behaviors and conditions.

Promotional indicators are measure of the functioning or development of children, youth, families, and communities that reflect an increased capacity to successfully address challenges.

Diehl and Cooke said that accountability frameworks operate at the state, community, and program levels. Although the indicators chosen at each of these levels can be the same, more detailed and specific data are easier to collect as the boundaries become closer to the target families (i.e., it is easier to collect detailed information in a program than in an entire community and easier to collect more detailed information in a community than in an entire state.) An example of a state-level indicator might be the percentage of children whose skills are within normal range for preschoolers. A community-level indicator might look at the percentage of children who are read to three or more times a week. At the program level, the critical issue might be measuring reciprocal engagement, whether a child is engaged in play and interaction. Other examples of promotional indicators include:

  • The percentage of parents/caregivers who practice child-rearing skills supportive of children's development
  • The percentage of parents who have back-up adult support to assist in educating, protecting, and nurturing their children
  • The percentage of children and youth who have frequent involvement with and receive emotional support from both parents

Traditional indicators often measure other things, such as:

  • The rate of substantiated child abuse and neglect
  • The percentage of children in out-of-home placement
  • The rate of pregnancy among teenage girls

Diehl and Cooke offered three reasons to use promotional indicators.

To bring a strengths-based approach to how we measure conditions of well-being for children, youth, families, and communities

To identify intermediate markers of growth, development, and functioning that are highly correlated with successful long-term outcomes for children and families

To demonstrate the value of family support strategies

They also quoted Michael Patton, who at an earlier meeting, said:

I would propose to you that a primary use of indicators of this kind, that is a real challenge, but that you are well positioned as a coalition to take on, is to use indicators to promote dialogue about healthy families rather than provide answers about the state of the world.

Diehl and Cooke then noted that their goal was to expand the role of promotional indicators, but not replace traditional indicators entirely. They closed with a number of key questions.

  • What are the key promotional indicators that states, communities, and programs can use to measure progress?
  • What is the link between promotional indicators and the achievement of long-term outcomes?
  • What are the best practices and strategies that are linked to achieving progress on promotional indicators?
  • What strengths-based instruments are available to measure progress on promotional indicators?
  • What are the most effective approaches for working with states and communities to infuse promotional indicators into their day-to-day thinking?
  • What kind of framework can we use to organize our work and communicate better about promotional indicators and related ideas?

Orientation to Projects and State Summaries

Printed summaries of state projects were part of the materials package distributed before the meeting. In addition, the four convening states offered brief verbal summaries.


Rebekah Hudgins said that Georgia had developed 26 benchmarks and is working to make these benchmarks more promotional.


Minnesota's interests include promoting federal interest in indicators and enhancing cross-state data comparability.

New York

Toni Lang of the New York State Council on Children and Families said that New York is involved in a number of indicators projects to complement the traditionally based New York Touchstones. It is also now training practitioners in promotional approaches, looking to expand the Youth Risk Behavior Survey to support promotional measures, and working on a web site.

West Virginia

Steve Heasley, a consultant to the Governor's Children's Cabinet, said that in the last three or four years of working on indicators, he has been struck by the limitations of available data. He has not found state administrative data particularly useful. The quest to find data that are useful has led West Virginia to promotional indicators. Just talking about these indicators, even if nothing else is accomplished, "changes the whole gestalt" and that is useful.

Promotional Categories/Frameworks Related to Indicators of Child and Family Well-Being

Dr. Carol M. Trivette of North Carolina's Orelena Hawks Puckett Institute introduced herself and her organization. She stressed the Institute's dedication to family support and research-supported, evidence-based best practices. She described some of the background pieces that conference participants had received in their information packages and said that, in preparing for this meeting over the past few months, she pulled together more than fifty indicator frameworks. She found little commonality among these frameworks and, as a result, she started trying to understand the categories and dimensions of indicators. She also began compiling a running list of promotional indicators. She included indicators from Arizona and Iowa in the packet for meeting participants and recommended that participants look at the web sites of these two states. What follows is a summary of Dr. Trivette's presentation. Her framework and other materials from her presentation are found in appendix A.


Targets. These are children, families, or communities. It's important to think about the unit, the targets of the work. Thinking through targets helps to focus the rest of your work. Big initiatives may have many targets.

Categories. These are broad areas of application — such as physical health, education, and shelter — that relate to particular targets of interest.

Dimensions. Dimensions for a target category such as physical health might include nutrition or immunizations.

Trivette said that the values and culture of the state and the community will influence the choice of indicators. When asked about linking indicators to research, she said that, once dimensions are defined, states need to look at research to identify links between promotional indicators and long-term outcomes. An audience member observed that much of the research on relevant topics is done at the individual or program level and is not necessarily applicable to a population level. (Although the Healthy People 2010 objectives, that include data reported by income level and race, have been posted on the web by HHS.)

Other comments regarding research included the suggestion that the complexity of the human development process makes it advisable to bundle indicators rather than look at a single indicator.

Trivette identified three different types of indicators:

  • Process
  • Intervening
  • Outcome

When asked to define these indicators, she provided these examples

Process. The number of times that social workers provide family support activities. (She warned states not to assume that activities happen as planned at the program level.)

Intervening. Intervening indicators might be, "mediating variables," for example, at the program level, the number of mothers and children spending significant amounts of time in interactive play

Outcome. Are children being able to engage in elaborated play?

Discussion ranged broadly across service delivery and social justice issues. Pat Seppanen of the University of Minnesota said, early in the session, that indicators of well-being might neglect social justice concerns and obscure a need for more aggressive income redistribution. Throughout the remaining discussion, she commented on how the relationship between the definition and development of indicators can be guided by existing power relationships. Another participant asked who decides on an appropriate outcome for a target population?

Other participants sought to explain how they addressed such challenges. Dee Gillespie of Georgia noted how that state struggled to define the target group that was as broad as possible (to avoid creating something for "those kids over there"). Another participant stressed the need to be guided by research in picking targets, pointing out that a range of attention needs to be devoted to healthy communities and a healthy state and that the development process should engage the targets in a discussion of what health means to them.

When Trivette presented examples of targets, categories, and dimensions, discussion turned to such issues as categorization. One participant cautioned against spending too much time on taxonomy, saying (paraphrase)

You can do it the age-graded way or using traditional departmental domains (e.g., health, education, etc.). It doesn't matter which road you take. The indicators have to have face validity. The unit of change is the community. There's a danger in developing elaborate systems and then not doing the work.

Others sought clearer definitions. David Diehl of FSA noted that the meeting originated in the desire of the four sponsoring states to come up with an indicator framework.

Margaret Gressens of the Healthy Anchorage Indicators project sought comment on their Success by Six indicators as represented in a pyramid diagram.

Near the end of the session, the four sponsoring states were asked to sketch their wishes for the shape of the afternoon session.

In-Depth Discussion of Categories & Working Toward Agreement

The afternoon session opened with Nilofer Ahsan's observation that the last session of the morning had been marked by tension between those who want to work on the indicator domains and those who want to look at the larger picture. In preparation for small group discussions, Trivette led the group through a discussion of the indicator domain physical health of young children. Dimensions of this indicator that were mentioned included

  • Food security
  • Health insurance
  • Safety
  • Physical activity
  • Self-care and hygiene
  • Reaching appropriate physical milestones
  • Quality care
  • Environmental quality (possibly including lead poisoning)
  • Routine well-child care visits
  • A child's health knowledge and awareness of its own health needs
  • Healthy diet and nutrition (this could be a component of food security and there was discussion over how availability and consumption were distinct but related)
  • Preventive dental care
  • Access to emergency and specialized care
  • Having models of healthy behavior
  • Having access to playgrounds and recreational equipment

Following this conversation, the group divided into five groups to discuss these indicator domains: families, parents, young children, youth, and communities. The in-depth discussions took place in small groups. The small groups then proceeded to identify possible indicators. Both efforts are presented in the next section.

Summary of Small Group Discussions on Dimensions and Sample Promotional Indicators

The small groups presented their ideas in a session facilitated by Ada Skyles.

Target: Families, Category: Emotional Health

  • Connections within the family and to the outside (including joint activities, time together, and traditions)
  • Healthy relationships (includes open communication and trust)
  • Spiritual development
  • Clear rules, regulations, and boundaries
  • Hopes and aspirations for the future
  • Arts and culture
  • Diversity — valuing it within the family
  • Respect
  • Family esteem
  • Family volunteerism; civic participation

The group felt that all of these dimensions were important.

Sample Promotional Indicators


  • Number of hours family actually volunteers
  • The extent to which the family members value giving back to the community (which includes the issue of saying versus doing)

Connections Within Families and Connections to the Outside (Time Together)

  • Number of interactive activities pursued together by 2 or more family members
  • Number of interactive conversations between 2 or more family members

Connections Within Family and to Outside (Tradition)

  • Regularity, consistency, and meaningfulness of rituals/celebrations

Clear Rules, Regulations, and Boundaries

  • Whether commonly accepted boundaries and consequences can be identified by family members
  • Bedtime exists and is followed
  • Developmentally appropriate curfews
  • Whether rules exist for adult family members (e.g., don't stay at the bar all night)
  • Whether family has roles and expectations for all family members that contribute to healthy family functioning (circular)
  • Adult caregiving of young children (not vice versa)
  • Number of child-initiated activities

The group judged the third, fifth, sixth, and seventh indicators to be the most important.

Target: Families, Category: Self-Sufficiency/Economic Security

  • Adequate housing
  • Job security
  • Stable and adequate income
  • Access to health care and insurance
  • Child care (availability, affordability, quality)
  • Life skills (financial management, nutrition, parenting)
  • Education/skills to achieve economic goals
  • Transportation
  • Emotional support within family/social network
  • Access/utilization to community facilities/resources
  • Internal locus of control
  • Family decision-making

Target: Parents, Category: Emotional Health

The parents group felt strongly that "parenting" should be its own category and consequently excluded it from discussion.

  • Social connection
  • Communication skills
  • Self-esteem
  • Social skills
  • Supportive family
  • Coping skills
  • Accurate self-concept (realistic)
  • Self-sufficiency (financial, dependents, survival, home)

Sample Promotional Indicators

  • Number of people an individual can rely on in a crisis
  • Percentage of individuals who score at or above the norm on self-esteem scale
  • Number of hours per month people can choose their recreational activity
  • Membership in or affiliation with religious organizations and/or community or civic groups
  • Percentage of people at or above norm on self-control scale (locus of control)

Target: Parents, Category: Economic Security

  • Stable employment
  • Livable wage (transportation, medical care, child care, housing)
  • Social safety net

Target: Young Children, Category: Emotional Health

High-Priority Dimensions
  • Sense of self (age-appropriate, part of family and community)
  • Attachment
  • Sense of future/hope
  • Laughter
  • Empathy
  • Self-regulates appropriately — coping skills

Other Dimensions

  • Health care
  • Consistent caregiver
  • Self-confidence/self-efficacy
  • Adequate child care
  • Empowerment (independence, have some control)
  • Consistent boundaries
  • Opportunities for socialization with peers and adults
  • Interacts appropriately with peers

Target: Young Children, Category: Shelter

High-Priority Dimensions
  • Safe shelter (structurally, neighborhood)
  • Accessible shelter (affordable, available)

Other Dimensions

  • Environmental quality (water, light, sound, heat, inside/outside)
  • Stable shelter (in one place for extended period; same composition)
  • Integrated shelter — ability to connect to community services and supports
  • Spatially adequate (the group noted potential cultural issues)

Sample Promotional Indicators


  • Number of library, school services, etc. by distance
  • Use of services/facilities
  • Knowledge of available services/facilities


  • Percentage median income spent on housing
  • Number of units available
  • Percentage of disposable income

The first and second indicators were judged to be the most significant.

  • Access to safe play areas
  • People are out of home using neighborhood in evening and weekends
  • Parents/caregivers feel safe at home (MN)
  • Youth feel safe in home (MN proxy)

The third and fourth indicators were judged to be the most significant.

Target: Youth, Category: Education

Dimension priority — high or other — was assigned using practicality as a deciding factor.

High-Priority Dimensions
  • High-school graduation
  • School engagement
  • Academic achievement

Other Dimensions

  • Educational support services
  • Computer literacy
  • Creativity
  • Love of learning
  • Safe driving
  • Lifelong learning skills
  • Parenting skills
  • Economic literacy
  • Physical education
  • Arts
  • Involvement in governance
  • Liking school
  • Meaningful participation in school
  • Community service

Sample Promotional Indicator for Education

  • Percentage of youth going on to higher education

Target: Youth, Category: Emotional Health

High-Priority Dimensions

Sense of identity

  • Self-esteem
  • Sexual identity
  • Ethnic/community
  • Self respect
  • Autonomy
  • Body image


  • Peers
  • Family
  • Adults

Other Dimensions

  • Respected by others
  • Emotional skills in decision-making and in conflict resolution
  • Personal responsibility
  • Social competency
  • Spirituality
  • Happiness
  • Accessibility of services

Sample Youth Emotional Health Promotional Indicators

  • Percentage of kids who say they have 2-3 peers/friends that care about them (nonfamily)
  • Percentage of kids who have 2-3 caring adults in their lives (nonparental)
  • Percentage of kids who feel loved and valued by their main caretaker

Target: Community, Category: Economic Security

Using clarity as a filter, the community group assigned priority to the dimensions.

High-Priority Dimensions
  • Income levels (living wage)
  • Meaningful employment
  • Full employment
  • Diverse job base (jobs, size of employers)
  • Job preparation (training/education; quality public schools)
  • Cost of living
  • Job supports (transportation, child care, job security, etc.)
  • Business climate (infrastructure)

Sample Promotional Indicators

  • Percentage of people earning a living wage
  • Percentage of people reporting job satisfaction
  • Percentage of new jobs created
  • Percentage of individuals receiving health care and employment benefits (through employers)
  • Number of individuals employed in small businesses
  • Number of individuals actively involved in job training
  • Number of individuals in secondary education

Target: Community, Category: Emotional Health

  • Supportive community (interconnectedness)
  • Civic participation (voting)
  • Volunteering
  • Recreational availability
  • Culture and arts
  • Sense of belonging — identity
  • Perception of safety
  • Diversity of leadership
  • Leadership opportunities
  • Spirituality
  • Cultural awareness and respect

Sample Promotional Indicators

  • Percentage of eligible voters voting
  • Percentage of people volunteering
  • Percentage of individuals attending number of events (cultural)
  • Satisfaction with volunteer role
  • Leadership reflective of the community (race, sex, age, protected classes)
  • Recreational opportunities (number of opportunities; budget by number of kids; after-school activities/slots by children; diversity of legal after-school activities)
  • Awareness of leadership activities
  • Number of religious institutions

Lessons, Observations, and "Should Nots"

Following these two sets of small group discussions, the groups were asked to reflect on lessons learned, to offer observations, and identify knots with which they struggled.


  • Start with what you know. Lay out traditional indicators and ask, "what is missing, what aren't we getting?" Simultaneously, it is important not to avoid using traditional indicators. They capture things promotional indictors sometimes do not. (An example, from Toni Lang of New York, is that although the answer to the question "Do you feel safe going out at night?" is useful, crime rates are still important.)
  • Adult emotional health is a difficult area in which to find indicators (an example might be measures of coping skills).
  • Critical question is how is work done-is it promotional?
  • Because traditional indicators often take a negative perspective and promotional indicators seek to assess the presence of positive qualities, there is sometimes a temptation to flip a traditional indicator. However, flipping a traditional indicator does not produce a promotional indicator. In addition, not every indicator can be flipped. For example, there is no logical flip for the number of jobs created.
  • Social justice can be addressed if we identify indicators that are sensitive to social justice and they become operational. The composition of the group identifying indicators needs to be representative and developers need to look at data balance of SES/race/ethnicity.
  • As measures become more specific, they are more likely to raise cultural issues.
  • Trying to identify indicators to measure a dimension is sometimes difficult.
  • Report data and offer cautions when using data (however, cautions may be ignored) and produce different reports for different times.
  • Small group discussions produced a sense of common understanding.
  • Some indicators are not easily categorized (e.g., when targeting youth should access to mental health be a dimension of emotional health or community?).
  • Need to understand what we are talking about.


  • Indicators need to be linked to research.
  • What are the rules? Does it matter if indicators are flipped? Or if they are reported as bundle or as single measures? The key issue is whether the focus is on fixing problems or building assets and strengths.
  • It is sometimes difficult not to flip a traditional indicator.
  • If resources are not available to develop surveys, find other means to collect data.
  • Validity/reliability issues surround self-report data.
  • Promotional indicators are social goods, not merely protection from bad things.
  • Promotional indicators measure the building of a healthier place.
  • Promotional indicators may not be measuring an action, rather the result of an action.
  • Identifying promotional indicators can lead to other questions.
  • To avoid misinterpretation of data, be careful and precise in reporting findings.
  • Since ranking can distort data, separate reports may produce more accurate understanding.
  • It is hard to capture social justice issues.
  • Social justice includes access to power and distribution of resources.
  • Working arrangements of caregivers can influence interactions, and opportunities for interactions, among family members.
  • The costs of cultural events may be prohibitive for low-SES families and should be cautiously included in indicators.
  • Promotional indicators may raise cultural and generational issues.
  • There are different cultural ways to define volunteerism.
  • If one is working for the minimum wage, how does he or she attain greater economic opportunity?
  • If an indicator is based on science, and/or target population has been included in its identification, cultural concerns are lessened.
  • Promotional indicators measure conditions and qualities of life.

"Should Nots"

  • Do not limit the list of indicators simply because measures might not be available.
  • If resources are not available for new research, look for existing research.
  • How valid is self-reported data from youth?
  • Because performance indicators are positivistic, and come from people in power, they can have limited application to social justice.

During the large group discussion, the following words were identified that describe promotional indicators:

  • Positive
  • Strengths
  • Developmental
  • Asset
  • Relationships
  • Capacity
  • Activities
  • Process
  • Aspirations, hopes
  • Optimistic

Taking Promotional Indicators to States

State Level Challenges to Using a Strength-Based Approach

David Murphey of the Vermont Agency of Human Services discussed the state's experiences with promotional indicators.


Survey overload. Vermont was interested in gathering more information with an additional survey, but did not want to bog down students and educators. Vermont currently fields the Youth Risk Behavior Survey (YRBS) every other year and was interested in also adding a Search Institute survey. To ease the burden, Vermont made the Search survey voluntary and slated it for years when the YRBS is not fielded. Also, the state agreed to pay for the surveys and provide reports for the smaller communities and school districts. (The survey costs about $2 per questionnaire and $500 per report.) Although the survey is voluntary, approximately half of the students who received it took it. (Vermont did not add the Search survey questions to the YRBS because of the length of a combined instrument. They felt that to combine the two would have compromised both.)

Potentially confusing language. Vermont is careful to organize and summarize findings in ways that reduce confusion.

The state of the science. Certain indicators and assets are better researched than are others. Vermont tends rely on the better researched indicators.

Following up on the data. Beyond sharing data with communities, states need to work with the communities to implement findings into planning. Vermont has school-building-level action plans that involve the use of data beyond grades and test scores.

Walking the talk. New ideas need to be incorporated into policies, not thought of as new management techniques.


Murphey presented five assets selected for examination in Vermont:

Parent involvement in schooling

Percentage of students reporting family love and support

Percentage of students reporting parents set rules and consequences

Children who have 2 to 3 (or more) nonparental adults that care about them

Young persons who feel that young people are seen as resources in their community.

Murphey says Vermont is interested in looking at how to train youth leaders and how to get adults to work in a collaborative and positive way with youth.

Debbykay Peterson: Minnesota Department of Children, Families and Learning

Minnesota has a universal health and development screening program which is a screening required by all children prior to entering the public school system. The early childhood data that comes from this screening is:


A snapshot of health development and other factors in young children

A complement to other data sources (such as maternal and child health and Census data) and provides a system of accountability

Why is the Screening Program Important?

It provides information on the status of young children.

It connects rural regions of the state to data.

It provides a means of outreach to diverse clientele.

The data gathered is used in many different ways, including

GIS mapping

Trend analysis

With different denominators (county, state, economic developments, school districts, etc)

Analysis that overlaps different data

The early childhood screening outcome data can be added to K-12 data to provide information on percent of kids in the normal ranges with hearing, vision, immunization, primary language spoken in homes, and other measures. The screening program evolved with the implementation of graduation standards and early benchmarks.

Rebekah Hudgins, Georgia


Regarding the issue of training youth leaders and adults working with youth. There is a need to have family representatives at the table to help make the best decisions.

Regarding data uses. How are we using this data? We (researchers and practitioners) need to be aware of data uses and measures of data.

What do legislators want? How do we package and present information?

Welcome and Introduction: February 4

Janel Harris introduced Steve Heasley and David Diehl to discuss attempts in West Virginia to help communities build their own set of indicators and in steering them to a promotional approach. Following the discussion of West Virginia's experiences, David Murphey of Vermont sketched his state's activities. The last speaker was reactant Susan Ault of Cass County, Minnesota.

Taking Promotional Indicators to Communities

West Virginia

West Virginia works to support the development of community-level indicators that fall underneath an umbrella of state-level effort. Forty-five West Virginia communities were invited to participate in the selection process, which led to the selection of two communities for pilot studies. Each community chose its own outcomes and indicators so that the product is a local report card. West Virginia defined an outcome by asking, "What do you want for your children, youth, families and communities?" Indicators are then identified to help measure whether or not they are approaching those outcomes.

These pilot efforts worked to orient communities toward strength-based approaches and measures of positive development. Some data, such as pre-K assessment data, are not reported to the state and are available at the community level only. The promotional indicators chosen as part of the pilot project included:

  • The percentage of children who are read to by their parents more than three times a week. (Martha Moorehouse said that data on parents reading to children is collected by a federal household survey and that this measure is included in a federal indicators report, offering communities using such measures the opportunity of benchmarking.)
  • The percentage of children who receive satisfactory on a social responsibility grade (based on 6th and 9th grade assessments unique to the schools in one of the pilot sites).
  • The percentage of people who get regular exercise.
  • The percentages of children who successfully complete the pre-K survey (that collects information on cognitive language development, health, vision, and hearing).
  • The percentage of people who report volunteering (like the social responsibility grade, this is a community-specific measure).

The pilot efforts encountered a number of challenges, including:

  • Obtaining broad community involvement
  • Keeping it simple
  • Data collection and management
  • Tendency to rely on existing data
  • Bridging program-level and community-level information
  • Impacting state practices

Lessons the state felt it learned were that such efforts:

  • Require work, resources, and commitment
  • Need to be connected to other efforts
  • Need to identify a neutral "home" for the work, such as a cross-agency home
  • Need to identify key people and ensure broad representation
  • Need to ensure that the community "gets it" and supports the effort


David Murphey reported that, like West Virginia, Vermont has a rather direct relationship between the communities and the state. Vermont uses the tools developed by the Search Institute to get the conversation started, but Vermont doesn't contend that Search's approach is the only way. Murphey sketched some local-level challenges:

Appropriateness. There is controversy over the appropriateness of some of the questions on the survey when asked of younger children. To address this circumstance, each principal or other school staff member responsible for the survey had to complete the instrument during training so that they would be familiar with it.

Academic language. Some find it difficult to deal with the academic language. To address this, the state took care to explain the data carefully. They found that describing resiliency connected better with communities than did asset or promotional terminology.

Involving youth. Giving youth meaningful, responsible roles requires a lot of support.

Holding to the vision. It is easy to fall back on old ways of thinking and easy to think of humans as a collection of deficits. As is true in West Virginia, communities understand the danger of negative thinking and also recognize that, although they sometimes feel powerless to impact negative measures, they can impact positive ones.

Reactant: Susan Ault, Cass County, Minnesota

Susan Ault supervises Child Protection Services in Cass County. A few years ago, the county and the adjoining Leech Lake Reservation were given a chance to work together, along with the Pew Charitable Trusts, on child protection issues. From this beginning, with the support of Pew and other nongovernment and government funding sources, they began a seven-year process to find out what they needed to know about what was going on in their communities. They used strength-based family support principles as a guide.

The process began with the creation of a vision and outcomes developed through a broad-based community dialogue. Cass County/Leech Lake Reservation's vision is that

All families have what they need to do what is best for themselves and their children.

Steps in this process have included

  • Focusing on a target and directing funds to particular programs to reach it
  • Understanding the context they are working on (organizational structure, and strategic planning framework)
  • Examining implementation (that is, examining how services are delivered in the county)
  • Self-assessment

Ault says that her community understands that this work ultimately leads to better services in the end .

Federal Perspectives on Promotional Indicators

Casey Hannan of the Centers for Disease Control and Prevention of the U.S. Department of Health and Human Services discussed the national initiative to improve adolescent health by the year 2000. His presentation relied in large part on a series of slides that are summarized below.

Why Are Adolescence and Young Adulthood So Important?

Pivotal and enduring changes
  • Biological
  • Intellectual
  • Emotional
  • Social


Establish patterns of behaviors and lifestyles

Societal institutions are very influential

Young people are influenced by a number of societal institutions.

Influential Societal Institutions

  • Parents and families
  • Schools
  • Health care providers
  • Community agencies that serve youth
  • Religious organizations
  • Media
  • Postsecondary institutions
  • Employers
  • Government agencies

Proposed Age Group Parameters

Adolescents and young adults: 10-24 years old
  • Young adolescents: 10-14 years
  • Older adolescents: 15-19 years
  • Young adults: 20-24 years

Mortality Rates Among 15-19 Year-Olds in 50 Nations, 1995 U.N. Report

Females Rate Males Rate
(1) Netherlands 20 (1) Sweden 50
(6) Poland 30 (6) Hungary 70
(11) France 30 (11) Italy 80
(16) Romania 40 (16) Czech Republic 90
(21) Chile 40 (21) Bulgaria 100
(22) United States 50 (26) Argentina 120
    (31) United States 130

Birth Rates Among 15-19 Year-Olds in 104 Nations, 1995 U.N. Report

(1) Japan 3.9
(11) Spain 11.0
(21) Ireland 16.5
(31) Austria 23.1
(41) Martinique 31.6
(51) Thailand 41.6
(61) Romania 47.6
(71) Sarawak 55.2
(79) United States 63.5

Leading Causes of Mortality Among 15-24 Year Olds in the U.S., 1997

Leading Causes of Mortality Among 15-24 Year Olds in the U.S., 1997
Motor Vehicle Crash 33%
Homicide 20%
Suicide 13%
HIV Infection 1%
Other Injuries 10%
Other 23%

Contributing Behaviors, 1997

Contributing Behaviors, 1997
Behaviors that result in unintentional and intentional injury
Rode with a drinking driver 36.6%
Physical fighting 36.6%
Weapon carrying 18.3%
Injurious suicide attempt 2.6%
Alcohol and drug use
Binge drinking 33.4%
Marijuana use 26.2%
Sexual risk behaviors
Engaged in intercourse 48.4%
Did not use condom at last intercourse 43.2%

Leading Causes of Mortality Among Adults 25 Years-Old and Older in the U.S., 1997

Leading Causes of Mortality Among Adults 25 Years-Old and Older in the U.S., 1997
Cardiovascular Disease 42%
Cancer 24%
Other 34%

Contributing Behaviors, 1997(Many of these behaviors begin in youth)

Contributing Behaviors, 1997(Many of these behaviors begin in youth)
Tobacco use
Use of any tobacco product 42.7%
Inadequate physical activity
Does not engage in vigorous physical activity 36.2%
Unhealthy dietary patterns
Overweight/at-risk of being overweight 24% (12-19 year-olds; 1994, NHANES)

Some of the most serious problems are caused by six behaviors.

Behaviors that Contribute to Education, Health, and Social Problems

  • Behaviors that result in unintentional and intentional injury
  • Alcohol and drug use
  • Sexual risk behaviors
  • Tobacco use
  • Inadequate physical activity
  • Unhealthy dietary patterns

Youth Risk Behaviors Among High School Students That Improved, 1991-1997

  1991 1993 1995 1997
Weapon carrying 26.1% 22.1% 20.0% 18.3%
Physical fighting 42.5 41.8 38.7 36.6
Ever had intercourse 54.1 53.0 53.1 48.4
Used condom at last intercourse 46.2 52.8 54.4 56.8

Youth Risk Behaviors Among High School Students That Worsened, 1991-1997

  1991 1993 1995 1997
Current cigarette use 27.5% 30.5% 34.8% 36.4%
Current marijuana use 14.7 17.7 25.3 26.2
Used birth control pills at last sexual intercourse 20.8 18.4 17.4 16.6
Participated in vigorous physical activity 66.3 65.8 63.7 63.8
Attended physical education class daily 41.6 34.3 25.4 27.4

More information on Healthy People 2010 is found in the HHS volume Developing Objectives for Healthy People 2010. Overall, there are some 400 objectives and 95 of those relate to youth and young adults. There are no process objectives for health outcomes or contributing behaviors, which is one place in which promotional indicators might have been featured. They were not defined for two reasons. One is that the core work group wanted to focus on behaviors. The second is that federal datasets have not been designed to accommodate promotional indicators.

A Draft List of 20 Critical Objectives

  • Mortality 10-14, 15-19, 20-24 year-olds
  • Motor vehicle fatalities
  • Alcohol/drug-related motor vehicle fatalities and injuries
  • Safety belt use
  • Riding with a drinking driver
  • Suicides
  • Injurious suicide attempts
  • Homicides
  • Physical fighting
  • Weapon carrying
  • Binge drinking
  • Use of marijuana
  • Feeling sad, unhappy, or depressed
  • Pregnancies 15-17 year-olds
  • HIV infection
  • Chlamydia
  • Abstinence or used condoms at last intercourse
  • Used any tobacco product
  • Overweight/obese
  • Vigorous physical activity

On November 4, 1998, the Surgeon General convened a National Interactive Television Conference with State Health Departments. It featured participation by representatives of key state societal institutions, who reviewed national progress in attaining more than 70 of the Healthy People 2000 objectives and also reviewed draft critical objectives for the year 2010. During the conference, participants discussed what each of the societal institutions could do to support these efforts. A conference videotape is available.

Draft Strategies of the National Initiative to Improve Adolescent Health by the year 2010
  • Publish every two years state progress on critical health objectives
  • Publish state adolescent health performance measures (this would be similar to a community report card)
  • Convene all state adolescent health coordinators every year
  • Increase state core capacity in adolescent health program and service delivery
  • Identify best policies, practices, and partners to attain critical health objectives
  • Publish annual review of state health policies
  • Develop on-line database of funding sources for adolescent health programs (this is in progress)
  • Implement and apply findings from Healthy Futures: Community-based Longitudinal Study of Adolescent Health
  • Broadcast live to state departments of health the national Healthy People 2010 progress reviews on adolescents and young adults
  • Develop a "companion document" on the National Initiative to Improve Adolescent Health by the Year 2010

Possible Partners


Association of Maternal & Child Health Programs
Association of State & Territorial Health Officials
National Association of County & City Health Officials
State Adolescent Health Coordinators Network

Centers for Disease Control & Prevention
Health Resources & Services Administration
National Institutes of Health
Office of Disease Prevention & Health Promotion
Office of Minority Health
Office of the Solicitor General
Office of Women's Health

National Nongovernmental
American Academy of Pediatrics
American Medical Association
Institute of Medicine
Society for Adolescent Medicine

Kristen Teipel, State Adolescent Health Coordinator Network

Kristin Teipel, Adolescent Health Coordinator in the Family Health Division of the Minnesota Department of Health, talked about the State Adolescent Health Coordinator Network. The Network's purposes include trying to ease cooperation between states and their communities and bringing national attention to adolescent health concerns.

The Network takes a strengths-based approach to adolescent health and operates from a practice-oriented and population-oriented perspective. It held a January meeting focused on the uses of data and included a look at the possibility of developing intermediate indicators.

Ms. Teipel said that she recently spoke with 400 Minnesota young people about health issues. She asked them to define health and the issues they mentioned included "self-confidence." In identifying factors that interfere with health they mentioned challenges such as drugs, but also "lack of support."

Martha Moorehouse, ASPE

Martha Moorehouse of the Office of the Assistant Secretary for Planning and Evaluation (ASPE) of the U.S. Department of Health and Human Services, sketched ongoing federal data collection projects that focus on state-level data and identified some ways in which those data can be applied to the indicators projects. She noted that a usual purpose of federal data collections was to yield state-level, not community-level, estimates. In part, this makes federal data collections a guide and a point of comparison for states, not a source of data relevant to communities. (Nevertheless, state data needs are influencing federal data collections in ways that can be useful in community-level work. Examples include changes being made to the Youth Risky Behavior Survey (YRBS).)

Useful Data Sources

Moorehouse began by detailing two data collections — the State and Local Area Integrated Telephone Survey (SLAIT) conducted by the National Center for Health Statistics of the Center for Disease Control and Prevention and the American Community Survey (ACS). Both were detailed on overheads (reproduced below). ACS is designed to collect information of the type collected on the Census long form, but to collect it annually. Goals of the ACS include to:

  • Provide federal, state, and local governments with an information base for the administration and evaluation of government programs
  • Improve the 2010 Census
  • Provide data users with timely demographic, housing, social, and economic data, updated every year, that can be compared across states, communities, and population groups

Although data from the ACS will be available more frequently than every decade, as is currently the case with Census long-form data, samples in small areas, such as Census tracts, will require more than a single year's data collection to yield analyzable samples. Full implementation of the ACS is slated for 2003. More information on the ACS can be found at

Moorehouse's American Community Survey Overhead

What is the American Community Survey (ACS)?

The ACS is:

A way to provide data communities need every year instead of every decade.

An on-going survey that the Census Bureau plans will replace the long form in the 2010 Census.

The ACS will provide estimates of demographic, housing, social, and economic characteristics every year for all states, as well as for all cities, counties, metropolitan areas, and population groups of 65,000 people or more. For smaller areas, it will take 2 to 5 years to accumulate sufficient sample to produce data for areas as small as census tracts.

Goals of the Program

The goals of the American Community Survey are to:

Provide federal, state, and local governments an information base for the administration and evaluation of government programs.

Improve the 2010 Census.

Provide data users with timely demographic, housing, social, and economic data updated every year that can be compared across states, communities, and population groups.


The American Community Survey is being implemented in three parts:

Demonstration period 1996-1998

Comparison sites 1999-2002

Full implementation nationwide starting in 2003 in every county of the U.S.

Data Dissemination

ACS goals

To provide data to the users within six months of the end of a collection or calendar year.

For states, populous counties, and other governmental units or population groups with a population of 65,000 or more, the American Community Survey can provide direct estimates for each year.

For smaller governmental units or population groups (those with a population of less than 65,000), estimates can be provided each year through refreshed multi-year accumulations of data.

Next, Moorehouse suggested meeting participants investigate the information found on the web site of the Federal Interagency Forum on Child and Family Statistics (

Moorehouse's SLAITS Overhead

New Directions for the State and Local Area Integrated Telephone Survey (SLAITS)

Originally designed by NCHS to generate high-quality state-level data for tracking and monitoring current and emerging health and welfare policy-related issues.

Its design and approach is based upon the telephone survey used by the National Immunization Program.

Current SLAITS Projects

1. Survey of children with special health care needs (funded by HRSA) Goal: Provide baseline estimates for federal and state performance measures, year 2010 national prevention objectives, and data for each state's Title V five-year needs assessment.

  • Includes 50 states and D.C.
  • Will screen 3300 households with children per state
  • Will complete 750 interviews in each state on children with special health care needs
  • A control sample of children without special health care needs will receive health insurance questions
  • Data collection is in July 2000-July 2001;
  • Five minute screener to identify special health care needs children and 15 minute survey
  • Survey content: severity of ongoing condition, medical home, access and barriers to care, care coordination, satisfaction with care, health insurance adequacy of health care coverage, impact on family.

2. Survey of pediatric care (funded by American Academy of Pediatrics). Goal: Provide data on the characteristics of pediatric care of children age 4-35 months.

  • Will include 2000 families including 800 minority families.

Contact: Marcie Cynamon at or (301) 458-4174.

Influencing Federal Data Collections

The National Institute of Child Health and Human Development (NICHD) has made a grant to Child Trends, Inc., to explore what topics might be added to federal data collections, including a look at whether measures from the Adolescent Health Study might be added to other surveys.

School Readiness

As an example, Moorehouse noted that the National Center for Education Statistics has a measure of early literacy drawn from a study of early childhood experiences. She is among government staff encouraging NCES not to use this measure as the sole measure of school readiness, arguing that readiness also includes a number of asset-linked and environmental measures. More complete school readiness measures might be found in the measures that Head Start has developed. Its measures of social competence are similar to the objectives found in the national education goals for this population.

Balancing Traditional and Promotional Indicators

Moorehouse said that there are particular roles for both traditional (or deficit-focused) indicators and promotional (asset-focused) indicators. Traditional indicators can galvanize attention — including the attention of policy makers — regarding a particular topic in ways that asset-based indicators can not. Many in government believe that it is government's role to address particular problems, not to craft promotional strategies. Moorehouse advised meeting participants to develop a strategy balancing both types of indicators, exploiting the strengths of each type.

Next Steps

David Diehl of FSA and Mairéad Reidy of Chapin Hall facilitated this session. Reidy sketched the meeting's development by the four states and expressed her pleasure at being able to work with them, FSA, and ASPE. She then asked each state to respond to a few questions:

  • Where do you go from here?
  • What will you draw on from this session and where else will you go to get help in moving forward?
  • What indicators and domains discussed during this session might you add to your ongoing development of indicators?

State responses follow.


Janel Harris of the Minnesota Department of Health indicated that she expects Minnesota to be an important presence at April's FSA meeting in Chicago and hopes to continue at that meeting the kind of dialogue undertaken at this meeting. Minnesota also looks forward to continuing that state's work with Carol Trivette and Carl Dunst of the Orelena Hawks Puckett Institute. In its continuing work, Minnesota will also draw on its own resources and on FSA and Chapin Hall.

Keeping in Touch

She also said that she thought the session earlier that morning was a great exercise and that she would like to get the states together again, depending on their feelings, to see about collecting some of these data. Diehl added that Family Support America has been convening monthly conference calls among the promotional indicators projects and invited other states to sign up for these calls.

Kids Gateway

Minnesota is about to put up its Kids Gateway on the web. The site will have data on children's circumstances and information on how to interpret those data. Harris promised that she would notify interested parties when the site is functional by using the child indicators list server.

Harris concluded by saying that she would like to work through any channels in order to get the news about promotional indicators to the public.


Rebekah Hudgins of Georgia said that their next step would take place on the following Monday when the she will discuss the St. Paul meetings at the state's policy development working group. Hudgins will also attend the FSA April meetings.

As Georgia moves ahead, it will draw on the perspectives a broad section of interests, including state agencies and Georgia's universities. She suggested that Georgia would build a chat capacity into a state web site in order to encourage dialogue about indicators and to spread the word on this work.

New York

Toni Lang of the New York Council on Children and Families said that their indicators project is developing a web-based information clearinghouse. They have used traditional indicators to develop the site presentation and are now preparing to include promotional indicators.

Lang said that they will share what they have learned with other asset-building projects in New York. She expects that the state indicator project will draw on this meeting, which she called "extremely helpful" and on indicators being collected by the United Way, as their work proceeds.

New York is identifying and planning to incorporate promotional indicators into its fielding of the YRBS. For example, New York intends to include questions to help assess students' access to supportive adults.

West Virginia

Steve Heasley of the Governor's Cabinet on Children and Families said that he was eager to get back to work. He expects that West Virginia will continue to focus on putting together and trying to publish the dataset on child well being. They are going to proceed with community-level work on indicators and are at work on a model project in two communities. They are also trying to make their web site on community indicators more family and community friendly.

Regarding this meeting, Heasley said that he expected to work with the participating states to reach consensus on the indicators and domains discussed earlier that morning. He called for an expansion of the FSA-moderated conference calls on promotional indicators.

Heasley would like to talk to federal representatives, and was disappointed by the lack of time to do so at this meeting, on the incorporation of asset-based approaches into federal policy. He hopes for opportunities for more talk. Heasley said that a deficit-focused approach has been damaging in West Virginia.


Margaret Gressens spoke for Alaska. Alaska's continuing efforts will include

  • Institutionalizing child data project
  • Conducting an Anchorage child health survey
  • Identifying a tool to measure wellness
  • Exploring the linkages between promotional and traditional indicators
  • Having conversations with federal agencies

She also noted that she would like to be part of the cross-state work on indicators and praised the ASPE list-serve.


Oshi Ruelas of the California Department of Social Services said that the meetings had:

  • Informed their current work on developing domains (community, basic needs, opportunity and diversity, education, and health)
  • Helped fill in indicator gaps

Ongoing work will include:

  • Looking at cultural and social justice issues
  • Comparing the frameworks in use by other states with ongoing work in California

They will explore

  • Head Start measures and measures used by other youth programs
  • Program performance measure work
  • Linkages between promotional and deficit indicators


Carolyn Harrington of Florida State University said that the state has legislatively mandated performance measures in all budgets. Last year, Florida worked to align indicators with budgets. She noted that promotional indicators are hard to sell to the Florida legislature, but stressed the need to include promotional indicators. She also said that the ASPE list-serve is "great."


Michael Lahti of the University of Southern Maine said that Maine's progress on indicators development is in part influenced by the state's partnership with Kids Count. He noted that Maine's Children's Cabinet was to meet on February 28 to look over a draft indicators list. In addition, Maine has involved magnet school students in collecting data on youth groups and in developing web products.

Lahti noted that this meeting provided their first experience working with FSA.

Maine will be looking at the family and community target area and Lahti will be writing a resource paper on moving toward the use of promotional indicators. Maine has been looking at the experiences of Florida because of the way Florida is linking indicators to strategic planning.


Rita Penza of the Utah Department of Health, said that she was pleased to see that the state already has a number of promotional indicators in such areas as

  • Healthy weight
  • Nutrition and exercise
  • Parent input
  • School readiness

She said that Utah's next step would be to shift from deficit-based indicators to asset-based indicators in the community domain. They will also work to package indicators for a policy audience. Utah might also work toward expanding the YRBS. One step might include oversampling particular populations.


David Murphey said that Vermont's next steps will include expanding their survey of assets and looking for ways to incorporate data into planning for actions. Vermont will be staging 12 regional youth summits to investigate the needs of Vermont's young people. These examinations will include the perspectives of young people. Vermont will also continue to work adapting the technical language of data analysis prepared for academic audiences so that it is more accessible to other audiences, including the media. Murphey also said that Vermont is less concerned with developing new domains, as a path to comprehensiveness, than it is in getting started.

Family Support America

Gail Koser of Family Support America said that she would schedule sessions at which states can discuss where they might want to go from here. David Diehl proposed brainstorming to help FSA plan a session on promotional indicators to be held at the April FSA meetings in Chicago. Suggestions included looking at indicators frameworks and data availability. Koser asked if people would want to do more on indicators and if they thought their frameworks were strong enough. Another participant asked how states might sell promotional indicators systemically to help foster a paradigm shift.

Office of the Assistant Secretary for Planning and Evaluation

Martha Moorehouse said that states thinking about how to sell the idea of promotional indicators need to think about organizing that effort around different tasks. Trying to change which data are collected in order to produce federal statistics is very hard. Program performance measures, in contrast, do relate to indicator work. Head Start, a program that arose out of deficit models and from empowerment models, was asked to demonstrate what it was accomplishing. Head Start has worked to document its accomplishments in positive ways. First, they began to use measures from the National Household Survey and then developed the Head Start Family and Child Experiences Survey (FACES). The federal government is at work on a randomized experiment to see what Head Start does for children. All of this demonstrates one way in which the federal government is involved in taking a promotional approach.

Mairéad Reidy thanked everyone who attended and worked on the conference, singling out for particular recognition Beth Haney and Janel Harris.


This summary was produced by the Chapin Hall Center for Children from notes taken by Nilofer Ahsan, David Diehl, Jeff Hackett, Beth Haney, Steve Heasley, Monica Herk, Holly Miller, and Lee Schutz and written materials from Casey Hannan, Martha Moorehouse, Mairéad Reidy, Carol Trivette, and FSA.


Appendix A: Family Resource Programs, Promotion Models, and Enhancement Outcomes

Carol M. Trivette, Ph.D.

Research Scientist
Orelena Hawks Puckett Institute
Morganton, North Carolina

Carl J. Dunst, Ph.D.

Research Scientist
Orelena Hawks Puckett Institute
Asheville, North Carolina


Research Director
Family, Infant and Preschool Program
Western Carolina Center
Morganton, North Carolina

Process for Developing Promotional Indicators

Carol M. Trivette, Ph.D.

Orelena Hawks Puckett Institute

February 2000


Promotion refers to enhancing, bringing about, and optimizing positive growth and functioning. This can occur for individuals (i.e., children, youth, young adults, and senior adults) and for groups (i.e., families, neighborhoods, and communities).

Development of Promotional Indicators

The process of developing promotional indicators begins by focusing on three areas:

  • The targets (i.e. children, families, youth, community) of the process;
  • The categories (i.e. physical health, education, shelter) of interest as they relate to the specific target(s) of the evaluation (i.e., children's physical health, families' emotional health) and
  • The dimensions (i.e. nutrition, primary physician) within each category (i.e., for physical health for children the important dimensions might be nutrition, primary care provider, immunizations).

Once the targets, categories, and dimensions of the process have been determined, then the development of promotional indicators can begin. The values and culture of the state or community that is the focus of this process will influence the development of specific promotional indicators. For example, a process indicator of literacy in young Native American children might deal with the amount of storytelling the children experience. For a middle class group of families, the indicator of literacy might be the amount of time parents spend reading to their children each day.

Though the targets, categories and dimensions will remain the same, the specific promotional indicators may vary at times across different locations. There are three types of promotional indicators: process indicators, intervening indicators, and outcome indicators. There will not necessarily be one of each type (process, intervening, outcome) of indicator for every dimension and there may be more than one indicator per type for a single dimension.

Target Categories Dimensions
Examples of Target, Categories and Dimensions
Young Children Emotional Health Stability
Trusting Relationship
Social Responsiveness
Child Affect
Physical Health Nutrition - Quality
Nutrition - Quantity
Emergency Health Care
Dental Care
Safety - Accidents
Primary Health Care
Education Peer Interactions
Stimulating Non-Social Environment
Stimulating Social Environment
Learning Valued
Responsive Social Environment
Responsive Non-Social Environment
Shelter Stable
Youth Emotional Health Nurturance
Self Esteem
Social Connections
Social Competency
Personal Responsibility
Physical Health Nutrition - Quality
Nutrition - Quantity
Health Care Access
Dental Care
Primary Health Care
Education Achievement Motivation
Educational Attainment
Math Competency
Problem Solving
Shelter Stable
Parents Emotional Health Self Esteem
Social Skills
Social Connections
Parental Efficacy/Control
Communication Skills
Parenting Style
Physical Health Nutrition - Quality
Nutrition - Quantity
Health Care - Access
Dental Care
Primary Health Care
Emergency Health Care
Transportation Dependable
Education Advanced Training
Educational Attainment
Education Valued
Economic Security Stable Employment
Quality Employment
Shelter Stable
Community Emotional Health Family Friendly Work Places
Diversity of Leadership
Caring Neighborhoods
Physical Health Clean Air
Nutrition - Quality
Water Quantity
Infectious Disease
Education Quantity Child Care
Quality Child Care
High School Alternatives
Community Colleges
Technical Schools
Economic Security Employment Benefits
Quality Employment Options
Shelter Safe

Copyright © 2000 Orelena Hawks Puckett Institute


Young Children



Parenting Adults


Senior Adults

Physical Health
Social Support
Economic Security
Sharing, Nurturance, Stability, Empowerment, Self-Esteem
Exercise, Primary Health Care, Dental Care,
Nutrition - Quality, Quantity
Literacy, Education Valued, High School Alternatives
Stable Employment, Job Options
Safe, Stable, Affordable
Values, Shared Beliefs

Framework for Developing Promotional Targets, Categories, and Dimensions
Copyright © 2000, Orelena Hawks Puckett Institute

References (1/31/00)

Annie E. Casey Foundation (1999). Kids count data book: State profiles of child well-being. Baltimore, MD: Author.

ASPE Child Indicator Grantees (1999, November). Indicators of school readiness and child care. Presented at the meeting for the New England Meeting HHS/ASPE Child Indicators States and the Carnegie Foundation Starting Points Sites, Providence, RI.

Benedict, M., Strobino, D., & Stacy, H. (1997, February). Evaluation of child abuse prevention projects funded in 1992. Final Report: Pennsylvania Children's Trust Fund. Harrisburg, PA.

Bryant, D., Bernier, K., & Taylor, K. (1999, June). Summary of Smart Start Evaluators' Meeting April 30, 1999. Paper presented at the Smart Start evaluators' meeting, Charlotte, NC.

Bruder, M. B. (1997). Social competence in early childhood: The effects of a specific curriculum focus.

Carnegie Corporation of New York. (1994, April). Starting points: Meeting the needs of our youngest children.

Center for Schools and Communities. (1997, May). A report on Pennsylvania's family center initiative. Lemoyne, PA: Author.

Chamberlin, R. W. (1994). Source primary prevention: The missing piece in child development legislation. In R. J. Simmeonsson, (Ed.). Risk resilience & prevention: Promoting the well-being of all children (pp. 33-53). Baltimore: Paul H. Brookes Publishing Co.

Children 1st. Guidelines for completing the Children 1st family assessment instrument.

Children's Defense Fund (1995, March). Children in the states. Children's Defense Fund: Washington, DC.

Children's Defense Fund (1999). The state of America's children yearbook. Children's Defense Fund: Washington, DC.

Department of Health and Human Services (1999). Advancing states' child indicator initiatives. Washington, DC: Department of Health and Human Services.

Developmental Research and Programs, Inc. (1993). Risk-focused prevention using the social development strategy: An approach to reducing adolescent problem behaviors. Washington: Author.

Dewar, T. (1997). A guide to evaluating asset-based community development: Lessons, challenges, and opportunities. Chicago: ACTA Publications.

Dunst, C. J., & Trivette, C. M. (1992). Measuring family functioning as an outcome of social action programs: A framework and relevant indicators. Position paper prepared for the Pew Charitable Trusts. Philadelphia.

Dunst, C. J., & Trivette, C. M. (1988). Toward experimental evaluation of the family, infant and preschool program. In H. Weiss, & F. Jacobs (Eds.), Evaluating Family Programs (pp. 315-346). Aldine de Gruyter: New York.

Emery, R. E., & Forehand, R. (1996). Parental divorce and children's well-being: A focus on resilience. In R. J. Haggerty, , L. Sherrod, L., N. Garmezy, & M. Rutter (Eds.), Stress, risk, and resilience in children and adolescents: Processes, mechanisms, and interventions (pp. 64-99). New York: Cambridge University Press.

FACES. Pages 34-335, 61-62.

Family Resource Coalition of America (1999). Proceedings of the outcome-based accountability and evaluation frameworks.

Federal Interagency Forum on Child and Family Statistics (1997). America's children: Key national indicators of well-being. Washington, DC: Author.

Frank Porter Graham (1999, June). The children of the cost, quality, and outcomes study go to school. [On-line]

Haggerty, R. J., Sherrod, L. R., Garmezy, N. & Rutter, M. (Eds.) 1996. Stress, risk, and resilience in children and adolescents: Processes, mechanisms, and interventions. New York: Cambridge University Press.

Hunt, J. B. (1999). Report to the people. Raleigh, NC: Office of the Governor, State of North Carolina.

Improved Outcomes for Children Project (1994). A start-up list of outcome measures with annotations. Washington, DC: Author.

Iowa Department of Human Services, The Alliance for Statewide Family Resource and Support Initiative, & Family Resource Coalition of America (1998, July). Peer review meeting on evaluation and outcomes. Des Moines, IA: Iowa Department of Human Services.

Jarrett, R. L. (1998). Indicators of family strengths and resilience that influence positive child-youth outcomes in urban neighborhoods: A review of qualitative and ethnographic studies.

Kagan, S. L., & Weissbourd, B. (Eds.) (1994). Putting families first: America's family support movement and the challenge of change. San Francisco: Jossey-Bass.

Knitzer, J., & Page, S. (1996). Map and track: State initiatives for young children and families. New York: National Center for Children in Poverty.

Leffert, N., Benson, P. L., & Roehlkepartain, J. (1997). Starting out right: Developmental assets for children. Minneapolis, MN: Search Institute.

Levey, J. L. (1999). National Center for Service Integration: Reinventing common sense. Des Moines, IA: Child and Family Policy Center.

Littell, J. H. (1986). Building strong foundations: Evaluation strategies for family resource programs. Chicago: Family Resource Coalition.

Love, J. M., Aber, J. L., Brooks-Gunn, J. (1994). Strategies for assessing community progress toward achieving the first national educational goal. Princeton, NJ: Mathematica Policy Research, Inc.

Lopez, M. E. & Hochberg, M. R. (1993). Paths to school readiness: An in-depth look at three early childhood programs. Cambridge, MA: Harvard Family Research Project.

McCroskey, J. & Meezan, W. (1997). Family preservation & family functioning. Washington, DC: CWLA Press.

McMillen, J. C., & Fisher, R. H. (1998). The perceived benefit scales: Measuring perceived positive life changes after negative events. Social Work Research [On-line], 22(8). Available: EBSCO Item No. 1070-5309.

National Center for Children in Poverty School of Public Health (1990). Five million children: A statistical profile of our poorest young citizens. New York: Author.

National Education Goals, Panel (1995). The national goals report: Building a nation of learners (ISBN: 0-16-048364-6). Washington, DC: U.S. Government Printing Office.

North Carolina Child Advocacy Institute (1995). Early childhood index. Raleigh, NC: Author.

Overcoming language and cultural barriers to early intervention. (1999, July). Early Childhood Report. LRP Publications.

Pratt, C., Katzev, A., & Grobe, D. (1999). Building results: From goals to measurable outcomes for Oregon's children and families revised outcomes, 1999. Family Policy Program, Oregon State University.

QLF/Atlantic Center for the Environment (1995). Guide to Sustainable Community Indicators. (p. 36-37/p. 62-63). Ipswich, MA: Author.

Richardson, B., & Landsman, M. J. (1999). Outcomes consultation: Lessons from the field (Part II), Matrix model: The automated assessment of family progress [On-line]. Available:

Rodriguez, G.G., & Cortez, C. P. (1988). The evaluation experience of the AVANCE Parent-Child Education Program. In H. Weiss, & F. Jacobs (Eds.) (1988). Evaluating Family Programs (pp. 287-313). Aldine de Gruyter: New York.

Roggman, L. A., Moe, S. T., Hart, A. D., & Forthun, L. F. (1994). Family Leisure and social support: Relations with parenting stress and psychological well-being in Head Start parents. Early Childhood Research Quarterly, 9, 463-480.

Rolf, J., Masten, A. Cicchetti, D., Nuechterlein, K. H., Weintraub, S. (Eds.). (1990). Risk and protective factors in the development of psychopathology. Cambridge University Press: New York.

Rutter, M. Protective factors in children's responses to stress and disadvantage. p. 49-74.

Sameroff, A., Selfer, R., Barocas, R., Zax, M., & Greenspan, S. (1987). Intelligence quotient scores of 4-year-old children: Social-environmental risk factors. Pediatrics, 79(3).

Schorr, L. B. & Schorr, D. (1988). Within our reach: Breaking the cycle of disadvantage. New York: Doubleday.

Simmeonsson, R. J. (Ed.). (1994). Risk resilience & prevention: Promoting the well-being of all children. Baltimore: Paul H. Brookes Publishing Co.

Snell, W. E., & Johnson, G. (1997). The Multidimensional Health Questionnaire. American Journal Health Behavior, 21(1). (p. 33-42).

Smith, G. C. (1999). Prevention and promotion models of intervention for strengthening aging families. In M. Duffy (Ed.), Handbook of counseling and psychotherapy with older adults (pp. 378-394). NewYork: Wiley.

State and Community Examples of Outcomes and Indicators for Children and Families, Appendix 2-3.

The Accreditation Council on Services for People with Disabilities (1995). Children have the best possible health. Towson, MD: The Accreditation Council.

The Aspen Institute (1996). Measuring community capacity building: A workbook-in-progress for rural communities. Version 3-96. Washington, DC: BR Publications.

U.S. Department of Health and Human Services. (1990). Healthy People 2000: National Health Promotion and Disease Prevention Objectives `DHHS Publication No. (PHS) 91-50212. Washington, DC: U.S. Government Printing Office.

United States General Accounting Office Health, Education, and Human Services Division. (1998). Head Start: Challenges in monitoring program quality and demonstrating results (Publication No. GAO/HEHS-98-186. Washington, DC: U.S. Government Printing Office.

Weiss, H. B. (1988). Family support and education programs: Working through ecological theories of human development. In H. Weiss, & F. Jacobs (Eds.), Evaluating Family Programs (pp. 3-36). Aldine de Gruyter: New York.

Weiss, H. & Jacobs, F (Eds.) (1988). Evaluating Family Programs. Aldine de Gruyter: New York.

Appendix B: Promotional Indicators Forum Participants

Nilofer Ahsan
Family Resource Coalition of America
20 N. Wacker Drive, Suite I 100
Chicago, IL 60626
Phone: (312) 338-0900 x126
Fax: (312) 338-1522

Susan Ault
Social Services Supervisor
Cass County Human Services
P. 0. Box 519
Walker, MN 56484
Phone: (218) 547-1340

Diane Benjamin
Children's Defense Fund/Minnesota
200 W. University, Suite 2 10
St. Paul, MN 55103
Phone: (651) 227-6121
Fax: (651) 227-2553

Leesa Betzold
Minnesota Department of Human Services
444 Lafayette Road, North
St. Paul, MN 55155
Phone: (651) 296-2831

Becky Buhler
Minnesota Planning
658 Cedar Street
St. Paul, MN 55155
Phone: (651) 297-5239
Fax: (651) 296-3698

Ruth Curwen Carlson
MCH Principal Planner
Division of Family Health
Minnesota Department of Health
PO Box 64882
St. Paul, MN 55164-0882
Phone: (651) 281-9894
Fax: (651) 215-8953

Betty Cooke
Minnesota Department of Children, Families, and Learning
1500 Highway 36 West
Roseville, NIN 55113
Phone: (651) 582-8329
Fax: (651) 582-8494

Wayne Coombs
Director, West Virginia Prevention Resource Center
Marshall University Graduate College
Angus Peyton Drive
South Charleston, W
Phone: (304) 746-2061
Fax: (304) 746-1942

David Diehl
Evaluation Specialist
Family Resource Coalition of America
328 Wagner Road
Morgantown, WV 26501
Phone: (304) 296-3307
Fax: (304) 296-2992

Dee Gillespie
Family Connections
700 Mitchell Bridge Road, #133
Athens, GA 30606
Phone: (706) 548-4465
Fax: (706) 548-2657

Margaret Gressens
Healthy Anchorage Indicators Project
Municipal Department of Health & Human Services
825 "L" Street
P.O. Box 196650
Anchorage, AK 99519-6650
Phone: (907) 343-4655
Fax: (907) 249-7377

Jeff Hackett
Chapin Hall Center for Children
University of Chicago
1313 E. 60th Street, #2E
Chicago, IL 60637
Phone: (773) 256-5139
Fax: (773) 753-5139

Beth Haney
(as of 2/22/2000)
Minnesota Department of Human Services
444 Lafayette Road, North
St. Paul, MN 55155

Casey Hannan
Assistant to the Director of Adolescent Health
Division of Adolescent School Health
Centers for Disease Control and Prevention
2858 Woodcock Blvd., Room 1037
Chamblee, GA 30341
Phone: (770) 488-3190
Fax: (770) 488-3110

Janel Harris
Research Scientist
Minnesota Department of Health
85 East 7th Place, Suite 400
P.O. Box 64882
St. Paul, MN 55164-0882
Phone: (651) 281-9940 -
Fax: (651) 215-8953

Marcia Hartsock
Project Director, Hawaii Kids Count
Center on the Family
University of Hawaii at Manoa
College of Tropical Agriculture & Human Resources
2515 Campus Road-Miller 103
Honolulu, HI 96822
Phone: (808) 956-4136
Fax: (808) 956-4147

Terry Haven
Utah Kids Count
757 E. South Temple, Suite 250
Salt Lake City, UT 84102
Phone: (801) 364-1182
Fax: (801) 3 64~ 1186

Steve Heasley
Governor's Cabinet on Children and Families
P.O. Box 155
Beverly, WV 26253
Phone: (304) 636-8277
Fax: (708) 575-5800

Monica Herk
Community Partners
3433 Allen Drive
Atlanta, GA 30340-1901
Phone: (770) 454-8182

Carolyn Herrington
Florida Education Policy Studies
Learning Systems Institute
Florida State University
4600 C. University Center
Tallahassee, FL 32306-2540
Phone: (850) 644-2573
Fax: (850) 644-4952
E-mail: cherrington(

Rebekah Hudgins
423 Adams Street
Decatur, GA 30030-5207
Phone: (404) 373-7939
Fax: (404) 373-4908

Jennifer Jewiss
University of Vermont
823 Snipe Ireland Road
Richmond, VT 05477
Phone: (802) 434-4995

Gail Koser
Family Resource Coalition of America
13 Sage Hill Lane North
Albany, NY 12204
Phone: (5 18) 462-2445
Fax: (518) 462-9098

Michel Lahti'
Institute for Public Sector Innovation
Edmund S. Muskie School of Public Service
University of Southern Maine
295 Water Street
Augusta, ME 04330
Phone: (207) 626-5274
Fax: (207) 626-5210

Toni Lang
Policy Analyst
NYS Council on Children and Families
5 Empire State Plaza, Suite 28 10
Albany, NY 12223
Phone: ( 518) 486-9153
Fax: (5 18) 473-2570

Mike Linder
Minnesota Department of Human Services
44.4 Lafayette Road, North
St. Paul, MN 55155
Phone: (651) 296-2373
Fax: (651) 297-1949

Carolyn Micklem
FRIENDS Outcome Accountability Project
Chapel Hill Outreach Training Project
11 Altamont Circle, #11
Charlottesville, VA 22902
Phone: (804) 979-8825
Fax: (804) 977-8106

Holly Miller
Minnesota Department of Human Services
444 Lafayette Road, North
St. Paul, MN 55155
Phone: (651) 296-5416

Martha Moorehouse
Officer of the Asst. Sec. for Planning and Evaluation
Department of Health & Human Services
Office of Human Services Policy
Division of Children and Youth Policy
Room 404E, Hubert H. Humphrey Building
200 Independence Ave., SW
Washington D.C. 20201
Phone: (202) 690-6939
Fax: (202) 690-5514

David Murphey
Project Coordinator
Vermont Child Indicators Project
Agency of Human Services
Planning Division
103 S. Main Street
Waterbury, VT 05671
Phone: (802) 241-2238
Fax: (802) 241-4461

Reeva Sullivan Murphy
Child Care Administrator
Rhode Island Department of Human Services
Louis Pasteur Building, #57
600 New London Avenue
Cranston, RI 02920
Phone: (401) 462-6875
Fax: (401) 462-6878

Larry Pasti
Community Program Specialist
NYS Office of Children and Family Services
144 Boynton Avenue
Plattsburgh, NY 12901
Phone: (518) 561-8740
Fax: (518) 562-8665

Ann Peisher
University of Georgia
Cooperative Extension Service
226 Hoke Smith Annex
Athens, GA 30602
Phone: (706) 542-2920
Fax: (706) 542-1799

Rita Penza
Utah Child Well-Being Indicators Project
Utah Department of Health
Center for Health Data
Office of Public Health Assessment
288 North 1460 West, P.O. Box 142101
Salt Lake City, UT 84114-2101
Phone: (801) 538-6676
Fax: (801) 536-0947

Debbykay Peterson
Minnesota Department of Children, Families, and
1500 Highway 3 6 West
Roseville, NIN 55113
Phone: (651) 582-8426
Fax: (651) 582-8494

Mairéad Reidy
Chapin Hall Center for Children
University of Chicago
1313 E. 60th Street
Chicago, IL 60637
Phone: (773) 256-5174
Fax: (773) 753-5940

Tonja Rolfson
Children's Mental Health Division
Minnesota Department of Human Services
444 Lafayette Road, North
St. Paul, MN 55155-3860
Phone: (651) 582-1988
Fax: (651) 582-1831

Rob Rosenkrantz
Meridian Consultants
1692 Central Avenue
Albany, NY 12205.
Phone: (518) 869-6198
Fax: (518) 869-3429

Oshi Ruelas
Research Program Specialist II
California Department of Social Services
Research and Evaluation Branch
Program Planning and Performance Division
744 P Street, MS 12-56
Sacramento, CA 95814
Phone: (916) 654-2067
Fax: (916) 653-1178

Joseph Ryan
Chapin Hall Center for Children
University of Chicago
1313 E. 60th Street
Chicago, IL 60637
Phone: (773) 256-5180
Fax: (773) 753-5940,

Lee Schutz
Minnesota Planning
658 Cedar Street
St. Paul, MN 55155
Phone: (651) 296-9534
Fax: (651) 296-2820

Ann Segal
Deputy Assistant Secretary for Policy Initiatives
ASPE, Room 415F
U.S. Dept of Health and Human Services
200 Independence Ave., SW
Washington D.C. 20201

Pat Seppanen
Center for Applied Research and Educational
University of Minnesota
265-2 Peik Hall
159 Pillsbury Drive, S.E.
Minneapolis, MN 55455-0208
Phone: (612) 625-6364
Fax: (612) 625-3086

Ada Skyles
Chapin Hall Center for Children
University of Chicago
1313 E. 60th Street
Chicago, IL 60637
Phone: (773) 256-5185
Fax: (773) 753-5940

Kristin Teipel
Adolescent Health Coordinator
Division of Family Health
Minnesota Department of Health
P.O. Box 64882
St. Paul, MN 55164-0882
Phone: (651) 281-9956
Fax: (651) 215-8953

Carol Trivette
Orelena Hawks Puckett Institute
128 S. Sterling Street
P.O. Box 2277
Morganton, NC 28655
Phone: (828) 432-0065
Fax: (828)432-0068