The Adoption and Safe Families Act of 1997 directed the Secretary of HHS to develop this report to Congress. This report was prepared with the input of the Advisory Panel on Kinship Care which met in October 1998 and January 1999. The report has two parts. Part I reviews the academic and related research literature on kinship care, including what
This brief highlights the downward trend in the number of children entering the foster care system and the increasing proportion of children who leave care in a timely manner. This is likely due to changes in both federal and state policy as well as a shift in resources to upfront services. With time in care decreasing for the majority of chil
A Temporary Haven: Children and Youth Are Spending Less Time in Foster Care. Time in Care: National Trends and State Differences
While the entry cohort analyses presented above reflect the total amount of time children spend in the foster care system, point-in-time (PIT) data are a snapshot reflecting the amount of time children had been in the system as of a particular date. PIT measures are useful for trend analyses as they provide a picture of what a “typical” case
A Temporary Haven: Children and Youth Are Spending Less Time in Foster Care. Length of Time Spent in Foster Care
Data from the Adoption and Foster Care Analysis Reporting System (AFCARS) for fiscal years 2006 to 2013 were combined to create an eight-year longitudinal file, out of which three first-time entry cohorts (2006, 2007 and 2008) were identified. Each was followed for five years to identify how much cumulative time a child spent in the foster care sy
A Temporary Haven: Children and Youth Are Spending Less Time in Foster Care. Trends in the Number of Children in Foster Care
Over the past 11 years, the total number of children in foster care has declined markedly, as has the number of children entering the system (either for the first time or re-entering). From 2002 to 2013, the number of children in foster care on September 30 th declined from an estimated 524,000 to approximately 402,000, a decrease of 23 percent
Argyris, C. Reasoning, Learning, and Action: Individual and Organizational . San Francisco: Jossey-Bass, 1982. AHRQ. (Agency for Healthcare Research and Quality). National Quality Measures Clearinghouse. “Uses of Quality Measures.” Available at [ http://www.qualitymeasures.ahrq.gov/tutorial/using.aspx ]. Accessed September 20, 2013.
This comparative framework is a heuristic tool rather than a prescriptive how-to manual for assigning rapid evaluation methods to different projects. There is not one best rapid evaluation method that works in all circumstances. The right rapid evaluation design addresses the goals of the evaluation and captures the complexities of the interventio
Rapid Evaluation Approaches for Complex Initiatives. VI. Complex Systems Change—systems Change Evaluation
This section provides an example of a system change evaluation used in the implementation of the Medicaid and CHIP Learning Collaboratives (MAC LC) project. Table 4 features the unique factors of system change evaluations. In Table 4, the ten evaluation factors are listed (in the left-hand column), the factors are applied to systemic change proj
Rapid Evaluation Approaches for Complex Initiatives. V. Organizational Change—rapid Cycle Evaluation
This section provides an example of an organizational change evaluation used in the implementation of the Partnerships for Patients Program (PfP). Table 3 features the unique factors of organizational change evaluations. In Table 3, the ten evaluation factors are listed (in the left-hand column), the factors are applied to organizational change pr
This section provides an example of a process change evaluation used in the implementation of the Safe Surgery Checklist. Table 2 features the unique factors of process change evaluations. In Table 2, the ten evaluation factors described in the previous section are listed (in the left-hand column), the factors are applied to process change proje
Factor 8: Purpose. What are the goals of the evaluation? Are the evaluation questions about implementing and testing the efficacy of a particular best practice or program model in a specific context, or making a judgment of the program’s value? Or, are the questions addressing how best to move forward in a complex initiative?
Factor 2: Intervention complexity. Is the intervention a simple, direct process change, a test of a program model, or a larger initiative addressing multisector, multilevel population or systems change? Factor 3: Governance structure. Who is funding and overseeing the initiative—a single organization, a federal funder of a cohort of grante
Factor 1: Situational dynamics. How complex are the dynamics of the context or environment in which the intervention is operating?
When is it appropriate to use each type of rapid evaluation method? The right design is one that best fits the evaluation’s purpose(s) and captures the complexities of the intervention and its environment (Funnell and Rogers 2011). When system dynamics are not considered in an evaluation’s design, the evaluation will inevitably miss crucial as
All situations or systems share certain basic attributes or conditions, called boundaries, relationships, and perspectives. Together, these conditions generate patterns of system-wide behavior that are called situational or system dynamics.
It can be a challenge to identify which rapid evaluation methods are best suited for particular HHS studies. To allow for easier comparison of different rapid evaluation approaches, this framework identifies ten points of comparison: the intervention’s (1) situational dynamics; (2) complexity; (3) governance structure; (4) scale of outcomes, (5)
Developed by John Kania and Mark Kramer of FSG, Collective Impact is a social change process for “highly-structured cross sector coalitions.” In Collective Impact projects, there is “heightened vigilance of multiple organizations looking for resources and innovations through the same lens, learning is continuous, and adoption happens simulta
Created by Michael Q. Patton, Developmental Evaluation (DE) applies complexity concepts to evaluation to support innovation development. Information collected through the evaluation is used to provide quick and credible feedback for adaptive and responsive development of an innovation.
Attributed to work done in the 1940s by Kurt Lewin, action research is an iterative research process involving researchers and community stakeholders. The approach creates a cycle of inquiry through three elements: “an ongoing analysis of contextual conditions, discrete actions taken to improve those conditions, and an assessment of the efficacy
Innovators within and outside of government are also seeking to conduct systems change evaluations of large-scale, multisector, multilevel, and community-based initiatives, including those aimed at transforming juvenile justice systems, building healthy communities, and developing and implementing health reforms. Mathematica is currently evaluatin