Rapid Evaluation Approaches for Complex Initiatives. Developmental Evaluation

03/31/2014

Created by Michael Q. Patton, Developmental Evaluation (DE) applies complexity concepts to evaluation to support innovation development. Information collected through the evaluation is used to provide quick and credible feedback for adaptive and responsive development of an innovation. The evaluator works with the social innovator to cocreate the innovation through an engagement process that involves conceptualizing the social innovation throughout its development, generating inquiry questions, setting priorities for what to observe and track, collecting data, and interpreting the findings together to draw conclusions about next steps, including how to adapt the innovation in response to changing conditions, new learnings, and emerging patterns (Patton 2008, 2011). This evaluation approach involves double-loop learning, in which preliminary innovation theories and assumptions are reality tested and revised.

Developmental evaluation is designed to address specific dynamic contexts, including (1) the ongoing development and adaptation of a new program, strategy, policy, or initiative to new conditions in complex, dynamic environments; (2) the adaptation of effective principles to new contexts; (3) the development of a rapid response to a sudden major change or crisis, such as a natural disaster or economic meltdown; (4) the early development of an innovation into a more fully realized model; and (5) evaluations of major cross-scale, multilevel, multisector systems change (Patton 2011). The approach can use any kind of qualitative or quantitative data or design, including rapid evaluations. Methods can include, for example, surveys, focus groups, community indicators, organizational network analyses, consumer feedback, observations, and key informant interviews with influential community leaders or policymakers. The frequency of feedback is based on the nature and timing of the innovation. During slow periods, there is less data collection and feedback; when the initiative, such as a policy advocacy campaign, comes to a critical decision point, there is more frequent data collection and feedback (Patton 2011).

View full report

Preview
Download

"rs_EvalApproach.pdf" (pdf, 598.02Kb)

Note: Documents in PDF format require the Adobe Acrobat Reader®. If you experience problems with PDF documents, please download the latest version of the Reader®