Ali Alkhalaf ALM Zarudeen Michelle Corby Yu Kyoung Park WF ED 597 March 23, 2012.

Slides:



Advertisements
Similar presentations
Immigrant Integration as a Complex Adaptive Social Systems Agnes Meinhard, PhD.
Advertisements

SETTINGS AS COMPLEX ADAPTIVE SYSTEMS AN INTRODUCTION TO COMPLEXITY SCIENCE FOR HEALTH PROMOTION PROFESSIONALS Nastaran Keshavarz Mohammadi Don Nutbeam,
MODULE 8: PROJECT TRACKING AND EVALUATION
Program Evaluation Strategies to Improve Teaching for Learning Rossi Ray-Taylor and Nora Martin Ray.Taylor and Associates MDE/NCA Spring School Improvement.
Laura Pejsa Goff Pejsa & Associates MESI 2014
A brief overview What is program evaluation? How is an evaluation conducted? When should it be used? When can it be used? Used with Permission of: John.
SWEDISH AGENCY FOR ECONOMIC AND REGIONAL GROWTH 1 Knowledge formation for structural change, innovation and growth – a challenge for Evaluation!
PPA 502 – Program Evaluation
Seminar on selected evaluation methodology issues 评估方法论研讨会 Independent Office of Evaluation of IFAD Beijing, 16 July 2014 国际农业发展基金独立评估办公室 2014 年 7 月 16.
Need to clarify what is possible.... Variables  settable  measurable  inferrable  observable  estimable  aspirational  hypothetical Relationships.
Copyright 2007 by Linda J. Vandergriff All rights reserved. Published 2007 System Engineering in the 21st Century - Implications from Complexity.
EDU555 CURRICULUM & INSTRUCTION ENCIK MUHAMAD FURKAN MAT SALLEH WEEK 4 CURRICULUM EVALUATION.
Complex Adaptive Systems (CAS) Mirsad Hadzikadic.
Systems Dynamics and Equilibrium
Evaluation and Policy in Transforming Nursing
Phase Four: Process Improvement. “The beauty of using PTR in the implementation of programs of study is its focus on using data to analyze problems and.
Evidence based research in education Cathy Gunn University of Auckland.
Improving Implementation Research Methods for Behavioral and Social Science Working Meeting Measuring Enactment of Innovations and the Factors that Affect.
1 NEST New and emerging science and technology EUROPEAN COMMISSION - 6th Framework programme : Anticipating Scientific and Technological Needs.
How complexity influences evaluation A presentation to the Australasian Evaluation Society Conference Sydney, August 2011 Julie McGeary.
Biocomplexity Teacher Workshop May 31 – June 2, 2008 University of Puerto Rico.
AES conference – 1 September 2011 A Developmental Evaluator’s influences on piloting innovative Sustainability and Climate Change Adaptation programs Case.
Logic Models: How to Develop, Link to M&E and Adapt Logic Models: How to Develop, Link to M&E and Adapt Evaluating Int’l Development Projects: One-Day.
Course, Curriculum, and Laboratory Improvement (CCLI) Transforming Undergraduate Education in Science, Technology, Engineering and Mathematics PROGRAM.
Organizations of all types and sizes face a range of risks that can affect the achievement of their objectives. Organization's activities Strategic initiatives.
Bringing Diversity into Impact Evaluation: Towards a Broadened View of Design and Methods for Impact Evaluation Sanjeev Sridharan.
Evaluation for Social Justice AMY HILGENDORF, PHD KATE WESTABY, MS VICTORIA FAUST, MPA UNIVERSITY OF WISCONSIN-MADISON American Evaluation Association.
Introduction Social ecological approach to behavior change
Monitoring and Evaluation Systems for NARS organizations in Papua New Guinea Day 4. Session 10. Evaluation.
Decision-Oriented Evaluation Approaches Presenters: Chris, Emily, Jen & Marie.
Revisiting the Relevance of Evaluation to Society: Challenging Assumptions About Social Interventions and Change Emily Gates & Thomas Schwandt University.
Module 8 Guidelines for evaluating the SDGs through an equity focused and gender responsive lens: Overview Technical Assistance on Evaluating SDGs: Leave.
Introduction Social ecological approach to behavior change
SEN PEER REVIEW Access to finance Introduction
Monitoring and Evaluating Rural Advisory Services
Professional Skepticism
Competency Based Learning and Project Based Learning
Module 2 The SDG Agenda: No-one left behind
Module 1: Introducing Development Evaluation
The inquiry classroom What are the challenges to using IBL?
RDQ 9 Enhancing Family Engagement in PBIS Discussion Leaders: Andy Garbacz, University of Wisconsin-Madison Mark Weist, University of South Carolina.
Learning Platforms on Land Governance & Food Security
Scaling for Social Impact: From Dignitas to Citizen Schools to …Relay
Monitoring and Evaluation Systems for NARS Organisations in Papua New Guinea Day 3. Session 7. Managers’ and stakeholders’ information needs.
Advancing Race Equity and Inclusion Annie E. Casey Foundation
Strategic Planning Setting Direction Retreat
Teaching All Children: Planning and Assessment
The Q Improvement Lab August 2017.
MGT 210 Chapter 8: Foundations of Planning
Claire NAUWELAERS, independent policy expert
Some Considerations for Developing & Implementing Clinical Experiences
21st Century Public Education and Beyond Boundaries Servant Leadership
MONITORING AND EVALUATION IN FOOD SECURITY AND NUTRITION INTERVENTIONS KOLLIESUAH, NELSON P.
John D. McGregor Session 5 Error Modeling
Note to evaluator… The overall purpose of this presentation is to guide evaluators through the completion of steps 4 to 6 of the UFE checklist. The main.
Focus of Outcome Mapping
A Focus on Outcomes and Impact
Evaluative Research Key Terms Evaluative Research Key Terms.
An Update of COSO’s Internal Control–Integrated Framework
Audit Evidence Bob Dohrer, Technology Working Group Chair and Audit Evidence Working Group Chair IAASB CAG Meeting, New York Agenda Item D March 5, 2019.
Directorate General Information Society & Media
Facilitating UFE step-by-step: a process guide for evaluators
Created for Sloan-C Conference, Fall 2006
DHET/NSDS III RME Capacity Building Workshop
Note to evaluator… The overall purpose of this presentation is to guide evaluators through the completion of steps 4 to 6 of the UFE checklist. The main.
Reflections on Revising the Guidance: An Evaluation
Considering Fidelity as an Element Within Scale Up Initiatives Validation of a Multi-Phase Scale Up Design for a Knowledge-Based Intervention in Science.
A Human, Learning, Systems approach to funding in complexity
Reviewing RIS3 in Catalonia
Presentation transcript:

Ali Alkhalaf ALM Zarudeen Michelle Corby Yu Kyoung Park WF ED 597 March 23, 2012

Developmental Evaluation

Definition of developmental evaluation

Nonlinearity: Sensitivity to initial conditions in which small actions can stimulate large reactions, thus the butterfly wings metaphor (Gleick, 1987). Emergence : Innovators can’t determine in advance what will happen, so evaluators can’t determine in advance what to measure. Adaptation : Interacting elements and agents respond and adapt to each other, and to their environment. Uncertainty : Under conditions of complexity, processes and outcomes are unpredictable, uncontrollable, and unknowable in advance. Coevolution : As interacting and adaptive agents self-organize, ongoing connections emerge that become evolutionary as the agents evolve together (coevolve) within and as part of the whole system, over time (Philip, 1999).

Summative EvaluationFormative Evaluation  Overall judgment of merit or worth about a stable and fixed program intervention based on explicit criteria like effectiveness, efficiency, relevance, and sustainability.  Improve the program. Fine- tune the model, clarifying key elements and linkages from inputs to activities and processes to outputs, outcomes, and impacts.  Determine the future of the program and model, including especially whether it should be disseminated as an exemplar and taken to scale.  Determine efficacy and effectiveness at a pilot level to establish readiness for summative evaluation. Source: Scriven (1991). Beyond formative and summative evaluation.

Summative EvaluationFormative Evaluation  Well-defined intervention model supported by an explicit and testable theory of change.  Clear, specific, measurable, attainable, and time-bound outcomes.  Draft program model to be fine-tuned.  Establish criteria for quality implementation to guide and focus process improvement.  Processes and instruments for getting participant feedback.  Is this an especially effective practice that should be funded and disseminated as an model program, a best practice?  Is the program model ready for summative evaluation? Source: Scriven (1991). Beyond formative and summative evaluation.

Support adaptation in complex, uncertain, and dynamic conditions. Identify emergent processes and outcomes that accompany innovation, and support making sense of their implications. Support ongoing development and adaptation to changing conditions. Determine when and if an innovation is ready for formative evaluation as a pilot intervention. Help social innovators explore possibilities for addressing major problems and needs, and identify innovative approaches and solutions. Source: Patton (2011). Developmental evaluation: Applying complexity concepts to enhance innovation and use.

TraditionalComplexity-sensitive  Formative-summative distinction dominant: formative improves; summative tests, proves, and validates program models; accountability.  Finding out if a program model works; focus on effectiveness, efficiency, impact, and scalability.  Outcome-driven; systems viewed as context.  Supports development of innovations and adaptation of interventions in dynamic environments.  Exploring possibilities; generating ideas and trying them out; preformal model, so preformative; nonsummative in that ongoing innovation and development is expected, never arriving at a fixed intervention.  System-change-driven; specific outcomes emergent, dynamic. Source: Patton (2011). Developmental evaluation: Applying complexity concepts to enhance innovation and use.

7 Scriven, M. (1991). Beyond formative and summative evaluation. In M. W. McLaughlin & D. C. Phillips (Eds.) Evaluation and Education: At Quarter Century, pp Chicago, IL: The University of Chicago Press. Patton, M. Q. (2011). Developmental evaluation: Applying complexity concepts to enhance innovation and use. New York: NY: The Guilford Press. Gleick, J. (1987). Chaos: Making a new science. New York: Penguin Books. Philip, A. (1999). Complexity theory and organization science. Organization Science, 10(3),

Scene 1: CEO Scene 2: