Developmental Evaluation as Harvesting

Slides:



Advertisements
Similar presentations
Outputs, outcomes and impacts Using Theory of Change to plan, develop and evaluate your initiative.
Advertisements

Good Evaluation Planning – and why this matters Presentation by Elliot Stern to Evaluation Network Meeting January 16 th 2015.
1 Qualitative Evaluation Terms Coding/categorization The process of condensing qualitative data through the identification of common themes. Data Matrix.
System Implementation and Monitoring Regional Session Spring, 2014 Resources are available at sim.abel.yorku.ca.
Donor perspectives on planning and evaluation Janice Astbury The J.W. McConnell Family Foundation.
Third Sector Evaluation: Challenges and Opportunities Presentation to the Public Legal Education in Canada National Conference on “Making an Impact” 26.
Facilitating UFE step-by-step: a process guide for evaluators Joaquín Navas & Ricardo Ramírez December, 2009 Module 1: Steps 1-3 of UFE checklist.
Hosted by: Funded by: Research into Reality Overcoming the challenges of knowledge exchange Foundation for the Future Research Conference 26 th – 28 th.
Low Impact Urban Design and Development: Getting it into practice A presentation by Viv Heslop, Researcher on the LIUDD project.
Co-designing urban infrastructures: cases, opportunities and challenges Liesbeth Huybrechts, Faculty of Architecture and Art UHasselt.
International engagement: m & e meeting Monitoring & Evaluation: an introduction for practitioners Liz Allen.
Capturing Transformative Change in Education: The Challenge of SDG Target 4.7 Susan Gallwey, Irish Development Education Association November 2015.
Strategy Evaluation A vital tool for the savvy strategy maker Australasian Evaluation Society Conference September 2011.
IPSP Outcomes Reporting Framework What you need to know and what you need to do.
Clarifying the Evaluation Focus in a Complex Program Context Howard Kress, PhD* Natalie Brown, MPH** *Battelle Memorial Institute, with National Center.
Stimulating innovation in engaged practice and developing institutional cultures that support it 3. Capacity building and skills development  Supporting.
Supported by: Student Hubs is a registered charity in England and Wales, number Impact Measurement Proving and Improving your Social Impact.
Renewing our focus on Impact Becky Murray Nairobi, 15 March 2016 Twitter: #impactafrica.
Jme McLean, MCP, MPH PolicyLink Kari Cruz, MPH Dana Keener Mast, PhD ICF International American Evaluation Association October 27, 2012 | Minneapolis,
1 Chapter 9 Implementing Six Sigma. Top 8 Reasons for Six Sigma Project Failure 8. The training was not practical. 7. The project was too small for DMAIC.
Customer Experience: Create a digitally led customer experience
User Stories > Big and Small
School – Based Assessment – Framework
Focus Groups A Tool to Enhance CCRs
DATA COLLECTION METHODS IN NURSING RESEARCH
Wednesday 7th December 2016, 4.00pm EST
4.02 Use knowledge management strategies to improve the performance and competitive advantage of an organization. Identify techniques that can be used.
Enacting Multiple Strategies and Limiting Potential Successes: Reflections on Advocacy Evaluation, Competing Objectives, and Pathways to Policy Change.
Capacity WORKS Critical Success Factor Processes
Monitoring, Evaluation and Learning
M-LANG project  Ref. n NO01-KA Interactive Exchange Workshop on how to use response systems and ICT tools for creating interactive learning.
Keeping Track in Complicated and Complex Situations
Investment Logic Mapping – An Evaluative Tool with Zing
IPSP Outcomes Reporting Framework
Unit VII Strategic Evaluation and Control
City Afterschool System Framework
“Impacting Cultural Change from the Department to the College:
10.2 Qualitative research: Interviews
L2L The Professional Development Framework through the lens of Libraries & Librarians.
Continuous Improvement through Accreditation AdvancED ESA Accreditation MAISA Conference January 27, 2016.
Challenges in Evaluating Basic Science Investments: a Funder’s Perspective Julia Klebanov.
Learning and Teaching –
Provincial Evaluation Plan By Kathleen Douglas-England
Study Programmes: Modelling & Operation Project
TSMO Program Plan Development
Complexity Matters: Aligning the Evaluation of Social and Behavior Change with the Realities of Implementation International Social and Behavior Change.
NHS Education for Scotland Always Event Project
Intro slide to all of these:
South African Monitoring and Evaluation Association
Race to the Top~November Session
Evaluate the effectiveness of the implementation of change plans
Educator Effectiveness System Overview
Continuous Improvement Planning with the eCIP Tool
Resource 1. Evaluation Planning Template
Facilitating UFE step-by-step: a process guide for evaluators
Literacy Twilight 2 March 2017
Introduction to Core Professionalism
Visualizing Beneficiary Experience through Journey Mapping
Parent-Teacher Partnerships for Student Success
Standards and Procedures
Monitoring, Evaluation and Learning
Nutrition Cluster Advocacy
Community Mobilization Model & Approaches
CHANGE IS INEVITABLE, PROGRESS IS A CHOICE
Chapter VII Introduction to community organizing
The Robert Carr Fund’s Strategic Planning Exercise
Focus on Assessment and Feedback:
DCEC & Getting to Impact
DCEC & Getting to Impact
Resources are available at sim.abel.yorku.ca
Presentation transcript:

Developmental Evaluation as Harvesting Practitioner’s Gathering: Harvesting, Evaluation and Research March 21-23 2017

Action—Reflection Tension “We do not learn from experience…we learn from reflecting on experience.” ~John Dewey

Traditional Evaluation Suited for: Linear, simple and complicated Formative: Tweaking an established program or model Summative: Evaluating if something failed or succeeded (test, prove, and validate) Measurement: Measures performance and success against predetermined goals (e.g. logic model) Goal: Improve proposals, improve reporting, proving effectiveness, generalizable findings

Development Evaluation (DE) Created for: Complexity, innovation, changing goals, changing context, probing, prototyping (vs. machine worldview, linear, predictable) Measurement: Develop new measures and monitoring mechanisms (or change them) as goals emerge and evolve. Rapid and real-time. Goal: deepen reflective culture of data-driven decisions, produce context-specific insights

DE can provide evidence and justification for course change Source: Mintzberg, Quinn-Patton

Diverge/Converge

DE Can Harvest For… Intangibles e.g., language change, slight change in practice, referenced a document Contribution vs. attribution e.g., I did X, which contributed to this larger change alongside others and their actions vs. I did X, which caused Y Policy-influence e.g., report written, report discussed, report cited, practice influenced, policy changed, make case for emergent, participatory methods approach

DE as Harvesting Both about reflection to inform wise-action and next steps DE can enhance the depth and rigor around data collection DE research methods capture data outside of live events (i.e. working with availability and bandwidth of folks) Data can inform and deepen sense-making at participatory gatherings

When to Use DE… and when to Not… it’s up for debate! If you want to make good use of an evaluation required for a funder If you need to make the case for participatory methods within a conventional framework If you can fund a robust harvesting strategy through developmental evaluation In other cases, draw on or be inspired by DE…add rigor to harvesting strategies

Methods for Harvesting Data Deep Dialogue Interviews Surveys Small, participatory focus groups Photovoice World Café with robust harvest Sensemaker Many more...

Tactic: 1 hour reflection session Easy DE Tactics Tactic: 1 hour reflection session Take stock, identify tangible and more intangible changes What are the key activities we’ve done? What impacts/changes/outcomes have I/we experienced? Share together and discuss motivation, understanding,  rationale

Tactic: Collective Sense-Making Session Easy DE Tactics Tactic: Collective Sense-Making Session Provide collected data in easily digestible ways (infographic, one-pager, document for review) Guiding Questions for Conversation: What does this data tell you? What surprises you about this data? What factors may explain some of the trends we’re seeing? Do these findings lead to any new questions? What key insights can inform us moving forward?

Tactic: Partner/relationship tracking Easy DE Tactics Tactic: Partner/relationship tracking Excel sheet listing partner, meeting, content, next steps, barriers, strengths, comments Institutional memory to pass on Look back after a year and understand how things shifted, what worked and what didn’t Improve tactics motivation, understanding,  rationale

Tactic: Add evaluative/reflective thinking in small ways Easy DE Tactics Tactic: Add evaluative/reflective thinking in small ways Weekly team meetings, or quarterly reflection retreats What’s working? What’s not working? What needs to change? What’s up? So what? Now what? Goal: identify, understand, pivot, impact

Reflecting on Data-Driven Action Questions What do I/we need to know in order to make good decisions or have greater impact? Where do we and can I/we easily collect data? What obstacles, if any, do we/I experience collecting data?

Resources Gamble, A. A. J. (2008) A Developmental evaluation primer. The J.W. McConnell Family Foundation. Patton, M. Q. (2011) Development evaluation: Applying complexity concepts to enhance innovation and use. Guilford Press. Patton, M. Q. (2014) Evaluation flash cards: embedding evaluative thinking in organizational culture. Otto Bremer Foundation. Cobb, M., & Donnelly, G. (2015) Community-based, participatory and developmental evaluation approaches: an introductory toolkit. Ecology Action Centre. Find at: bit.ly/2nsxcRM

Contact Gabrielle Donnelly, Ph.D. gabrielle@bravespace.ca Miranda Cobb Research and Evaluation Coordinator Ecology Action Centre mirandajcobb@gmail.com