Developmental Evaluation as Harvesting Practitioner’s Gathering: Harvesting, Evaluation and Research March 21-23 2017
Action—Reflection Tension “We do not learn from experience…we learn from reflecting on experience.” ~John Dewey
Traditional Evaluation Suited for: Linear, simple and complicated Formative: Tweaking an established program or model Summative: Evaluating if something failed or succeeded (test, prove, and validate) Measurement: Measures performance and success against predetermined goals (e.g. logic model) Goal: Improve proposals, improve reporting, proving effectiveness, generalizable findings
Development Evaluation (DE) Created for: Complexity, innovation, changing goals, changing context, probing, prototyping (vs. machine worldview, linear, predictable) Measurement: Develop new measures and monitoring mechanisms (or change them) as goals emerge and evolve. Rapid and real-time. Goal: deepen reflective culture of data-driven decisions, produce context-specific insights
DE can provide evidence and justification for course change Source: Mintzberg, Quinn-Patton
Diverge/Converge
DE Can Harvest For… Intangibles e.g., language change, slight change in practice, referenced a document Contribution vs. attribution e.g., I did X, which contributed to this larger change alongside others and their actions vs. I did X, which caused Y Policy-influence e.g., report written, report discussed, report cited, practice influenced, policy changed, make case for emergent, participatory methods approach
DE as Harvesting Both about reflection to inform wise-action and next steps DE can enhance the depth and rigor around data collection DE research methods capture data outside of live events (i.e. working with availability and bandwidth of folks) Data can inform and deepen sense-making at participatory gatherings
When to Use DE… and when to Not… it’s up for debate! If you want to make good use of an evaluation required for a funder If you need to make the case for participatory methods within a conventional framework If you can fund a robust harvesting strategy through developmental evaluation In other cases, draw on or be inspired by DE…add rigor to harvesting strategies
Methods for Harvesting Data Deep Dialogue Interviews Surveys Small, participatory focus groups Photovoice World Café with robust harvest Sensemaker Many more...
Tactic: 1 hour reflection session Easy DE Tactics Tactic: 1 hour reflection session Take stock, identify tangible and more intangible changes What are the key activities we’ve done? What impacts/changes/outcomes have I/we experienced? Share together and discuss motivation, understanding, rationale
Tactic: Collective Sense-Making Session Easy DE Tactics Tactic: Collective Sense-Making Session Provide collected data in easily digestible ways (infographic, one-pager, document for review) Guiding Questions for Conversation: What does this data tell you? What surprises you about this data? What factors may explain some of the trends we’re seeing? Do these findings lead to any new questions? What key insights can inform us moving forward?
Tactic: Partner/relationship tracking Easy DE Tactics Tactic: Partner/relationship tracking Excel sheet listing partner, meeting, content, next steps, barriers, strengths, comments Institutional memory to pass on Look back after a year and understand how things shifted, what worked and what didn’t Improve tactics motivation, understanding, rationale
Tactic: Add evaluative/reflective thinking in small ways Easy DE Tactics Tactic: Add evaluative/reflective thinking in small ways Weekly team meetings, or quarterly reflection retreats What’s working? What’s not working? What needs to change? What’s up? So what? Now what? Goal: identify, understand, pivot, impact
Reflecting on Data-Driven Action Questions What do I/we need to know in order to make good decisions or have greater impact? Where do we and can I/we easily collect data? What obstacles, if any, do we/I experience collecting data?
Resources Gamble, A. A. J. (2008) A Developmental evaluation primer. The J.W. McConnell Family Foundation. Patton, M. Q. (2011) Development evaluation: Applying complexity concepts to enhance innovation and use. Guilford Press. Patton, M. Q. (2014) Evaluation flash cards: embedding evaluative thinking in organizational culture. Otto Bremer Foundation. Cobb, M., & Donnelly, G. (2015) Community-based, participatory and developmental evaluation approaches: an introductory toolkit. Ecology Action Centre. Find at: bit.ly/2nsxcRM
Contact Gabrielle Donnelly, Ph.D. gabrielle@bravespace.ca Miranda Cobb Research and Evaluation Coordinator Ecology Action Centre mirandajcobb@gmail.com