© 2011 Regents of the University of Minnesota. All rights reserved. Ripple Effect Mapping: A Participatory Strategy for Measuring Program Impacts Presented.

Slides:



Advertisements
Similar presentations
The YVPC Fathers and Sons Program Cassandra L. Brooks, MSA Cleopatra Howard Caldwell, PhD and The Fathers and Sons Steering Committee Youth Violence Prevention.
Advertisements

© 2011 Regents of the University of Minnesota. All rights reserved. Ripple Effect Mapping: A Tool for Evaluating the Impacts of Complex Interventions Presented.
Department of Mathematics and Science
Guidebook for Risk Analysis Tools and Management Practices to Control Transportation Project Costs Keith R. Molenaar, PhD Stuart D. Anderson, PhD, PE Transportation.
Mapping the Impacts of Extension Programs Lynette Flage, Ph.D. Northeast District Director North Dakota State University Extension Scott Chazdon, Ph.D.
1 Theory of Change Chesapeake Bay Funders Network Program Evaluation Training Workshop OMG Center for Collaborative Learning January 9-10, 2008.
Karen L. Mapp, Ed.D. Deputy Superintendent, Boston Public Schools
DOCUMENTING YOUR HOPES & DREAMS Jim Moran Nancy Lucero.
DETERMINANTS OF DATA USE Session 2. Session Objectives  Explain the data-use conceptual framework  Highlight the determinants of data use  List potential.
Institute of Public Policy University of Missouri-Columbia 1 Assessment driven planning: The development of a strategic plan to address risky drinking.
HEInnovate A self-assessment tool for higher education institutions (HEIs) wishing to explore their entrepreneurial and innovative potential.
IIBA Denver | may 20, 2015 | Kym Byron , MBA, CBAP, PMP, CSM, CSPO
IHS Special Diabetes Program Competitive Grants Part 2: Refining Idea Maps Diabetes Prevention Planning Cynthia C. Phillips, Ph.D. Lisa Wyatt Knowlton,
Sierra Leone Consortium for Relief and Development (CORAD) CARE Int., AFRICARE, Catholic Relief Services, World Vision Int. Action Plan for Making Monitoring.
Topical Interest Groups as Communities of Practice: Strategies for Building a Community of Practice Facilitated by: PK12 Educational Evaluation TIG Evaluation.
Becky Bowen She is an attorney and has served several nonprofit organizations in various capacities, including communications director, general counsel.
How to Focus and Measure Outcomes Katherine Webb-Martinez Central Coast and South Region All Staff Conference April 23, 2008.
RIPPLE EFFECT MAPPING (REM)
Malnutrition Prevention Programme Laura Pearce and Martin Field Kent.
Parent Leadership Lisa Brown and Lisa Conlan Family Resource Specialists Technical Assistance Partnership.
The Alberta Regional Professional Development Consortia (ARPDC) are dedicated to promoting student learning and achievement, school improvement and parental.
Using Ripple Effects Mapping to Determine Community Capitals Outcomes Debra Hansen Lorie Higgins Mary Emery.
Continuing QIAT Conversations Planning For Success Joan Breslin Larson Third webinar in a series of three follow up webinars for.
1 © 2015 Regents of the University of Minnesota. All rights reserved. 11 EXTENSION LABOR MANAGEMENT PROGRAMMING IN MINNESOTA: EVALUATIVE OUTCOMES AND IMPACTS.
The Evaluation Plan.
WELCOME BACK!! WEDNESDAY. Response to Feedback Awesome!!! Hmmm… Increased collaboration Michael’s presentation Logic model work Coordinators who understand.
Reflect and Revise: Evaluative Thinking for Program Success Tom DeCaigny and Leah Goldstein Moses.
Fundamentals of Evaluation for Public Health Programs ROBERT FOLEY, M.ED. NIHB TRIBAL PUBLIC HEALTH SUMMIT MARCH 31,
Cathy Burack and Alan Melchior The Center for Youth and Communities The Heller School for Social Policy and Management, Brandeis University Your Program.
The Logic Model An Introduction. Slide 2 Innovation Network, Inc. Who We Are National nonprofit organization Committed to evaluation as a tool for empowerment.
Evaluation framework: Promoting health through strengthening community action Lori Baugh Littlejohns & Neale Smith David Thompson Health Region, Red Deer,
Nef (the new economics foundation) Co-producing Lambeth what’s possible? Lucie Stephens and Julia Slay nef, October 2011.
Setting the Stage: Workshop Framing and Crosscutting Issues Simon Hearn, ODI Evaluation Methods for Large-Scale, Complex, Multi- National Global Health.
Boston Geneva San Francisco Seattle Beginning the Evaluation Journey with FSG KCIC Boot Camp March 24, 2010 Prepared for:
Eastern Washington University Heidi O’Donnell Heather Veeder BEGINNING TO TRANSFORM: THE IMPORTANCE OF DIVERSITY IN IDENTITY DEVELOPMENT.
Prepared by the North Dakota State Data Center July HNDECA and ECCS Evaluation Dr. Richard Rathge Professor and Director North Dakota State Data.
Monitoring & Evaluation: The concepts and meaning Day 9 Session 1.
Monitoring and Evaluation of GeSCI’s Activities GeSCI Team Meeting 5-6 Dec 2007.
1 © 2010 Regents of the University of Minnesota. All rights reserved. 11 Logic Models A ROADMAP TO SUCCESS.
Integrating Knowledge Translation and Exchange into a grant Maureen Dobbins, RN, PhD SON, January 14, 2013.
Third Sector Evaluation: Challenges and Opportunities Presentation to the Public Legal Education in Canada National Conference on “Making an Impact” 26.
Copyright 2012 Delmar, a part of Cengage Learning. All Rights Reserved. Chapter 9 Improving Quality in Health Care Organizations.
Shelter Training 08b – Belgium, 16 th –18 th November, 2008 based on content developed by This session will look at how to prepare Shelter Training for.
Mapping the logic behind your programming Primary Prevention Institute
WASHINGTON STATE UNIVERSITY EXTENSION Evaluation Based Program Development This material was developed in partnership with Rutgers's University and Lydia.
Developing a Framework In Support of a Community of Practice in ABI Jason Newberry, Research Director Tanya Darisi, Senior Researcher
1 “Raise the Civic Canopy” 3 rd Annual “Raise the Civic Canopy” Event February 15, 2007 Belmar Center, Lakewood, CO.
The Learning Cycle as a Model for Science Teaching Reading Assignment Chapter 5 in Teaching Science to Every Child: Using Culture as a Starting Point.
Positive School Climate Dr. Shanda C. Crowder Clinical Assistant Professor and Director The Positive Schools Center University of Maryland, School of Social.
Rutgers Cooperative Extension, New Jersey
Making it Count! Program Evaluation For Youth-Led Initiatives.
Regional Nutrition Education and Obesity Prevention Centers of Excellence-Western Region at Colorado State University SNAP & EFNEP: Regional Nutrition.
Does Your HHW Education Program Work? Workshop on Program Evaluation – Evaluation Logic Model NAHMMA Conference September 22, 2006 Tacoma, WA Trudy C.
1 Key Roles of River Basin Planners Integrators of Knowledge and People Trade-offs analysts and presenters Communicators and Facilitators CORE COMPETENCIES.
© 2011 Regents of the University of Minnesota. All rights reserved. Ripple Effects Mapping: A Participatory Strategy for Measuring Program Impacts Presented.
Logical Framework Approach An Evaluation Toolbox Presentation
{ Focus Groups An Assessment Tool in Student Affairs Image Retrieved from:
Program Design Chapter 5 6 th Edition Raymond A. Noe Copyright © 2013 by The McGraw-Hill Companies, Inc. All rights reserved.McGraw-Hill/Irwin.
Community Conversations during Responsible Business Week 2016.
Dr. Anubha Rajesh, Early Education Services, ICF January 13, 2016 Invest in Young Children: National Conference on Mother- tongue Based Multilingual Early.
Monitoring and Evaluation Systems for NARS organizations in Papua New Guinea Day 4. Session 10. Evaluation.
ADRCs Do What? Using Logic Models to Document and Improve ADRC Outcomes Glenn M. Landers.
Documenting Ripples of Resiliency in Communities through 4-H Tribal Youth Mentoring Rachelle Vettern, Ph.D. Associate Professor Center for 4-H Youth Development.
Learning Assessment Techniques
Monitoring and Evaluation Systems for NARS Organisations in Papua New Guinea Day 3. Session 9. Periodic data collection methods.
Building the foundations for innovation
Program Development and Evaluation Manager
Determine Grant Impacts through Ripple Effects Mapping
4.2 Identify intervention outputs
Proposal Development Support & Planning
Presentation transcript:

© 2011 Regents of the University of Minnesota. All rights reserved. Ripple Effect Mapping: A Participatory Strategy for Measuring Program Impacts Presented at the American Evaluation Association 2012 Annual Conference Scott Chazdon, Ph.D., Evaluation and Research Specialist Extension Center for Community Vitality Lynette Flage, Ph.D. Regional Director, North Dakota State University Extension

© 2011 Regents of the University of Minnesota. All rights reserved. Overview  The Challenge of Capturing Impact –Socially complex interventions –Requiring collaboration among stakeholders or sectors – Highly emergent/evolving  Ripple Effect Mapping: A Possible Solution –Method –Analysis –Real World Example

© 2011 Regents of the University of Minnesota. All rights reserved. Ripple Effect Mapping  Purpose – to better understand intended and unintended results of a program, intervention or collaborative for individuals, groups, communities and regions.  Completed post-program as part of impact evaluation

Mind Mapping – Radiant Thinking Pictorial Method –Note taking –Brainstorming –Organizing –Problem solving –Evaluation Image: Mindmap, Graham Burnett, For more on mind mapping, see Buzan, T. (2003). The mind map book. London: BBC Books.

© 2011 Regents of the University of Minnesota. All rights reserved.  Concept Mapping (Trochim, 1989)  Mind Mapping (Eppler, 2006)  Outcome Mapping (Outcome Mapping Learning Community, 2011)  Participatory Impact Pathway Analysis (Douthwaite et al, 2008)  Most Significant Change (Davies 2005) Ripple effect mapping: related approaches

© 2011 Regents of the University of Minnesota. All rights reserved. How Does it Work?  Identify the intervention  Schedule the event and invite participants  Group mapping session held  Follow-up interviews  Cleaning, Coding, Analysis

© 2011 Regents of the University of Minnesota. All rights reserved. RIPPLE EFFECT MAPPING Method –Identify the intervention  High engagement program or position  Cross-sector initiative  Collaboration –Invite stakeholder group  Participants  Non-participant stakeholders  12 to 20 participants  Two moderators

© 2011 Regents of the University of Minnesota. All rights reserved. RIPPLE EFFECT MAPPING Method –Appreciative Inquiry interview  Conducted among pairs of participants  Examples of questions: –Tell me a story about how you have used the information from the program? –Is there anything that resulting from the program that you are proud to share? –List an achievement or a success you had based on what you learned.

© 2011 Regents of the University of Minnesota. All rights reserved. RIPPLE EFFECT MAPPING Method –Mapping  On wall or using Concept Mapping software with data projector  Group cross-validation  Potential for probing using the Community Capitals Framework  Probe examples: –What happened as a result of this? –Did this lead to anything else? –Have you seen changes in the natural environment (or substitute other capital)?

© 2011 Regents of the University of Minnesota. All rights reserved. RIPPLE EFFECT MAPPING Demonstration of Mapping Process  Think back to the first time you attended an AEA conference. –Is there anything that resulting from attending the conference that you are proud to share? –List an achievement or a success you had based on what you learned or who you met.  Floating topics  Beginning to categorize

Example: Ripple Effect Map of the Naturally Occurring Retirement Communities Collaborative

© 2011 Regents of the University of Minnesota. All rights reserved. Example: Ripple Effect Map of Hugo, MN Business Retention and Expansion program

© 2011 Regents of the University of Minnesota. All rights reserved. RIPPLE EFFECT MAPPING Cleaning, Coding, Analysis –Organize map to better identify pathways or combine pathways –Download data to Excel for coding –Code using Community Capitals Framework and type of outcome  KASA = something learned  Behavior change = action taken  Impact = change in system –Follow-up interviews if more clarity is needed

The Community Capitals Framework (Emery and Flora, 2008)

© 2011 Regents of the University of Minnesota. All rights reserved. RIPPLE EFFECT MAPPING Coding Demonstration

First order (core outputs) Second order ripples Third order Fourth order Human capital effects (knowledge and behavior change) Social capital effectsCivic effects Financial effects Built capital effects Health, Food and Nutrition Effects Cultural effects Natural environment effects Market the City of Hugo City identity workshop -XX How to attract residents and biz (coninuing work)XX Have identified key attributes about the CityXXX Create, Coordinate, and Encourage Events New position at City for park & rec. planningXXX ~10 new recreation programsXX Hanifl Fields attracted over 20,000 kidsXX? Entrepreneurial BootcampXXX Businesses have used City resourcesX? Provide promotion opps. for bizX Coupons at football tourneyX Host Business and Breakfast Workshops 5 business breakfasts were heldXXX attendees Lots of business networkingX?? Address Highway 61 Access Issues and Improve Downtown Hugo Installed a traffic light at 61 and 147thX Removed 4 blighted bldgs on 61X Coding Example

Reporting Example (Hugo BR&E program)

© 2011 Regents of the University of Minnesota. All rights reserved. Direct vs. Indirect Impacts  Collaboratives and high engagement programs often build social capital, but don’t take credit for it.  People do not act in isolation -- strengthened social capital is a necessary pre-condition for other impacts  Other impacts may occur that were not foreseen in program theory

© 2011 Regents of the University of Minnesota. All rights reserved. Benefits  Simple and cheap retrospective tool  Captures impacts of complex or evolving work  Captures intended and unintended impacts  Participatory and appreciative approach that engages stakeholders  Group validation of results

© 2011 Regents of the University of Minnesota. All rights reserved. Limitations  Risk of bias in participant selection and data collection  Participants may not have complete information about a program or program outcomes  Potential for inconsistency in implementation

© 2011 Regents of the University of Minnesota. All rights reserved. Suggestions  Use same facilitator, recorder and “mapper”  Understand you will be “probing” for responses – think about some of those probes beforehand  Make decision prior to mapping whether to use community capitals as probes during group interviews  May need to recognize that one organization isn’t trying to take all credit for all change  It is important to probe for negatives

© 2011 Regents of the University of Minnesota. All rights reserved. Lessons Learned Thus Far  Find the right balance between breadth and depth  Schedule the event along with another activity  Put much effort into recruitment and explaining the process  Choose a good setting – not too informal  Use external facilitators, not program staff

© 2011 Regents of the University of Minnesota. All rights reserved. Hopes for REM  Broader Community of Practice  Extension professionals and others will use this regularly as an evaluation tool  Start discussion in the evaluation literature

© 2011 Regents of the University of Minnesota. All rights reserved. Mind Mapping Software   Freemind   (free)  IMindMap – ($99-$225)

© 2011 Regents of the University of Minnesota. All rights reserved. References  Baker, B., Calvert, M., Emery, M., Enfield, R., & Williams, B. (2011). Mapping the impact of youth on community development: What are we learning? [PowerPoint slides]. Retrieved from pdf pdf  Buzan, T. (2003) The Mind Map Book. London: BBC Books.  Cooperrider, D.L. & Whitney, D Appreciative Inquiry: A Positive Revolution in Change. Pp in P. Holman & T. Devane (eds.), The Change Handbook, 2 nd edition. San Francisco: Berrett-Koehler Publishers, Inc.  Douthwaite, B., Alvarez, S., Thiele, G., & MacKay, R. (2008). Participatory impact pathways analysis: A practical method for project planning and evaluation. ILAC Brief 17.  Emery, M., & Flora, C.B. (2006). Spiraling-up: Mapping community transformation with community capitals framework. Community Development: Journal of the Community Development Society 37(1),  Eppler, M.J. (2006). A Comparison Between Concept Maps, Mind Maps, Conceptual Diagrams, and Visual Metaphors as Complementary Tools for Knowledge Construction and Sharing. Information Visualization 5:  Hansen Kollock, D.A., Flage, L, Chazdon, S., Paine, N., and Higgins, L. (2012). Ripple Effect Mapping: A “Radiant” Way to Capture Program Impacts. Forthcoming in Journal of Extension (www. joe.org).  Hearn, S. (2010). Introduction to outcome mapping. Presentation on  Kollock, D. A. (2011). Ripple effects mapping for evaluation. Washington State University curriculum. Pullman, WA.  Outcome Mapping Learning Community. (2011).

© 2011 Regents of the University of Minnesota. All rights reserved. Contact information Scott Chazdon, Ph.D. Evaluation and Research Specialist Center for Community Vitality University of Minnesota Extension

© 2011 Regents of the University of Minnesota. All rights reserved. The University of Minnesota is an equal opportunity educator and employer. This PowerPoint is available in alternative formats upon request. Direct requests to Thank you!