Jme McLean, MCP, MPH PolicyLink Kari Cruz, MPH Dana Keener Mast, PhD ICF International American Evaluation Association October 27, 2012 | Minneapolis,

Slides:



Advertisements
Similar presentations
Notes by Ben Boerkoel, Kent ISD, based on a training by Beth Steenwyk –
Advertisements

Project Monitoring Evaluation and Assessment
1 Getting Equity Advocacy Results (GEAR) identifying and tracking the essential components of equity advocacy for policy change Knowledge for Equity Conference.
Challenge Questions How good is our operational management?
Shared Decision Making: Moving Forward Together
Inventory, Monitoring, and Assessments A Strategy to Improve the IM&A System Update and Feedback Session with Employees and Partners December 5, 2011.
© 2015 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part.
Monitoring and Evaluation of GeSCI’s Activities GeSCI Team Meeting 5-6 Dec 2007.
Ready to Raise PowerPoint Resource The Work of Early Years Community Developers Please feel free to adapt these PowerPoint slides to your needs. Credit.
Sustainability Planning Framework and Process Cheri Hayes Consultant to Nebraska Lifespan Respite Statewide Sustainability Workshop June 23-24, 2015 ©
Prepared by: Forging a Comprehensive Initiative to Improve Birth Outcomes and Reduce Infant Mortality in [State] Adapted from AMCHP Birth Outcomes Compendium.
A Framework for Assessing Needs Across Multiple States, Stakeholders, and Topic Areas Stephanie Wilkerson & Mary Styers REL Appalachia American Evaluation.
1. The Convergence Partnership: Defining and Tracking the Success of Innovation, Collaboration, and Community Change for Health Equity Jme McLean, MCP,
Developing a Monitoring & Evaluation Plan MEASURE Evaluation.
Evaluating with Perspective: Mobilizing Community Participation in Program Evaluation …its not too late!
IW:LEARN TDA/SAP Training Course Module 3: Developing the SAP.
Collective Impact The Best Chance We Have at Addressing Wicked Problems By Trevor Cook.
Session 1. The Central Principles of HiAP WORKSHOP: PREPARING FOR TRAINING IN HEALTH IN ALL POLICIES (HiAP) USING THE NEWLY LAUNCHED WHO HiAP TRAINING.
Collaborative & Interpersonal Leadership
Stages of Research and Development
JMFIP Financial Management Conference
School Building Leader and School District Leader exam
Evaluating the Quality and Impact of Community Benefit Programs
Monitoring and Evaluating Rural Advisory Services
Session VII: Formulation of Monitoring and Evaluation Plan
Building evaluation in the Department of Immigration and Citizenship
MODULE 12 – STRATEGIC MANAGEMENT
Innovation Ecosystems Fellowship Overview
Building Our Plan Creating our Regional Action Plan
Session 1. The Central Principles of HiAP
Enacting Multiple Strategies and Limiting Potential Successes: Reflections on Advocacy Evaluation, Competing Objectives, and Pathways to Policy Change.
Monitoring, Evaluation and Learning
Evaluation Plan Akm Alamgir, PhD June 30, 2017.
a New Focus for External Validity
Child Health Global Leadership Mapping
PURPOSE AND SCOPE OF PROJECT MONITORING AND EVALUATION M&E.
Chapter 8: Foundations of Planning
Programme Board 6th Meeting May 2017 Craig Larlee
Mackenzie River Basin Board 2016 – 2019 Business Plan
Introduction to Program Evaluation
Title of the Change Project
Continuous Improvement through Accreditation AdvancED ESA Accreditation MAISA Conference January 27, 2016.
The Value of Twisting the Lion’s Tail: How the Design of Policy Experiments Impact Learning Outcomes for Adaptation Governance. Belinda McFadgen, PhD researcher,
9/16/2018 The ACT Government’s commitment to Performance and Accountability – the role of Evaluation Presentation to the Canberra Evaluation Forum Thursday,
TSMO Program Plan Development
Family-Guided Routines-Based Intervention Introduction Module
Outcome Harvesting nitty- gritty Promise and Pitfalls of Participatory Design Atlanta, 10:45–11:30 29 October, 2016.
UN Support to SDG implementation in Seychelles.
Opportunities for Growth
KEY PRINCIPLES OF THINKING SYSTEMICALLY
Foundations of Planning
Logic Models and Theory of Change Models: Defining and Telling Apart
Karen Hacker, MD MPH Director
Models of Community Engagement
school self-evaluation and improvement toolkit
What Is Planning? Planning - a primary managerial activity that involves: Defining the organization’s goals Establishing an overall strategy for achieving.
Chicago Public Schools
What Is Planning? According to Koontz & O’Donell,
What is PACE EH? PACE EH is a process for assessing and analyzing the environmental health of communities and for creating plans to address threats and.
Gender Advisory Support to CSUD Project
Monitoring, Evaluation and Learning
Linking Evaluation to Coaching and Mentoring Models
Reflections on the EQB: Opportunities for Deepening Public Engagement
Using Logic Models in Project Proposals
Issues of Technology Needs Assessment for Climate Change
HOW TO ENGAGE COMMUNITY MEMBERS IN OUTCOME EVALUATION?
DCEC & Getting to Impact
DCEC & Getting to Impact
Developing SMART Professional Development Plans
Civil Society Facility and Media Programme Call for proposals: EuropeAid/162473/DH/ACT/Multi Webinar no. 3: Preparing effective Concept Note.
Presentation transcript:

Jme McLean, MCP, MPH PolicyLink Kari Cruz, MPH Dana Keener Mast, PhD ICF International American Evaluation Association October 27, 2012 | Minneapolis, MN Developmental Evaluation in Practice: Lessons and Tips from an Evaluation of the Convergence Partnership

2 icfi.com | “Developmental evaluation emerged in response to the need to support real-time learning in complex and emergent situations. Traditional forms of evaluation work well in situations where the progression from problem to solution can be laid out in a relatively clear sequence of steps (Gamble, 2008). However, initiatives with multiple stakeholders, high levels of innovation, fast paced decision- making, and areas of uncertainty require more flexible approaches (Patton, 2008).” Developmental Evaluation 201: A Practitioner’s Guide to Developmental Evaluation by Elizabeth Dozois, Marc Langlois, and Natasha Blanchet-Cohen. Developmental Evaluation

3 icfi.com | Orient participants to Developmental Evaluation and the Convergence Partnership Describe how we applied Developmental Evaluation to the Convergence Partnership Share our lessons learned and tips for others Provide time for discussion and questions Session Objectives

4 icfi.com | Applying Developmental Evaluation Approaches to the Convergence Partnership Kari Cruz, MPH Dana Keener Mast, PhD ICF International American Evaluation Association October 27, 2012 | Minneapolis, MN

5 icfi.com | 1.DE’s focus is on social innovations where there is no accepted model (and might never be) for solving the problem. 2.Continuous Learning is intentionally embedded into the developmental evaluation process. 3.An emergent and adaptive evaluation design ensures that the evaluation has purpose and that it can respond in nimble ways to emerging issues and questions. 4.The role of the developmental evaluator is a strategic learning partner and facilitator, which reflects a different role for most evaluators and their clients. 5.The developmental evaluator brings a complex systems orientation to the evaluation. Five Characteristics of Developmental Evaluation 1 Evaluating Social Innovation by Hallie Preskill and Tanya Beer, 2012

6 icfi.com | An integrated part of the intervention team Primarily elucidate the innovation and adaptation processes, track their implications and results, and facilitate ongoing, real-time data-based decision-making Developmental Evaluator’s Role

7 icfi.com | The Convergence Partnership: Engages multiple partners from various sectors and fields Employs environmental and policy change to create communities of healthy people living in healthy places Seeks to create major systems changes within institutions, namely philanthropic and governmental institutions Adapt and innovate based on data and changing political and economic environment Why is Developmental Evaluation Right for Convergence Partnership?

8 icfi.com | Evaluation Priorities 1.Demonstrate the progress and impact of the partnership Provide an objective perspective on progress Data and conclusions feed stakeholder reporting 2.Provide feedback to partners to inform strategic thinking and decision-making Evaluation to inform strategic discussions Continuous feedback loop needed

9 icfi.com | Scope and budget Evaluation framework Data collection Timely reporting Interpretation of findings Applying Developmental Evaluation to Convergence Partnership

10 icfi.com | Scope prioritizes rapid data collection in priority areas noting progress and outcomes Budget ceiling with set deliverables forecasted, then updated Room for emergent work, which is negotiated Lesson learned: Build in the necessary time to negotiate emergent work Clear communication and trust go a long way in uncertain budget circumstances Scope and Budget

11 icfi.com | Identified key areas where the partnership expects to affect change Developed evaluation questions, indicators, and priority outcomes covering these areas Evaluation Framework

12 icfi.com | 1.Increased intellectual capacity (skill building, knowledge)* 2.New/improved multi-field partnerships* 3.Changes in institutional practices* 4.Leveraged resources* 5.New/improved policies or programs* *Equity is a cross-cutting aspect that may or may not be reflected in any one of these outcomes Evaluation Priority Outcomes

13 icfi.com | Evaluation Framework cont. Lesson learned: Identify what stakeholders want to know and where they expect to see results Use a wide lens

14 icfi.com | Systematic and on-going –Attend key meetings –Monthly interviews with program staff (or as needed) –Surveys of participants at meetings, trainings, etc. –Align reporting mechanisms for grant funded initiatives Emergent –Key informant interviews at critical junctures (snowball sampling) –Document review –Others Lesson learned: Data collection is also strategic Data Collection

15 icfi.com | Sensemaking –Stage 1 – provide data summaries –Stage 2 – interpret data with key stakeholders Lesson Learned Interpretation cannot be done in isolation from stakeholders Interpretation of Findings

16 icfi.com | Briefer reporting Participation and direct communication at standing monthly partner meeting, staff meetings, etc. Lesson learned: Timely reporting equates to reporting at opportune times (e.g. standing meetings) Anticipate or identify decision making needs Semi-annual reports not enough Timely Reporting

17 icfi.com | Change in scope and audience Change our evaluation style, interaction, and dynamics with stakeholders Defining boundaries of evaluation Resource limitations Uncertainty Key Challenges

18 icfi.com | Resources

19 icfi.com | Our Contact Information: Jme McLean Kari Cruz Dana Keener Mast Questions, ideas, or feedback?