Logic Models: How to Develop, Link to M&E and Adapt Logic Models: How to Develop, Link to M&E and Adapt Evaluating Int’l Development Projects: One-Day.

Slides:



Advertisements
Similar presentations
Introduction to Monitoring and Evaluation
Advertisements

Presentation at The Conference for Family Literacy Louisville, Kentucky By Apter & O’Connor Associates April 2013 Evaluating Our Coalition: Are We Making.
School Leadership that Works
Lesson 2: Project: Evaluation, Monitoring, Auditing Macerata, 22 nd October Alessandro Valenza, Director, t33 srl.
Laura Pejsa Goff Pejsa & Associates MESI 2014
1 Theory of Change Chesapeake Bay Funders Network Program Evaluation Training Workshop OMG Center for Collaborative Learning January 9-10, 2008.
Action Logic Modelling Logic Models communicate a vision for an intervention as a solution to a public health nutrition (PHN) problem to:  funding agencies,
Return On Investment Integrated Monitoring and Evaluation Framework.
Developing a Logic Model
Getting on the same page… Creating a common language to use today.
Evaluation is a professional and ethical responsibility and is a core part of PHN professional practice Commitment to evaluation helps build the PHN intelligence.
Challenge Questions How good is our operational management?
Really Using (Useful) Theories of Change IPDET 2013 John Mayne, Advisor on Public Sector Performance
Evaluation and Attitude Assessment BME GK-12 Fellows 12/1/10, 4-5pm 321 Weill Hall Tom Archibald PhD Student, Adult & Extension Education GRA, Cornell.
How to Develop the Right Research Questions for Program Evaluation
Evaluation and Policy in Transforming Nursing
How to Focus and Measure Outcomes Katherine Webb-Martinez Central Coast and South Region All Staff Conference April 23, 2008.
Evaluating NSF Programs
Dr. G. Johnson, Program Evaluation and the Logic Model Research Methods for Public Administrators Dr. Gail Johnson.
Monitoring Evaluation Impact Assessment Objectives Be able to n explain basic monitoring and evaluation theory in relation to accountability n Identify.
From Evidence to Action: Addressing Challenges to Knowledge Translation in RHAs The Need to Know Team Meeting May 30, 2005.
This product was developed by Florida’s Positive Behavior Support Project through University of South Florida, Louis de la Parte Florida Mental Health.
The Evaluation Plan.
Logic Models Handout 1.
Program Evaluation and Logic Models
Conducting Community Health Research
Elizabeth Godfrey 1.  Periodic assessment of results Appropriateness, effectiveness, efficiency, impact, sustainability  Identifies intended and unintended.
Evaluation Research and Engineering Education Lesley Jolly For AaeE ERM wiki at
Lesson 2: Project: Evaluation, Monitoring, Auditing Macerata, 22 nd October Alessandro Valenza, Director, t33 srl.
Semester 2: Lecture 9 Analyzing Qualitative Data: Evaluation Research Prepared by: Dr. Lloyd Waller ©
Beyond logical frameworks to program impact pathways CIIFAD M&E Workshop 5 November 2011  135 Emerson Hall Sunny S. Kim, MPH PhD candidate Division of.
1 Analysing the contributions of fellowships to industrial development November 2010 Johannes Dobinger, UNIDO Evaluation Group.
Prepared by the North Dakota State Data Center July HNDECA and ECCS Evaluation Dr. Richard Rathge Professor and Director North Dakota State Data.
1 Designing Effective Programs: –Introduction to Program Design Steps –Organizational Strategic Planning –Approaches and Models –Evaluation, scheduling,
Monitoring and Evaluation of GeSCI’s Activities GeSCI Team Meeting 5-6 Dec 2007.
PPA 502 – Program Evaluation Lecture 2c – Process Evaluation.
FLAGSHIP STRATEGY 1 STUDENT LEARNING. Student Learning: A New Approach Victorian Essential Learning Standards Curriculum Planning Guidelines Principles.
Facilitating UFE step-by-step: a process guide for evaluators Joaquín Navas & Ricardo Ramírez December, 2009 Module 1: Steps 1-3 of UFE checklist.
The PHEA Educational Technology Initiative. Project Partners PHEA Foundations – Ford, Carnegie, Kresge, MacArthur South African Institute for Distance.
Using Logic Models in Program Planning and Grant Proposals The Covenant Foundation.
Addis Ababa, 8-10 May, 2012 Communication and social learning: supporting local decision making on climate change, agriculture and food security.
Consultant Advance Research Team. Outline UNDERSTANDING M&E DATA NEEDS PEOPLE, PARTNERSHIP AND PLANNING 1.Organizational structures with HIV M&E functions.
Developing a Framework In Support of a Community of Practice in ABI Jason Newberry, Research Director Tanya Darisi, Senior Researcher
Adrienne DiTommaso, MPA, CNCS Office of Research and Evaluation
1 The project is financed from the European Union funds within the framework of Erasmus+, Key Action 2: Cooperation for innovation and the exchange of.
2010 OSEP Leadership Mega Conference Collaboration to Achieve Success from Cradle to Career Evaluating for Impact Washington State’s Perspective: Our Blueprint.
Transitioning to Wrap-Around Content Model: Online Course Development for Learning Longevity Presenter: Denise Gaspard-Richards, PhD Head, Course Development.
Outcomes Thinking* Christine Jost Linking Knowledge with Action Research Theme KMC4CRP2 workshop, Addis Ababa, 4 December 2013 * Drawing from the presentation.
What is a Logic Model and What Does it Do?
Managing the Planning Process 1. ____ is a system designed to identify objectives and to structure the major tasks of the organization to accomplish them.
C4EO – Ways of Working Heather Rushton, Planning and Performance 1.
Capacity Development Results Framework A strategic and results-oriented approach to learning for capacity development.
Session 2: Developing a Comprehensive M&E Work Plan.
Using Qualitative Methods to Identify System Dynamics and Inform System Evaluation Design Margaret Hargreaves Mathematica Policy Research American Evaluation.
Developing Effective Performance Measures: A Skill-Building Workshop David J. Bernstein, Ph.D., Senior Study Director, Westat Sponsored by the Government.
Evaluation for Social Justice AMY HILGENDORF, PHD KATE WESTABY, MS VICTORIA FAUST, MPA UNIVERSITY OF WISCONSIN-MADISON American Evaluation Association.
Business Leadership Network: Program Evaluation in Public Health March 28, 2016.
Developing a Monitoring & Evaluation Plan MEASURE Evaluation.
Monitoring and Evaluation Systems for NARS organizations in Papua New Guinea Day 4. Session 10. Evaluation.
Ali Alkhalaf ALM Zarudeen Michelle Corby Yu Kyoung Park WF ED 597 March 23, 2012.
Module 8 Guidelines for evaluating the SDGs through an equity focused and gender responsive lens: Overview Technical Assistance on Evaluating SDGs: Leave.
Incorporating Evaluation into a Clinical Project
Facilitating UFE step-by-step: a process guide for evaluators
Purpose of Outcomes measurement
WHAT is evaluation and WHY is it important?
Changing the Game The Logic Model
Using Logic Models in Project Proposals
Data for PRS Monitoring: Institutional and Technical Challenges
Presentation transcript:

Logic Models: How to Develop, Link to M&E and Adapt Logic Models: How to Develop, Link to M&E and Adapt Evaluating Int’l Development Projects: One-Day Skills Building Workshop on M&E Cornell International Institute for Food and Agriculture Development November 5, 2011 Lesli Hoey PhD Candidate Cornell Department of City and Regional Planning

Outline Outline 1.How to develop a logic model 2.Using logic models to design M&E 3.M&E across program phases 4.Linear vs. complex interventions

Step 1: Purpose and use Why are you developing a logic model? Who will use it? How? Step 2: Involve others Who should participate in creating the logic model? Step 3: Set the boundaries for the logic model What will the logic model depict: a single, focused endeavor; a comprehensive initiative; a collaborative process? What level of detail is needed? Step 4: Understand the situation What is the situation giving rise to the intervention? What do we know about the problem/audience/context? Adapted from: Taylor-Powell and Henert, 2008 Developing a Logic Model

Adapted from: Taylor-Powell and Henert, 2008 Process Options 1)Everyone identifies resources, activities, participants and outcomes on post-it notes arranged on wall. Check for “if- then” relationships, edit duplicates, ID gaps, etc. 2)Small subgroups develop their own logic model of the program. The whole group merges these into one. 3)Participants bring a list of program outcomes. Sort into short- and long-term outcomes by target group. Edit duplicates, ID gaps, etc. Discuss assumptions about chain of outcomes, external factors. Link resources, activities. 4)Use web-based systems, or other distance methods. 5)Subcommittee creates the model and reviews with others.

Logic Models & Evaluation Helps us match evaluation to the program Helps us know what and when to measure - Are you interested in process and/or outcomes? Helps us focus on key, important information -Where will you spend limited evaluation resources? - What do we really need to know? Source: Taylor-Powell and Henert, 2008

Types of Evaluation Mapped Across the Logic Model Needs/asset assessment: What are the characteristics, needs, priorities of target population? What are potential barriers/facilitators? What is most appropriate to do? Process evaluation: How is program implemented? Are activities delivered as intended? Fidelity of implementation? Are participants being reached as intended? What are participant reactions? Outcome evaluation: To what extent are desired changes occurring? Goals met? Who is benefiting/not benefiting? How? What seems to work? Not work? What are unintended outcomes? Impact evaluation: To what extent can changes be attributed to the program? What are the net effects? What are final consequences? Is program worth resources it costs? Source: Taylor-Powell and Henert, 2008

Source: Taylor-Powell, 2002 Water Quality Project Example Formative Evaluation QuestionsSummative Evaluation Questions Indicators

Source: Trochim, 2006 Initiation – Need dynamic, flexible, rapid feedback about implementation and process. Includes monitoring, post- only feedback, unstructured observation, sharing of implementation experiences. Mostly qualitative. Development – Focus on observation, assessment of change in key outcomes, emerging consistency. Includes pre-post differences. Qualitative or quantitative. Mature – When a program is routinized and stable, compare outcomes with expectations, with performance in alternative programs, or sites with no program. Includes experimental and quasi-experimental designs, more structured and comparative qualitative approaches. Dissemination – Focused on transferability, generalizability or external validity. Measure consistency of outcomes across different settings, populations or program variations. Program Phases and Evaluation FORMATIVEFORMATIVE SUMMATIVESUMMATIVE

Three ways of conceptualizing and mapping theories of change 1. Linear Newtonian causality  Interdependent systems relationships  Complex nonlinear dynamics Source: Patton, 2008

Interdependent Systems Relationships Dept 1 Dept 2 Dept 3 Dept 4 OUTPUTS SHORT-TERM OUTCOMES MID-TERM OUTCOMES LONG-TERM OUTCOMES Adapted from Chapel, 2006 in Taylor-Powell and Henert, 2008

Source: Patton, 2008 Strong Timely, National/ Grassroots Coordination Opportunistic Lobbying & Judicial Engagement Strong High Capacity Coalitions EFFECTIVE ADVOCACY Disciplined Focused Message/ Effective Communications Collaborating Funders/ Strategic Funding Complex, Non-Linear Intervention Solid Knowledge & Research Base

Conditions that challenge traditional model-testing evaluation High innovation Ongoing development High uncertainty Dynamic, rapid change Emergent (difficult to plan and predict) Systems Change Interdependence Adaptive Management Adapted from: Patton, 2008

Ideal Type Evaluation Models Adapted from: Patton, 2008 TraditionalDevelopmental Tests modelsSupports innovation and adaptation Renders definitive judgment of success or failure Provides feedback, generates learning and affirms changes in certain direction Measures success against predetermined goals Develops new measures and monitoring mechanisms as goals emerge and evolve Evaluator external, objectiveEvaluator part of team, ‘learning coach’ Evaluator determines designEvaluator collaborates on design Design based on linear cause- effect model Design captures system dynamics, inter- dependencies, emergent interconnections Aim to produce generalizable findings across time & space Aim to produce context-specific understand to inform ongoing innovation Accountability directed externally, to control Accountability focused on commitment to learning, for responding to lack of control Engenders fear of failureEngenders desire to learn

See CIIFAD website for evaluation institutes and WMU Visit U Wisconsin Extension website Look at these books: Bamberger, M., Rugh, J. and M. Linda (2 nd Ed). Real World Evaluation Working Under Budget, Time, Data, and Political Constraints. Los Angeles: Sage. Patton, M.Q (4 th Ed). Utilization-Focused Evaluation. Los Angeles: Sage. Patton, M.Q Developmental Evaluation: Applying Complexity Concepts to Enhance Innovation and Use. NY: Guilford Press. Williams, B and I. Imam Systems Concepts in Evaluation – An Expert Anthology. Point Reyes CA: Edge Press/AEA World Bank Conducting Quality Impact Evaluations Under Budget, Time and Data Constraints. Washington, DC: Author Useful Resources

Patton, M.Q “Evaluating the complex: Getting to maybe”. Power point presented in Oslo, Norway. Available online: aidontheedge.files.wordpress.com/2009/09/patton_oslo.p pt Taylor-Powell, E. and E. Henert “Developing a logic model: Teaching and training guide”. Madison: University of Wisconsin – Extension Trochim, W “Evolutionary perspectives on evaluation: Theoretical and practical implications”. Paper Presented at the Colorado Evaluation Network References Cited