Download presentation
Presentation is loading. Please wait.
Published byErik Bridges Modified over 9 years ago
1
Logic Models: How to Develop, Link to M&E and Adapt Logic Models: How to Develop, Link to M&E and Adapt Evaluating Int’l Development Projects: One-Day Skills Building Workshop on M&E Cornell International Institute for Food and Agriculture Development November 5, 2011 Lesli Hoey PhD Candidate Cornell Department of City and Regional Planning
2
Outline Outline 1.How to develop a logic model 2.Using logic models to design M&E 3.M&E across program phases 4.Linear vs. complex interventions
3
Step 1: Purpose and use Why are you developing a logic model? Who will use it? How? Step 2: Involve others Who should participate in creating the logic model? Step 3: Set the boundaries for the logic model What will the logic model depict: a single, focused endeavor; a comprehensive initiative; a collaborative process? What level of detail is needed? Step 4: Understand the situation What is the situation giving rise to the intervention? What do we know about the problem/audience/context? Adapted from: Taylor-Powell and Henert, 2008 Developing a Logic Model
4
Adapted from: Taylor-Powell and Henert, 2008 Process Options 1)Everyone identifies resources, activities, participants and outcomes on post-it notes arranged on wall. Check for “if- then” relationships, edit duplicates, ID gaps, etc. 2)Small subgroups develop their own logic model of the program. The whole group merges these into one. 3)Participants bring a list of program outcomes. Sort into short- and long-term outcomes by target group. Edit duplicates, ID gaps, etc. Discuss assumptions about chain of outcomes, external factors. Link resources, activities. 4)Use web-based systems, e-mail or other distance methods. 5)Subcommittee creates the model and reviews with others.
5
Logic Models & Evaluation Helps us match evaluation to the program Helps us know what and when to measure - Are you interested in process and/or outcomes? Helps us focus on key, important information -Where will you spend limited evaluation resources? - What do we really need to know? Source: Taylor-Powell and Henert, 2008
6
Types of Evaluation Mapped Across the Logic Model Needs/asset assessment: What are the characteristics, needs, priorities of target population? What are potential barriers/facilitators? What is most appropriate to do? Process evaluation: How is program implemented? Are activities delivered as intended? Fidelity of implementation? Are participants being reached as intended? What are participant reactions? Outcome evaluation: To what extent are desired changes occurring? Goals met? Who is benefiting/not benefiting? How? What seems to work? Not work? What are unintended outcomes? Impact evaluation: To what extent can changes be attributed to the program? What are the net effects? What are final consequences? Is program worth resources it costs? Source: Taylor-Powell and Henert, 2008
7
Source: Taylor-Powell, 2002 Water Quality Project Example Formative Evaluation QuestionsSummative Evaluation Questions Indicators
8
Source: Trochim, 2006 Initiation – Need dynamic, flexible, rapid feedback about implementation and process. Includes monitoring, post- only feedback, unstructured observation, sharing of implementation experiences. Mostly qualitative. Development – Focus on observation, assessment of change in key outcomes, emerging consistency. Includes pre-post differences. Qualitative or quantitative. Mature – When a program is routinized and stable, compare outcomes with expectations, with performance in alternative programs, or sites with no program. Includes experimental and quasi-experimental designs, more structured and comparative qualitative approaches. Dissemination – Focused on transferability, generalizability or external validity. Measure consistency of outcomes across different settings, populations or program variations. Program Phases and Evaluation FORMATIVEFORMATIVE SUMMATIVESUMMATIVE
9
Three ways of conceptualizing and mapping theories of change 1. Linear Newtonian causality Interdependent systems relationships Complex nonlinear dynamics Source: Patton, 2008
10
Interdependent Systems Relationships Dept 1 Dept 2 Dept 3 Dept 4 OUTPUTS SHORT-TERM OUTCOMES MID-TERM OUTCOMES LONG-TERM OUTCOMES Adapted from Chapel, 2006 in Taylor-Powell and Henert, 2008
11
Source: Patton, 2008 Strong Timely, National/ Grassroots Coordination Opportunistic Lobbying & Judicial Engagement Strong High Capacity Coalitions EFFECTIVE ADVOCACY Disciplined Focused Message/ Effective Communications Collaborating Funders/ Strategic Funding Complex, Non-Linear Intervention Solid Knowledge & Research Base
12
Conditions that challenge traditional model-testing evaluation High innovation Ongoing development High uncertainty Dynamic, rapid change Emergent (difficult to plan and predict) Systems Change Interdependence Adaptive Management Adapted from: Patton, 2008
13
Ideal Type Evaluation Models Adapted from: Patton, 2008 TraditionalDevelopmental Tests modelsSupports innovation and adaptation Renders definitive judgment of success or failure Provides feedback, generates learning and affirms changes in certain direction Measures success against predetermined goals Develops new measures and monitoring mechanisms as goals emerge and evolve Evaluator external, objectiveEvaluator part of team, ‘learning coach’ Evaluator determines designEvaluator collaborates on design Design based on linear cause- effect model Design captures system dynamics, inter- dependencies, emergent interconnections Aim to produce generalizable findings across time & space Aim to produce context-specific understand to inform ongoing innovation Accountability directed externally, to control Accountability focused on commitment to learning, for responding to lack of control Engenders fear of failureEngenders desire to learn
14
See CIIFAD website for evaluation institutes and WMU Visit U Wisconsin Extension website Look at these books: Bamberger, M., Rugh, J. and M. Linda. 2011 (2 nd Ed). Real World Evaluation Working Under Budget, Time, Data, and Political Constraints. Los Angeles: Sage. Patton, M.Q. 2008 (4 th Ed). Utilization-Focused Evaluation. Los Angeles: Sage. Patton, M.Q. 2011. Developmental Evaluation: Applying Complexity Concepts to Enhance Innovation and Use. NY: Guilford Press. Williams, B and I. Imam. 2006. Systems Concepts in Evaluation – An Expert Anthology. Point Reyes CA: Edge Press/AEA World Bank.2006. Conducting Quality Impact Evaluations Under Budget, Time and Data Constraints. Washington, DC: Author Useful Resources
15
Patton, M.Q. 2008. “Evaluating the complex: Getting to maybe”. Power point presented in Oslo, Norway. Available online: aidontheedge.files.wordpress.com/2009/09/patton_oslo.p pt Taylor-Powell, E. and E. Henert. 2008. “Developing a logic model: Teaching and training guide”. Madison: University of Wisconsin – Extension Trochim, W. 2007. “Evolutionary perspectives on evaluation: Theoretical and practical implications”. Paper Presented at the Colorado Evaluation Network References Cited
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.