MODULE 8: PROJECT TRACKING AND EVALUATION

Slides:



Advertisements
Similar presentations
Reasons for Monitoring and Evaluation at the Project Level
Advertisements

Introduction to Monitoring and Evaluation
Results-Based Management: Logical Framework Approach
A brief overview What is program evaluation? How is an evaluation conducted? When should it be used? When can it be used? Used with Permission of: John.
Project Cycle Management (PCM)
The Lumina Center Grantseeking Workshop Series Presents Outcomes & Evaluations April 20, 2006.
Program Evaluation In A Nutshell 1 Jonathan Brown, M.A.
Types of Evaluation.
EDU555 CURRICULUM & INSTRUCTION ENCIK MUHAMAD FURKAN MAT SALLEH WEEK 4 CURRICULUM EVALUATION.
Monitoring Evaluation Impact Assessment Objectives Be able to n explain basic monitoring and evaluation theory in relation to accountability n Identify.
Proposal Writing for Competitive Grant Systems
ISTEP: Technology Field Research in Developing Communities Instructor: M. Bernardine Dias CAs: Sarah Belousov and Ermine Teves Spring 2009.
CONCEPT PAPER RESULT BASED PLANNING. RESULT-ORIENTED PLANNING Overall Objective/ Goal Specific Objective/ Purposes Expected Result/ Output Activities.
This project is funded by the EUAnd implemented by a consortium led by MWH Logical Framework and Indicators.
May 12 th Monitoring and Project Control. Objectives Anticipated Outcomes Express why Monitoring and Controlling are Important. Differentiate between.
SPEAK: Strategic Planning, Evaluation and Knowledge Networking Right Here Workshop, London October 2009.
Shelter Training 08b – Belgium, 16 th –18 th November, 2008 based on content developed by p This session describes the benefits of developing a strategic.
PEACE III - Theme 1.1 Aid for Peace – Phases I & II 21 September 2011 Celeste McCallion.
Evaluation in the GEF and Training Module on Terminal Evaluations
Too expensive Too complicated Too time consuming.
Monitoring and Evaluation in MCH Programs and Projects MCH in Developing Countries Feb 10, 2011.
1 Introduction to Evaluating the Minnesota Demonstration Program Paint Product Stewardship Initiative September 19, 2007 Seattle, WA Matt Keene, Evaluation.
Evaluation Assists with allocating resources what is working how things can work better.
Project design & Planning The Logical Framework Approach An Over View Icelandic International Development Agency (ICEIDA) Iceland United Nations University.
Semester 2: Lecture 9 Analyzing Qualitative Data: Evaluation Research Prepared by: Dr. Lloyd Waller ©
Evaluation framework: Promoting health through strengthening community action Lori Baugh Littlejohns & Neale Smith David Thompson Health Region, Red Deer,
1 Analysing the contributions of fellowships to industrial development November 2010 Johannes Dobinger, UNIDO Evaluation Group.
UNDAF M&E Systems Purpose Can explain the importance of functioning M&E system for the UNDAF Can support formulation and implementation of UNDAF M&E plans.
GSSR Research Methodology and Methods of Social Inquiry socialinquiry.wordpress.com January 17, 2012 I. Mixed Methods Research II. Evaluation Research.
TYPES OF EVALUATION Types of evaluations ask different questions and focus on different purposes. This list is meant to be illustrative rather than exhaustive.
Measuring and reporting outcomes for BTOP grants: the UW iSchool approach Samantha Becker Research Project Manager U.S. IMPACT Study 1UW iSchool evaluation.
21/4/2008 Evaluation of control measures 1. 21/4/2008 Evaluation of control measures 2 Family and Community Medicine Department.
Senior Evaluation Officer GEF Independent Evaluation Office Minsk, Belarus September 2015 Evaluation in the GEF and Training Module on Terminal Evaluations.
Project Management Learning Program 23 Mar – 3 Aprl 2009, Mekong Institute, Khon Kaen, Thailand Managing for Development Results Results-Oriented Monitoring.
Monitoring and Evaluation in MCH Programs and Projects MCH in Developing Countries Feb 24, 2009.
Program Evaluation Overview. Definitions of Program Evaluation systematic collection of information abut the activities, characteristics, and outcome.
Support the spread of “good practice” in generating, managing, analysing and communicating spatial information Evaluating and Reflecting on the Map-making.
Planning Evaluation Setting the Course Source: Thompson & McClintock (1998) Demonstrating your program’s worth. Atlanta: CDC/National Center for Injury.
1 The project is financed from the European Union funds within the framework of Erasmus+, Key Action 2: Cooperation for innovation and the exchange of.
MANAGEMENT INFORMATION, INFORMATION MANAGEMENT AND A PERFORMANCE MANAGEMENT STRATEGY YSDF THURSDAY 12, NOVEMBER, 2009.
Monitoring and Evaluation in MCH Programs and Projects MCH in Developing Countries Feb 9, 2012.
The P Process Eight Key Questions 1. What is the communication problem? 2. What do we need to do? 3. What materials/interventions do we need to develop?
Project Evaluation for MSP Targeted Partnerships Eric R. Banilower Horizon Research, Inc.
ACTED AME Appraisal, Monitoring and Evaluation. Summary 1/ ACTED AME department 2/ AME Responsibilities 3/ AME throughout project cycle 4/ Involvement.
Logic Model, Monitoring, Tracking and Evaluation Evaluation (Section T4-2)
Developing Monitoring & Evaluation Frameworks: Process or Product? Anne Markiewicz.
Session 2: Developing a Comprehensive M&E Work Plan.
Introduction to Monitoring and Evaluation. Learning Objectives By the end of the session, participants will be able to: Define program components Define.
Project Monitoring and Evaluation A useful Tool for Research Projects’ Administrators and Managers Folajogun V. Falaye. (Professor of Educational Research.
4.Model of Evaluation a. Hierarchical Criteria Model (classical model by Bennett, 1976) 1. Input (resource) 2. Activity 3. Participation 4. Reaction 5.
Evaluation What is evaluation?
Developing a Monitoring & Evaluation Plan MEASURE Evaluation.
Monitoring and Evaluation Systems for NARS organizations in Papua New Guinea Day 4. Session 11. Reporting.
Monitoring and Evaluation Systems for NARS organizations in Papua New Guinea Day 4. Session 10. Evaluation.
Building an ENI CBC project
Project monitoring and evaluation
Strategic Initiative for Resource Efficient Biomass Policies
Strategic Planning for Learning Organizations
بسم الله الرحمن الرحيم.
Policy and Budget Monitoring and Evaluation
AEA Annual Meeting , Nov , 2009 Achieving Knowledge Translation for Technology Transfer: Implications for Evaluation Presenter: Vathsala I. Stone.
Follow-up and Evaluation Mechanism
مدل زنجیره ای در برنامه های سلامت
Effect combined IMPACT on achieving outcomes Organizational OUTPUTS
Evaluation in the GEF and Training Module on Terminal Evaluations
Evaluation – embed within an equity framework
CATHCA National Conference 2018
Developing a shelter strategy
Integrating Gender into Rural Development M&E in Projects and Programs
How is an M & E framework derived from the logframe?
Presentation transcript:

MODULE 8: PROJECT TRACKING AND EVALUATION REHABILITATION ENGINEERING RESEARCH CENTER ON TECHNOLOGY TRANSFER (T2 RERC) TRAINING MODULES ON TECHNOLOGY TRANSFER

Objective of the Module: Participants will: explain basic concepts and terms used in program evaluation/research; define indicators of effectiveness and efficiency for a project of their interest suggest methods to track the indicators; identify how the results would be used for project monitoring and evaluation Slides 13-15

BASIC CONCEPTS AND TERMS Research and Evaluation Merit and Worth Management Orientation: Evaluation for Development Effectiveness and Efficiency Formative and Summative Evaluation Outcome and Impact

Merit and Worth Merit refers to intrinsic quality of person/ product/ system its potential, or promise to be “valued” in its broader context. Ex: Program inputs (resources, design, theory); processes (operations) and outcomes indicate its merit Worth refers to extrinsic value of person/ product/ system its actual “value” in the broader context Ex: program effects/ impacts on internal and external environments indicate its worth.

Fig 2. Evaluation and the Planning Cycle ↓ ← Context Evaluation Cycle Planning Cycle ß ↑ What needs to be done? [ Defining objective] Product Þ Are we there yet? How to do it? Recycle output or [Defining Ü Input disseminate? resources ] Evaluation ↓ ↓ What is the ↑ ↑ ↓ ↓ optimal path ↑ ↑ ↓ → → to get to → → ↑ ↓ goal? ↑ ↓ [process ↑ ↓ improvement ↑ ↓ ] ↑ ↑ ↓ Ý ↑ ↓ ↑ ↑ → → → → → → → Process → → → → → → → → Evalu ation

System characteristics: INPUT ---- à P ROCESS ---- à OUTPUT ↑ ↓ ← ← ← ← ← ← ← FEEDBACK Effective: gets outcome at the desired quality/quantity level Efficient: Maximizes effectiveness, minimizes cost System characteristics: .Uses the “input” resources; .“Processes” them [facilitates and manages the dynamics of their interaction]; .In function of a pre-defined objective [i.e., effort directed to achieving the corresponding output] .Does “self-regulation”[i.e., checks/obtains feedback information about the product taking shape, constantly adjusts its processes so the desired quality of product is obtained]. Efficiency =Output/ Input Slides 13-15

Formative vs. Summative Evaluation Assesses quality of outcome during the process, while outcome is taking shape Provides basis for improving/maintaining program actions SUMMATIVE Assesses quality of outcome at the end of process when outcome is in final stage Provides basis for establishing program effectiveness & efficiency; [accountability]

Outcomes and Impacts Outcomes direct outputs or immediate effects assess program’s performance- its effectiveness and efficiency; an indicator of merit Impacts long term effects; benefits to staff, participants, other stakeholders and community at large assess program’s validity/relevance; indicator of worth

Framing program evaluation: three basic questions Is the program effective? - has it achieved its goals? Is it efficient? - did it optimize its methods, resources to achieve the goals? Is it valid? - are its goals relevant to the context? Is it valued by its stakeholders and beneficiaries?

In Response: what to assess Effective? --- Outcome quality; outcome quantity (for duration of program) Efficient? -- Cost per outcome; Time per outcome Valid? Worth it? -- internal and external impacts; long term effects on/ benefits to stakeholders and others involved NOTE: outcome,cost/time [merit indicators] - assessed through tracking program performance. Impact [worth indicator]: - assessed through tracking effects beyond program period

Slides 13-15

Slides 13-15

Slides 13-15

Use of Information Monitoring project Performance Improve intermediary outcomes [quality vs. timelines] Improve process re-distribute personnel review protocols/timelines Describing performance level -[effectiveness and efficiency]

Questions, Comments, Suggestions Related Issues Questions, Comments, Suggestions

Workshop Session Two For your project, identify, or discuss applicability of: Goals, Protocols, Action Plan (or similar framework) Expected Outcomes - final [and intermediary, if applicable] Outcome Tracking: (a) process and (b) use- improving outcomes (formative eval.) and finding effectiveness (summative eval.) Tracking resource utilization (time/cost …): (a) process and (b) use for process improvement and finding efficiency (summative eval.) Instrument and database alternatives