Monitoring and Evaluation Orientation 17 th May 2011 MCA Namibia.

Slides:



Advertisements
Similar presentations
Poverty Monitoring - Good Practice in Selecting Indicators Presentation by Dirk U. Hahn, Consultant at the Workshop on Prioritization, Operationalization.
Advertisements

Results Based Monitoring (RBM)
Overview M&E Capacity Strengthening Workshop, Maputo 19 and 20 September 2011.
Donald T. Simeon Caribbean Health Research Council
Results-Based Management: Logical Framework Approach
A brief overview What is program evaluation? How is an evaluation conducted? When should it be used? When can it be used? Used with Permission of: John.
M&E Issues: RAFIP and REP Kaushik Barua Accra, 12 Dec
The Lumina Center Grantseeking Workshop Series Presents Outcomes & Evaluations April 20, 2006.
Results-Based Management: Logical Framework Approach
A Human Rights-Based Approach to Monitoring Session 6 (cont.)
Gender Aware Monitoring and Evaluation. Amsterdam, The Netherlands Presentation overview This presentation is comprised of the following sections:
Susana T. Fried Senior Gender/HIV Advisor Bureau for Development Policy UNDP New York Two tools to support mainstreaming gender into national HIV strategies.
Indicator Baseline Target Milestones PERFORMANCE MEASUREMENTS.
Participants should expect to understand: Concept of M&E Importance of gender in M&E Different steps in the M&E process Integrating gender into program/project.
A Tool to Monitor Local Level SPF SIG Activities
1 Module 4: Designing Performance Indicators for Environmental Compliance and Enforcement Programs.
This project is funded by the EUAnd implemented by a consortium led by MWH Logical Framework and Indicators.
RESEARCH A systematic quest for undiscovered truth A way of thinking
Results-Based Management
May 12 th Monitoring and Project Control. Objectives Anticipated Outcomes Express why Monitoring and Controlling are Important. Differentiate between.
Indicators Dr Murali Krishna Public Health Foundation of India.
Measuring & Assessing Democratic Governance Pro-poor & gender-sensitive indicators Lorraine Corner.
1 RBM Background Development aid is often provided on a point to point basis with no consistency with countries priorities. Development efforts are often.
Lesson 8: Effectiveness Macerata, 11 December Alessandro Valenza, Director, t33 srl.
The National Development Plan, Iraq 6 July 2010 “Developing Objectives & Indicators for Strategic Planning” Khaled Ehsan and Helen Olafsdottir UNDP Iraq.
INTRODUCTION TO PROJECT PLANNING AND APPRAISAL LOGICAL FRAME WORK PREPARED BY GEORGE BOTCHIE.
1 Data analysis. 2 Turning data into information.
Framework for Monitoring Learning & Evaluation
1 PROJECT EVALUATION IT’S ALL ABOUT STUDENTS. 2 In partnership, we help America’s students stay in school and graduate by: Reducing gaps in college access.
The LOGICAL FRAMEWORK Scoping the Essential Elements of a Project Dr. Suchat Katima Mekong Institute.
Methods: Pointers for good practice Ensure that the method used is adequately described Use a multi-method approach and cross-check where possible - triangulation.
M&E Basics Miguel Aragon Lopez, MD, MPH. UNAIDS M&E Senior Adviser 12 th May 2009.
ROLE OF INFORMATION IN MANAGING EDUCATION Ensuring appropriate and relevant information is available when needed.
Monitoring & Evaluation Presentation for Technical Assistance Unit, National Treasury 19 August 2004 Fia van Rensburg.
MAINSTREAMING MONITORING AND EVALUATION IN EDUCATION Can education be effectively managed without an M & E system in place?
Creating a S.I.M.P.L.E. & Comprehensive Assessment Plan for Residence Life and Housing Dr. Mike Fulford Emory University Student Affairs Assessment Conference.
SESSION 3: FROM SETTING PRIORITIES TO PROGRAMMING FOR RESULTS.
Impact Evaluation in Education Introduction to Monitoring and Evaluation Andrew Jenkins 23/03/14.
UK Aid Direct Introduction to Logframes (only required at proposal stage)
Project Management Learning Program July 2008, Mekong Institute, Khon Kaen, Thailand Project Design and Planning Sequence of Systematic Project Design.
SUB-MODULE. 3 RESULTS CHAIN RESULTS BASED LOGICAL FRAMEWORK TRAINING Quality Assurance and Results Department (ORQR.2)
PREPARING FOR MONITORING AND EVALUATION 27 – 31 May 2013 Bangkok Bangkok Office Asia and Pacific Regional Bureau for Education United Nations Educational,
Guidelines for LDS preparation for Croatian LAG’s Estonian Leader Union Kadri Tillemann and Kristiina Timmo 28 th of September, Zagreb.
1 Results-based Monitoring, Training Workshop, Windhoek, Results-based Monitoring Purpose and tasks Steps 1 to 5 of establishing a RbM.
Developing a Project Proposal
1 The project is financed from the European Union funds within the framework of Erasmus+, Key Action 2: Cooperation for innovation and the exchange of.
School Development Goal Development “Building a Learning Community”
Monitoring and evaluation Objectives of the Session  To Define Monitoring, impact assessment and Evaluation. (commonly know as M&E)  To know why Monitoring.
Session 2: Developing a Comprehensive M&E Work Plan.
Survey Training Pack Session 2 – Data Analysis Plan.
Development of Gender Sensitive M&E: Tools and Strategies.
Evaluation What is evaluation?
Project monitoring and evaluation
Session VII: Formulation of Monitoring and Evaluation Plan
Template Contents of the Low Carbon Development Strategy (LCDS)
Monitoring and Evaluation Systems for NARS Organisations in Papua New Guinea Day 3. Session 9. Periodic data collection methods.
Gender-Sensitive Monitoring and Evaluation
Gender-Sensitive Monitoring and Evaluation
M&E Basics Miguel Aragon Lopez, MD, MPH
Module 9 Designing and using EFGR-responsive evaluation indicators
Session 1 – Study Objectives
Monitoring and Evaluation Systems for NARS Organisations in Papua New Guinea Day 2. Session 6. Developing indicators.
Multi-Sectoral Nutrition Action Planning Training Module
Outcomes and Evidence Based Programming
Describe result frameworks
Monitoring and Evaluating FGM/C abandonment programs
Appraising result framework
Monitoring and Evaluation in Communication Management
Data for PRS Monitoring: Institutional and Technical Challenges
M & E Plans and Frameworks
Presentation transcript:

Monitoring and Evaluation Orientation 17 th May 2011 MCA Namibia

Presentation Overview Introduction Basic M&E Concepts MCA-N’s M&E expectations Overview of MCA-N’s grant-related M&E activities

Introduction A focus on results is one of the core principles on which the MCC was founded and an important aspect of this focus is the M&E of programs M&E helps to boost the effectiveness, accountability, and transparency of development assistance In the short-term, it improves management decision making and over the long-term, it contributes to better design of development projects MCC Policy for Monitoring and Evaluation guides MCA’s M&E requirements

Basic concepts of MONITORING AND EVALUATION Monitoring –systematic collection of data on specified indicators to provide indications of progress. Evaluation –measures the changes in individual, household or community income and other aspects of well-being that result from a particular project or program.

Basic concepts of MONITORING AND EVALUATION Activity – Actions taken or work performed through which inputs, such as funds, technical assistance and other types of resources are mobilized to produce specific outputs. Indicator – Quantitative or qualitative variable that provides a simple and reliable means to measure achievement of an intervention.

Basic concepts of MONITORING AND EVALUATION Input – Financial, human, and material resources used during an intervention. Output – The direct results of a project activity. The goods or services produced by the implementation of an Activity. Outcome –intermediate- or medium-term effects/results of an intervention’s Outputs. Objective –intermediate or long-term effects/results of an intervention’s outputs. Target – The expected results for a particular indicator to be met by a certain point in time.

7 Inputs Activities Outputs Outcomes Impact Indicators Assumptions/Risks Results-Chain

Result statements and Indicators to monitor results

9 All three conditions mustaddressedbe Principle of Results Based Planning Interventions must not only be necessary, but also sufficient to achieve the expected result If a problem is caused by three conditions

Results Language = Change Language Action Language  expresses results from the provider’s perspective  can be interpreted in many ways  focuses on completion of activities Change Language  describes changes in the conditions of people  sets precise criteria for success  focuses on results, leaving options on how to achieve them 10

Quality Criteria for a Results statement Is the scale/scope realistically within the control of you and your partners? Is it stated using change language? Make sure it is SMART – Specific, Measurable, Achievable, Relevant, Time bound Take reference to strategy out of sentence Result statement: Improved knowledge base of livestock producers in NCA 11

Don’t confuse indicator formulation with results statement ! An indicator is neutral, does not pre-judge or set targets, is therefore “empty of data”, i.e., data still has to be collected:  Indicator: % of all trained farmers in NCA using new farming technologies. NOT “90% of all trained farmers in NCA using new farming technologies.  Definition:  Denominator: # of trained farmers using new technologies  Numerator: Total # of Farmers trained  Baseline: 10%  Target Year 1: 45% and Target year 2: 95%

An indicator should… state whom/where /what is being measured: e.g. # of girls in x district be expressed in quantifiable units:  Unit of measure : %, number, ratio Or descriptive words Be disaggregated by sex whenever possible !

Quantitative statistical measures: number of frequency of % of variance with Qualitative judgments or perceptions: presence of quality of extent of level of Types of indicators

15 Checklist for INDICATORS Validity - Does it measure the result? Reliability - Is it a consistent measure over time and, if supplied externally, will it continue to be available? Sensitivity - When a change occurs will it be sensitive to those changes? Simplicity - Will it be easy to collect and analyze the information? Equality – Is the status or situation of women and men compared? Comparisons between ethnicity…geography? Utility - Will the information be useful for decision-making and learning? Affordability – Do we have the resources to collect the information? What baseline do we have?

MCA-N’s M&E expectations All Grantees are required to have M&E Plans specifying expected results, data collection processes, and reporting frequency M&E Plan: Objectives, Activities, Results, Indicators and definition, Targets, Data sources, Frequency Data collection Process: Who is going to collect data on indicators? How is data going to be stored? How frequent is data going to be collected? Are you going to collect baseline data before you intervention, how or who is going to do it? Reporting: Report back on indicators-How often? Who is going to compile the report?

Overview of MCA-N’s grant-related M&E activities (CS/INP) CS (WWW and Grantees) and (NRI and Grantees) INP activity implementation CS/INP Baseline surveyM&E Plans/Reports from GranteesData Collection Plan by Activity implementersCS/INP Evaluation

Overview of MCA-N’s grant-related M&E activities (LMEF) LMEF Activity implementation (Grantees) M&E Plans/Reports from Grantees LMEF Evaluation

THANK YOU!!! Q&A