From Monitoring Through Evaluation To Impact Assessment The case of NHDRs and vulnerable groups Andrey Ivanov Human Development Adviser, Bratislava RSC.

Slides:



Advertisements
Similar presentations
Reasons for Monitoring and Evaluation at the Project Level
Advertisements

HELPING THE NATION SPEND WISELY Performance audit and evaluation: common ground with Internal Audit ? The UK National Audit Office experience Jeremy Lonsdale.
Role of CSOs in monitoring Policies and Progress on MDGs.
M & E for K to 12 BEP in Schools
THE REGIONAL ENVIRONMENTAL CENTER for Central and Eastern Europe Integrated Environmental Policies for Sustainable Development UNDP Workshop for NIS
AN INTRODUCTION TO SPHERE AND THE EMERGENCY CONTEXT
European Social Fund Evaluation in Italy Stefano Volpi Roma, 03 maggio 2011 Isfol Esf Evaluation Unit Human Resources Policies Evaluation Area Rome, Corso.
The need for gender disaggregated data and its impact on policies, and achieving gender equality goals Hamidan Bibi.
Project Monitoring Evaluation and Assessment
IPDET Lunch Presentation Series Equity-focused evaluation: Opportunities and challenges Michael Bamberger June 27,
M&E Issues: RAFIP and REP Kaushik Barua Accra, 12 Dec
1 Pertemuan 9 Department Organization Matakuliah:A0274/Pengelolaan Fungsi Audit Sistem Informasi Tahun: 2005 Versi: 1/1.
Program Evaluation In A Nutshell 1 Jonathan Brown, M.A.
Evaluation. Practical Evaluation Michael Quinn Patton.
ARQ part II data management Training pack 2: Monitoring drug abuse for policy and practice.
1 Designing a Monitoring and Evaluation System for a Rural Travel and Transport Project Michael Bamberger Gender and Development Group The World Bank RTTP.
HOW TO WRITE A GOOD TERMS OF REFERENCE FOR FOR EVALUATION Programme Management Interest Group 19 October 2010 Pinky Mashigo.
Monitoring & Evaluation in World Bank Agriculture and Rural Development Operations SASAR Monitoring & Evaluation Workshop New Delhi; June 20, 2006.
How to Develop a Project Evaluation Plan Pat Gonzalez Office of Special Education Programs
Participants should expect to understand: Concept of M&E Importance of gender in M&E Different steps in the M&E process Integrating gender into program/project.
How to Develop the Right Research Questions for Program Evaluation
Guidance on Evaluation of Youth Employment Initiative
Europe and CIS NHDR Workshop: Training on Statistical Indicators. Bratislava, May 2003 Disaggregation of HD indicators: Why needed? Why difficult? What.
Unit 2: Managing the development of self and others Life Science and Chemical Science Professionals Higher Apprenticeships Unit 2 Managing the development.
Vulnerability analysis: Methodologies, Purpose, and Policy Application Susanne Milcher Specialist, Poverty and Economic Development UNDP Regional Centre.
Results-Based Management
May 12 th Monitoring and Project Control. Objectives Anticipated Outcomes Express why Monitoring and Controlling are Important. Differentiate between.
Lesson 3: Monitoring and Indicator Macerata, 23 nd October Alessandro Valenza, Director, t33 srl.
Preconditions for Employment for Roma: Ethnically sensitive data collection Susanne Milcher Specialist, Poverty and Economic Development Specialist, Poverty.
What gets lost along the way? Chances and pitfalls of government led implementation procedures for GRB The case of Austria Dr. Elisabeth Klatzer European.
1 RBM Background Development aid is often provided on a point to point basis with no consistency with countries priorities. Development efforts are often.
Project design & Planning The Logical Framework Approach An Over View Icelandic International Development Agency (ICEIDA) Iceland United Nations University.
Lesson 2: Project: Evaluation, Monitoring, Auditing Macerata, 22 nd October Alessandro Valenza, Director, t33 srl.
KEYWORDS REFRESHMENT. Activities: in the context of the Logframe Matrix, these are the actions (tasks) that have to be taken to produce results Analysis.
Andrey Ivanov, Senior Policy Advisor, Human Development and Roma Inclusion cluster, UNDP BRC.
UNDAF M&E Systems Purpose Can explain the importance of functioning M&E system for the UNDAF Can support formulation and implementation of UNDAF M&E plans.
M&E Basics Miguel Aragon Lopez, MD, MPH. UNAIDS M&E Senior Adviser 12 th May 2009.
MAINSTREAMING MONITORING AND EVALUATION IN EDUCATION Can education be effectively managed without an M & E system in place?
Impact Evaluation in Education Introduction to Monitoring and Evaluation Andrew Jenkins 23/03/14.
Module 5 - Questions and Criteria for Evaluations.
Roadmap for a monitoring framework for the post-2015 development agenda OWG on Sustainable Development Goals Informal meeting on measuring progress (17.
Project Cycle Management for International Development Cooperation Indicators Teacher Pietro Celotti Università degli Studi di Macerata 16 December 2011.
GEOSS Mid-Term Evaluation Detailed Framework. Issues with Plan Clear Direction on Scoping Data Issues- Made explicit so as not to raise expectations or.
European Commission Joint Evaluation Unit common to EuropeAid, Relex and Development Methodology for Evaluation of Budget support operations at Country.
Regional Policy Veronica Gaffey Evaluation Unit DG Regional Policy International Monitoring Conference Budapest 11 th November 2011 Budapest 26 th September2013.
M & E TOOLKIT Jennifer Bogle 11 November 2014 Household Water Treatment and Water Safety Plans International and Regional Landscape.
Module II: Developing a Vision and Results Orientation Cheri Hayes Consultant to Nebraska Lifespan Respite Statewide Sustainability Workshop June 23-24,
Governance indicators for pro-poor and gender-sensitive policies The NHDRs as frameworks for analysis with a focus on vulnerable groups Andrey Ivanov Human.
Community Planning 101 Disability Preparedness Summit Nebraska Volunteer Service Commission Laurie Barger Sutter November 5, 2007.
1 Results-based Monitoring, Training Workshop, Windhoek, Results-based Monitoring Purpose and tasks Steps 1 to 5 of establishing a RbM.
Sustainability Planning Framework and Process Cheri Hayes Consultant to Nebraska Lifespan Respite Statewide Sustainability Workshop June 23-24, 2015 ©
Adrienne DiTommaso, MPA, CNCS Office of Research and Evaluation
Kathy Corbiere Service Delivery and Performance Commission
Changing the way the New Zealand Aid Programme monitors and evaluates its Aid Ingrid van Aalst Principal Evaluation Manager Development Strategy & Effectiveness.
Defining Key Performance Indicators Learning from international practices Challenges for the UI scheme in Viet Nam By Celine Peyron Bista, 13 December.
AGRO PARKS “The Policy Cycle” Alex Page Baku November 2014.
Selection Criteria and Invitational Priorities School Leadership Program U.S. Department of Education 2005.
Evaluation What is evaluation?
ITC-ILO/ACTRAV Course A Trade Union Training on Occupational Safety, Health & HIV/AIDS (26/11 – 07/12/2012, Turin) Introduction to National Occupational.
Developing a Monitoring & Evaluation Plan MEASURE Evaluation.
Country Level Programs
Decade of Roma Inclusion Progress monitoring
Gender-Sensitive Monitoring and Evaluation
Gender-Sensitive Monitoring and Evaluation
Accountability: an EU perspective
UNDP-UNEP POVERTY & ENVIRONMENT INITIATIVE (PEI): MID-TERM REVIEW
DG Employment, Social Affairs and Inclusion
Civil Society Facility and Media Programme Call for proposals: EuropeAid/162473/DH/ACT/Multi Webinar no. 3: Preparing effective Concept Note.
Integrating Gender into Rural Development M&E in Projects and Programs
Presentation transcript:

From Monitoring Through Evaluation To Impact Assessment The case of NHDRs and vulnerable groups Andrey Ivanov Human Development Adviser, Bratislava RSC

Main issues in this presentation n Measuring what? n Measuring how? n The role of quantitative data n The case of NHDRs n Links to poverty measurements

Some terminological clarity Three distinct (but often confused) concepts: n Monitoring n Evaluation n Impact assessment n All relate to quality but quality of different aspects of development process and have different purpose, applicability and scope

Measuring what? n Monitoring – primarily of progress. Relatively easy if major components are clear and traceable n Evaluation – of process but also outputs and outcomes. Feasible if these elements are replicable and provide grounds for comparability n IA – long-term changes in development situation. Difficult because of correlations and mutual influences

Applicability: monitoring Applicable to process (NHDR elaboration) and implementation (applicable both to programs and projects). For that purpose needed: n Progress indicators related both to the outputs and outcomes (reaching benchmarks) n Consistency of procedures (administrative, accountancy) n Consistency of process (participation, consultation of stake-holders etc.)

Applicability: evaluation Applicable to process but even more so to its outcomes (the NHDR itself) For that purpose needed: n Measurable outputs and outcomes (linked to benchmarks). But what is “NHDR outcome”? n Measurable inputs (to assess efficiency) n Again, consistency of procedures and process (participation, consultation of stake-holders etc.) One common “trap”: aligning evaluation to indicators available and biased towards measuring inputs instead of outputs

Applicability: IA Applicable only to outcomes (but what is the outcome – the NHDR itself or the process). “Sexy” but difficult to achieve n Applicable at different levels – that of programs (long-term change in development situation) but also of projects (small- scale development interventions n The higher the level, the higher the correlations n Outcomes not always quantifiable (what is quantifiable usually is not an outcome). When it is quantifiable, data is not available (disaggregated poverty or unemployment The common “trap”: broadly used as a term but with vague conceptual justification and argumentation

Why impact assessment in different areas? Of legislation (regulations IA): n Could outline the consistency with other pieces of legislation n Could suggest additional areas needing legislative involvement Of policies: n Could outline the consistency with priorities n Could measure the advance in strategic areas The two areas – a kind of “advance warning” scenario building Of projects : n Could measure the change after the involvement (and not just count the inputs) n Could measure the efficiency of the process (to what extent there could be better alternative solutions)

Impact assessment: NHDRs What is the outcome and impact: n A book? Number of pages? n Process? Change in the paradigm? Possible indicators: n Press coverage? n Content analysis of political documents? n Policies implemented? n Projects implemented? n “Ideas leakages”?

Impact assessment: Roma Report What is the outcome and impact: n Again, a book? A knowledge site? n A data base behind it? n New attitude to quantitative information in Roma targeted programs and projects? n Again, change in the paradigm? Possible indicators (behind a “Policy Impact” Award): n HR paradigm influenced n Policies influenced n Ideas “leaked”

What is the (often) existing practice ? When policies are concerned: n Governments are usually convinced they possess “universal wisdom”. n Impact assessment - usually post factum by the opponents politically biased. When projects are concerned: n Usually focused on the “compliance with the budget” n “Properly” defined objectives (easy to monitor) n Confusion between means and objectives, outputs and outcomes n “Static” approach often applied n “Corporate interest”: often nobody really needs adequate assessment

How to reach there? n Clearly defined objectives, clear and realistic targets n Involvement of the target group and other beneficiaries n Measurable baseline indicators n Consistent and adequate data n Constituencies indeed interested in measuring the impact n Take into consideration externalities – both positive and negative

Example: Employment generation projects n Inputs: training courses, presentations, practical exercises n Outputs: number of people with requalification course passed n Outcome: number of former unemployed who found jobs n Sustainability: duration of the job n Impact: HH incomes increased, poverty indicators improved n Positive externalities: reduced drop-our tares, reduced societal fragmentation Minimum necessary data: n Employment status, unemployment rates, incidence of poverty, levels of income, qualification levels n The costs of “reaching the beneficiary”, the costs of the alternative approaches, opportunity costs n The costs of non-involvement at all

Example: Employment generation projects Possible sources: n Targeted small-sample surveys n Community level data collection (not sample-based) n Interviews the beneficiaries on the specific involvement and how did it influenced them n Interviewing the other actors involved n Comparing the trends within the group with the overall trends At the end the data should allow building alternative scenarios to compare with and measure the efficiency of the specific project

Main conclusions n Clear understanding of the differences between monitoring, evaluation and IA is a must n All three should not be treated as substitutive n Should not be susceptible to short-term political agenda n Data is crucial. Quantitative information components feeding progress indicators should be embedded from the very beginning of the project n Wide range of stakeholders should be involved n Composite indices should be used carefully, for evaluation of complex, multidimensional processes

What can we offer? n Statistical capacity development support (“Measuring Human Development” manual and KM) n Targeted data collection (“Vulnerable Groups” survey) 1. Methodological aspects of vulnerability research