Methodological Proposal for the Evaluation of the Relevance and Scope of Indicators of Social Programs Gilberto Moncada Consultant Banco Mundial Noviembre.

Slides:



Advertisements
Similar presentations
1 Planning an Evaluation Observations from a Practitioner.
Advertisements

1 A proposed approach to developing indicators Use the Strategic Targets document as the basis –Recent; explicitly addresses outcomes; relatively concise.
1 Evaluating Communication Plans Cvetina Yocheva Evaluation Unit DG REGIO 02/12/2009.
A MERICAN I NSTITUTES FOR R ESEARCH An Analysis of the Research and Impact of ICT in Education in Developing Country Contexts Nitika Tolani-Brown, PhD.
Protocol Development.
5.1.2 Situative Planning 1 Situative Planning - A Strategic Approach to Urban Planning UPA Package 5, Module 1.
Progress Toward Impact Overall Performance Study of the GEF Aaron Zazueta GEF Evaluation Office Hanoi, March 10, 2010.
Barbara M. Altman Emmanuelle Cambois Jean-Marie Robine Extended Questions Sets: Purpose, Characteristics and Topic Areas Fifth Washington group meeting.
Enhancing Data Quality of Distributive Trade Statistics Workshop for African countries on the Implementation of International Recommendations for Distributive.
Engineering Design Rubric Dimensions 1, 2 and 7.
Knowledge Translation Curriculum Module 3: Priority Setting Lesson 2 - Interpretive Priority Setting Processes.
Dr. Julian Lo Consulting Director ITIL v3 Expert
Chapter 4 Validity.
Introduction to the User’s Guide for Evaluating Learning Outcomes from Citizen Science Tina Phillips, Cornell Lab of Ornithology Marion Ferguson, Cornell.
Submission Writing Fundamentals – Part Webinar Series Leonie Bryen.
Factors affecting contractors’ risk attitudes in construction projects: Case study from China 박병권.
Participants should expect to understand: Concept of M&E Importance of gender in M&E Different steps in the M&E process Integrating gender into program/project.
1 Commissioned by PAMSA and German Technical Co-Operation National Certificate in Paper & Pulp Manufacturing NQF Level 3 Collect and use data to establish.
Add presentation title to master slide | 1 New inspection methodology from September 2009 NATSPEC conference Lorna Fitzjohn May 2009.
Business Analysis and Essential Competencies
1 RBM Background Development aid is often provided on a point to point basis with no consistency with countries priorities. Development efforts are often.
Chapter 6 : Software Metrics
Defining Research Problem
Developing Indicators
Course on Data Analysis and Interpretation P Presented by B. Unmar Sponsored by GGSU PART 2 Date: 5 July
Institutionalizing results based on monitoring and impact evaluation practices in community based social services development projects Kristine Grigoryan.
Crossing Methodological Borders to Develop and Implement an Approach for Determining the Value of Energy Efficiency R&D Programs Presented at the American.
TEN-T Experts Briefing, March Annual Call Award Criteria.
KATEWINTEREVALUATION.com Education Research 101 A Beginner’s Guide for S STEM Principal Investigators.
PPA 502 – Program Evaluation Lecture 2c – Process Evaluation.
Eloise Forster, Ed.D. Foundation for Educational Administration (FEA)
Qualitative Research Designs Day 4 The Curious Skeptics at work.
Insights from the Your Better Life Index Romina Boarini OECD Statistics Directorate Exploring and exploiting quality of life complexity (QoLexity): epistemological,
Building evaluation capacity – training Ben Barnes Principal Research Scientist July 2015.
UNESCO Institute for Statistics Monitoring and Improving Learning in the 2030 Agenda Simple Road Map to a cross-national scale Silvia Montoya, PhD Director.
UK Aid Direct Introduction to Logframes (only required at proposal stage)
Project Management Learning Program 7-18 May 2012, Mekong Institute, Khon Kaen, Thailand Writing Project Report Multi-Purpose Reporting.
Evaluation Proposal Defense Observations and Suggestions Yibeltal Kiflie August 2009.
 Welcome, introductions  Conceptualizing the evaluation problem  stakeholder interests  Divergent and convergent processes  Developing evaluation.
European Conference on Quality in Official Statistics 8-11 July 2008 Mr. Hing-Wang Fung Census and Statistics Department Hong Kong, China (
Developing a Review Protocol. 1. Title Registration 2. Protocol 3. Complete Review Components of the C2 Review Process.
Consultant Advance Research Team. Outline UNDERSTANDING M&E DATA NEEDS PEOPLE, PARTNERSHIP AND PLANNING 1.Organizational structures with HIV M&E functions.
Technical Support for the Impact Assessment of the Review of Priority Substances under Directive 2000/60/EC Updated Project Method for WG/E Brussels 22/10/10.
The Risk Management Process
Criteria for selection of a data collection instrument. 1.Practicality of the instrument: -Concerns its cost and appropriateness for the study population.
Quality Assessment of MFA’s evaluations Rita Tesselaar Policy and operations Evaluation Department Netherlands Ministry of Foreign Affairs.
Evaluation Requirements for MSP and Characteristics of Designs to Estimate Impacts with Confidence Ellen Bobronnikov February 16, 2011.
12-1 Chapter 12 Measurement Learning Objectives Understand... distinction between measuring objects, properties, and indicants of properties similarities.
Technology Needs Assessments under GEF Enabling Activities “Top Ups” UNFCCC/UNDP Expert Meeting on Methodologies for Technology Needs Assessments
UNDP Guidance for National Communication Project Proposals UNFCCC Workshop on the Preparation of National Communications from non-Annex I Parties Manila,
The FDES revision process: progress so far, state of the art, the way forward United Nations Statistics Division.
Investors Working Group – Sharing the lessons from Pilot- testing of SPI4 9 th SPTF annual meeting Dakar June 3, 2014.
Evaluating Engagement Judging the outcome above the noise of squeaky wheels Heather Shaw, Department of Sustainability & Environment Jessica Dart, Clear.
Aspect 1 Defining the problem - Problem: The design context will normally offer a variety of potential problems to solve. A focused problem and need is.
Training on Safe Hospitals in Disasters Module 3: Action Planning for “Safe Hospitals”
Market Feasibility 2 Prepared by : Genoveva (
.  Evaluators are not only faced with methodological challenges but also ethical challenges on a daily basis.
Development of Gender Sensitive M&E: Tools and Strategies.
ACF Office of Community Services (OCS) Community Services Block Grant (CSBG) Survey of Grantees Satisfaction with OCS Survey of Eligible Entities Satisfaction.
Canberra Chapter July PMI Chapter Meeting July 2007 PMCDF Competence Framework A presentation by Chris Cartwright.
Information Technology Project Management, Seventh Edition.
Evaluation. What is important??? Cost Quality Delivery Supplier Expertise Financial Stability Coverage Product Offerings Do you intend to negotiate?
Evaluation Requirements for MSP and Characteristics of Designs to Estimate Impacts with Confidence Ellen Bobronnikov March 23, 2011.
Regulatory Strategies and Solutions Group, LLC
Sustaining School Reform A guide to support evaluation policy
Evaluation of Priority Gender Equality
American Evaluation Association
Decision Matrices Business Economics.
Outcome Harvesting nitty- gritty Promise and Pitfalls of Participatory Design Atlanta, 10:45–11:30 29 October, 2016.
IB Environmental Systems and Societies
Presentation transcript:

Methodological Proposal for the Evaluation of the Relevance and Scope of Indicators of Social Programs Gilberto Moncada Consultant Banco Mundial Noviembre 2008

1. Background Methodological proposal developed in the framework of Fees For Services with the National Evaluation Council of Mexico. The proposal has to be piloted with the indicators of different programs in order to contrast its utility in different situations and make refinements. The proposal can serve as a base for the development and application of a methodology for the evaluation of indicators in different contexts.

2. Objetive of the methodology To facilitate the measurement and evaluation of the relevance and scope of inidcators used by social programs, with the objective of refining/improving them. Are the selected indicatoirs the most adequate to measure the advances of social programs? How can we examine if these indicators are the best?

3. Focus of evaluation Indicators of social programs created through a logic framework process. Indicators at the level of: outcome, objective and component.

4. Area of evaluation Dimensions of evaluation Categories to be evaluatedVariables General Variables related to basic capacities of indicators Consensus in the formualtion: information capacity, dissimination and access. Specific aspects of basic capacities of indicators as defined by a guide of questions Quality of the indicator CREAM criteria: clear, relevant, economic, adequate, monitorable. Specific aspects of the quality of an indicator as defined by a guide of questions. Use of the indicator Specific aspects of the use of an indicator as defined by a guide of questions.

5. Method 5.1 Guide of Questions 5.2 Information Collection 5.3 Question Scoring 5.4 Weights and Points 5.5 Metrics of evaluation

5.1 Method: Guide of Questions The evaluation should follow a guide of questions related to the variables. It is proposed that the variables can relate to a group of indicators and/or single indicators. Each dimension of evaluation will be evaluated on the basis of the results of the questions.

5.2 Method: Collection of Information Desk review of key documents( matrix of indicators, evaluation reports, registers, normative documents, benficiary information, databases etc.) In depth interviews with operators of the program, superior authorities, external and internal users of the indicators of a program. The guide of questions should be answered by the evaluation team based on the information obtained.

5.3 Method: Question Scores Every question has response alternatives which represent a scale:  Maximum score is 1;  Second is 0.5;  Third is 0.25;  Lowest score is 0. Some question are binomial: yes/no (value of 1 or 0). Each response must be justified clearly and concisely, with evidence. If evaluating a group of indicators rather than an individual indicator (in for example, basic capacities) is indicator is given the corresponding score.

5.4 Method: Weights and Points

Calculation of weights It is proposed to determine the relative weights of the different dimensions of evaluation (general variables, quality and use), as will as the crtieria in each dimension (CREAM etc), through a process of consulting national experts. The experts should use a scoring table, using a scale of 1 to 10 to indicate the value they would attach to each dimension and criteria given their experience. The results obtained through this consultation can serve as the basis of determining the average weight of each dimension and criteria.

5.5 Method: Metric to Evaluate Indicator Example of the calculation for the criteria: quality

Score of indicators It is proposed to consider three rankings for each dimension using three equal ranges.  Low quality/use/general capacity0 – 33.3  Medium quality/use/general capacity33.4 – 66.7  High quality/use/general capacity66.8 – 100 The total score of the indicators in terms of its relevance and scope is found estimating the average weight of each dimension.  Low relevance and scope0 – 33.3  Medium relevance and scope 33.4 – 66.7  High relevance and scope 66.8 – 100

6. Methodology to evaluate the quality of indicators at the level of activitiy Proposal is to use a guide of 10 questions that explore the basic characteristics of an indicator. Binomial reponses (Yes/No) which represent a values of 0 and 1. The points recievd by each indicator are determined by the following ranking:  Low Quality0 – 3  Medium Quality4 – 6  High Quality7 – 10

Muchas gracias