Lesson 8: Effectiveness Macerata, 11 December Alessandro Valenza, Director, t33 srl.

Slides:



Advertisements
Similar presentations
Reasons for Monitoring and Evaluation at the Project Level
Advertisements

EN Regional Policy EUROPEAN COMMISSION Impact evaluation: some introductory words Daniel Mouqué Evaluation unit, DG REGIO Brussels, November 2008.
Introduction to Monitoring and Evaluation
Intelligence Step 5 - Capacity Analysis Capacity Analysis Without capacity, the most innovative and brilliant interventions will not be implemented, wont.
Evaluation Mary Rowlatt MDR Partners. Definition of project evaluation Evaluation focuses on whether the project was effective, achieved its objectives,
Prepared by BSP/PMR Results-Based Programming, Management and Monitoring Presentation to Geneva Group - Paris Hans d’Orville Director, Bureau of Strategic.
Evaluating public RTD interventions: A performance audit perspective from the EU European Court of Auditors American Evaluation Society, Portland, 3 November.
Lesson 2: Project: Evaluation, Monitoring, Auditing Macerata, 22 nd October Alessandro Valenza, Director, t33 srl.
Results-Based Management: Logical Framework Approach
Knowledge Translation Curriculum Module 3: Priority Setting Lesson 2 - Interpretive Priority Setting Processes.
Lesson 5: Relevance Macerata, 6 th December Alessandro Valenza, Director, t33 srl.
Evaluation is a professional and ethical responsibility and is a core part of PHN professional practice Commitment to evaluation helps build the PHN intelligence.
Pestalozzi Children‘s Foundation emPower 2012 Monitoring & Evaluation Lecturers: Beatrice Schulter.
HOW TO WRITE A GOOD TERMS OF REFERENCE FOR FOR EVALUATION Programme Management Interest Group 19 October 2010 Pinky Mashigo.
Indicator Baseline Target Milestones PERFORMANCE MEASUREMENTS.
1 Monitoring and evaluation after 2013 – some first ideas, mainly for ERDF / CF Evaluation network DG REGIO 14 th October 2010.
Monitoring Evaluation Impact Assessment Objectives Be able to n explain basic monitoring and evaluation theory in relation to accountability n Identify.
CONCEPT PAPER RESULT BASED PLANNING. RESULT-ORIENTED PLANNING Overall Objective/ Goal Specific Objective/ Purposes Expected Result/ Output Activities.
Evaluation Test Justin K. Reeve EDTECH Dr. Ross Perkins.
Lesson 3: Monitoring and Indicator Macerata, 23 nd October Alessandro Valenza, Director, t33 srl.
Evaluation methods and tools (Focus on delivery mechanism) Jela Tvrdonova, 2014.
Elizabeth Godfrey 1.  Periodic assessment of results Appropriateness, effectiveness, efficiency, impact, sustainability  Identifies intended and unintended.
Animal Welfare EU Strategy Introduction Community Action Plan The Commission's commitment to EU citizens, stakeholders, the EP and.
Lesson 2: Project: Evaluation, Monitoring, Auditing Macerata, 22 nd October Alessandro Valenza, Director, t33 srl.
KEYWORDS REFRESHMENT. Activities: in the context of the Logframe Matrix, these are the actions (tasks) that have to be taken to produce results Analysis.
Lesson 3: Monitoring and Indicator Macerata, 2o th November Alessandro Valenza, Director, t33 srl.
Comp 20 - Training & Instructional Design Unit 6 - Assessment This material was developed by Columbia University, funded by the Department of Health and.
Rwanda MCC Threshold Program CIVIL SOCIETY STRENGTHENING PROJECT Cross-Cutting Advocacy Issues Data Collection Monitoring and Evaluation.
IAOD Evaluation Section, the Development Agenda (DA) and Development Oriented Activities Julia Flores Marfetan, Senior Evaluator.
The LOGICAL FRAMEWORK Scoping the Essential Elements of a Project Dr. Suchat Katima Mekong Institute.
Methods: Pointers for good practice Ensure that the method used is adequately described Use a multi-method approach and cross-check where possible - triangulation.
TYPES OF EVALUATION Types of evaluations ask different questions and focus on different purposes. This list is meant to be illustrative rather than exhaustive.
This project is financed by the European Union 1 The project is implemented by a European Profiles S.A. led consortium Evaluation of Training By Senior.
Project Cycle Management for International Development Cooperation Indicators Teacher Pietro Celotti Università degli Studi di Macerata 16 December 2011.
UK Aid Direct Introduction to Logframes (only required at proposal stage)
Introduction Macerata, 15 th October Alessandro Valenza, Director, t33 srl.
Regional Policy Veronica Gaffey Evaluation Unit DG Regional Policy International Monitoring Conference Budapest 11 th November 2011 Budapest 26 th September2013.
Monitoring and Evaluation Orientation 17 th May 2011 MCA Namibia.
Program Evaluation.
Lesson 8: Efficiency, Impact and sustainibility Macerata, 11 December Alessandro Valenza, Director, t33 srl.
Lesson 7: Performance Macerata, 3 December Alessandro Valenza, Director, t33 srl.
Project Management Learning Program 23 Mar – 3 Aprl 2009, Mekong Institute, Khon Kaen, Thailand Managing for Development Results Results-Oriented Monitoring.
SJI Impact Assessment 2014 Based on Impact Assessment for Development Agencies – Learning to Value Change (Chris Roche, Oxfam, digital file 2010)
1 The project is financed from the European Union funds within the framework of Erasmus+, Key Action 2: Cooperation for innovation and the exchange of.
Evaluation: from Objectives to Outcomes Janet Myers, PhD MPH AIDS Education and Training Centers National Evaluation Center
Lesson 7: Performance Macerata, 13 December Alessandro Valenza, Director, t33 srl.
URBAN STREAM REHABILITATION. The URBEM Framework.
Lesson 4: Evaluation Plan Macerata, 29 th October Alessandro Valenza, Director, t33 srl.
EVALUATION OF THE SEE SARMa Project. Content Project management structure Internal evaluation External evaluation Evaluation report.
Key Components of a successful Proposal OAS / IACML Workshop on Technical Assistance. San José, Costa Rica May 8, 2007 By: José Luis Alvarez R. Consultant.
Training Evaluation Chapter 6 6 th Edition Raymond A. Noe Copyright © 2013 by The McGraw-Hill Companies, Inc. All rights reserved.McGraw-Hill/Irwin.
SUMMARY Macerata, 8 th April Andrea Gramillano, t33 srl.
Development of Gender Sensitive M&E: Tools and Strategies.
Project Monitoring and Evaluation A useful Tool for Research Projects’ Administrators and Managers Folajogun V. Falaye. (Professor of Educational Research.
So You Think You’ve Made a Change? Developing Indicators and Selecting Measurement Tools Chad Higgins, Ph.D. Allison Nichols, Ed.D.
Evaluation Nicola Bowtell. Some myths about evaluation 2Presentation title - edit in Header and Footer always involves extensive questionnaires means.
Evaluation What is evaluation?
Monitoring and Evaluation Systems for NARS organizations in Papua New Guinea Day 4. Session 10. Evaluation.
 Meaning  Purpose  Steps in evaluation  Models of evaluation  Program evaluators and the role.
Module 8 Guidelines for evaluating the SDGs through an equity focused and gender responsive lens: Overview Technical Assistance on Evaluating SDGs: Leave.
Sustainable Urban Mobility Plans: Monitoring & Evaluation
Veronica Gaffey & Antonella Schulte-Braucks
Lesson 7: Performance Macerata, 28th October
Monitoring and Evaluation Systems for NARS Organisations in Papua New Guinea Day 3. Session 7. Managers’ and stakeholders’ information needs.
Prepared by BSP/PMR Results-Based Programming, Management and Monitoring Presentation to Geneva Group - Paris Hans d’Orville Director, Bureau of Strategic.
WHAT is evaluation and WHY is it important?
Lesson 3: Performance, effectiveness, efficiency
Training Evaluation Chapter 6
ESF monitoring and evaluation in Draft guidance
Civil Society Facility and Media Programme Call for proposals: EuropeAid/162473/DH/ACT/Multi Webinar no. 3: Preparing effective Concept Note.
Presentation transcript:

Lesson 8: Effectiveness Macerata, 11 December Alessandro Valenza, Director, t33 srl

Agenda  What does Effectiveness mean?  Assessing Effectiveness  Multicriteria analysis

Effectiveness (definition 1) The extent to which the development intervention’s objectives were achieved, or are expected to be achieved, taking into account their relative importance. Note: Also used as an aggregate measure of (or judgement about) the merit or worth of an activity, i.e. the extent to which an intervention has attained, or is expected to attain, its major relevant objectives efficiently in a sustainable fashion and with a positive institutional developmental impact. (OCSE DAC)

Effectiveness (definition 2 ) The term effectiveness has many possible meanings. The most common definition identifies effectiveness with “achievement of objectives”. This leaves open the definition to the different meanings of “objectives”. Objectives can be expressed quantitatively in terms of expected output or results. The effectiveness is evaluated simply by comparing what has been obtained with what had been planned: outputs and results indicators are all is needed. (European Commission – DG REGIO EVALSED GUIDE)

EFFECTIVENESS: WHAT TO EVALUATE ? EUROPEAN COMMISSION Quality: effectiveness is evaluated by comparing results with quality standards. A bility of a given action to produce a desired change: comparing what is observed after the action has taken place with what would have happened without the action. One needs data that allow recovery of the counterfactual situation. OCSE - DAC Effectiveness assesses whether the results outlined in the logframe are delivered and if they are likely to produce the expected objective. Evaluating effectiveness should include assessment of how women and men benefit from the results brought by the project

Evaluation Questions To what extent were the originally defined objectives of the development intervention realistic? To what extent have the (direct) objectives of the development intervention been achieved in accordance with the (adjusted, if applicable) target populations? What are the (concrete) contributions of interventions for achieving the objectives of the development intervention? What factors were crucial for the achievement or failure to achieve the project objectives so far (indication of strengths and weaknesses, e.g. the monitoring and evaluation system)? What is the quality of development-policy, technical planning and coordination ?

Information and Data on OUTCOME Indicators Quantitative -Financial -Physical -Procedural Qualitative -Opinions on the level of achievement -Perceptions on satisfaction Methods Literature review Interview Dialogic interview Community interview Project visit Focus group Case study Survey

Different way for collecting information

How to analyse the information? Quality: parameters and standard Economic / Financial aspects: Cost and Revenues Satisfaction: expectations against the perception Opinions: majority/ minority Physical: achieved against targets (logframe ) Time/ environmental protection: savings with regard to the situation before the intervention (business as usual) Counterfactual: Situation with/ without intervention

Objective or Objectives? - Normally a project has more than one objective even if not too many (no,more than 3 - OCSE DAC). - For example a new road construction can make the connection: 1.Safer 2.Cheaper 3.Faster

Multicriteria Tool used to compare several interventions in relation to several criteria. Multicriteria analysis is used also in the ex ante evaluation for comparing proposals. It can also be used in the ex post evaluation of an intervention, to compare the relative success of the different components of the intervention. Finally, it can be used to compare separate but similar interventions, for classification purposes. Multicriteria analysis may involve weighting, reflecting the relative importance attributed to each of the criteria. It may result in the formulation of a single judgment or synthetic classification, or in different classifications reflecting the stakeholders' diverse points of view. In the latter case, it is called multicriteria-multijudge analysis. (from EVALSED)

Process Step 1 define criteria Step 2 scoring or ranking Step 3 weighting Step 4 aggregating

STEP 1: setting criterion Criterion 1: Financial Performance Criterion 2: Procedural Criterion 3: Physical realisation Project 1 Project 2 Project 3 Project 4 Project n…

STEP 2: Score or rank for judgment -It is needed to find a way to appraise the project according to different aspects since we used different measuring units for different aspects of different process -We can opt for: - A) Scoring: by assigning a numeric value to different “interval” of performance. For example 3 for “above average” –1 for “on line with average” – 3 for “below average” - B) Ranking: we simply order the different projects according to their performance from the first to the last

STEP 2: Scoring Criterion 1:Criterion 2:Criterion 3: Project 1111 Project 2021 Project 3311 Project 4330 Project n…

STEP 3: Establishment of weight If some Criteria is more important than others it shall be given more importance. To do it we simple apply a multiplication factor > 1 (e.g. 1,5). Some criteria may have such importance that they have to be singled out. This is the case for criteria determined by a veto threshold (For example “Physical” if some project has 0 performance, it is excluded by the analysis).

STEP 3: Apply the weight Criterion 1: ( * 1,5) Criterion 2:Criterion 3: Project 11,511 Project 2021 Project 34,511 Project 44,53out Project n…

STEP 4: Aggregate the score Criterion 1: ( * 1,5) Criterion 2:Criterion 3: Total (with weight) Total (without weight) Project 11, Project Project 34,5116,55 Project 44,53out 6 Project n…

WORK OUT: SME INCUBATOR Criterion 1: Economic (average increase of turnover) Criterion 2: New Jobs Criterion 3: Satisfaction for quality service Total Project 1 Project 2 Project 3 Project 4

WORK OUT DATAS Project1234 Economic80%70%30%70% Physical QualityHighLowHighMedium LowMediumHigh Quality levels

Apply weight A Economic* 1,5 PhysicalVeto = N. of job < 10 B Physical* 2 QualityVeto for “Low”

See you