Lesson 7: Performance Macerata, 28th October

Slides:



Advertisements
Similar presentations
EU CORIN Supporting Cross-border Cooperation BiH – Croatia/Serbia/Montenegro Europeaid/122730/C/SER/BA ________________________ Topical Training for JMC.
Advertisements

A Fresh Look at the Intervention Logic of Structural Funds
Lesson 2: Project: Evaluation, Monitoring, Auditing Macerata, 22 nd October Alessandro Valenza, Director, t33 srl.
Ray C. Rist The World Bank Washington, D.C.
MONITORING & QUALITY CONTROL ERASMUS MUNDUS II PROJECT CENTAURI MOBILITY KAZAKHSTAN, KYRGYZSTAN, TAJIKISTAN, UZBEKISTAN.
Methodology for POLICY IMPACT ANALYSIS AND EVALUATION Paolo Roberti (Istat) Roma, RRT Meeting July 6, 2006.
XII- COST CONTROL, MONITORING & ACCOUNTING
HOW TO WRITE A GOOD TERMS OF REFERENCE FOR FOR EVALUATION Programme Management Interest Group 19 October 2010 Pinky Mashigo.
1 Monitoring and evaluation after 2013 – some first ideas, mainly for ERDF / CF Evaluation network DG REGIO 14 th October 2010.
ETCF is funded by the European Union Project is implemented by Eurochambres & TOBB CONCEPT NOTE ETCF Information Days April 3rd 2008 Ankara.
CONCEPT PAPER RESULT BASED PLANNING. RESULT-ORIENTED PLANNING Overall Objective/ Goal Specific Objective/ Purposes Expected Result/ Output Activities.
Project Implementation Monika Balode Joint Technical Secretariat Lead Partner Seminar 16 October 2009, Šiauliai.
Lesson 3: Monitoring and Indicator Macerata, 23 nd October Alessandro Valenza, Director, t33 srl.
1 1 Slide © 2004 Thomson/South-Western Chapter 17 Multicriteria Decisions n Goal Programming n Goal Programming: Formulation and Graphical Solution and.
1 RBM Background Development aid is often provided on a point to point basis with no consistency with countries priorities. Development efforts are often.
Lesson 2: Project: Evaluation, Monitoring, Auditing Macerata, 22 nd October Alessandro Valenza, Director, t33 srl.
KEYWORDS REFRESHMENT. Activities: in the context of the Logframe Matrix, these are the actions (tasks) that have to be taken to produce results Analysis.
Lesson 3: Monitoring and Indicator Macerata, 2o th November Alessandro Valenza, Director, t33 srl.
Synthetic Equity Arrangements 2015 Federal Budget Christopher Steeves 5 th Annual CASLA Conference on Securities Lending June 3, 2015.
Comp 20 - Training & Instructional Design Unit 6 - Assessment This material was developed by Columbia University, funded by the Department of Health and.
Lesson 8: Effectiveness Macerata, 11 December Alessandro Valenza, Director, t33 srl.
Workshop II Monitoring and Evaluation INTERACT ENPI Annual Conference December 2009 | Rome.
EU Funding opportunities : Rights, Equality and Citizenship Programme Justice Programme Jose Ortega European Commission DG Justice.
Project Cycle Management for International Development Cooperation Indicators Teacher Pietro Celotti Università degli Studi di Macerata 16 December 2011.
An overview of multi-criteria analysis techniques The main role of the techniques is to deal with the difficulties that human decision-makers have been.
Lesson 8: Efficiency, Impact and sustainibility Macerata, 11 December Alessandro Valenza, Director, t33 srl.
Paulius Baniūnas Ministry of Finance of the Republic of Lithuania EU Structural Support Management Department Monitoring and Analysis Division SYSTEM OF.
Lesson 7: Performance Macerata, 3 December Alessandro Valenza, Director, t33 srl.
Rome, july 5, 2006 Observing project implementation and conducting project analysis (UVER) Presentation by Luigi Guerci.
Lesson 7: Performance Macerata, 13 December Alessandro Valenza, Director, t33 srl.
URBAN STREAM REHABILITATION. The URBEM Framework.
Lesson 4: Evaluation Plan Macerata, 29 th October Alessandro Valenza, Director, t33 srl.
"The challenge for Territorial Cohesion 2014 – 2020: delivering results for EU citizens" Veronica Gaffey Acting Director EUROPEAN COMMISSION, DG for Regional.
Key Components of a successful Proposal OAS / IACML Workshop on Technical Assistance. San José, Costa Rica May 8, 2007 By: José Luis Alvarez R. Consultant.
Evaluating Engagement Judging the outcome above the noise of squeaky wheels Heather Shaw, Department of Sustainability & Environment Jessica Dart, Clear.
SUMMARY Macerata, 8 th April Andrea Gramillano, t33 srl.
Croatia: Result orientation within the process of preparation of programming documents V4+ Croatia and Slovenia Expert Level Conference Budapest,
Monitoring and Evaluation Systems for NARS organizations in Papua New Guinea Day 4. Session 10. Evaluation.
Module 8 Guidelines for evaluating the SDGs through an equity focused and gender responsive lens: Overview Technical Assistance on Evaluating SDGs: Leave.
Building an ENI CBC project
Classroom Assessments Checklists, Rating Scales, and Rubrics
Analysis Manager Training Module
Purpose To introduce the amendments to the BBBEE Codes of Good Practice, published in the Government Gazette on 11 October 2013, implemented May 2015.
Tender Evaluation and Award Process
Predetermined Objectives – 2013/14
Project Management Processes
4.4 Procurement by grant beneficiaries
TechStambha PMP Certification Training
Strategic Planning for Learning Organizations
Monitoring and Evaluation Systems for NARS Organisations in Papua New Guinea Day 3. Session 7. Managers’ and stakeholders’ information needs.
Classroom Assessments Checklists, Rating Scales, and Rubrics
Measuring Outcomes of GEO and GEOSS: A Proposed Framework for Performance Measurement and Evaluation Ed Washburn, US EPA.
Decision Matrices Business Economics.
Simplification and reduction of administrative burden.
CMS HIPAA Transaction Implementation Status Checklist
The role of the Passport Indicators in Monitoring PFM Strategy
Faculty Performance Reviews at MSU
Performance Framework
Presentation ESF performance report AIR 2016 ESF Technical Working Group 9 February 2018 Brussels Costanza Pagnini.
Overview performance report AIR2016
Florida College System Performance Based Funding
URBAN STREAM REHABILITATION
It’s about more than just tracking numbers
Opening seminar of the project
Project Management Processes
Management Verifications & Sampling Methods
Lesson 3: Performance, effectiveness, efficiency
Chapter 12 Analyzing Semistructured Decision Support Systems
ESF monitoring and evaluation in Draft guidance
Civil Society Facility and Media Programme Call for proposals: EuropeAid/162473/DH/ACT/Multi Webinar no. 3: Preparing effective Concept Note.
Presentation transcript:

Lesson 7: Performance Macerata, 28th October Andrea Gramillano, t33 srl

Verify the consistency of the analysis of needs Obstacles (needs) Potential for solutions Ostacoli – colli di bottiglia

Agenda Definition and meanings Indicators Multicriteria

Performance: problematic defintion. OECD definition: Commission Defintion: The degree to which a development intervention or a development partner operates according to specific criteria/ standards/ guidelines or achieves results in accordance with stated goals or plans. The meaning of the word performance is not yet stable; it is therefore preferable to define it whenever it is used. Performance might mean that intended results were obtained at a reasonable cost, and/or that the beneficiaries are satisfied with them. Efficiency and performance are two similar notions, but the latter extends, more broadly, to include qualitative dimensions.

Performance: When it does take place On going Evalaution Criteria the project activities are delivered on time, the outputs respect the targets, the resources are duly absorbed, the procedures are done according to the rules. Needs Resources (inputs) Output (implementation) (Specific Objective) Result

Performance questions: The project: is able to spend all the financial resources; meets the procedural deadlines; achieved the target in terms of physical realization. And in the next period: What is needed to increase the project performance?(human resources? political support? administrative enforcement? Will the project be successfully completed? Which might be future challenges Performance evaluation is based on monitoring indicators (procedural, financial, physical, output indicators)

Operational life of a Project Setting Tender or Purchaise procedures Start of Work / service Implementation Test, End and final payment

Procedural monitoring Most public activities have to follow a more or less rigid schedule in which the different steps are mandated and the deadlines fixed (i.e.). Procedural monitoring usually provides information about how project pipelines are progressing (where and when calls for tenders have been published, contracts have been awarded, …). Procedure Status Specifications ready Call published Contract awarded …… Final payment Expected Actual P1 Open 08-08-10 14-08-10 19-08-10

Physical monitoring Example of physical monitoring: Micro N 142 91 Indicator (number of enterprises) Unit of Measurement Target Achievement Micro N 142 91 Small 133 111 Medium 39 21 Owner (women) 50 40 Owner (<30y) 26 3 Start-up 54

Financial monitoring Example of financial monitoring: Priority Expected expenditures Resources committed Expenditures Amount % (a) (b) (b/a) (c) (c/a) P1 133.4 100.4 75.2 71.8 53.9

Multicriteria Tool used to compare several interventions in relation to several criteria. Multicriteria analysis is used also in the ex ante evaluation for comparing proposals. It can also be used in the ex post evaluation of an intervention, to compare the relative success of the different components of the intervention. Finally, it can be used to compare separate but similar interventions, for classification purposes. Multicriteria analysis may involve weighting, reflecting the relative importance attributed to each of the criteria. It may result in the formulation of a single judgement or synthetic classification, or in different classifications reflecting the stakeholders' diverse points of view. In the latter case, it is called multicriteria-multijudgement analysis. (from EVALSED)

Step 2 scoring or ranking Process Step 1 define criteria Step 2 scoring or ranking Step 3 weighting aggregating Step 4 Before Step 1 : Definition of the projects or actions to be judged (here results/impacts?)

STEP 1: setting criterion Financial Performance Criterion 2: Procedural Criterion 3: Physical realisation Project 1 Project 2 Project 3 Project 4 Project n…

STEP 2: Score or rank for judgment It is needed to find a way to appraise the project according to different aspects since we used different measuring units for different aspects of different process We can opt for: A) Scoring: by assigning a numeric value to different “interval” of performance . For example 3 for “above average” –1 for “on line with average” – 3 for “below average” B) Ranking: we simply order the different projects according to their performance from the first to the last

STEP 2: Scoring Criterion 1: Criterion 2: Criterion 3: Project 1 1 2 Project 3 3 Project 4 Project n…

STEP 3: Establishment of weight If some Criteria is more important than others it shall be given more importance. To do it we simple apply a multiplication factor > 1 (e.g. 1,5). Some criteria may have such importance that they have to be singled out. This is the case for criteria determined by a veto threshold (For example “Physical” if some project has 0 performance, it is excluded by the analysis).

STEP 3: Apply the weight Criterion 1: ( * 1,5) Criterion 2: Project 1 1,5 1 Project 2 2 Project 3 4,5 Project 4 3 out Project n…

STEP 4: Aggregate the score Criterion 1: ( * 1,5) Criterion 2: Criterion 3: Total (with weight) Total (without weight) Project 1 1(1,5) 1 3.5 3 Project 2 2 Project 3 3(4,5) 6,5 5 Project 4 out 6 Project n…

WORK OUT: SME INCUBATOR Criterion 1: Economic (average increase of turnover) Criterion 2: New Jobs Criterion 3: Satisfaction for quality service Total Project 1 Project 2 Project 3 Project 4

WORK OUT DATAS Quality levels Low Medium High Project 1 2 3 4 Economic 80% 70% 30% Physical 50 10 30 5 Quality High Low Medium Quality levels Low Medium High

Apply weight A Economic * 1,5 Physical Veto = N. of job < 10 B * 2 Quality Veto for “Low”

See you www.t33.it a.gramillano@t33.it