Welfare Economics, Project and Programme Appraisal and Evaluation

Slides:



Advertisements
Similar presentations
KYRGYZ REPUBLIC Programmatic Public Expenditure Review Monitoring and Evaluation The MOF/Donor Workshop on PPER Bishkek, September 26, 2005.
Advertisements

Project Appraisal Module 5 Session 6.
Options appraisal, the business case & procurement
Intelligence Step 5 - Capacity Analysis Capacity Analysis Without capacity, the most innovative and brilliant interventions will not be implemented, wont.
Developing an Evaluation Strategy – experience in DFID Nick York Director – Country, Corporate and Global Evaluations, World Bank IEG Former Chief Professional.
Evaluating public RTD interventions: A performance audit perspective from the EU European Court of Auditors American Evaluation Society, Portland, 3 November.
Project Monitoring Evaluation and Assessment
Lesson 2: Project: Evaluation, Monitoring, Auditing Macerata, 22 nd October Alessandro Valenza, Director, t33 srl.
Ray C. Rist The World Bank Washington, D.C.
NDP Evaluation Process Presentation by David Hegarty, NDP/CSF Evaluation Unit Irish Evaluation Network 21 March 2003.
PPA 502 – Program Evaluation
Evidence Based Cohesion Policy Focus on performance incentives Thomas Tandskov Dissing Senior Adviser Ministry of Economics and Business Affairs Danish.
Australia’s Experience in Utilising Performance Information in Budget and Management Processes Mathew Fox Assistant Secretary, Budget Coordination Branch.
Session 3 - Plenary on implementing Principle 1 on an Explicit Policy on Regulatory Quality, Principle 3 on Regulatory Oversight, and Principle 6 on Reviewing.
Standards and Guidelines for Quality Assurance in the European
Integration of Regulatory Impact Assessment into the decision making process in the Czech Republic Aleš Pecka Department of Regulatory Reform and Public.
HOW TO WRITE A GOOD TERMS OF REFERENCE FOR FOR EVALUATION Programme Management Interest Group 19 October 2010 Pinky Mashigo.
Dr. G. Johnson, Program Evaluation and the Logic Model Research Methods for Public Administrators Dr. Gail Johnson.
Evaluation plans for programming period in Poland Experience and new arrangements Ministry of Infrastructure and Development, Poland Athens,
Evaluation Seminar Czech Republic CSF and OP Managing Authorities David Hegarty NDP/CSF Evaluation Unit Ireland.
GOVERNMENT OF ROMANIA MINISTRY OF PUBLIC FINANCE MANAGING AUTHORITY FOR COMMUNITY SUPPORT FRAMEWORK Evaluation Central Unit Development of the Evaluation.
Evaluation methods and tools (Focus on delivery mechanism) Jela Tvrdonova, 2014.
EQARF Applying EQARF Framework and Guidelines to the Development and Testing of Eduplan.
1 RBM Background Development aid is often provided on a point to point basis with no consistency with countries priorities. Development efforts are often.
Government’s Expenditure Review Initiative Progress Lunchtime seminar of Irish Evaluation Network 10 March 2005 Conor McGinn, Department of.
Lesson 2: Project: Evaluation, Monitoring, Auditing Macerata, 22 nd October Alessandro Valenza, Director, t33 srl.
KEYWORDS REFRESHMENT. Activities: in the context of the Logframe Matrix, these are the actions (tasks) that have to be taken to produce results Analysis.
Research Quality Assessment following the RAE David Sweeney Director, Research, Innovation, Skills.
Semester 2: Lecture 9 Analyzing Qualitative Data: Evaluation Research Prepared by: Dr. Lloyd Waller ©
IAOD Evaluation Section, the Development Agenda (DA) and Development Oriented Activities Julia Flores Marfetan, Senior Evaluator.
EVALUATION APPROACHES Heather Aquilina 24 March 2015.
Report on the Evaluation Function Evaluation Office.
POVERTY IMPACT ASSESSMENT Poverty impact assessment arrangements in the EU : an overview Hugh Frazer Coordinator, EU Network of Independent Experts on.
The LOGICAL FRAMEWORK Scoping the Essential Elements of a Project Dr. Suchat Katima Mekong Institute.
Commissioning Self Analysis and Planning Exercise activity sheets.
Monitoring & Evaluation: The concepts and meaning Day 9 Session 1.
EU-Regional Policy Structural actions 1 Structural Funds Evaluation A VIEW FROM THE EUROPEAN COMMISSION Anna Burylo, DG Regional Policy, Evaluation.
Result Orientation in Interreg CENTRAL EUROPE Annual Meeting, Luxemburg, 15 September 2015 Monika Schönerklee-Grasser, Joint Secretariat.
Project Management Learning Program 7-18 May 2012, Mekong Institute, Khon Kaen, Thailand Writing Project Report Multi-Purpose Reporting.
Regional Policy Veronica Gaffey Evaluation Unit DG Regional Policy International Monitoring Conference Budapest 11 th November 2011 Budapest 26 th September2013.
Screen 1 of 22 Food Security Policies – Formulation and Implementation Policy Monitoring and Evaluation LEARNING OBJECTIVES Define the purpose of a monitoring.
Disability Services Value for Money and Policy Review 29/11/20151 Value for Money and Policy Review of Disability Services in Ireland Presentation to the.
A short introduction to the Strengthened Approach to supporting PFM reforms.
Consultant Advance Research Team. Outline UNDERSTANDING M&E DATA NEEDS PEOPLE, PARTNERSHIP AND PLANNING 1.Organizational structures with HIV M&E functions.
Results Based Management in Practice – Experiences and learning By Kai Matturi Knowledge & Learning Adviser Mary Banda, 37, in her field in May Mary.
Provincial M&E Forum 18 August 2011 The Presidency Department of Performance Monitoring and Evaluation Draft National Evaluation Policy Framework.
Developing an Investment Governance Framework
Evaluation design and implementation Puja Myles
Kathy Corbiere Service Delivery and Performance Commission
Tools for Mainstreaming Disaster Risk Reduction: Guidance Notes for Development Organisations Charlotte Benson and John Twigg Presented by Margaret Arnold.
ADE’s 25 th anniversary Economic Governance: Key to Development ? Introduction Bruxelles – Bibliothèque Solvay – 5 October 2015.
W. Schiessl, AGRI E.II.4 Programme management and institutions involved in monitoring and evaluation.
Evaluation What is evaluation?
Spending Reviews in Ireland
Country Level Programs
Project Cycle Management
Introductory Presentation by David Hegarty David Hegarty Phone:
9/16/2018 The ACT Government’s commitment to Performance and Accountability – the role of Evaluation Presentation to the Canberra Evaluation Forum Thursday,
Business Environment Dr. Aravind Banakar –
Business Environment Dr. Aravind Banakar –
Business Environment
Business Environment
Business Environment
Business Environment
Tracking development results at the EIB
Draft OECD Best Practices for Performance Budgeting
Evaluation plans for programming period in Poland
ESF monitoring and evaluation in Draft guidance
Role of Evaluation coordination group and Capacity Building Projects in Lithuania Vilija Šemetienė Head of Economic Analysis and Evaluation Division.
Presentation transcript:

Welfare Economics, Project and Programme Appraisal and Evaluation Lecture 2: Programme Evaluation David Hegarty 7 October, 2011

Structure Part 1: Definition, purpose and key issues in programme evaluation Part 2: Overview of methodological tools Part 3: Evaluation capacity and practice in Ireland

Part I: Definition and Purpose No single, preferred definition of evaluation “Evaluation” is a common, everyday activity People evaluate films, restaurants etc Firms evaluate investments “Evaluation is an elastic word that stretches to cover judgements of many kinds” (Weiss)

Some Definitions “The process of collecting and analysing information and reaching conclusions on specific questions” (Dept. of Finance VFM guidance manual) “Judgement of interventions according to their results, impacts and needs they aim to satisfy” (EU Commission) “Evaluation is the process of determining the merit, worth, and value of things, and evaluations are the products of that process” (Scriven) “The systematic assessment of the operation and/or the outcomes of a program or policy, compared to a set of explicit or implicit standards, as a means of contributing to the improvement of the program or policy” (Weiss)

Key Features Definitions point to some key features Evaluation can be carried at various levels: policy, programme or project A systematic exercise based on accepted social science research standards Involves forming a judgement to be based on certain criteria Focus can be either on the process (operation) of the programme or on its impacts (outcomes) Purpose of exercise is to improve the intervention under evaluation

Evaluation Purposes: Why evaluate? Planning Is the programme justified? Programme design Resource allocation Implementation Is the programme working and/or how can the programme be improved? Accountability What was achieved?

Evaluation Purposes Knowledge Development What interventions work and in what circumstances? Does the logic of the programme and its assumptions need to be questioned? Development Institutional performance and strengthening Service quality

Evaluation Purposes But… evaluation sometimes used for other, “covert” purposes Justify decisions already made Postpone decisions Public relations Compliance “A rational exercise often undertaken for non-rational reasons” (Weiss)

Summary Evaluation can be seen as serving an overarching learning purpose “To learn through systematic enquiry how to better design, implement and deliver public programmes and policies” (EU Evalsed Guide)

Formative and Summative Evaluation Accountability focus What has been achieved? Formative (“Process”) evaluation Development or learning focus How can we improve performance and delivery of programme? “When the cook tastes the soup that’s formative evaluation. When the guest tastes it, that’s summative evaluation” (Scriven) Most evaluations lie along this continuum Combining elements of each Both relevant and useful to public sector Dept. of Finance VFM manual has strong summative emphasis

Key issues in programme evaluation What's the basis for evaluation judgements? Dept. of Finance VFM Manual refers to 5 main evaluation criteria Relevance Rationale Effectiveness Efficiency Impact Framework originally developed for evaluation of EU-funded programmes (CSF Evaluation Unit, 1996)

Evaluation Questions Rossi et. al (2004) identify 5 main question types in evaluation of social programmes Needs assessment Assessment of programme theory Assessment of programme process Impact assessment Efficiency assessment DOF framework has stronger summative or economic emphasis

Relevance Two main dimensions Policy relevance External relevance Domestic EU External relevance What societal needs or problems does programme address? (needs assessment) Is the programme “fit-for purpose”? Implications of external changes for programme (continued relevance)

Rationale Why is the State involved? Types of market failure Is there a market failure? “A necessary but not sufficient for government intervention to improve economic efficiency is that there is some form of market failure” (HM Treasury) Could the problem be addressed through more direct means? Danger of “second-best” solutions Types of market failure Public goods Externalities Redistribution

Effectiveness Is the programme meeting its objectives? Generally addressed at level of Inputs: Is the money being spent? Outputs: Results or immediate benefits Are the above in line with expectations? If not, why not? Almost a monitoring question Effectiveness and impact questions often overlap A lot depends on how objectives are framed

Efficiency Some definitions: “Efficiency in the public sector involves making the best use of resources available for the provision of public services” (Gershon UK efficiency review 2004) “Optimising the ratios of inputs to outputs” (DOF VFM Manual) Can be viewed in a number of ways Reduced inputs for same level of service Additional outputs for same level of inputs Improved unit cost ratio Changing mix of activities/outputs to better deliver a given objective for same input level Using alternative delivery approaches, e.g., outsourcing to private sector Efficiency a core, perhaps overarching, element of value for money agenda Getting the best return from a given level of resources is the essence of value for money

Impact What difference has the programme made? Need to consider To its beneficiaries In terms of wider socio-economic objectives Need to consider Deadweight effects Displacement Unintended side-effects So-called horizontal issues a sub-set of impact Rural development Poverty Gender equality

Evaluation Cycle Evaluation cycle is a function of the wider programme and policy cycles (see diagram) Aim should be to conduct evaluations at the right time to influence programme design and policy formulation Easier said than done!

Policy, Programme & Evaluation Cycles Delivery Formulation Review Programme Implementation Design Conclusions Evaluation Source: The EVALSED

The Evaluation Cycle EU Structural Fund regulations require evaluations at three stages Ex ante (before) Interim or ongoing Ex post (after) Value for Money reviews generally take form of ongoing evaluations

Ex Ante Evaluation Focus: “to optimise the allocation of resources and improve the quality of programming” (EU Regulation) A planning purpose Key evaluation questions What is the rationale for programme and is it robust? Is the programme relevant or fit-for-purpose Programme design issues

Interim Evaluation Focus Key questions Largely an implementation purpose But much depends on programme maturity Key questions Relevance or continued relevance Effectiveness Efficiency

Ex Post Mainly an accountability purpose what has been achieved and at what cost summative in character Not widely practiced in Ireland except for EU programmes

Evaluation Cycle and Focus Stage Ex ante Ongoing Ex Post Purpose Planning Implementation Accountability Questions Rationale *** * Relevance Effectiveness ** Efficiency Impact

Overview of methodological tools Sourcing information and data Data analysis techniques Tools to inform evaluation judgements 25 25

Data Sourcing All evaluations require data Primary and secondary data “the raw material that once collected is organised, described, grouped, counted and manipulated by various methods and techniques” (EVALSED Guide) Primary and secondary data Primary data are data generated as a consequence of programme (uptake of services, data relating to beneficiaries) Secondary data are generated for other purposes and pre-exist the programme (e.g., socio-economic and administrative data) 26 26

Data Types Key distinction between quantitative and qualitative approaches Quantitative used to gather “hard” data Who, what, how many Expressed in terms of averages, ratios or ranges In practice much hard or quantitative data may be categoric or ordinal in nature Qualitative methods used to gather “soft” data Focus on understanding or why questions Quantitative/qualitative a continuum Distinction stronger in terms of analytical intent Quantitative for aggregation and generalisation Qualitative to understand complexity 27 27

Data sourcing techniques Main techniques/sources include Monitoring indicators Documentary analysis Administrative data Socio-economic data Beneficiary surveys Stakeholder interviews Focus groups Case studies 28 28

Data Analysis Techniques Once the data is collected, how do we analyse it? Main techniques include Statistical analysis SWOT analysis Econometric models Experimental designs Quasi-experimental designs (control groups) 29 29

Tools to inform evaluation judgements Having gathered and analysed the data, how do we arrive at evaluation judgements? Main tools Benchmarking Multi-criteria analysis Cost benefit analysis and cost effectiveness analysis (will be addressed in project evaluation stream) Economic impact assessment Macro Micro Intervention logic analysis Specialist thematic tools Gender impact assessment Strategic environmental assessment 30 30

Factors Affecting Choice of Method Programme type Stage in programme/evaluation cycle Evaluation purpose Evaluation scope and questions Data availability Resources 31 31

Factors Affecting Choice of Method 32 32

Part 3: Evaluation capacity and practice in Ireland Concept of evaluation capacity Influence and evolution of EU Structural Funds evaluation systems Development of national programme evaluation processes Expenditure Review Initiative Value for Money and Policy Review Initiative Where are we now?

The concept of evaluation capacity Evaluation capacity concerns the process of setting-up the necessary systems and infrastructures to undertake evaluation Some definitions “the development of national or sectoral evaluation systems” (MacKay, World Bank) “the institutional, human, resource, skill and procedural base for conducting evaluations in public policy and public management systems” (Evalsed Guide) Concerned with creating and sustaining factors that support evaluation in government sector

Key dimensions of evaluation capacity Fair degree of consensus in literature as to key building-blocks 4 key dimensions generally highlighted in literature Architecture: organisation of evaluation function Demand: is there an effective demand for evaluation? Supply: evaluation resources (methods, resources, skills) Institutionalisation: building evaluation into policymaking systems The wider, cultural factors or conditions that determine the influence of evaluation on policy

Critical Success Factors Key lessons as to factors needed to strengthen government evaluation systems Substantive government demand essential Incentives important for demand Limitations of reliance on rules and regulations Need to work on demand and supply sides in parallel Need for evaluation champions Adequate evaluation resources Including good data systems Importance of structural arrangements/architecture including links with other functions Danger of over-engineering the system Utilisation is key A long-haul effort requiring patience, persistence and leadership

Evolution and development of EU evaluation systems Evaluation context (as of late 1980s) Little prior tradition of programme evaluation in Ireland prior to Structural Funds evaluation limited in scope and largely peripheral to decision making A low evaluation capacity baseline Evaluation impetus driven by compliance considerations EU Commission pressure and support political priority attached to EU funds Leading to …. gradual creation of evaluation structures (CSF1, 1989-1993) major expansion in evaluation capacity and output (CSF2, 1994-1999)

Key Developments in Capacity Development 1994 to 1999 CSF saw gradual establishment of programme evaluation structures 3 internal evaluation units 6 external evaluators to other programmes By end-1996 each programme (9) had dedicated evaluation function Central CSF Unit with a coordination and good practice promotion remit set up in 1996 Lessons learnt influenced design of evaluation arrangements for 2000 – 2006 period

2000 to 2006 Evaluation System: Key Features Evaluation system extended from just EU-funded elements to entire NDP (€51 bn) Applied to up to 20% of total public expenditure Centralised system with 1 Evaluation Unit (NDP/CSF Evaluation Unit) Main responsibilities and activities Development of performance indicators Advice on project appraisal techniques Drafting evaluation terms of reference Commissioning evaluations Doing evaluations Extensive ongoing evaluation effort

Development of National Evaluation Systems Background/origins Increased emphasis on public service management Public management reforms in early/mid 1990s International developments in public management Key milestones C&AG amendment Act, 1993 Gave C&AG mandate to carry out VFM audits And to examine adequacy of departments’ systems to evaluate effectiveness of their operations 76 VFM audits to date Launch of Strategic Management Initiative (1994) Delivering Better Government (1996) Public Service Management Act, 1997 Departments required to produce Statements of Strategy and Annual Reports

Expenditure Review Initiative Expenditure Review Initiative (ERI) Non-EU evaluation system introduced in 1997 ERI influenced by Australian system A “whole of government” evaluation strategy Objectives to provide a systematic analysis of what is actually being achieved by expenditure in each programme; and to provide a basis on which more informed decisions can be made on priorities within and between expenditure programmes

ERI: Key Features Aim was to review all expenditure areas every 3 years Programme of reviews agreed by each department with Department of Finance Central Steering Committee and secretariat in DOF Reviews undertaken by line departments and by programme managers Department of Finance represented on steering committees

Evolution of ERI ERI reviewed by C&AG in 2001 Key findings 3 year target not met, significant delays Reviews focused on minor programmes Quality highly variable Limited influence on resource allocation Number of reforms introduced Establishment of network of reviewers and training supports Independent quality review procedure Efforts to track impact of reviews and review process generally

Evolution of ERI Central steering committee (ERCSC) reviewed progress in October 2004 Key findings Taken time to for earlier reforms to take effect Slippage in timeframe for completion of reviews Topics selected for review relatively small scale Evaluative capacity of departments variable Process had led to improvements in approach to evaluation and evaluation culture of departments Extent to which reviews driving resource allocation decisions unclear

ERI Review Series of recommendations made by ERCSC Changes to structures and reporting arrangements in departments Independent steering committees Reporting on review results Intensify efforts to develop performance indicators Use trainee analysts and graduates from IPA policy analysis masters course to support review process

Value for Money and Policy Review Initiative ERI replaced in 2006 by “Value for Money and Policy Review Initiative” Somewhat wider evaluation focus Ninety reviews approved for 2006-2008 period 2 per department per year Guidance manual published 2007 Mix of internal and external reviews Target for number of reviews does not appear to have been reached Progress rather uneven

Current Situation Post 2007, EU funding very limited EU funding of just €900 mn. for 2007 to 2013 Limited evaluation requirements under EU regulations Earlier NDP/CSF Evaluation Unit replaced by Central Expenditure Evaluation Unit in MOF Responsible for evaluation of all national programmes Main focus of Unit now on Value for Money and Policy Review Initiative Undertaking and overseeing VFM reviews Issuing guidelines Unit also has important role in project evaluation area

Reflections on Irish Experience EU requirements and external influences a key driver EU funds contributed to development of capacity and expertise Increased awareness and understanding of evaluation amongst policymakers Led to creation of internal evaluation structures And improved supply-side capacity in response to evaluation demand Important long-term benefits of Structural Funds

Reflections on Irish Experience Ireland now in a “post Structural Funds” era Evaluation system no longer organised around EU Funds Slow progress under ERI and VFM processes Some signs of a loss of momentum in evaluation practice over recent years Current economic difficulties means heavy emphasis is on expenditure control and reduction and broader expenditure review exercises McCarthy review (2009) Comprehensive Spending Review

Conclusions: Some quotes “In the age of evaluation Ireland has been encouraged or even compelled by the pressure of very influential external forces to adopt a culture of evaluation. Despite the fact that this culture dates back over some three decades it remains a somewhat uneasy and unconvincing addition to the tools of governance” “One thing seems clear: policy developments in the field of evaluation will continue to be largely driven by external pressures since there is very little evidence of an appetite for evidence-driven policy among senior political or public-sector leaders” (McNamara et al, Developing a Culture of Evaluation In Ireland, 2009)