Provincial M&E Forum 18 August 2011 The Presidency Department of Performance Monitoring and Evaluation Draft National Evaluation Policy Framework.

Slides:



Advertisements
Similar presentations
Developing an Evaluation Strategy – experience in DFID Nick York Director – Country, Corporate and Global Evaluations, World Bank IEG Former Chief Professional.
Advertisements

Overview of the Global Fund: Guiding Principles Grant Cycle / Processes & Role of Public Private Partnerships Johannesburg, South Africa Tatjana Peterson,
Monitoring and Evaluation in the CSO Sector in Ghana
Evaluating public RTD interventions: A performance audit perspective from the EU European Court of Auditors American Evaluation Society, Portland, 3 November.
NATIONAL TREASURY TECHNICAL ASSISTANCE UNIT (TAU) Enabling Change for Development Programme and Project Management (PPM) Update Project Management Interest.
Evaluation and performance assessment - experience from DFID Colin Kirk Head, Evaluation Department, DFID.
Project Monitoring Evaluation and Assessment
Ray C. Rist The World Bank Washington, D.C.
The Polity Utility of Tracking State HIV/AIDS Allocations UNAIDS Satellite Meeting: “National AIDS Accounts” XV International AIDS Conference Bangkok,
Comprehensive M&E Systems
Improving productivity in the public service
National Evaluation Capacity Development Key elements for a conceptual framework Marco Segone*, Systemic Management, UNICEF Evaluation Office, and former.
1 Portfolio Committee on Home Affairs Compliance monitoring in the Department of Home Affairs 30 April 2013.
Session 3 - Plenary on implementing Principle 1 on an Explicit Policy on Regulatory Quality, Principle 3 on Regulatory Oversight, and Principle 6 on Reviewing.
Session 4: Good Governance: How SAIs influence Good Governance in Public Administration Zahira Ravat 27 & 28 May 2014.
Common recommendations and next steps for improving local delivery of climate finance Bangkok, October 31, 2012.
1 M&E SUPPORT TO PLANNING & BUDGET IN GHANA Presentation by CAPT. P.I DONKOR (rtd) National Development Planning Commission, Ghana.
Click to edit Master subtitle style 2/17/12 Portfolio Committee on Public Service and Administration Meeting PALAMA’s strategy regarding scarce skills.
SECTOR POLICY SUPPORT PROGRAMMES A new methodology for delivery of EC development assistance. 1.
GOVERNMENT OF ROMANIA MINISTRY OF PUBLIC FINANCE MANAGING AUTHORITY FOR COMMUNITY SUPPORT FRAMEWORK Evaluation Central Unit Development of the Evaluation.
MONITORING AND EVALUATION A GENERAL OVERVIEW A PRESENTATION AT ISSER 28 June, 2013 By Bruno B. Dery.
National Support Team: Findings from the first 2 years Katrina Stephens Associate Delivery Manager, Alcohol Harm Reduction National Support Team, Department.
By Bankole Ebisemiju At an Intensive & Interactive workshop on Techniques for Effective & Result Oriented Annual Operation Plan November 24th 2010 Annual.
IAOD Evaluation Section, the Development Agenda (DA) and Development Oriented Activities Julia Flores Marfetan, Senior Evaluator.
Commissioning Self Analysis and Planning Exercise activity sheets.
Audit of predetermined objectives Presentation: Portfolio Committee on Economic Development March 2013.
AUDITOR-GENERAL Presentation to the Public Service and Administration Portfolio Committee on the appointment and utilisation of consultants Report of the.
Strategic Plan th October Management and Governance “GeSCI’s corporate structures and management arrangements were appropriate for.
DRAFT V1 National Vaccine Supply Chain Innovations: Country Commitment to Ownership, Sustainability & Impact GAVI Partners’ Forum WHO – UNICEF – GAVI -
1 Analysing the contributions of fellowships to industrial development November 2010 Johannes Dobinger, UNIDO Evaluation Group.
8 TH -11 TH NOVEMBER, 2010 UN Complex, Nairobi, Kenya MEETING OUTCOMES David Smith, Manager PEI Africa.
Regional Training/Consultations on Capacity Development for Sustainable Growth and Human Development in Europe and CIS Application of Capacity Development.
CHE Business Plan Mission The mission of the CHE is to contribute to the development of a higher education system that is characterised by.
Portfolio Committee on Appropriations Audit of predetermined objectives 26 March 2013.
BEYOND MKUKUTA FRAMEWORK: Monitoring and Evaluation, Communication and Implementation Guide Presentation to the DPG Meeting 18 th January, 2011.
DETERMINE Working document # 4 'Economic arguments for addressing social determinants of health inequalities' December 2009 Owen Metcalfe & Teresa Lavin.
Trade Union Training on employment policies with a focus on youth employment 11 July, 2007 Turin, Italy.
Global Advocacy Working Group Second report back.
Aid Transparency: Better Data, Better Aid Simon Parrish, Development Initiatives & IATI Yerevan, 4 October 2009.
NSDS DESIGN PROCESS: ROAD MAPS & OTHER PRELIMINARIES Prof. Ben Kiregyera NSDS Workshop, Addis Ababa, Ethiopia 9 August 2005.
The partnership principle and the European Code of Conduct on Partnership.
Tracking national portfolios and assessing results Sub-regional Workshop for GEF Focal Points in West and Central Africa June 2008, Douala, Cameroon.
TREASURY REGULATIONS’ CHANGES AND POTENTIAL IMPACT
Consultant Advance Research Team. Outline UNDERSTANDING M&E DATA NEEDS PEOPLE, PARTNERSHIP AND PLANNING 1.Organizational structures with HIV M&E functions.
Kathy Corbiere Service Delivery and Performance Commission
With Ticon DCA, Copenhagen DC and Ace Global Evaluation of the International Trade Centre Overview, Key Findings, Conclusions and Recommendations Presentation.
Gender Mainstreaming: Making It Happen Geeta Rao Gupta February 16, 2006.
Grant Application Form (Annex A) Grant Application Form (Annex A) 2nd Call for Proposals.
Page 1 APAC ANNUAL TRAINING WORKSHOP 2011 The use of performance audit reports from AGSA 2 – 3 August 2011.
Auditing of Performance A conceptual discussion. Auditing of performance To demonstrate and discuss the differences between auditing of performance information.
Monitoring Afghanistan, 2015 Food Security and Agriculture Working Group – 9 December 2015.
Comprehensive M&E Systems: Identifying Resources to Support M&E Plans for National TB Programs Lisa V. Adams, MD E&E Regional Workshop Kiev, Ukraine May.
Folie 1 Sarajevo, October 2009 Stefan Friedrichs Managing Partner Public One // Governance Consulting Project Management in the Public Sector Monitoring.
Torbay Council Partnerships Review August PricewaterhouseCoopers LLP Date Page 2 Torbay Council Partnerships Background The Audit Commission defines.
© ARVIR Balancing Funding Priorities for Innovation Projects; Does the South African Government Address the Issue of Portfolio Management?
TAIEX-REGIO Workshop on Applying the Partnership Principle in the European Structural and Investment Funds Bratislava, 20/05/2016 Involvement of Partners.
Monitoring and Evaluation Systems for NARS organizations in Papua New Guinea Day 4. Session 10. Evaluation.
African Agriculture Planning Scorecard. The challenge Wide variations in how African countries practice agricultural planning and budgeting Some practices.
1 STATE OF THE PUBLIC SERVICE REPORT February 2003.
Project: EaP countries cooperation for promoting quality assurance in higher education Maria Stratan European Institute for Political Studies of Moldova.
Audit of predetermined objectives
Presentation to the Portfolio Committee on the Appointment and Utilisation of Consultants Report of the Auditor-General dated August October 2002.
Ian Goldman Head: Evaluation and Research AEA, 17 October 2012
UNECE Work Session on Gender Statistics, Belgrade,
9/16/2018 The ACT Government’s commitment to Performance and Accountability – the role of Evaluation Presentation to the Canberra Evaluation Forum Thursday,
Presenter: Beverly Reynolds, DPM, Health Sector Development
Introduction to National Evaluation System
Department of Performance Monitoring and Evaluation
ATTORNEYS AND SOLICITORS
Mrs.Dharshana Senanayake Director General
Presentation transcript:

Provincial M&E Forum 18 August 2011 The Presidency Department of Performance Monitoring and Evaluation Draft National Evaluation Policy Framework

The Presidency: Department of Performance Monitoring and Evaluation Process on the Framework  Evaluation as core part of GWM&ES  More emphasis on monitoring  Draft policy framework 6 months ago  Study tour to Mexico/Colombia/US focusing on this (with DBE/DSD/OPSC/GCIS)  Writeshop with same depts plus GP (and FS invited)  Draft framework developed together – DPME edited  Comments by 31 August 2

The Presidency: Department of Performance Monitoring and Evaluation Structure of the Framework Part AIntroduction 1Background 2Why evaluate? 3Approach to evaluation Part B How we do evaluation? 4Uses and types of evaluations 5Assuring credible and quality evaluations 6The process of evaluation 7Assuring follow-up Part C How we make this happen? 8Institutionalising evaluation in the Government system 9 Management and coordination of evaluation across government 3

The Presidency: Department of Performance Monitoring and Evaluation 1Background  Challenges  Lack of clear policy and strategic direction around evaluation;  A need to promote the use of knowledge from both evaluation and research;  Confusion on what is evaluation, performance auditing, research etc;  Evaluation work exists but not necessarily known, within departments or externally;  Lack of coordination between organisations and fragmentation of approaches;  Inadequate use of evaluation, leading to a perception that it is a luxury and a lack of institutionalisation.  Problem - Evaluation is applied sporadically and not informing planning, policy-making and budgeting sufficiently, so we are missing the opportunity to improve Government’s effectiveness, efficiency and impact. 4

The Presidency: Department of Performance Monitoring and Evaluation Focus of document  Focus of this policy framework  A common language and conceptual base for evaluation in Government;  An institutionalised system across Government linking to planning and budget;  Clear roles and responsibilities;  Improved quality of evaluations;  Utilisation of evaluation findings to improve performance.  Target group  Political principals and senior managers in the public sector who must improve their performance and incorporate evaluation into what they do  Other actors who need to be involved in the evaluation process, such as potential evaluators (including academics and other service providers  Training institutions, who will have to ensure that public servants understand evaluation and we have a wider cadre of potential evaluators with the required skills and competences. 5

The Presidency: Department of Performance Monitoring and Evaluation 2Why evaluate  Judge merit or worth of something:  Was the programme successful? Was it effective? Did the intended beneficiaries receive the intervention? Did it impact on their lives?  Improving policy or programme performance (evaluation for learning):  this aims to provide feedback to programme managers. Questions could be: was this the right intervention for this objective, was it the right mix of outputs, what is the most effective way to do X?  Evaluation for improving accountability:  where is public spending going? Is this spending making a difference?  Evaluation for generating knowledge (for research):  increasing knowledge about what works and what does not with regards to a public policy, programme, function or organization. 6

The Presidency: Department of Performance Monitoring and Evaluation 3 Approach to evaluation  For this Evaluation Policy Framework evaluation is defined as:  The systematic collection and objective analysis of evidence on public policies, programmes, projects, functions and organizations to assess issues such as relevance, performance (effectiveness and efficiency) and value for money, and recommend ways forward.  It is differentiated from monitoring:  Monitoring involves the continuous collecting, analysing and reporting of data in a way that supports effective management. Monitoring aims to provide managers with regular feedback on progress in implementation and results and early indicators of problems that need to be corrected. It usually reports on actual performance against what was planned or expected (adapted from the Policy Framework on GWM&ES) 7

The Presidency: Department of Performance Monitoring and Evaluation Comparing concepts 8

The Presidency: Department of Performance Monitoring and Evaluation 5 How do we evaluate – proposed types 9

The Presidency: Department of Performance Monitoring and Evaluation Applying evaluations to different objects 10

The Presidency: Department of Performance Monitoring and Evaluation When to apply the evaluations 11

The Presidency: Department of Performance Monitoring and Evaluation Priority for existing programmes/policies Large (eg over R500 million) or covering a large proportion of the population, and have not had a major evaluation for 5 years. This figure can diminish with time; Of strategic importance, and for which it is important that they succeed. If these have not been evaluated for 3 years or more, an implementation evaluation should be undertaken; Innovative, from which learnings are needed – in which case an implementation evaluation should be conducted; Of significant public interest – eg key front-line services; Any programme for which there are real concerns about its design should have a design evaluation conducted. 12 Figure to be confirmed

The Presidency: Department of Performance Monitoring and Evaluation For new programmes/policies 13

The Presidency: Department of Performance Monitoring and Evaluation Internal/external 14 Balancing ownership and credibility

The Presidency: Department of Performance Monitoring and Evaluation Who does the evaluations 15

The Presidency: Department of Performance Monitoring and Evaluation 8Institutionalising evaluation  Legal framework  Evaluation plan  3 year and annual evaluation plan developed by DPME (with partners) starting with 2012/13. Specifies from a national perspective what needs to be done. Government institutions can choose to do additional evaluations.  Role and responsibilities  Departments and public institutions - responsibility to incorporate evaluation into their management functions as a way to continuously improve their performance. They need to:  Ensure there is an evaluation budget in all programmes (see 8.4) and a plan over 3-5 years for which evaluations will be undertaken, and the form of evaluation;  Ensure there are specific structures within the organisation entrusted with the evaluation role, and with the required skills. This could be a M&E Unit, or a research unit, or a policy unit.  Ensure that the results of evaluations are used to inform planning and budget decisions, as well as general decision-making processes. Thus the results of evaluations must be discussed in management forums and used to guide decision-making. 16

The Presidency: Department of Performance Monitoring and Evaluation Other roles and responsibilities  DPME is the custodian of the evaluation function in Government. Includes:  Standard setting,, Pooling of knowledge, Quality assurance, Capacity building and technical assistance, Promotion  National Treasury - assure value for money when allocates budgets. See that:  Plans and budgets are informed by evidence, including from evaluations;  Ensure that cost-effectiveness analyses are undertaken, and that suitable methodologies employed.  DPSA - see that the results of evaluations which raise questions around the performance or structure of the public service are addressed.  OPSC - specific independent role in the evaluation process, reporting directly to Parliament, and source of expertise in helping to build the evaluation system.  Auditor-General - independent body, and an important player in its role of performance audit.  PALAMA -responsible for developing capacity-building programmes around M&E across government.  Universities  tertiary education including evaluation, and skills development  supply many of the evaluators, particularly where sophisticated research methodologies are needed  undertake research which is closely allied to evaluation, and can help to inform research processes.  SAMEA - The South African M&E Association  support the development of systems and capacities, and are an important forum for learning and sharing. 17

The Presidency: Department of Performance Monitoring and Evaluation Other issues  Budgeting 1-5% of programme budgets for evaluation  Using standardised systems  Donor funded evaluations following government system  Optimising limited capacity  technical capacity in DPME to support departments on methodology and quality;  Outsourcing of evaluations to external evaluations using an accredited panel;  Training using short courses including PALAMA, universities, and private consultants.  Building on international partnerships with similar countries (eg Mexico and Colombia), and international organisations, eg 3ie or World Bank. 18

The Presidency: Department of Performance Monitoring and Evaluation 9Management  Champion DPME, with specific technical unit created to provide support  Evaluation Working Group to build on strengths in government and ensure commitment across government, including provincial expert  OoPs to provide leadership and coordination at provincial level 19