Download presentation
Presentation is loading. Please wait.
Published byHelena Dalton Modified over 8 years ago
1
Nox Chitepo National Evaluation Practices: Experience from South Africa 14 June 2016 Kampala, Uganda Department of Planning, Monitoring and Evaluation South Africa’s National Evaluation Policy Framework
2
Why evaluate? 2 Improving policy or programme performance (evaluation for continuous improvement): this aims to provide feedback to programme managers. Improving decision-making: Should the intervention be continued? Should how it is implemented be changed? Should increased budget be allocated? Evaluation for improving accountability: where is public spending going? Is this spending making a difference? Evaluation for generating knowledge (for learning): increasing knowledge about what works and what does not with regards to a public policy, programme, function or organization.
3
Where are we at? Total evaluations on the NES- 59 Approved reports- 25 Served at cabinet- 13 Evaluations underway- 16 Preparation stage- 12 Stuck- 1 Dropped- 5
4
Use of evaluations Programme evaluatedProgress ECDNew policy gazetted responding to findings Grade RDBE address quality of provision not just quantity CRDPSubstantial revisions to operations Recapitalisation (RADP)Substantial revisions to operations Nutrition interventions for children under 5 Nutrition plan integrated with Food Security. MTSF target to reduce stunting from 21% to 10%. Restitution Creating independent Commission on Land Claims. Substantial revisions to operations. Impact evaluation. Support Programme for Industrial Innovation Changes to operation including addition of commercialisation stage. Re-launched. Urban Settlements Development Grant Even before evaluation completed changes made to guidelines Export Marketing Incentive (EMIA) Changes to operation (remove duplication in operations) Policy on Community CollegesThis was a design evaluation and before the policy was released significant changes were made as a result 4
5
Approach – ensuring use and ownership Key issues to ensure use: – Departments must own the evaluation concept and the process and so they must request evaluation (not be imposed on them) – There must be a learning focus rather than punitive otherwise departments will just game the system – so punish people not because they make mistakes, but if they don’t learn from their mistakes – Broad government ownership – so selection by cross- government Evaluation Technical Working Group – based on importance (either by scale or because strategic or innovative) – Evaluations must be believed - seen as credible – There must be follow-up (so improvement plans) 5
6
Priority interventions to evaluate Large (eg over R500 million for national, R50m prov) or covering a large proportion of the population, and have not had a major evaluation for 5 years. This figure can diminish with time Linked to 14 outcomes in MTSF/NDP/prov strat priorities Of strategic importance, and for which it is important that they succeed Innovative, from which learnings are needed Of significant public interest – eg key front-line services 6
7
Approach – credibility and transparency 7 To ensure credibility Ensure independence: Independent external service providers undertake the evaluation, reporting to the Steering Committee Ensure quality: DPME plays intensive role supporting the evaluations/ QA at the end of the evaluation Capacity Building initiatives Ensure transparency: All evaluation reports go to Cabinet/EXCO/dept mgt Then Parliament Then on web on Evaluation Repository
8
Emerging findings Coordination across departments is a major problem – need to find good practice mechanisms (many) Poor admin data/ unavailability of data (many) M&E inadequate, sometimes targets not set in advance (many) Inadequate use of IT ( use of paper-based system) Initiatives are sometimes not targeted enough and resources get spread too thinly (EMIA,CRDP, CASP, ) Sometimes Frameworks are good but not enforced Poor management of implementation and operational challenges (many) 8
9
Emerging findings Services often generic and not targeted enough for different groups – one size fits all Sometimes support for beneficiaries is inadequate (e.g. BPS MAFISA, CASP) No explicit theory of change, sometimes no consensus on programme design (many) Government better at spending money/infrastructure than on behaviour change/outcomes No post-support project tracking (difficult to measure impact) In some instances, government’s role is poor/ not clear 9
10
Do better? Enabling conditions Key role of a powerful and capable central ‘champion’ with sustained political will Resources : budget and a talented team to drive the system to solve problems early and rigorously Build a coalition across government Promote Integration of evaluations in intervention cycle and performance management cycle Utilization seen as the measure of ‘success’; promote ownership; improvement plans implemented 10
11
Do better? Enabling Conditions Strengthen the monitoring system; data availability and management Ensure that there is a strong design of interventions – with a proper theory of change Incentives – co funding; profiling of evaluations through evaluation networks; publishing and co-authoring, awards as SAMEA, etc 11
12
How can we strengthen system further? Improvement plans & follow up on use Draw budget implications for NT Communication of findings (policy briefs, social media, Evaluations Update, new website, etc) Building body of evidence across sectors- Human Settlements 7 evaluations + synthesis; Rural Development 5 evaluations + synthesis Quick internal evaluations for problem-solving Reflect on the evaluation system- slow, too many evaluations, administratively a burden
13
Asanteni sana; Merci beaucoup; Thank you! Noqobo (Nox) Chitepo Director: Evaluation and Research, DPME nox@dpme.gov.za www.dpme.gov.za 13
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.