Building Capacity for effective government wide Monitoring and Evaluation Dr Mark Orkin and Mr Oliver Seale SAMEA Conference Friday, 30 March 2007
2 Presentation Structure ♦Strategic Objectives and Outputs (Mark Orkin) ♦Transformation to the Academy (Mark Orkin) ♦Capacity building for Monitoring and Evaluation (Oliver Seale) Objectives ♦To provide an overview of SAMDI’s current focus areas and plot its future course. ♦To explore and engage with SAMDI’s capacity building mandate for government wide monitoring and evaluation.
3 ♦Good Governance: Anti-corruption, Gender and disability, Batho Pele and Public Participation programmes. ♦Capacity of the State: Local Government Strategic Agenda, Skills Assessment and Capacity Building programmes. ♦Macro-organisation of the State: Single Public Service, Integrated Service Delivery and E-Government Service Delivery Projects. ♦Transversal Systems: Integration of Planning and Government-wide Monitoring and Evaluation System. Strategic Objectives and Outputs Governance & Administration (G&A) Cluster Priorities
4 ♦Develop and administer a training framework for curricula and materials. ♦Co-ordinate the provision of executive development programmes for the Senior Management Service. ♦Develop and implement a quality management and monitoring system. ♦Capacitate departments to identify their human resource development needs. ♦Establish and maintain partnerships and links with national and international institutes and training providers. ♦Arrange customised training programmes to support foreign policy on the African Union (AU) and the New Partnership for Africa’s Development (NEPAD). Strategic Objectives and Outputs SAMDI 2007/08 to 2009/10
5 Strategic Objectives and Outputs Training Statistics: 1 Apr – 28 Feb. 2007
6 Strategic Objectives and Outputs PTDs* delivered in Provinces (to end-Jan. 2007) Distribution of Government Employees per Province Distribution of PTDs per Province 14% 18% 5% 10% 14% 6% 2% 11% 5% 2% 3% 5% 3% 10% 18% 6% 31% 20% 12% * PTDs = Person Training Days
7 ♦Support JIPSA (Joint Initiative On Priority Skills Acquisition) policy formulation and training. ♦Incubate AMDIN (African Management Development Institutes’ Network) and DRC (Democratic Republic of Congo). ♦Contribute to the ASGI-SA (Accelerated and Shared Growth Initiative for South Africa) through concentrated public sector human resource development activities and operations. ♦Transformation to the Academy – refer detail slides. ♦Capacity building for Monitoring and Evaluation – refer detail slides. Strategic Objectives and Outputs Focus areas for 2007/08 and beyond
8 Transformation to the Academy SAMDI: Need for a paradigm shift ♦R1 billion p/a spent in departments but 43% of staff in provincial departments reported no training in ♦International benchmarks suggest at least 5 days training per annum: For approx. 250,000 middle and junior managers requires 1,25 million PTDs p.a.; Allowing for 60% of training already occurring in departments still requires 0,5 million PTDs p.a. ♦For induction, staff turn-over is people p.a. requiring another 0,2 million PTDs. ♦Thus, the total-demand driven requirement is 0,7 million PTDs: nearly 10 times SAMDI’s present output!
9 Transformation to the Academy Vision and activities ♦Three “mantras” Provision to facilitation. Competition to collaboration. Selective coverage to massification. ♦First main stream of activity Executive development programmes for SMS. Entrant, lower and upper SMS: programmes, courses and events. In collaboration with universities and counterparts. ♦Second main stream of activity “Massified” management training for junior and middle managers. Training frameworks of curriculum and materials in conjunction with provincial academies and DPLG; Monitoring and Evaluation to regulate providers; The induction programme for new entrants at all levels.
10 Transformation to the Academy 2006/7 SAMDI outputs: basis of new approach MDT- Management Development Training; ELD - Executive Leadership Development; HRDT- Human Resource Development & Training; SCM: Supply Chain Management; HRMT - Human Resource Management Training; FPMT - Finance & Project Management Training; SDT - Service Delivery Training; ID - Institutional Development
11 Transformation to the Academy ENE training spend in national departments
12 Transformation to the Academy Learning framework: tentative harmonised modules Performance levels Senior Middle Junior Supervisor Finance People ProjectsCulture Information Other Finance Human Res.Supply Chain Information Other Immigration Pensions Induction Generic competencies Functional competencies Sectoral competencies
13 Transformation to the Academy Projects for Internal Task Teams 1.Audits of junior and middle management courses 2.Planning and implementing the training framework 3.Enhancing the monitoring and evaluation system 4.Streamlining accreditation processes 5.Planning for massified induction programme 6.Preparing for service provider mobilisation 7.Planning executive programmes for SMS level 8.Recasting delegations and policies 9.Scoping operational system for outsourced training 10.Conceive a knowledge management system
14 Transformation to the Academy Recap and way forward ♦Executive development programmes. ♦Learning framework for massified middle and junior management learning. ♦Curricula and material development, guality assurance and accreditation. ♦Provider and user relations; M&E of large-scale provision ♦Provincial infrastructure. ♦Research capacity and networking. ♦Continental support for Management Development Institutes, international relations. ♦Impending restructuring process.
15 Capacity building for Monitoring & Evaluation Background Aims and objectives The aim of the system is to contribute to improved governance and enhanced effectiveness of public sector institutions. The system aims to collect, collate, analyse and disseminate information on the progress and impact of programmes. Result Areas Accurate information on progress in the implementation of public sector programmes is updated on an ongoing basis; Information on the outcomes and impact achieved by government is periodically collected and presented; The quality of monitoring and evaluation (M&E) practices in government and public bodies is continuously improved.
16 Capacity building for Monitoring & Evaluation Provincial needs analysis: example of feedback (A) 1.Little coherent or articulated strategy in provinces, though expenditure on expensive systems to collate M&E data. i.What would a coherent strategy need to contain? ii.What is an articulated strategy? What type of links are we looking for? iii.What systems are there? What do we mean by a system? iv. What data are there? How were they obtained? What is the quality of this data? v.What collation is taking place? How? Can a system collate data? What are the implications for training?
17 Capacity building for Monitoring & Evaluation Provincial needs analysis: example of feedback (B) 2. Monitoring programmes are just about collecting data, very little analysis and feedback given. i.What data, why and how are they being collected? 3. Alignment of plans doesn’t exist i.What planning does take place and how? ii.How is M&E incorporated into planning? iii.What do we mean by alignment why do we need it? iv.Is alignment always possible and necessary? What are the implications for training?
18 Capacity building for Monitoring & Evaluation Provincial needs analysis: example of feedback (C) 4. Planning without indicators i.What type of indicators do we mean and how should these be developed? ii.What indicators do exist and how are they measured? iii.How are they decided on? 5. Lack of or poor baseline data i.What baseline data do exist and do we evaluate it? ii.What type of baseline data are required? iii.How are these presently obtained? What are the implications for training?
19 Capacity building for Monitoring & Evaluation Conceptual f ramework for training 1.Description 2.Existing data bases 3.Data collection methods 4.Baseline data 1.What will be done (strategy) 2.Why will it be done (policy) 3.How will it be done (operations) 4.Indicators and criteria (how to measure) 5.When (timeframes) Monitoring Existing situation New project or programme Planning 1.System to be used (MIS) 2.Indicators 3.Methods 4.Baseline data 5.Inputs 6.Tracking i.Processes ii.Activities 7.Interventions and modifications 8.Outputs 9.Outcomes Evaluation 1.System to be used (EIS) 2.Indicators 3.Methods 4.Baseline data 5.Criteria 6.Assessment 7.Process 8.Impact 9.Lessons learned 10.Feedback
20 Capacity building for Monitoring & Evaluation Training principles for various levels Principles 1.Understanding of basic principles of M&E 2.Applying principles to a specific project 3.Applying principles to a programme 4.Applying principles to overall management in departments 5.Applying principles across departments/provinces 6.Actually performing evaluations Levels Basic for general users of information Basic for project managers Intermediate for programme managers Advanced for executive managers Advanced for CFOs and DDGs Specialist technical training for M&E staff
21 Capacity building for Monitoring & Evaluation Target Audiences 1. Users i.Political heads and parliamentarians (incorporated into report-backs to portfolio committees) ii.Accounting officers (DGs) iii.Executive managers and managers in govt departments iv.Users of the service or the information outside government 2. Producers i.Programme managers ii.Project managers iii.Operations staff iv.Participants 3. M&E staff in national and provincial departments
22 Capacity building for Monitoring & Evaluation Examples of current provision InstitutionProgrammeDurationLevelContent University of Stellenbosch Diploma in Monitoring and Evaluation Methods 1 yearPost Graduate ♦ General principles & paradigms ♦ Clarificatory evaluation ♦ Process evaluation & programme monitoring ♦ Data collection methods ♦ Statistical and qualitative methods ♦ Impact assessment designs University of Cape Town Workshop on Monitoring and Evaluation 1 weekAll♦ General principles ♦ Measuring Public projects RegenesysShort course in Monitoring and Evaluation 3 daysNQF 4♦ Monitoring and evaluation concepts ♦ Results-based management ♦ Concepts of outcomes ♦ Project objectives and indicators ♦ Monitoring and evaluation system ♦ Team performance improvements ♦ Performance standards ♦ Risks and the impact thereof ♦ Success and failure factors ♦ Project evaluation reports
23 Capacity building for Monitoring & Evaluation Strategy and Plan of Action ♦Progress Report Terms of reference developed for Task Team. 15 workshops on M&E for programme and project management (340 officials). Initial needs analysis on provincial M&E capacity. Consultation with key internal and external stakeholders. ♦Plan of Action Research on M&E training needs - SMS, MMS, JMS, practitioners (March ‘07). Determine current providers of M&E - HEIs, privates, NGOs etc. (March ‘07). Undertake training needs analysis for M&E (May ‘07). Develop M&E training programme (Sept ‘07); roll-out (Nov. ‘07).
24 Siyabonga Thank you Rolivhuwa Dankie Nakhensa Re a leboga