Download presentation
Presentation is loading. Please wait.
Published byCora Hines Modified over 9 years ago
1
REFLECTIONS ON THE EVOLUTION OF M&E IN UGANDA Uganda’s Location and Profile Pre-NIMES NIMES 2004 Revised NIMES Conclusion
2
UGANDA Source:http:/www.fco.gov.uk/en/about-the-fco/country-profiles/sub-saharan- africa/uganda
3
Uganda’s Profile Area: 197,058,000 sq km (93,104 sq miles) Population: 28.9 million (2006 estimate) Capital City: Kampala (population 1.3 million) GDP: US$11.7 billion (2007 est) Annual Growth: 6% (2006) Inflation: 5.7% (2007 est) 80% of the population engaged Agriculture 2007 World Development Report - placed 154 th out of 177 world's most impoverished countries. reduction from 56% to 31% of the population living below the poverty line. Source:http:/www.fco.gov.uk/en/about-the-fco/country-profiles/sub-saharan-africa/uganda
4
National Integrated Monitoring and Evaluation System (NIMES) Prior to NIMES M&E activities were independent carried out by each ministry/sector/institution/agency Each ministry/sector/institution/agency was merely required to submit quarterly implementation report to OPM The reports were supposed to be to be analyzed and give feedback to improve performance Challenges Culture of reporting was weak and most were not reporting Reports available were largely lacking in value. – outputs, measurable indicators were poor.
5
National Integrated Monitoring and Evaluation System (NIMES) (2004) Objectives A strategy to develop an appropriate M&E system for government Build capacity in ministries, sector, institutions/agencies to manage M&E frameworks. Improve M&E information flow and use in decision making.
6
IMPLEMENTATION OF NIMES NIMES was built upon PEAP policy making Used a set of indicators agreed upon and used by all government institutions OPM led a assessment based on these indicators (PEAP policy matrix) – reporting was annual to OPM – OPM prepared the status report to government – Report informed parliament and cabinet and provided feedback on progress and required support by the government institutions – Intended to influence budgeting.
7
IMPLEMENTATION OF NIMES (cont’d) Challenges PEAP matrix was too big and had too many indicators Complex and tedious Undifferentiated for various government institutions performing at different rates Assumed that once you have a national indicator, the contribution of local government would be reflected in its assessment Did not provided information frequently on quarterly basis - hence influence budgeting did not materialize Why?: -sectors were doing reviews once a year -Ministries were doing assessment at the time of preparing policy statements -A lot of activities were not funded or implemented due tobureaucracy and/or procurement procedures
8
REVISION OF NIMES Monitoring and Evaluation Strategy for National Development Plan (NDP) Results and Performance Frameworks at three levels o Sector (e.g water and sanitation) o Ministry level (departments and agencies) o Local governments Monitoring Indicators and targets are being developed at each of these levels
9
Logical Description of the Strategy How outputs at local government link/contribute to objectives of the ministry How outputs of the ministry link/contribute to the outcomes of the sector How outcomes sectors contribute to overall national development plan objectives and impact.
10
IMPLEMENTATION Each Ministry and Sector To set its own indicators with support from NIMES Agree on few indicators for monitoring (3-5) Analysis of quarterly reports Computer based system for this analysis Policy briefs are prepared by NIMES for higher level decision making (parliament and cabinet) Feedback is provided to sectors, ministries and local governments Budgeting allocations to be linked to this monitoring strategy NTMWG – working with secretarial at OPM/NIMES to ensure that the system works
11
Reporting Each lower level will report to the immediate high level i.e local government to the ministry; ministry to sector and sector to NIMES. The Local governments will report – monthly Ministries to sector will report – quarterly Sectors (16) to OPM/NIMES – Quarterly
12
Evaluation Strategy To set up evaluation database Minimum guidelines and standards Resources persons (qualified and competent) Documentation of all evaluation reports for synthesis – for information and learning
13
Conclusion Commendable effort and involvement of stakeholders it is not clear as to how the issue of institutions performing at different rates is being addressed It is necessary that revision of the NIMES- strategize on weak and poor culture of reporting (issues of value addition, incentives etc.) Shift of burden to local governments i) ii) The big and varied number of monitoring indicators for ministries and sectors and their own ii) They have the shortest reporting time (monthly) of all the actors Iii) Pressed with greatest demand amidst almost no existent human resource capacity and poor resource envelope.
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.