Download presentation
Presentation is loading. Please wait.
1
Results Based Management for Monitoring & Evaluation
M Junaid Akhtar / Unber Khan
2
Content Monitoring & Evaluation
What is Results-Based Management (RBM)? Key Principles of RBM Why is Results-Based Management necessary? What is a Result? Results Chain Results Framework M&E Process of NDRMF Reporting Requirements of NDRMF Automated M&E System - PMRS M&E Requirements for Proposal Submission
3
Monitoring & Evaluation
efficiency effectiveness appropriateness Quantitative indicators impact Qualitative indicators target performance assessment outputs coverage Logframes DO NO HARM INPUTS outcomes accountability connectedness timeliness A WASP NEST…?
4
Monitoring & Evaluation
“What gets measured gets managed” “If you can't measure it, you can't improve it” (Peter F. Drucker) Monitoring What is happening Evaluation Why it happened Monitoring is an internal, repetitive, operations and management function. Evaluation is often external, periodic/ snapshot, in greater depth and asking different questions.
5
Why M&E
6
Why M&E Compare Actual State Expected Results
8
What is Results-Based Management?
Management strategy focused on achieving results Processes & Inputs Desired results Accountability for results Monitoring progress towards results Assessment and reporting on performance In other words RBM is a management approach aimed to improve management effectiveness and accountability in achieving results. RBM is focused on results chain: output, outcomes, and impact.
9
Key principles of RBM Define expected results first and activities later Foster the active participation of stakeholders Ensure that all stakeholders work towards achieving expected results Apprise your work critically and learn the lessons
10
Why Results-Based Management
Improved focus on results instead of activities Improved transparency Improved accountability Enhanced performance orientation Improved measurement of achievements Enhanced strategic focus No choice, it is a standard
11
What is Result
12
Results Chain
13
Results Chain
14
Results Chain
15
M&E Process of NDRMF NDRMF Monitoring and Evaluation framework follows the simplified log-frame and results-based model for the implementation and administration of interventions / projects. NDRMF Results Framework is finalized on the basis of: Project Administration Manual of ADB National Disaster Management Plan National Flood Protection Plan NDRMF Results Framework covers major outcome along with key outputs and relevant sub-outputs with performance indicators
16
M&E Process of NDRMF Performance Indicators
FIPs must select relevant indicators from NDRMF Results Framework Customized Indicators will be mutually agreed Baseline & Target Means of Verification Indicators Monitoring Framework Sr # Indicator Type Definition Unit of Measure Data Source Data Collection Method Frequency / Timing Segregation (Gender / Geographic) Responsibility (Org / Person)
17
M&E Process of NDRMF Field Visits Data Quality Assessments (DQA)
Field visits serve the purpose of validation. They validate the results reported by grants and projects. Data Quality Assessments (DQA) Reported Data will be verified by checking where it originated from, who was responsible for its collection and the methods and standards used to collect it. Third-Party Evaluations / Verifications to verify and evaluate program achievements – claimed in progress reports
18
M&E Process of NDRMF Monitoring of Activities & Results
19
Reporting “When performance is measured, performance improves. When performance is measured and reported back, the rate of improvement accelerates” (Pearson's law)
20
Reporting Requirements of NDRMF
Standardized Reporting Formats & Tools developed Quarterly Progress Report (QPR) Within 5 Days after ending of a calendar quarter Semi-Annual ESMS Within 5 Days after every alternate quarter along with QPR Annual Progress Report (APR) Within 5 Days every calendar year
21
Reporting Requirements of NDRMF
RAG Report for Implementation / Work Plan
22
Automated M&E System Systematically manage and monitor the progress
Real-time and rigorous data collection Allow stakeholders to get tailored access to the data, analysis and reports Enable online entry, collection, intelligent analysis, and efficient reporting of real-time activity based data Measure project / programme results (especially outputs and outcome) indicators against targets, milestones etc Track progress of Implementation Plan / Work Plan
23
M&E Requirements for Proposal Submission
Results Performance Indicators Means of Verifications Risks / Assumptions Baselines Targets Outcome Outputs 1. 1a. 1b. 1c. 2. 2a. 2b. 2c.
24
M&E Requirements for Proposal Submission
Indicators Indicator Definition/Unit Data Collection method/sources Frequency of Monitoring Outcome Output1 Output 2
25
M&E Requirements for Proposal Submission
How will the performance of the project be tracked in terms of achievement of the milestones set forth in the results framework? How will the impact of the project be assessed in terms of achieving the project's objectives? How will the mid-term assessment and related adjustment of the project design and plans be managed? How will the inclusiveness of stakeholders/community members (including Gender, Vulnerable groups) in the project M&E processes be achieved?
26
Key issues in proposal evaluations
Most Results Frameworks are filled for Outcome and Outputs but without Performance Indicators and/or means of verifications and risks/assumptions. Issues with indicators: Performance indicators are not separately mentioned; Improvements needed in indicator definitions; Indicators need to be more specific. Improvements required in M&E narratives: Project evaluations for assessing project impact? Tools and/or processes to ensure programmatic improvements? Inclusiveness of stakeholders/community groups in M&E processes?
27
Thanks
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.