Presentation is loading. Please wait.

Presentation is loading. Please wait.

USAID’s Experience and Lessons Learned in Approaches used in Monitoring and Evaluating Capacity Building Activities Duane Muller, USAID November 7, 2008.

Similar presentations


Presentation on theme: "USAID’s Experience and Lessons Learned in Approaches used in Monitoring and Evaluating Capacity Building Activities Duane Muller, USAID November 7, 2008."— Presentation transcript:

1 USAID’s Experience and Lessons Learned in Approaches used in Monitoring and Evaluating Capacity Building Activities Duane Muller, USAID November 7, 2008 UNFCCC Meeting on Experiences with Performance Indicators for Monitoring and Evaluation of Capacity Building Rio De Janeiro, Brazil

2 USG COMMITMENT TO CAPACITY BUILDING Integral to development programs Country driven approach Useful lessons at the project level

3 Managing for Results USAID’s Experiences

4 PERFORMANCE MANAGEMENT

5 The systematic process of: Monitoring the results of activities Collecting and analyzing performance information Evaluating program performance Using performance information Communicating results

6 STEPS IN DEVELOPING A PERFORMANCE MANAGEMENT PLAN (PMP)

7 MANAGING FOR RESULTS Performance Indicators –Scale or dimension Standard Indicators –Combination of output and outcome indicators –Measure direct, intended results Custom Indicators –Meaningful outcome measures

8 CHARACTERISTICS OF GOOD INDICATORS Direct measures Objective Plausible attribution Practical Disaggregated Quantitative

9 Monitoring & Evaluation Different but complementary roles at USAID

10 MONITORING AND EVALUATION MONITORING Clarify program objectives Link project activities to their resources/objectives Translate into measurable indicators/set targets Collect data on indicators Report on progress EVALUATION Analyzes why and how intended results were/were not achieved Assesses contributions of activities to results Examines results not easily measured Explores unintended results Provides lessons learned/recommendations

11 EXPERIENCES WITH MONITORING USAID’s lessons learned

12 8 step process to collect monitoring data 1)Indicators/Definitions 2)Data source 3)Method: data collection 4)Frequency: data collection 5) Responsibilities: acquiring data 6) Data analysis plans 7) Plans for evaluations 8) Plans for reporting/using performance information

13 EXPERIENCES WITH EVALUATION USAID’s Lessons Learned

14 EVALUATION= POWERFUL LEARNING TOOL Identifies lessons learned Improves quality of capacity building efforts Critical to understanding performance Retrospective

15 ANALYTICAL SIDE OF PROJECT MANAGEMENT Analyzes why and how intended results were/were not achieved Assesses contributions of activities to results Examines results not easily measured Explores unintended results Provides lessons learned/recommendations

16 TYPES OF EVALUATION USED BY USAID

17 TRADITIONAL EVALUATION Donor focused and ownership of evaluation Stakeholders often don’t participate Focus is on accountability Predetermined design Formal evaluation methods Independent/third party evaluators

18 PARTICIPATORY EVALUATION Participant focus and ownership Broad range of stakeholders participate Design is flexible Focus on learning

19 ASSESSMENTS Quick and flexible Trends and dynamics Broader than evaluations

20 METHODOLOGIES FOR EVALUATIONS Scope of Work (SOW) Interviews Documentation Reviews Field Visits Key informant interviews Focus group interviews Community group interviews Direct observation Mini surveys Case studies Village imaging

21 SUCCESSFUL EVALUATIONS= LESSONS LEARNED Making the decision to evaluate Ensuring Scope of Work is well thought-out Finding the appropriate team Ensuring the results are used

22 PROGRAM ASSESSMENT RATING TOOL (PART)

23 PART: Goals, Procedures and Results Reviewing performance of US government programs –Program purpose and design –Strategic planning –Program management –Results Standard questionnaire called PART Results in an assessment and plan for improvement

24 PART RATINGS Performing –Effective –Moderately effective –Adequate Not Performing –Ineffective –results not demonstrated

25 Conclusions Lessons learned/best practices for M&E Project Level experiences –Cost effective –Timely –Ensure data is used National experience= PART Country driven approach to capacity building –Paris Declaration on AID Effectiveness

26 ADDITIONAL RESOURCES Development Experience Clearinghouse http://dec.usaid.gov Performance Management A Guide to Developing and Implementing Performance Management Plans http://www.usaid.gov/policy/ads/200/200sbn.doc Evaluation Documents Preparing an Evaluation Scope of Work http://www.usaid.gov/pubs/usaid_eval/pdf_docs/pnaby207.pdf Conducting a Participatory Evaluation http://www.usaid.gov/pubs/usaid_eval/pdf_docs/pnabs539.pdf Constructing an Evaluation Report http://pdf.usaid.gov/pdf_docs/PNADI500.pdf Conducting Key Informant Interviews http://www.usaid.gov/pubs/usaid_eval/pdf_docs/pnabs541.pdf PART - http://www.whitehouse.gov/omb/part/http://www.whitehouse.gov/omb/part/ - http://www.whitehouse.gov/omb/expectmore/http://www.whitehouse.gov/omb/expectmore/

27 For further information: Duane Muller USAID EGAT/ESP/GCC Tel 1-202-712-5304 Fax 1-202-216-3174 Email: dmuller@usaid.govdmuller@usaid.gov Website: www.usaid.govwww.usaid.gov Keyword: climate change


Download ppt "USAID’s Experience and Lessons Learned in Approaches used in Monitoring and Evaluating Capacity Building Activities Duane Muller, USAID November 7, 2008."

Similar presentations


Ads by Google