KarRKb;RKghirBaØvtßúsaFarN³ sRmab;GPivDÆn_vis½yCnbT ¬ EpñkTI1¦ CMnYy\tsMNgrbs;FnaKarGPivD Æn_GasuI elx 0133- CAM fvikatamkmµviFIfvikatamkmµviFI éf¶TI7.

Slides:



Advertisements
Similar presentations
1 EU Strategy for the Baltic Sea Region Evaluation: Setting Outcome Indicators and Targets Seminar: 15 March 2011, La Hulpe Veronica Gaffey Acting Director.
Advertisements

Policies and Procedures for Civil Society Participation in GEF Programme and Projects presented by GEF NGO Network ECW.
Intelligence Step 5 - Capacity Analysis Capacity Analysis Without capacity, the most innovative and brilliant interventions will not be implemented, wont.
Overview M&E Capacity Strengthening Workshop, Maputo 19 and 20 September 2011.
Donald T. Simeon Caribbean Health Research Council
Improving how your organisation supports the use of research evidence to inform policymaking.
What You Will Learn From These Sessions
1 Tools and mechanisms: 1. Participatory Planning Members of local communities contribute to plans for company activities potentially relating to business.
Project Monitoring Evaluation and Assessment
Program Evaluation and Measurement Janet Myers. Objectives for today… To define and explain concepts and terms used in program evaluation. To understand.
Ray C. Rist The World Bank Washington, D.C.
MINISTRY OF DEVOLUTION AND PLANNING. M&E DEPARTMENT WELIME Using Technology in Monitorin g and Evaluation (e-ProMIS)
PPA 502 – Program Evaluation
Seminar on selected evaluation methodology issues 评估方法论研讨会 Independent Office of Evaluation of IFAD Beijing, 16 July 2014 国际农业发展基金独立评估办公室 2014 年 7 月 16.
Evaluation. Practical Evaluation Michael Quinn Patton.
Types of Evaluation.
Kermelle Hensley Larry Lipscomb Terri Wyrosdick.  Program evaluation is research designed to assess the implementation and effects of a program.  Its.
Monitoring & Evaluation in World Bank Agriculture and Rural Development Operations SASAR Monitoring & Evaluation Workshop New Delhi; June 20, 2006.
Dr. G. Johnson, Program Evaluation and the Logic Model Research Methods for Public Administrators Dr. Gail Johnson.
ISTEP: Technology Field Research in Developing Communities Instructor: M. Bernardine Dias CAs: Sarah Belousov and Ermine Teves Spring 2009.
1 Module 4: Designing Performance Indicators for Environmental Compliance and Enforcement Programs.
Evaluation Office 1 Evaluating Capacity Development David Todd Senior Evaluation Officer GEF Evaluation Office.
1 OPHS FOUNDATIONAL STANDARD BOH Section Meeting February 11, 2011.
Sociology 3322a. “…the systematic assessment of the operation and/or outcomes of a program or policy, compared to a set of explicit or implicit standards.
Too expensive Too complicated Too time consuming.
May 8, 2012 MWP-K Learning Event Monitoring, evaluation, and learning (MEL) framework for the Millennium Water Program, Kenya.
Evaluation Assists with allocating resources what is working how things can work better.
Elizabeth Godfrey 1.  Periodic assessment of results Appropriateness, effectiveness, efficiency, impact, sustainability  Identifies intended and unintended.
M & E: WHAT, WHY A ND HOW ? S. Venkatraman UIS-AIMS, UNESCO Bangkok.
Semester 2: Lecture 9 Analyzing Qualitative Data: Evaluation Research Prepared by: Dr. Lloyd Waller ©
IAOD Evaluation Section, the Development Agenda (DA) and Development Oriented Activities Julia Flores Marfetan, Senior Evaluator.
EVALUATION APPROACHES Heather Aquilina 24 March 2015.
Commissioning Self Analysis and Planning Exercise activity sheets.
Sub-Regional Workshop for GEF Focal Points in West and Central Africa Accra, Ghana, 9-11 July 2009 Tracking National Portfolios and Assessing Results.
UNDAF M&E Systems Purpose Can explain the importance of functioning M&E system for the UNDAF Can support formulation and implementation of UNDAF M&E plans.
GSSR Research Methodology and Methods of Social Inquiry socialinquiry.wordpress.com January 17, 2012 I. Mixed Methods Research II. Evaluation Research.
M ONITORING AND EVALUATION. O UR A PPROACH 1. What is monitoring and evaluation? Conceptual differences and terminologies. 2. Approaches to Monitoring.
Monitoring & Evaluation Presentation for Technical Assistance Unit, National Treasury 19 August 2004 Fia van Rensburg.
MAINSTREAMING MONITORING AND EVALUATION IN EDUCATION Can education be effectively managed without an M & E system in place?
PPA 502 – Program Evaluation Lecture 2c – Process Evaluation.
June 20, 2011 Prepared by Arusyak Alaverdyan The World Bank
Monitoring and Evaluation
Policies and Procedures for Civil Society Participation in GEF Programme and Projects presented by GEF NGO Network ECW.
SOFTWARE PROJECT MANAGEMENT
Feasibility Study.
Private Health Sector Assessments (PHSA) April Harding 2011.
Evaluation and Designing
Consultant Advance Research Team. Outline UNDERSTANDING M&E DATA NEEDS PEOPLE, PARTNERSHIP AND PLANNING 1.Organizational structures with HIV M&E functions.
Evaluation design and implementation Puja Myles
Kathy Corbiere Service Delivery and Performance Commission
1 The project is financed from the European Union funds within the framework of Erasmus+, Key Action 2: Cooperation for innovation and the exchange of.
Evaluating Engagement Judging the outcome above the noise of squeaky wheels Heather Shaw, Department of Sustainability & Environment Jessica Dart, Clear.
Monitoring and evaluation Objectives of the Session  To Define Monitoring, impact assessment and Evaluation. (commonly know as M&E)  To know why Monitoring.
Development of Gender Sensitive M&E: Tools and Strategies.
Organizations of all types and sizes face a range of risks that can affect the achievement of their objectives. Organization's activities Strategic initiatives.
Module 3: Ensuring Sustainability Session 1. Empowerment of the Community 1 CBDRR Framework Training - Myanmar Red Cross Society.
Monitoring and evaluation 16 July 2009 Michael Samson UNICEF/ IDS Course on Social Protection.
Éf¶T7 Exemsa qñaM2010 karBinitütamdan nigvaytémø erobcMfvikatamkmµviFI ¬5- 8 emsa 2010¦ karRKb;RKghirBaØvtßúsaFarN³ sRmab;GPivDÆn_vis½yCnbT ¬ EpñkTI1¦
Monitoring and Evaluation Systems for NARS organizations in Papua New Guinea Day 4. Session 10. Evaluation.
1 Secure Health Programme Monitoring and Evaluation December 12, 2013 Violet Murunga African Institute for Development Policy (AFIDEP) 1.
Module 8 Guidelines for evaluating the SDGs through an equity focused and gender responsive lens: Overview Technical Assistance on Evaluating SDGs: Leave.
Monitoring and Evaluating Rural Advisory Services
Program Evaluation ED 740 Study Team Project Program Evaluation
Fundamentals of Monitoring and Evaluation
Community program evaluation school
Descriptive Analysis of Performance-Based Financing Education Project in Burundi Victoria Ryan World Bank Group May 16, 2017.
CATHCA National Conference 2018
WHAT is evaluation and WHY is it important?
SUSTAINABLE MICRO-FINANCE for WOMEN’S EMPOWERMENT
Integrating Gender into Rural Development M&E in Projects and Programs
Presentation transcript:

karRKb;RKghirBaØvtßúsaFarN³ sRmab;GPivDÆn_vis½yCnbT ¬ EpñkTI1¦ CMnYy\tsMNgrbs;FnaKarGPivD Æn_GasuI elx CAM fvikatamkmµviFIfvikatamkmµviFI éf¶TI7 Exemsa qñaM2010 mUldæanRKwH énfvikatamkmµviFI

ADB Grant No.0133-CAM/Component 1: PFMRD 2 karBinitütamdan nigvaytémø ]bkrN¾xøH² viFIsaRsþ nigrebobBinitütamdan nigvaytémø

GnuvtþeTAtamsUvnkrN¾ GPiRkmRkbx½NÐ ktnuelam karvaytMéleTAtamRTwsþI karsÞab;sÞg;CapøÚvkar 3 ]bkrN¾mYycMnYn/ viFIsaRsþ nigGPiRkm

viFIsaRsþvaytMélrh½s karsÞab;sÞg;tamdancMNays aFarN³ sar³RbeyaCn¾énéfø nigkarviPaKéfømanRbsiTæP aB \TæiBlvaytMél karRtYtBinitü/ karvaytMél nigviFIsaRsþ ]bkrN¾ nigGPiRkm

kareRCIserIsviFIsaRsþEpñke lI³ - nrNaxøHRtUvkarBt’man ehIykñúgeKalbMNgGVI? - ehIyBt’manEdlRtUvkarKW)an rh½sKWtamrebobNa? - etItMélénkarRbEmRbmUlBt’ manmantMélb:unµan? -.l>>>>>. 5 karRtYtBinitü/ karvaytMél nigviFIsaRsþ ]bkrN¾ nigGPiRkm ¬t¦

sMrab;yuT§saRsþ/ kmµviFI/ KMerag>>>>>> kareRbIR)as;sMrab; - karkMNt;eKaledA nig)a:n;RbmankarrIkcMMerIn - kMNt;bBað nigeFVIkarEktMrUvskmµPaB - karsMerc faetIkarRtYtBinitüeLIgvijebIca M)ac; 6 karGnuvtþsUvnkrN¾

7 karGnuvtþsUvnkrN¾ ¬t¦ KuNsm,tþi³ -RbsiTæPaBmankarvas;Evg karrIkcMerInQandl;eKaledA -CYysMrbsMrYlmansþg;dak ñúgkareFVIkareRbobeFobP aBxusEpøk BIKñarvagsßab½n/ GgÁPaB/ RsukehIyeTAelIeBlevla.

karGnuvtþsUvnkrN¾ ¬t¦ KuNvibtþi³ -PaBkMesayénsUvnkrN¾ minEmnCakarvas;Evg)ane CaKC½yeT -ninñakarminRtUvkMNt;eGa y)annU vsUvnkrN¾eRcIneT. ninñakar TaMgenHminGacTajykRbPB Tinñn½y)aneTeFVIeGayRb B½næBt’man mantMélx<s; ehIymanPaBlMeGog/ TMngCakreRbIR)as;minGs; lTæPaB.

karGnuvtþsUvnkrN¾ ¬t¦ - Cajwkjab;/ karTTYlyksUvnkrN¾EdlGacv as;Evg)an nigkar eRbIR)as;Tinñn½ymanRsab;.

tMél³ - eKGaccMNat;fñak;BITab eTAx<s;/ GaRs½yedaycMnYnénsUvn krN¾EdlRbmUl)anmk. eRbkg; nigcMnYnénkarrkeXIj ehIykar RbB½næEdlmanlkçN³RKb;R CugeRCay. 10 karGnuvtþsUvnkrN¾ ¬t¦

etIeyIgGaceRbIGPiRkmRkbx NÐktnuelameFVIGVI? eFVIeGayRbesIreLIgnUvKuN PaBénKMerag nigkmµviFIEdl)ankMNt; edaycMa)ac;eKaledACak;lak;é nyfaRbePT. kareRbIR)as;Gnuvtþ sUvnkrN¾. kardwgBIbBaðahaniP½y. 11 GPiRkmRkbxNÐktnuelam

GPiRkmRkbxNÐktnuelam ¬t¦ karsegçbeLIgvijnUvcegáams kmµPaBsµúKsaµj. karCYyerobcMEpnkarRbtikm µkarlMGit. karpþl;nUvTisedACamUldæa nsMrab;karRtYtBinitüskmµPa b/ kareFVI karRtUvelIkarvaytMél.

ADVANTAGES: – Ensures that decision-makers ask fundamental questions and analyze assumptions and risks. – Engages stakeholders in the planning and monitoring process. – An effective management tool to guide implementation, monitoring and evaluation. DISADVANTAGES: – If managed rigidly, stifles creativity and innovation. – If not updated during implementation, it can be a static tool that does not reflect changing conditions. 14 karvaytMéleTAtamRTwsþI

COST: – Low to medium, depending on extent and depth of participatory process used to support the approach. 15 Logical Framework approach

16 karvaytMélEp¥kelIRTwsþI etIkarvaytMélEp¥kelIRTwsþI KWCaGVI? karbegáIteLIgnUvlkçN³vinicä ½ynUvktþaeCaKC½yedaykar RtYtBinitü eTAelIkmµviFIRTwsþI/ kmµvFIktþviCa¢³ ]TahrN¾³ rdæaPi)almankmµviFIEklMG/ kMritGkçrkmµedaybegáIn cMnYnRtUv³ kareCaKC½yKWGacEp¥keT AelIktþaFM²mYycMnYn

karvaytMélEp¥kelIRTwsþI ¬t¦ - lTæPaBfñak;eron nigesovePAGan - RbtikmµBIRkumRKYsar/ eKaledArbs;sala nigsisSsala - eTBekaslü nigsuCIvFm’rbs;elakRKU - eFVIkarEbgEckeGayelakRK UbEnßmeTAtamRsuk>>>>> - manCMenOeTAelIrdæaPi)alk ñúgkarpþl;R)ak; nigbEnßmeTot.

etIkarvaytMélEp¥kelIRTwsþI KWCaGVI? erobcMEpnkar KWmanktþavinicä½ysMxan; ²sMrab;kareCaKC½y etIeK GacnigeFVIGnþrkmµ EpnkarenaHGacsMercCaC Mhan)anKYrEtRtYt BinitüelIkmµviFIGPivDÆn¾. 18 karvaytMélEp¥kelIRTwsþI ¬t¦

karemIleXIjdUcemþcEdll¥Ca karBitEdlekIteLIg RbsinebITinñn½yenaHbgðaj ktþaminsMerc)anCaehtupl snñidæan KWCakmµviFIhak;dUcKµane CaKC½yenAkarsMerceKale dA.

karvaytMélEp¥kelIRTwsþI ¬t¦ GtßRbeyaCn¾³ - karpþl;RbtikmµPøam²GMBI GVI b¤Gt;eFVIkarehIyehtuGVI? - GnuBaØatiPøam²nUvPaBRt wmRtUvénbBaðEdlKYrEtelc ecjPaøm² - CYyeFVIkarkt;sMKal;GtþBa ØaNedayGectna\TæiBlEdlb: H Bal;dl;kmµviFI.

karvaytMélEp¥kelIRTwsþI ¬t¦ - CYyeFVIGaTiPaBtUbnIykmµ EdlecjeFVIkaresuIbGegáteG aykan; EtsIuCMerA/ RbEhlCaeRbIR)as;pÞal;eTA elIkarRbmUlTinñn½yCaeRcI n b¤eFVIeGaykan;EtRbesIrnUv viFIRtYtBinitü nigvaytMél. - pþl;nUvmUldæanGac)a:n;sµ an)anRbEhlCab:HBal;dl;kmµ viFI

karvaytMélEp¥kelIRTwsþI ¬t¦ KuNvibtþi³ GacCakargarRsYlERbkaøye TACakarsµúKsµaj RbsinebIkMrit skmµPaBFM rWRbsinebIeRbIGs;nUvbBa ¢Iénktþa nigedaysnµt³ PaKIBak;B½næRbEhlCamin yl;RBmGMBIGVIEdlBYkeKe FVIkarvinicä½y CasMxan;CaktþakMNt;EdlNa mYyGacCaeRbIR)as;eBlevl a.

karvaytMélEp¥kelIRTwsþI ¬t¦ tMél³ tMélmFümEp¥keTAelIkarviP aCn¾sIuCMerA nigCaBiessKWsIuCMerAeT A karesIubGegátelIkmµviFIkarg ar.

ADVANTAGES: – Provides early feedback about what is or is not working, and why. – Allows early correction of problems as soon as they emerge. – Assists identification of unintended side-effects of the program. – Helps in prioritizing which issues to investigate in greater depth, perhaps using more focused data collection or more sophisticated M&E techniques. – Provides basis to assess the likely impacts of programs. 24 Theory Based Evaluation

DISADVANTAGES: Can easily become overly complex if the scale of activities is large or if an exhaustive list of factors and assumptions is assembled. Stakeholders might disagree about which determining factors they judge important, which can be time-consuming to address. COST: Medium—depends on the depth of analysis and especially the depth of data collection undertaken to investigate the workings of the program. 25 Theory Based Evaluation

WHAT ARE THEY? Formal surveys can be used to collect standardized information from a carefully selected sample of people or households. Surveys often collect comparable information for a relatively large number of people in particular target groups. 26 Formal surveys

WHAT CAN WE USE THEM FOR? – Providing baseline data against which the performance of the strategy, program, or project can be compared. – Comparing different groups at a given point in time. – Comparing changes over time in the same group. – Comparing actual conditions with the targets established in a program or project design. – Describing conditions in a particular community or group. – Providing a key input to a formal evaluation of the impact of a program or project. 27 Formal surveys

WHAT CAN WE USE THEM FOR? – Providing baseline data against which the performance of the strategy, program, or project can be compared. – Comparing different groups at a given point in time. – Comparing changes over time in the same group. – Comparing actual conditions with the targets established in a program or project design. – Describing conditions in a particular community or group. – Providing a key input to a formal evaluation of the impact of a program or project. 28 Formal surveys

ADVANTAGES: Findings from the sample of people interviewed can be applied to the wider target group or the population as a whole DISADVANTAGES: Surveys are expensive and time-consuming. The processing and analysis of data can take a long time, even where computers are available. Many kinds of information are difficult to obtain through formal interviews. COST: High 29 Formal surveys

WHAT ARE THEY? Quick, low-cost ways to gather the views and feedback of beneficiaries and other stakeholders, in order to respond to decision-makers’ needs for information. – Key informant interview – Focus group discussion – Community group interview – Direct observation – Mini-survey 30 Rapid Appraisal Methods

WHAT CAN WE USE THEM FOR? Providing rapid information for management decision-making, especially at the project or program level. Providing qualitative understanding people’s values, motivations, and reactions. Providing context and interpretation for quantitative data collected by more formal methods. 31 Rapid Appraisal Methods

ADVANTAGES: Lower cost than formal survey. Can be conducted quickly. Provides flexibility to explore new ideas. DISADVANTAGES: Findings usually relate to specific communities or localities—thus difficult to generalize from findings. Less valid, reliable, and credible than formal surveys. COST: Low to medium 32 Rapid Appraisal Methods

ADVANTAGES: Lower cost than formal survey. Can be conducted quickly. Provides flexibility to explore new ideas. DISADVANTAGES: Findings usually relate to specific communities or localities—thus difficult to generalize from findings. Less valid, reliable, and credible than formal surveys. COST: Low to medium 33 Rapid Appraisal Methods

WHAT ARE THEY? Participatory methods provide active involvement in decision-making for those with a stake in a project, program, or strategy and generate a sense of ownership in the M&E results and recommendations. – Stakeholder analysis – Participatory rural appraisal – Beneficiary assessment – Participatory monitoring and evaluation 34 Participatory Methods

WHAT CAN WE USE THEM FOR? Learning about local conditions and local people’s perspectives and priorities to design more responsive and sustainable interventions. Identifying problems and trouble-shooting problems during implementation. Evaluating a project, program, or policy. Providing knowledge and skills to empower poor people. 35 Participatory Methods

ADVANTAGES: Examines relevant issues by involving key players in the design process. Establishes partnerships and local ownership of projects. Enhances local learning, management capacity, and skills. Provides timely, reliable information for management decision-making. DISADVANTAGES: Sometimes regarded as less objective. Time-consuming Potential for domination and misuse by some stakeholders to further their own interests. 36 Participatory Methods

COST: Low to medium. Costs vary greatly, depending on scope and depth of application and on how local resource contributions are valued. 37 Participatory Methods

WHAT ARE THEY? Public expenditure tracking surveys (PETS) track the flow of public funds and determine the extent to which resources actually reach the target groups. The surveys examine the manner, quantity, and timing of releases of resources to different levels of government, particularly to the units responsible for the delivery of social services. PETS are often implemented as part of larger service delivery and facility surveys which focus on the quality of service, characteristics of the facilities, their management, incentive structures, etc. What can we use them for? ■ Diagnosing problems in service delivery quantitatively. ■ Providing evidence on delays, “leakage,” and corruption. ADVANTAGES: ■ Supports the pursuit of accountability when little financial information is available. ■ Improves management by pinpointing bureaucratic bottlenecks in the flow of funds for service delivery. DISADVANTAGES: ■ Government agencies may be reluctant to open their accounting books. ■ Cost is substantial. COST: Can be high until national capacities to conduct them have been established. For example, the first PETS in Uganda cost $60,000 for the education sector and $100,000 for the health sector Public Expenditure Tracking Surveys

WHAT CAN WE USE THEM FOR? Diagnosing problems in service delivery quantitatively. Providing evidence on delays, “leakage,” and corruption. ADVANTAGES: Supports the pursuit of accountability when little financial information is available. Improves management by pinpointing bureaucratic bottlenecks in the flow of funds for service delivery. DISADVANTAGES: ■ Government agencies may be reluctant to open their accounting books. COST: High 39 Public Expenditure Tracking Surveys

WHAT ARE THEY? Tools for assessing whether or not the costs of an activity can be justified by the outcomes and impacts. Cost-benefit analysis measures both inputs and outputs in monetary terms. Cost-effectiveness analysis estimates inputs in monetary terms and outcomes in non-monetary quantitative terms (such as improvements in student reading scores). 40 Cost-Benefit and Cost-Effectiveness Analysis

WHAT CAN WE USE THEM FOR? Informing decisions about the most efficient allocation of resources. Identifying projects that offer the highest rate of return on investment. ADVANTAGES: Good quality approach for estimating the efficiency of programs and projects. Makes explicit the economic assumptions that might otherwise remain implicit or overlooked at the design stage. Useful for convincing policy-makers and funders that the benefits justify the activity. 41 Cost-Benefit and Cost-Effectiveness Analysis

DISADVANTAGES: Fairly technical, requiring adequate financial and human resources available. Requisite data for cost-benefit calculations may not be available, and projected results may be highly dependent on assumptions made. Results must be interpreted with care, particularly in projects where benefits are difficult to quantify. COST: Varies greatly, depending on scope of analysis and availability of data. 42 Cost-Benefit and Cost-Effectiveness Analysis

WHAT IS IT? Impact evaluation is the systematic identification of the effects – positive or negative, intended or not – on individual households, institutions, and the environment, caused by a given development activity such as a program or project. Impact evaluation helps us better understand the extent to which activities reach the poor and the magnitude of their effects on people’s welfare. 43 Impact Evaluation

Impact evaluations can range from: – large scale sample surveys in which project populations and control groups are compared before, during and after the program; – to small-scale rapid assessment appraisals where estimates of impact are obtained from combining group interviews, key informants, case studies and available secondary data. 44 Impact Evaluation

WHAT CAN WE USE IT FOR? Measuring outcomes and impacts of an activity and distinguishing these from the influence of other, external factors. Helping to clarify whether costs for an activity are justified. Informing decisions on whether to expand, modify or eliminate projects, programs or policies. Drawing lessons for improving the design and management of future activities. Comparing the effectiveness of alternative interventions. Strengthening accountability for results. 45 Impact Evaluation

ADVANTAGES: Provides estimates of the magnitude of outcomes and impacts for different demographic groups, regions or over time. Provides answers to some of the most central development questions – to what extent are we making a difference? What are the results on the ground? How can we do better? Systematic analysis and rigor can give managers and policy-makers added confidence in decision-making. 46 Impact Evaluation

DISADVANTAGES: Some approaches are very expensive and time- consuming, although faster and more economical approaches are also used. Reduced utility when decision-makers need information quickly. COST: High 47 Impact Evaluation