Presentation is loading. Please wait.

Presentation is loading. Please wait.

CARE INTERNATIONAL MEL COMMUNITY OF PRACTICE QUALITY OF EVALUATIONS & brief “what’s new” on the global MEL agenda November 27, 2018.

Similar presentations


Presentation on theme: "CARE INTERNATIONAL MEL COMMUNITY OF PRACTICE QUALITY OF EVALUATIONS & brief “what’s new” on the global MEL agenda November 27, 2018."— Presentation transcript:

1 CARE INTERNATIONAL MEL COMMUNITY OF PRACTICE QUALITY OF EVALUATIONS & brief “what’s new” on the global MEL agenda November 27, 2018

2 “what’s new” on the global MEL agenda
August 29, 2019

3 August 29, 2019

4 PIIRS FY18 data available in the MEL wiki
STAFF data FY15-FY18 in Power BI REACH data: pre-defined summaries by project, country, region, CMP, Lead Member and Global REACH data: “do your own analysis” summary Video tutorials in English and Spanish. French coming soon. Impact data is being reviewed/validated: We may get in touch with you! Forms can be submitted any time! Evaluation Library reminder: Evaluation documents need to be published in How many projects do we implement around the world? What sectors do we focus on? How many people do we reach? To what extent do we incorporate key programmatic elements? REACH WHAT What changes (impacts/outcomes) do we contribute to? (Global indicators) HOW What are the most successful strategies that drive that change? What are we learning about CARE’s contribution to change? IMPACT/OUTCOMES STAFF DATA

5 Data collection for MEL Data storage/organization
We are starting the development of a MEL minimum package. The idea is not to reinvent resources but rather package them in connection with our MEL standards. Please share any guidance that you are currently using. ❶ Design your MEL system based on a clear theory of change and evidence needs. ❷ Have a clear definition of participants: direct/indirect participants and target/impact groups. ❸ Define a meaningful and manageable set of quantitative and qualitative indicators and/or questions for impact, outcomes and outputs in each participant group, and the methods to track them. ❹ Define the monitoring and evaluation moments and methods that best ensure robust and comparable tracking of outputs, outcomes and impact. ❺ Ensure your evidence can be translated into learning and support on the identification of potential for scale. ❻ Make your evidence accessible, and ensure your MEL practices are participative and responsive to feedback. ❼ Use your MEL system to continuously read the context and adapt to it. Module Topics General Intro to MEL MEL capacity mapping Planning for MEL Theory of Change and MEL Defining participants and stakeholders Defining indicators Generating a MEL plan (methods, data collection, budget, roles and responsibilities, protocol of data privacy, markers) Commissioning Evaluations (TOR, report template, managing consultants) Quality of Evaluations (quality assurance checklists, inception reports vs. final report, sampling, etc. Data collection for MEL How to apply/operationalize monitoring and evaluation tools (compilation of tools – pros and cons) How to operationalize feedback mechanisms Data storage/organization MIS selection (compilation of options – pros and cons) Data safety/security Data Analysis How to analyze qualitative data How to analyze quantitative data How to triangulate data Sensemaking of data Sensemaking practice Use of data How to do data viz/products for different audiences How to create knowledge products for different audiences PIIRS reporting, analysis, and data use  Adaptive management and MEL Reporting and presenting data or findings

6

7 Mapping of MEL capacities: We have 252 colleagues currently but at least 25 colleagues have left in the last update of the data. Please continue inviting others to participate: Have a topic to recommend? Want to present any work you are currently doing? PIIRS FY18: exploring the data and what we've learned this year Data Security MIS (part II): Kobo Tool box RCTs: when it works and when it doesn't Qualitative methods: when they work and when they don’t Value for Money MEL experiences in humanitarian work

8 Evaluation Quality August 29, 2019

9 Where Do We Get Our Impact Data?
PIIRS EVALUATIONS Focus Groups MEL Systems MIS Surveys

10 How Good is our data?

11 Average Score = 5.5 out of 10 August 29, 2019

12 82% have good data quality
August 29, 2019

13 Honest about failure 65% discuss failure August 29, 2019

14 We talk about impact late…
12 Average page impact appears on August 29, 2019

15 1 in 3 evaluations does not talk about impact
…or never 1 in 3 evaluations does not talk about impact August 29, 2019

16 We don’t share very well
Only 25% in our evaluation library August 29, 2019

17 30% Gender Disaggregate

18 CARE Madagascar's experience with the quality of evaluation

19 Summary of the project and the evaluation results

20 Project Name: Pathways to quality education Madagascar Donor: Lyreco for Education Period: July June 2018 (4 ans) Main objective: Improve access to quality primary education for at least 17,000 children aged between 5 and 15 in 47 public primary schools in the district of Vatomandry, with particular attention to the education of girls, following the Hurricane Giovanna in February 2012

21 The purpose of the final evaluation: To analyze and assess the level of achievement in light of the objectives set by the project, the target beneficiaries dynamics of change as well as the viability of the activities that were developed after four years of intervention.

22 Results of the evaluation
Effects of the project Number of children accessing quality education: 19,138 children out of 17,000 expected Number of girls accessing quality education: 9,568 girls out of 8,449 expected Access rate: from 83% to 131%. School retention rate: Increased from 73% to 85%, and exceeded the target of 5 points. Academic success rate: from 62% to 83%. School enrollment rate: from 82.5% to 90%, and exceeding the target of 5 points. Reduction of the drop-out rate for girls and boys at the end of each school year: 11% against 15% of planned. Failure Unachieved targets on average admission rate per level: 58% achieved out of 75% expected  due to non-achievement and recommendations for future actions

23 Tools used for successful evaluation

24 Data base of consultants
TOR Purpose of the evaluation evaluative questions Report templates Data base of consultants specializations recommendations   black listed BID analysis Check list qualité des rapports

25 Questions? Other experiences you´d like to share?
August 29, 2019


Download ppt "CARE INTERNATIONAL MEL COMMUNITY OF PRACTICE QUALITY OF EVALUATIONS & brief “what’s new” on the global MEL agenda November 27, 2018."

Similar presentations


Ads by Google