Adding value to project implementation through a learning/CLA approach

Slides:



Advertisements
Similar presentations
Drug Awareness for Primary Schools Richard Boxer Drug Education Consultant Health & Well-Being Team (CSF) Safeguarding: Drug Education Richard Boxer, Drug.
Advertisements

Comprehensive M&E Systems
Evaluation of Math-Science Partnership Projects (or how to find out if you’re really getting your money’s worth)
Program Collaboration and Service Integration: An NCHHSTP Green paper Kevin Fenton, M.D., Ph.D., F.F.P.H. Director National Center for HIV/AIDS, Viral.
May 8, 2012 MWP-K Learning Event Monitoring, evaluation, and learning (MEL) framework for the Millennium Water Program, Kenya.
Reservoir Primary School Literacy Share Day
MAINSTREAMING MONITORING AND EVALUATION IN EDUCATION Can education be effectively managed without an M & E system in place?
Monitoring and Evaluation of GeSCI’s Activities GeSCI Team Meeting 5-6 Dec 2007.
Program Evaluation Dr. Ruth Buzi Mrs. Nettie Johnson Baylor College of Medicine Teen Health Clinic.
Results Focus is Focusing on the Solution: SHIFTING FROM AID THAT WORKS TO DEVELOPMENT THAT WORKS Richard Ssewakiryanga Executive Director Uganda National.
Maggie Montgomery Water and Health Conference, Chapel Hill Oct 2015 International Network on Household Water Treatment and Safe Storage Regional Workshops.
Module 1: Program Planning Cycle (Role of Surveys and Linking Indicators to Plans) Outcome Monitoring and Evaluation Using LQAS.
NFM: Modular Template Measurement Framework: Modules, Interventions and Indicators LFA M&E Training February
Implementation Science: Finding Common Ground and Perspectives Laura Reichenbach, Evidence Project, Population Council International Conference on Family.
2007 Pan American Health Organization 2004 Pan American Health Organization Malaria in the Americas: Progress, Challenges, Strategies and Main Activities.
National Early Years Conference Edinburgh Conference Centre Heriot Watt Campus October 2010.
An Action Plan To End Preventable Deaths #EveryNewborn EVERY NEWBORN Lily Kak On behalf of the ENAP Team Nigeria, October 23, 2014.
A Strategic Approach to the Development of evidence- based HIV/AIDS Workplace Education Policies and Behaviour Change Communication Programmes A Case Study.
Global Fund Work on HIV/SRH Linkages 09 March 2015 Olga Bornemisza New York, USA IAWG Meeting on HIV/SRH Linkages.
MONITORING, EVALUATION & REPORTING UPDATES 2014 Annual Partners Forum 15 April 2014.
Office of Global Health and HIV (OGHH) Office of Overseas Programming & Training Support (OPATS) Health The Global Response to Caring for Orphans and Vulnerable.
Monitoring & Evaluation Capacity Strengthening Workshop WORKSHOP INTRODUCTION AND OVERVIEW.
CHAPTER 7 DELIVERY OF YOUR COMPREHENSIVE SCHOOL COUNSELING PROGRAM
Logic Models How to Integrate Data Collection into your Everyday Work.
Introduction to the Application of Balanced Counseling Strategy
Board Roles & Responsibilities
Quality Improvement An Introduction
Designing Effective Evaluation Strategies for Outreach Programs
Innovation Ecosystems Fellowship Overview
GEF Familiarization Seminar
Discussion of CRVS strategies
INTER-AMERICAN DEVELOPMENT BANK CAPACITY BUILDING AND TRAINING.
Jerry Yerardi • Michelle Bautista • Paolo Mercado
Recap of Day 3.
Fundamentals of Monitoring and Evaluation
MONITORING AND EVALUATION IN THE GEF
Building a Framework to Support the Culture Required for Student Centered Learning Jeff McCoy | Executive Director of Academic Innovation & Technology.
Gender, Education and HIV
Strategic Planning for Learning Organizations
HEALTH IN POLICIES TRAINING
New Goal Clarity Coach Training October 27, 2017
Coastlands Hotel – Durban, South Africa November 2016
Exploring and Using the new foundations of Education (3rd edition) Connection Chapters to promote Literacy Instruction Dr. Dawn Anderson from Western Michigan.
“CareerGuide for Schools”
JET Education Services: Innovations in Teacher Support and Curriculum Development Presentation to the Care and Support for Teaching and Learning Regional.
Leading Teaching and Learning through School Self-Evaluation
Multi-Sectoral Nutrition Action Planning Training Module
Integrating Protective Factors into Case Planning
Background to The Conference
Learning Prompt- How do you think tonight’s readings connect to the next steps in our assessment process?
Workshop of the Africa Program for Impact Evaluation of HIV/AIDS
CATHCA National Conference 2018
Learning loves company
Agenda Introduction Melanie Erzinger: Lab Teaching Example 1
MAP-IT: A Model for Implementing Healthy People 2020
Policy Change Department of Veterans Affairs
MONITORING AND EVALUATION IN THE GEF
Developed by HC3, USAID’s Flagship Health Communication Project
Group work is not a dirty word anymore
Environment and Development Policy Section
Assertive Community Treatment Webinar
MONITORING AND EVALUATION IN THE GEF
Comprehensive M&E Systems
Learning for Adapting Lactational Amenorrhea Method (LAM)
Purposeful Learning to Help Programs Adapt
MONITORING AND EVALUATION IN THE GEF
Instructional Plan and Presentation Cindy Douglas Cur/516: Curriculum Theory and Instructional Design November 7, 2016 Professor Gary Weiss.
Stakeholder engagement and research utilization: Insights from Namibia
Kenneth Sherr Embedded implementation science to enhance the relevance of effectiveness trials for structural interventions Kenneth Sherr.
MONITORING AND EVALUATION IN THE GEF
Presentation transcript:

Adding value to project implementation through a learning/CLA approach October 16, 2018

The evolution of M&E: the early days Our projects were doing a good job of collecting and reporting data But not much emphasis on data use, applied research, learning and feedback loops to take learning to action We started focusing on learning and on “telling the story” of projects We started integrating our M&E approaches with applied research and USAID’s CLA guidance

What does the MERLA approach look like?

What is MERLA? MERLA is the intentional application of results-focused monitoring, evaluation and research… …to inform continuous learning and adaptation… …for improving program effectiveness and policy decision making.

MERLA in relationship to M&E MERLA does not replace monitoring and evaluation (M&E), but rather adds more value and augments M&E MERLA helps projects move away from M&E as a data reporting requirement and toward a dynamic, collaborative, and adaptive process to increase program efficiency and impact MERLA integrates research and CLA (collaborating, learning, and adapting) into M&E, thereby making M&E stronger

What does MERLA do for a project? MERLA improves program implementation by ensuring that: Data collected through M&E systems are timely, of high quality, analyzed and used continuously M&E data are complemented and strengthened by filling gaps in evidence through operations research (OR) Key results and evidence from M&E and research are synthesized as learning Learning is continuously incorporated into program re-design for improving implementation, documenting and communicating lessons, and informing policy decisions

Examples of MERLA products developed for proposals and at project start up Performance Monitoring Plan (PMP) with indicators, results framework, logframes Project dhis2 platforms Learning Agenda and Communications Plan Operations research questions and protocols Approaches and tools for ensuring CLA

dhis2 platforms

Learning and adapting approaches and tools Internal and external pause and reflect sessions Establishment of learning platforms and communities of practice Data to action guides Virtual cross-country roundtable discussions

Learning agenda key questions What results is a given project achieving? Why and how are we achieving – or not achieving - results? At what cost are we achieving these results? How can we use learning to make rapid programmatic course correction and influence policy discussions? And for effective communication?

Example 1: how learning helps improve program implementation RTI’s StopPalu+ malaria prevention project in Guinea aims to motivate pregnant women to seek mRDT Project implemented SMS program to reach out to pregnant women As part of MERLA, we set up intervention and comparison groups Set up feedback loops to ensure continuous program learning and adapting

Learning and adapting is key Image source: Counseling, testing and enrollment of women in StopPalu program. Patrick Adams, RTI International Already seeing findings that are helping us tweak the SMS program as it progresses Learning: Women with no education and those registering between 1-4 weeks into their first trimester less likely to seek 2nd antenatal care visit Adapting: SMS program is targeting messages for women early on in pregnancy and those likely to become pregnant In terms of study design, this will be a new arm: important to track results and costs for each aspect of program Lastly, let’s remember why and who it is we are learning and adapting for: not for the sake of learning in itself, but for the benefit of mothers and children

Example 1: how learning helps improve program implementation SMS intervention group had 8% more follow-up antenatal care visits than the comparison group Cost per woman enrolled dropped from $20.50 in first month to $2.95 by end of first year SMS group saw 23 fewer neonatal deaths over a one-year period than the comparison group It cost the program $650 per neonatal death averted

Example 1: how learning helps improve program implementation Feedback loops and reflection sessions enabled course correction on continuous basis Women with no or little education were less likely to seek follow-up antenatal care Based on this learning, we developed targeted and visual messages for these women Saw increased uptake of antenatal care following these refinements Government is using these learnings to provide guidance for other SMS programs

Example 2: how learning helps improve program implementation RTI’s LuzonHealth project in Philippines aims to prevent adolescent pregnancies in Luzon province Project was tasked with implementing a government-approved teen pregnancy prevention program in schools Concerns that the program wasn’t really promoting pregnancy prevention

Example 2: how learning helps improve program implementation We put together a MERLA plan & protocol Comprehensive desk review of curriculum Pre and post test in intervention and comparison schools Qualitative study exploring perceptions and benefits of curriculum, barriers to implementation, etc.

Example 2: how learning helps improve program implementation Teen pregnancy prevention module had no info on teen pregnancy prevention: message was abstinence But HIV prevention and responsible parenthood modules covered condom use and pills Intervention schools saw significant increase in teen pregnancy prevention knowledge and behavior intentions Identified concrete ideas for curriculum improvements Dept of Education partnering with Dept of Health for curriculum mainstreaming

Learning from learning CLA/MERLA not rocket science. We’ve all been doing it in various ways for decades But important to have concrete approaches and tools to ensure deliberate learning and adapting CLA is so much easier on paper! The soft skills (patience, diplomacy, time for engagement) related to CLA are as important as the hard skills/results The rewards are worth it: strong local buy in and ownership, more likelihood of learning to action

Moving to cross-sectoral learning CLA is helping us break M&E/MERLA siloes within RTI Established cross-RTI MERLA Community of Practice in 2017 Engaging closely with USAID Learning Lab to share our CLA experiences Focusing on communicating our CLA stories: in-country and cross-country learning platforms, learning events, conferences, publications