Module 1: Program Planning Cycle (Role of Surveys and Linking Indicators to Plans) Outcome Monitoring and Evaluation Using LQAS.

Slides:



Advertisements
Similar presentations
HIV/AIDS Results Monitoring and Evaluation Systems Measuring the Multi-Sector Response.
Advertisements

Template: Making Effective Presentation about Your Evidence-based Health Promotion Program This template is intended for you to adapt to your own program.
K-6 Science and Technology Consistent teaching – Assessing K-6 Science and Technology © 2006 Curriculum K-12 Directorate, NSW Department of Education and.
February Dakar, Senegal
Overview M&E Capacity Strengthening Workshop, Maputo 19 and 20 September 2011.
BA (Hons) Primary Education Year Three School Based Training Briefing
Measuring improvement in outcomes for children and young people Ruth Talbot and Jim Magee Knowledge for Improvement project, DfES.
Project Monitoring Evaluation and Assessment
Results-Based Management: Logical Framework Approach
Role of Result-based Monitoring in Result-based Management (RBM) Setting goals and objectives Reporting to Parliament RBM of projects, programs and.
Whose Job Is It? Part Two © Iowa Association of School Boards At the Board Table Discussion Tool.
Evaluating Physical Activity Intervention Programs Thomas Schmid, PhD Physical Activity and Health Branch CDC Atlanta, Georgia, USA.
Standards and Guidelines for Quality Assurance in the European
MONITORING AND EVALUATION – A PERSISTENT CHALLENGE 78 th Session of the Evaluation Committee Rome, 5 September 2013.
Evaluation of Math-Science Partnership Projects (or how to find out if you’re really getting your money’s worth)
Introduction to Home/School Compacts
PDHPE K-6 Using the syllabus for consistency of assessment © 2006 Curriculum K-12 Directorate, NSW Department of Education and Training.
A Sourcebook for Monitoring and Evaluating Agricultural and Rural Development Measuring Results in less-than-ideal Conditions Global Donor Platform for.
May 12 th Monitoring and Project Control. Objectives Anticipated Outcomes Express why Monitoring and Controlling are Important. Differentiate between.
Program Collaboration and Service Integration: An NCHHSTP Green paper Kevin Fenton, M.D., Ph.D., F.F.P.H. Director National Center for HIV/AIDS, Viral.
Please read before using this briefing This presentation forms the basis of a workshop for operational managers and other relevant staff to review quality.
APAPDC National Safe Schools Framework Project. Aim of the project To assist schools with no or limited systemic support to align their policies, programs.
© New Zealand Ministry of Education copying restricted to use by New Zealand education sector. Page 1 Consider the Evidence Evidence-driven.
EQARF Applying EQARF Framework and Guidelines to the Development and Testing of Eduplan.
1 What are Monitoring and Evaluation? How do we think about M&E in the context of the LAM Project?
Fundamentals of Evaluation for Public Health Programs ROBERT FOLEY, M.ED. NIHB TRIBAL PUBLIC HEALTH SUMMIT MARCH 31,
Elizabeth Godfrey 1.  Periodic assessment of results Appropriateness, effectiveness, efficiency, impact, sustainability  Identifies intended and unintended.
Primer on Monitoring and Evaluation. The 3 Pillars of Monitoring and Evaluation  Identifying the Performance Indicators  Collecting information using.
UNDAF M&E Systems Purpose Can explain the importance of functioning M&E system for the UNDAF Can support formulation and implementation of UNDAF M&E plans.
M&E Basics Miguel Aragon Lopez, MD, MPH. UNAIDS M&E Senior Adviser 12 th May 2009.
Implementing QI Projects Title I HIV Quality Management Program Case Management Providers Meeting May 26, 2005 Presented by Lynda A. O’Hanlon Title I HIV.
Partner’s Roles Valuable partners, I want to remind our roles in project. And I want to get feedback from you about this. These are our roles in project,
Monitoring & Evaluation: The concepts and meaning Day 9 Session 1.
MAINSTREAMING MONITORING AND EVALUATION IN EDUCATION Can education be effectively managed without an M & E system in place?
Overall Quality Assurance, Selecting and managing external consultants and outsourcing Baku Training Module.
W HAT IS M&E  Day-to-day follow up of activities during implementation to measure progress and identify deviations  Monitoring is the routine and systematic.
CONDUCTING A PUBLIC OUTREACH CAMPAIGN IMPLEMENTING LEAPS IN CENTRAL AND EASTERN EUROPE: TRAINERS’ HANDBOOK Conducting a Public Outreach Campaign.
21/4/2008 Evaluation of control measures 1. 21/4/2008 Evaluation of control measures 2 Family and Community Medicine Department.
Program Evaluation Dr. Ruth Buzi Mrs. Nettie Johnson Baylor College of Medicine Teen Health Clinic.
“A Truthful Evaluation Of Yourself Gives Feedback For Growth and Success” Brenda Johnson Padgett Brenda Johnson Padgett.
Preparing for Data Analysis and Interpreting Data CEI Implementing the Reproductive Health Assessment Toolkit for Conflict-Affected Women November.
Consultant Advance Research Team. Outline UNDERSTANDING M&E DATA NEEDS PEOPLE, PARTNERSHIP AND PLANNING 1.Organizational structures with HIV M&E functions.
ANNOOR ISLAMIC SCHOOL AdvancEd Survey PURPOSE AND DIRECTION.
1 Results-based Monitoring, Training Workshop, Windhoek, Results-based Monitoring Purpose and tasks Steps 1 to 5 of establishing a RbM.
The Literacy and Numeracy Secretariat Le Secrétariat de la littératie et de la numératie October – octobre 2007 The School Effectiveness Framework A Collegial.
Staff All Surveys Questions 1-27 n=45 surveys Strongly Disagree Disagree Neutral Agree Strongly Agree The relative sizes of the colored bars in the chart.
Building Bridges: Embedding outcome evaluation in national and state TA delivery Ella Taylor Diane Haynes John Killoran Sarah Beaird August 1, 2006.
Monitoring and evaluation Objectives of the Session  To Define Monitoring, impact assessment and Evaluation. (commonly know as M&E)  To know why Monitoring.
The Literacy and Numeracy Secretariat Le Secrétariat de la littératie et de la numératie October – octobre 2007 The School Effectiveness Framework A Collegial.
Session 2: Developing a Comprehensive M&E Work Plan.
Presentation By L. M. Baird And Scottish Health Council Research & Public Involvement Knowledge Exchange Event 12 th March 2015.
Module 4: Obtaining, Tabulating, and Using Survey Results Outcome Monitoring and Evaluation Using LQAS.
Introduction to Monitoring and Evaluation. Learning Objectives By the end of the session, participants will be able to: Define program components Define.
Development of Gender Sensitive M&E: Tools and Strategies.
Session 7: Planning for Evaluation. Session Overview Key definitions:  monitoring  evaluation Process monitoring and process evaluation Outcome monitoring.
Ethiopia Demographic and Health Survey 2011 HIV/AIDS Knowledge, Attitudes, and Behaviour.
Knowledge Practice & Coverage Survey (KPC) Overview & resources Moving from research objectives, questions, hypothesis to questionnaire design Day 3: Session.
Developing a Monitoring & Evaluation Plan MEASURE Evaluation.
Mitigation, Enhancement and Monitoring Session 9 Health in SEA.
Strategic Information on ART Scale up Kevin O'Reilly Department of HIV/AIDS WHO.
OGB Partner Advocacy Workshop 18 th & 19 th March 2010 Indicators.
Gender-Sensitive Monitoring and Evaluation
Gender-Sensitive Monitoring and Evaluation
Central American Certificate Course: HIV Monitoring and Evaluation for HIV/AIDS Policy and Program Management BZ Unit 2 – Post test CR ES GT NI PA January.
Fundamentals of Monitoring and Evaluation
Multi-Sectoral Nutrition Action Planning Training Module
Adding value to project implementation through a learning/CLA approach
OGB Partner Advocacy Workshop 18th & 19th March 2010
Integrating Gender into Rural Development M&E in Projects and Programs
Monitoring and Evaluation
Presentation transcript:

Module 1: Program Planning Cycle (Role of Surveys and Linking Indicators to Plans) Outcome Monitoring and Evaluation Using LQAS

Outcome Monitoring and Evaluation Using LQAS: Module 1 1.Articulate how M&E fits within program planning and implementation processes 2.Describe the role of population-based surveys within program planning and M&E 3.Select or develop indicators to address knowledge, practice and coverage outputs 4.Examine examples of survey questions and questionnaire resources Module 1: Objectives Slide 1

Outcome Monitoring and Evaluation

Outcome Monitoring and Evaluation Using LQAS: Module 1 Monitoring: Progress Tracking The regular collection and analysis of information Monitoring assists timely decision-making, ensures accountability and provide the basis for evaluation and learning. IFAD. (2002). A guide for project M&E: Managing for impact in rural development. (emphasis mine) Monitoring assesses progress against set objectives/outputs, supervises implementation and assesses the effectiveness of implementation strategies. The Applied Nutrition Programme, University of Nairobi School of Nutrition Science and Policy, Tufts University. (2000). Monitoring and evaluation of nutrition and nutrition-related programmes: A training manual for programme managers and implementors. (emphasis mine) Outcome Monitoring and Evaluation Workshop Slide 2

Outcome Monitoring and Evaluation Using LQAS: Module 1 Evaluation A systematic (and objective as possible) examination of a planned, ongoing or completed project. It aims to answer specific management questions and to judge the overall value of an endeavor and supply lessons to improve future actions, planning and decision-making. IFAD. (2002). A guide for project M&E: Managing for impact in rural development. (emphasis mine) The process of determining the worth or significance of an activity, policy or program. Kusek, J.Z. (2004). Ten steps to a results-based monitoring and evaluation system: A handbook for development practitioners. The World Bank. (emphasis mine) Outcome Monitoring and Evaluation Workshop Slide 3

Outcome Monitoring and Evaluation Using LQAS: Module 1 Process: Focuses on outputs (number of trainings, services offered, brochures distributed, visits made). Outcome: Focuses on changes in knowledge, practice, and service coverage. Impact: Focuses on attributing changes seen in a population to the program. Outcome Monitoring and Evaluation Workshop Slide 4

Outcome Monitoring and Evaluation Using LQAS: Module 1 Whether you are doing monitoring or evaluation, you rely on the tools of research. Research is the systematic process of collecting, analyzing, and interpreting information (data) in order to increase our understanding of the phenomenon about which we are interested or concerned. (W)e intentionally set out to enhance our understanding of a phenomenon and expect to communicate what we discover. Leedy, P.D., & Ormrod, J.E. (2005). Practical research: Planning and design. Upper Saddle River, NJ: Prentice Hall. (page 2—emphasis mine) Research Slide 5

Outcome Monitoring and Evaluation Using LQAS: Module 1 1.That you rigorously assemble evidence to… a.track your progress in an ongoing way (monitoring) b.assess the value of our work periodically (evaluation) 2.That you settle upon a consistent and intelligent method of assessing your output results, and then tracking your trajectory with rigor. Collins, J. (2005). Good to great and the social sectors. Boulder, Colorado: Jim Collins. (emphasis in the original) Output M&E: What Really Matters Slide 6

Outcome Monitoring and Evaluation Using LQAS: Module 1 Program Planning and Implementation Cycle (Key Planning, Learning, and Management Tasks) Slide 7

Outcome Monitoring and Evaluation Using LQAS: Module 1 1.Take a pause (10 minutes) to review the definitions shared in this session. 2.Put them into your own words―in a way you could explain them to someone else. Activity Slide 8

Population-Based Surveys

Outcome Monitoring and Evaluation Using LQAS: Module 1 Population-Based Surveys What are “population-based surveys?” Assess changes in the population by administering questions to people in the general population. The “population can be a subset of a larger population that the program/project is interested in, for example : Young people age 15–24 Pregnant women People living with HIV Orphans Others. These differ from service provider surveys, e.g., HFA, SPA (not the focus of this workshop). Slide 9

Outcome Monitoring and Evaluation Using LQAS: Module 1 Discuss the following in groups of three: –Your experiences in population surveys –What do those surveys help you to do? –What (more) would you have liked the surveys to help you to do? Individually, –Can you define in your own words what population-based surveys are? –What are your ideas on collecting and using data locally and for the entire program? –Write your responses to the above two issues on a notecard and post on the wall. Activity Slide 10

Linking Objectives, Indicators, and Questions

Outcome Monitoring and Evaluation Using LQAS: Module 1 From Objectives to Indicators Objective(s) Indicator(s) Structured Questions Slide 11

Outcome Monitoring and Evaluation Using LQAS: Module 1 From Objectives to Indicators Before writing an objective, you should have a broader “Goal,” for example, “reducing the transmission of HIV among young people or from mothers to their children.” This then leads to objectives. The objectives are basically clear statements of “the results” you aim to achieve so as to contribute to your broader goal. Indicators are like clues, signs, or markers that inform us on whether the program is achieving its results or objectives. a.An indicator measures one aspect of a program or project that is directly related to the program’s results or objectives. b.The value of an indicator changes from baseline to the time of the evaluation. c.An indicator presents this change in a meaningful way, such as a percentage or number. d.Indicators should be measurable, precise, valid, and reliable. Slide 12

Outcome Monitoring and Evaluation Using LQAS: Module 1 From Objectives to Indicators Goal Reduce the transmission of HIV among young people. Objective Increase the knowledge among young people of how HIV is transmitted in the next 5 years. (You could state the amount of change.) Possible Indicator Knowledge The percentage of respondents age 15–24 who, in response to a prompted question, say that people can protect themselves from contracting HIV by not having penetrative sex, by using condoms, or by having sex with only one faithful, uninfected partner. Slide 13

Outcome Monitoring and Evaluation Using LQAS: Module 1 From Objectives to Indicators Goal Reduce the transmission of HIV among young people. Objective Increase the use of condoms during sex among unmarried young people in the next 5 years. Possible Indicator Practice/Behavior The percentage of respondents age 15–24 who say that they used a condom the last time they had sex with a non-marital, non-cohabiting partner, of those who have had sex with such a partner in the last 12 months. Slide 14

Outcome Monitoring and Evaluation Using LQAS: Module 1 From Objectives to Indicators Goal Reduce the transmission of HIV from mothers to their children. Objective Increase the proportion of pregnant women who are counseled about and tested for HIV in the next 5 years. Possible Indicator Coverage The percentage of women who were counseled and offered voluntary HIV testing during antenatal care for their most recent pregnancy, accepted the offer of testing, and received their test results, of all women who were pregnant at any time in the (1 or) 2 years preceding the survey. Slide 15

Outcome Monitoring and Evaluation Using LQAS: Module 1 Get together with colleagues from your organization and do the following: –Look at the objectives of your program. –Examine from the list of indicators (provided) those that can best be used to measure progress toward your respective program objectives. –From the indicators suggested, select questions that can be asked to collect the necessary information from the population. (provide questionnaires from where questions may be picked) Share with the plenary how you have come down from objectives to the indicators, then to the questions. Activity Slide 16