Complementing Routine Data with Qualitative Data for Decision Making: Understanding the "Why" Behind Program Data Day 1 - Session 1 Note to Facilitator:

Slides:



Advertisements
Similar presentations
Rwanda Case Study Additional Slides on Stakeholder Involvement.
Advertisements

GENERATING DEMAND FOR DATA Module 1. Session Objectives  Understand the importance of improving data-informed decision making  Understand the role of.
Monitoring and Evaluation of National Tuberculosis Programs Regional Workshop Kyiv, Ukraine May 23-26, 2006.
Begin with the End in Mind
LINKING DATA TO ACTION Session 6. Session Objectives By the end of this session, you will be able to:  Identify priority decisions and programmatic questions.
Dissemination and Use of Results from OVC Program Evaluations Florence Nyangara, PhD MEASURE Evaluation/Futures Group Dissemination Meeting, September.
DETERMINANTS OF DATA USE Session 2. Session Objectives  Explain the data-use conceptual framework  Highlight the determinants of data use  List potential.
Context of Decision Making
UNDERSTANDING DATA AND INFORMATION FLOW Session 4.
Technical Approach to and Experiences from Strengthening National Monitoring and Evaluation System for Most Vulnerable Children Program in Tanzania Prisca.
Business as Unusual: Changing the Approach to Monitoring OVC Programs Karen G. Fleischman Foreit, PhD Futures Group/MEASURE Evaluation.
Strengthening Health Information Systems: Creating an Information Culture Manila, June 14, 2011 Theo Lippeveld, MD, MPH,
An introduction to quality improvement training. Objectives of Quality Improvement training 1.Introduce the concepts of quality, performance measurement.
Linking Data with Action Part 1: Seven Steps of Using Information for Decision Making.
Day 6: Supervisors’ Training This presentation has been supported by the U.S President’s Emergency Plan for AIDS Relief (PEPFAR) through the U.S. Agency.
WHY IS THIS HAPPENING IN THE PROGRAM? Session 5 Options for Further Investigation & Information Flow.
Introduction to Group Work. Learning Objectives The goal of the group project is to provide workshop participants with an opportunity to further develop.
Day 4: Field Practicum This presentation has been supported by the U.S President’s Emergency Plan for AIDS Relief (PEPFAR) through the U.S. Agency for.
STUDY IMPLEMENTATION Day 2 - Session 5 Interview guides and tips for effective strategies.
MEASURE Evaluation Data Quality Assurance Workshop Session 3 Introduction to Routine Data Quality Assessment.
PLANNING FOR QUALITATIVE DATA COLLECTION Day 2 - Session 4.
MEASURE EVALUATION Session: 7 Developing Action Plans Based on Results Data Quality Assurance Workshop.
Data Use for Gender-Aware Health Programming Welcome and Introductions.
Day 1: Well-being and Interviewing This presentation has been supported by the U.S President’s Emergency Plan for AIDS Relief (PEPFAR) through the U.S.
Data Demand & Use: Information Use Map Webinar Series #2 Tuesday, January 24, 2012 Presenters: Eric Geers and Tara Nutley.
Monitoring & Evaluation Capacity Strengthening Workshop WORKSHOP INTRODUCTION AND OVERVIEW.
Management of RHIS Resources
Module 2 Household Vulnerability Prioritization Tool Database.
Introduction ROUTINE HEALTH INFORMATION SYSTEMS MODULE 9:
Data Quality Assurance Workshop
Introduction MODULE 2: Indicators and Data Collection and Reporting
Introduction ROUTINE HEALTH INFORMATION SYSTEMS MODULE 5:
RHIS Design and Reform ROUTINE HEALTH INFORMATION SYSTEMS MODULE 10:
Session: 5 Using the RDQA tool for System Assessment
Using Data to Inform Community-Level Management
Session: 8 Disseminating Results
Introduction ROUTINE HEALTH INFORMATION SYSTEMS MODULE 8:
Introduction MODULE 6: RHIS Data Demand and Use
Right-sized Evaluation
Fundamentals of Monitoring and Evaluation
Training of Trainers on the OVC Household Vulnerability Prioritization Tool.
ROUTINE HEALTH INFORMATION SYSTEMS
MEASURE Evaluation Using a Primary Health Care Lens Gabriela Escudero
Presenting an Information Needs Framework for PEPFAR OVC Programs
Monitoring and Evaluation of HIV/AIDS Programs Workshop Overview
Introduction to Comprehensive Evaluation
Introduction ROUTINE HEALTH INFORMATION SYSTEMS MODULE 4:
Overview of the RHIS Rapid Assessment Tool
Session: 4 Using the RDQA tool for Data Verification
Putting Public Health Evidence in Action
What don’t we know? Often M&E data are reviewed, but questions still remain to truly understand why a program is not meeting it objectives. In this group.
Assessment Training Session 9: Assessment Analysis
Training Content and Orientation
Introduction ROUTINE HEALTH INFORMATION SYSTEMS MODULE 3:
Introduction RHIS Design and Reform ROUTINE HEALTH INFORMATION SYSTEMS
Introduction to Health Informatics:
Introduction to the PRISM Framework
Information Systems for Health:
Information Systems for Health:
Introduction to Health Informatics:
Session: 6 Understanding & Using the RDQA Tool Output
Introduction MODULE 7: RHIS Governance and Management of Resources
Siân Curtis, PhD OVC Evaluation Dissemination Meeting,
Data and Interoperability:
Use of Information for Decision Making
Measuring Data Quality
Introduction to Health Informatics
Session: 9 On-going Monitoring & Follow Up
Process Improvement, System Design, and Usability Evaluation
Slide Deck: 01.
Presentation transcript:

Complementing Routine Data with Qualitative Data for Decision Making: Understanding the "Why" Behind Program Data Day 1 - Session 1 Note to Facilitator: If the group is unfamiliar with MEASURE Evaluation, be sure to briefly explain MEASURE Evaluation to the group – “USAID’s primary vehicle for supporting monitoring and evaluation of public health programs worldwide” – and briefly describe its focus on trying to improve the use of that information in programmatic decision making o improve services offered to beneficiaries. This is also the time to acknowledge anyone who has helped with coordinating the workshop logistics and the organization sponsoring the workshop and/or workshop participants. This workshop was developed based on field experiences where decisions are often made based on anecdotal evidence or “gut” instincts. In some cases, no data at all are reviewed, in other cases M&E data are reviewed in light of other data, but often questions still remain to truly understand why the program is not meeting its objectives. This workshop will help individuals know when they need to collect additional data, particularly qualitative data, and how to collect, analyze, and use those data for programmatic improvements. This workshop is geared toward public health/clinic staff working at the national or district level/government or NGO/private sector. It is appropriate for program officers, leaders, M&E officers, and others involved in public health programs.

Introductions Let’s take a minute to introduce ourselves – please say your name, position, organizational affiliation, and one thing you hope to get out of this workshop.

Workshop Objectives Understand when to consider other data to further understand program performance Identify qualitative data collection methods appropriate for answering questions about program performance Build skills in collecting, analyzing, and interpreting primary qualitative data with a specific focus on in-depth interviews Build skills in applying qualitative findings to program improvements By the end of this session, the learner will be able to:

Workshop Design Understand key steps for collecting and analyzing information to understand program performance. Provide an overview of qualitative methods and experience developing questions, collecting primary data, and analysis . It is NOT a comprehensive course in qualitative research methods. Note to Facilitator: Review workshop design

References Aujoulat, I., Luminet, O., Deccache, A. Jack, S.M. MacLean, L., Meyer, M., Estable, A Maman, S. Ryan, G.W. & Bernard, H.R. Sandelowski, M. Spradley, J. Ulin, P.R., Robinson, E.T., & Tolley, E.E. Note to Facilitator: Mention the references that were used in the development of the material.

Agenda Overview Day Session Day 1 Session 1: Overview Session 2: Using Data to Improve Programs Session 3: Conduct Further Research Day 2 Continued – Session 3: Conduct Further Research Session 4: Planning for Qualitative Data Collection Session 5: Study Implementation Day 3 Session 6: Data Management and Analysis Day 4 Session 7: Making Data-informed Programmatic Decisions Note to Facilitator: Go over the agenda and make any necessary logistics announcements.

Workshop Methods Plenary presentations Group discussions Small group work The workshop will use a variety of methods to engage participants – it is heavily dependent on learner participation, and sharing of examples and perspectives. For a successful workshop, we need a commitment from everyone to participate and share the wealth of experiences you have in your field.

Housekeeping and Ground Rules Breaks Keeping to the agenda Cell phones turned off Full participation Others? Note to Facilitator: Ask participants to set the ground rules for the training (i.e., no cell phones, take turns speaking, etc.). Ask participants to list their expectations for the training. List these expectations on flip chart paper. When finished, post expectations on the wall. Make sure the expectations remain posted throughout the duration of the workshop.

Activity 1: Individual Pre-test Please fill out Activity 1: Individual Pre-test and return to the facilitator Note to Facilitator: Pass out a copy of Activity 1: The Individual Pre-test. Ask participants to take a few minutes to fill it out. Note that each participant will be asked to complete an evaluation form at the end of the training but that comments, thoughts, and suggestions are welcome throughout the workshop — either one-on-one during a break or at the beginning or end of the day during feedback sessions.

MEASURE Evaluation is funded by the U. S MEASURE Evaluation is funded by the U.S. Agency for International Development (USAID) and the U.S. President's Emergency Plan for AIDS Relief (PEPFAR). Views expressed in this presentation do not necessarily represent the views of USAID, PEPFAR or the U.S. government. MEASURE Evaluation is implemented by the Carolina Population Center at the University of North Carolina at Chapel Hill in partnership with Futures Group, ICF International, John Snow, Inc., Management Sciences for Health, and Tulane University.