C OMMUNICATING F INDINGS & L INKING D ATA WITH A CTION Module 5.

Slides:



Advertisements
Similar presentations
DATA DEMAND AND USE: S HARING I NFORMATION AND P ROVIDING F EEDBACK Session 5.
Advertisements

Nigeria Case Study HIVAIDS Specific Slides. ANALYZING AND INTERPRETING DATA.
Follow-up after training and supportive supervision The IMAI District Coordinator Course.
Rwanda Case Study Additional Slides on Stakeholder Involvement.
GENERATING DEMAND FOR DATA Module 1. Session Objectives  Understand the importance of improving data-informed decision making  Understand the role of.
Begin with the End in Mind
Developing and Testing a Framework and Approach for Measuring Success in Repositioning Family Planning Nicole Judice Elizabeth Snyder MEASURE Evaluation.
LINKING DATA TO ACTION Session 6. Session Objectives By the end of this session, you will be able to:  Identify priority decisions and programmatic questions.
Dissemination and Use of Results from OVC Program Evaluations Florence Nyangara, PhD MEASURE Evaluation/Futures Group Dissemination Meeting, September.
DETERMINANTS OF DATA USE Session 2. Session Objectives  Explain the data-use conceptual framework  Highlight the determinants of data use  List potential.
Module 6: Making data- informed decisions. Module 6: Learning objectives  Review role of M&E in program improvement  Identify priority decisions and.
Communicating and Applying Research Results Session 3.
Context of Decision Making
UNDERSTANDING DATA AND INFORMATION FLOW Session 4.
Module 5: Sharing data & information. Module 5: Learning objectives  Understand the importance of information feedback in program improvement and management.
CONTEXT OF DECISION MAKING Module 2. PART 1: DETERMINANTS OF DDU.
A Charge to Collaborate: IT’S NOT JUST ABOUT WHAT WE DO… IT’S ABOUT HOW WE DO IT…
Data Demand & Use (DDU) Why we collect health-related data.
Increasing district level evidence-based decision making in Côte d’Ivoire Tara Nutley MEASURE Evaluation / Futures Group Mini University, Washington DC.
Program Collaboration and Service Integration: An NCHHSTP Green paper Kevin Fenton, M.D., Ph.D., F.F.P.H. Director National Center for HIV/AIDS, Viral.
Building Capacity on Program Evaluation in Latin America: The Experience of the Partnership between Mexico’s National Institute of Public Health (INSP)
1 What are Monitoring and Evaluation? How do we think about M&E in the context of the LAM Project?
Strengthening Health Information Systems: Creating an Information Culture Manila, June 14, 2011 Theo Lippeveld, MD, MPH,
Bridging the Research-to-Practice Gap Session 1. Session Objectives  Understand the importance of improving data- informed decision making  Understand.
Data Triangulation. Objectives:  At the end of the session, participants will be able to:  Describe the role of data triangulation in program evaluation.
PLACE Method Priorities for Local AIDS Control Efforts 1 1 MEASURE Evaluation, A Manual for implementing the PLACE Method.
Consultant Advance Research Team. Outline UNDERSTANDING M&E DATA NEEDS PEOPLE, PARTNERSHIP AND PLANNING 1.Organizational structures with HIV M&E functions.
Linking Data with Action Part 1: Seven Steps of Using Information for Decision Making.
Regional Forum: Use of Gender Data in Sub-national Decision-making Kigali, Rwanda August 2012 Key Gender Terms and Concepts.
Using Data for Decision Making. Learning Objectives By the end of the session, participants will be able to: 1. Raise awareness of the importance of using.
Introduction to Monitoring and Evaluation. Learning Objectives By the end of the session, participants will be able to: Define program components Define.
WHY IS THIS HAPPENING IN THE PROGRAM? Session 5 Options for Further Investigation & Information Flow.
Unmet Need Exercise  Review the trends in CPR. What do you conclude about the performance of the FP program in each country?  Review trends in unmet.
Information Use Part II Informing Decisions Strengthening Programs through Improved Use of Data and Information.
MEASURE Evaluation Data Quality Assurance Workshop Session 3 Introduction to Routine Data Quality Assessment.
PLANNING FOR QUALITATIVE DATA COLLECTION Day 2 - Session 4.
Developing a Monitoring & Evaluation Plan MEASURE Evaluation.
MEASURE EVALUATION Session: 7 Developing Action Plans Based on Results Data Quality Assurance Workshop.
Data Use for Gender-Aware Health Programming Welcome and Introductions.
Data Demand & Use: Information Use Map Webinar Series #2 Tuesday, January 24, 2012 Presenters: Eric Geers and Tara Nutley.
Monitoring & Evaluation Capacity Strengthening Workshop WORKSHOP INTRODUCTION AND OVERVIEW.
Integration of Community Based Services It seems like a good idea, but how to make it work? Molly Cannon Palladium/MEASURE Evaluation September 28, 2015.
Management of RHIS Resources
Data Quality Assurance Workshop
Introduction MODULE 2: Indicators and Data Collection and Reporting
Introduction ROUTINE HEALTH INFORMATION SYSTEMS MODULE 5:
RHIS Design and Reform ROUTINE HEALTH INFORMATION SYSTEMS MODULE 10:
Data Quality Assurance
Session: 5 Using the RDQA tool for System Assessment
Using Data to Inform Community-Level Management
Session: 8 Disseminating Results
Introduction MODULE 6: RHIS Data Demand and Use
Right-sized Evaluation
Fundamentals of Monitoring and Evaluation
ROUTINE HEALTH INFORMATION SYSTEMS
Monitoring and Evaluation of HIV/AIDS Programs Workshop Overview
Introduction ROUTINE HEALTH INFORMATION SYSTEMS MODULE 4:
Overview of the RHIS Rapid Assessment Tool
Session: 4 Using the RDQA tool for Data Verification
Complementing Routine Data with Qualitative Data for Decision Making: Understanding the "Why" Behind Program Data Day 1 - Session 1 Note to Facilitator:
What don’t we know? Often M&E data are reviewed, but questions still remain to truly understand why a program is not meeting it objectives. In this group.
Assessment Training Session 9: Assessment Analysis
Training Content and Orientation
Introduction to the PRISM Framework
Introduction to Health Informatics:
Session: 6 Understanding & Using the RDQA Tool Output
Use of Information for Decision Making
Measuring Data Quality
Session: 9 On-going Monitoring & Follow Up
Willis Odek, PhD Chief of Party/Senior Technical Advisor,
Presentation transcript:

C OMMUNICATING F INDINGS & L INKING D ATA WITH A CTION Module 5

PART I: PROVIDING FEEDBACK ON DATA COLLECTED / ANALYZED

Part 1: Session Objectives  Understand the importance of feedback in program improvement and management  Consider how to improve feedback mechanisms in own work 3

“We are always giving patient forms and data to our M&E Unit, who then gives data to donors and the government. I am the head doctor and I never have the chance to look through the data before they go up. We just keep giving data up and up, and we never hear back about it…” Head of ART facility, Nigeria

Importance of Feedback  Information needs to be shared  At timely and regular intervals  Within, between, up, and down  Paves path between data collectors and users at all levels of the health system

Importance of Feedback  Leads to greater appreciation of data:  Improved data quality  Influencing collection of appropriate data  Important element of management and supervision  Creates opportunity to monitor & improve program services  Incentive for staff

Examples of Feedback  Sharing information within a facility or organization  Sharing aggregated service provision data from facilities within a district or between provinces  Meetings between facility and supervising agency to review and discuss information  Meetings between donor and NGO to review information and discuss challenges and opportunities

Working Toward a Culture of Information Use  Information becomes an integral part of decision-making processes, including planning, problem solving, choosing alternatives, feedback, etc.  Empowers people to ask questions, seek improvement, learn, and improve quality

Higher Levels: District, Province, National Analysts, Evaluators Service Delivery Point Feedback Managers, Government, Donors Program Compiled Data Clinical Histories, Service Statistics Reports Information Flow

Variety of Formats  Narratives  Summaries, bulleted items, graphs, charts  In-person discussion  One-on-one  Staff meetings, district meetings  Speeches to staff  Supervision visits

Quarterly Performance Indicators #IndicatorNumeratorDenominatorPercentage ART 1% of eligible clients placed on ART # of new clients on ARTSum of # of new clients on ART and clients on ART waiting list 100% 39 2% of current ART clients # of active clients on ART# of cumulative clients on ART 92% % of ART clients in 6 month cohort undergoing repeat CD4 testing # of clients for whom repeat CD4 testing was done at 6 months Total # of active ART clients in 6 month cohort 94% Pediatric ART 1% of children current on ART # of active children on ART# of cumulative children on ART 78% 4558 ART Care Follow-up 1% of non active ART patients who have stopped ART. # of patients who stopped ART. # of non active ART patients. 0% % of non active ART patients who transferred out. # of patients who transferred out. # of non active ART patients. 6% % of non active ART patients who died. # of patients who died.# of non active ART patients. 73% % of non active ART patients who have been lost to follow- up. # of patients who have been lost to follow-up. # of non active ART patients. 16% 23145

When developing feedback mechanism, consider…  The information being shared  Who will benefit from feedback  The format of the feedback mechanism  The forum in which the feedback will be shared  How often the feedback will be provided  How the feedback will move to the next level  Document the process

Potential barriers to providing feedback  Hierarchy  Role clarification – data clerk & M&E officer  Approval requirements to distribute data  Lack of knowledge of what information stakeholders need

Group Participation  Discuss barriers to providing feedback that you have experienced in your work  Discuss the benefits of feedback that you have experienced in your work  Identify:  Two stakeholder groups that would benefit from receiving feedback  The ideal mechanism to provide feedback to them

PART 2: LINKING DECISIONS/QUESTIONS WITH POTENTIAL DATA SOURCES

Part 2: Session Objectives  Identify priority decisions and programmatic questions  Link decisions/questions with potential data sources  Create a time-bound plan for using data in decision making (Framework for Linking Data with Action) 18

Building Data Use into Your Work  PLAN PLAN PLAN !  Regularly review your data – schedule time  Use the Framework for Linking Data with Action  Engage in dialogue with stakeholders to fully understand the  decisions they make  information they need  best way to present that information

Elements of the Framework  Decision makers and stakeholders with potential interest in your data  Decisions / actions that the stakeholder makes (possible uses of data)  Questions to which the stakeholder requires answers  When the decision will be made

Elements of the Framework (cont’d)  Indicators and/or data of interest (to respond to stakeholder need)  Source of data  How will data be presented (what types of analyses, graphs, formats)?

F RAMEWORK FOR L INKING D ATA WITH A CTION

Framework for Linking Data with Action Decision / Action Program/ Policy Question Decision Maker (DM), Other Stakehold- ers (OS) Indicator /Data Data Source Timeline (Analysis) (Decision) Commu- nication Channel

Framework for Linking Data with Action Decision/ Action Program /Policy Question Decision Maker (DM), Other Stakehold- ers (OS) Indicator /Data Data Source Timeline (Analysis) (Decision) Commu- nication Channel

What Are Decisions?  Choices that lead to action  All decisions are informed by questions  All questions should be based on data

Decisions  Allocation of resources across IPs/ states / districts/facilities  Revising OVC program approaches to emphasize fostering and adoption  Develop and institute workplace policies on HIV/AIDS in all institutions in state X  Hire and allocate staff to facilities

Programmatic Questions  What percentage of HIV+ pregnant women in care actually are delivering in health facilities?  What percentage of clients starting ART are lost to follow-up?  Are the number of family planning clients decreasing?  What percentage of pregnant patients who are HIV+ actually are receiving ART?

Framework for Linking Data with Action Decision/ Action Program/ Policy Question Decision Maker (DM), Other Stakehol ders (OS) Indicator/ Data Data Source Timeline (Analysis) (Decision) Commu- nication Channel DM OS Are we reaching testing targets in PMTCT? Do we have sufficient test kits? What is nurse: client ratio?

Framework for Linking Data with Action Decision/ Action Program/ Policy Question Decision Maker (DM), Other Stakehold ers (OS) Indicator /Data Data Source Timeline (Analysis) (Decision) Commu- nication Channel DM OS Are we reaching testing targets in PMTCT? Do we have sufficient test kits? What is nurse: client ratio? DM – Head of Regional Health Committee OS – Other providers, Division of Clinical Training

Framework for Linking Data with Action Decision/ Action Program/ Policy Question Decision Maker (DM), Other Stakehold ers (OS) Indicator /Data Data Source Timeline (Analysis) (Decision) Commu- nication Channel DM OS Are we reaching testing targets in PMTCT? Do we have sufficient test kits? What is nurse: client? DM – Head of Regional Health Committee OS – Other providers, Division of Clinical Training

Framework for Linking Data with Action Program/ Policy Question Decision Maker (DM), Other Stakehold ers (OS) Indicator /Data Data Source Timeline (Analysis) (Decision) Commu- nication Channel DM OS Are we reaching testing targets in PMTCT? Do we have sufficient test kits? What is nurse: client ratio? DM – Head of Regional Health Committee OS – Other providers, Division of Clinical Training 711 form indicator s K41, B73, B 91 Service statistics; Logistics manage- ment system

Framework for Linking Data with Action Decision/ Action Program/ Policy Question Decision Maker (DM), Other Stakehold ers (OS) Indicator /Data Data Source Timeline (Analysis) (Decision) Commu- nication Channel DM OS Are we reaching testing targets in PMTCT? Do we have sufficient test kits? What is nurse: client ratio? DM – Head of Regional Health Committee OS – Other providers, Division of Clinical Training 711 form indicator s K41, B73, B 91 Service statistics; Logistics manage- ment system

Framework for Linking Data with Action Decision/ Action Program/ Policy Question Decision Maker (DM), Other Stakehold ers (OS) Indicato r/Data Data Source Timeline (Analysis) (Decision) Commu- nication Channel DM OS Are we reaching testing targets in PMTCT? Do we have sufficient test kits? What is nurse: client ratio? DM – Head of Regional Health Committee OS – Other providers, Division of Clinical Training 711 form indicator s K41, B73, B 91 Service statistics; Logistics manage- ment system Dec. 2010, March 2011, June 2011, September 2011, December 2011

Framework for Linking Data with Action Decision/ Action Program/ Policy Question Decision Maker (DM), Other Stakehold ers (OS) Indicator /Data Data Source Timeline (Analysis) (Decision) Commu- nication Channel DM OS Are we reaching testing targets in PMTCT? Do we have sufficient test kits? What is nurse: client ratio? DM – Head of Regional Health Committee OS – Other providers, Division of Clinical Training 711 form indicator s K41, B73, B 91 Service statistics; Logistics manage- ment system Dec. 2010

Framework for Linking Data with Action Decision/ Action Program/ Policy Question Decision Maker (DM), Other Stakehold ers (OS) Indicator /Data Data Source Timeline (Analysis) (Decision) Commu- nication Channel DM OS Are we reaching testing targets in PMTCT? Do we have sufficient test kits? What is nurse: client ratio? DM – Head of Regional Health Committee OS – Other providers, Division of Clinical Training 711 form indicator s K41, B73, B 91 Service statistics; Logistics manage- ment system Dec. 2010Short summary presented to facility manager at weekly clinic meeting

Framework for Linking Data with Action Decision/ Action Program/ Policy Question Decision Maker (DM), Other Stakehold ers (OS) Indicator /Data Data Source Timeline (Analysis) (Decision) Commu- nication Channel DM OS Hire more PMTCT counsel- ors Are we reaching testing targets in PMTCT? Do we have sufficient test kits? What is nurse: client ratio? DM – Head of Regional Health Committee OS – Other providers, Division of Clinical Training 711 form indicator s K41, B73, B 91 Service statistics; Logistics manage- ment system Dec. 2010Short summary presented to facility manager at weekly clinic meeting

Framework for Linking Data with Action  Creates a time-bound plan for information- informed decision making  Encourages greater use of existing information  Monitors the use of information in decision making

Small Group Activity 6 – Instructions  Select a note taker  On flip chart paper, create the Framework table  Brainstorm three decisions or questions in columns 1 & 2  Complete the remaining columns  Time: 1 hour

Small Group Activity – Report Back  Each group will have 10 minutes to present its completed Framework  Group discussion – are there other data sources that might have been used in this decision? Were there other stakeholders who should have been considered? (10 minutes)

Building Data Use into Your Work  PLAN PLAN PLAN !  Regularly review your data – schedule time  Use the Framework for Linking Data with Action  Engage in dialogue with stakeholders  Consider other tools or methods related to data demand and use

Improving Data Demand & Use: Multifaceted Approach Applying a combination of:  Assessment of current data use, capacity building needs, and barriers to data use  Capacity building initiatives around data use concepts, use of tools, data analysis…  Tool application  Organization development (e.g., leadership, systems improvement)  Collaborative efforts between data users and producers

Multifaceted Approach in Nigeria  Large amount of data collected, feeding NNRIMS  Data were not being used effectively at sites or within project  Pervasive mistrust of data  Lack of understanding of how RHIS data could be used  Lack of understanding of how indicators were calculated and used for program improvement

Multifaceted Approach in Nigeria

Multifaceted Approach in Nigeria: Results  86% of respondents implemented solutions to identified barriers to data use  76% reported assisting decision makers with data interpretation

THANK YOU! MEASURE Evaluation is a MEASURE project funded by the U.S. Agency for International Development and implemented by the Carolina Population Center at the University of North Carolina at Chapel Hill in partnership with Futures Group International, ICF Macro, John Snow, Inc., Management Sciences for Health, and Tulane University. Views expressed in this presentation do not necessarily reflect the views of USAID or the U.S. Government. MEASURE Evaluation is the USAID Global Health Bureau's primary vehicle for supporting improvements in monitoring and evaluation in population, health and nutrition worldwide. Visit us online at