Session 6: Data Flow, Data Management, and Data Quality.

Slides:



Advertisements
Similar presentations
MICS4 Survey Design Workshop Multiple Indicator Cluster Surveys Survey Design Workshop Field Staff and Field Procedures.
Advertisements

MICS4 Survey Design Workshop Multiple Indicator Cluster Surveys Survey Design Workshop Survey Quality Control.
1 Field Management: Roles & Responsibilities Partially Adapted from Multiple Indicator Cluster Surveys (MICS) Regional Training Workshop – Survey Techniques,
Introduction to Monitoring and Evaluation
Data Quality Considerations
Quality Management within the Clinical Research Process
More CMM Part Two : Details.
ORGANIZATION. 2 Problem scenario  Develop an organizational chart for your laboratory showing lines of authority from the head of the organization to.
1 QUANTITATIVE DESIGN AND ANALYSIS MARK 2048 Instructor: Armand Gervais
1 The aim…. ‘to enable assessors to objectively assess a laboratory’s compliance with the new standards’
Multiple Indicator Cluster Surveys Survey Design Workshop
Comprehensive M&E Systems
Midterm Review Evaluation & Research Concepts Proposals & Research Design Measurement Sampling Survey methods.
Lecture 13 Revision IMS Systems Analysis and Design.
Documentation for Acute Care
PPA 502 – Program Evaluation Lecture 5b – Collecting Data from Agency Records.
Indicators, Data Sources, and Data Quality for TB M&E
Evaluation. Practical Evaluation Michael Quinn Patton.
Unit 4: Monitoring Data Quality For HIV Case Surveillance Systems #6-0-1.
Promoting Excellence in Family Medicine Enabling Patients to Access Electronic Health Records Guidance for Health Professionals.
Dimensions of Data Quality M&E Capacity Strengthening Workshop, Addis Ababa 4 to 8 June 2012 Arif Rashid, TOPS.
MICS Survey Design Workshop Multiple Indicator Cluster Surveys Survey Design Workshop Overview of MICS Tools, Templates, Resources, Technical Assistance.
Employee Orientation and Training
Personnel. 2 Purchasing & Inventory Assessment Occurrence Management Information Management Process Improvement Customer Service Facilities & Safety The.
Use of OCAN in Crisis Intervention Webinar October, 2014.
Copyright 2010, The World Bank Group. All Rights Reserved. Training and Procedural Manuals Section A 1.
INFORMATION SYSTEM APPLICATIONS System Development Life Cycle.
Unit 6: Training and Supervision # Warm Up Questions: Instructions  Take five minutes now to try the Unit 6 warm up questions in your manual. 
Authorization and Inspection of Cyclotron Facilities Inspections.
Module 4: Systems Development Chapter 13: Investigation and Analysis.
1 Accreditation and Certification: Definition  Certification: Procedures by which a third party gives written assurance that a product, process or service.
Slide 1 D2.TCS.CL5.04. Subject Elements This unit comprises five Elements: 1.Define the need for tourism product research 2.Develop the research to be.
Module 5: Assuring the Quality of HIV Rapid Testing
Business Analysis and Essential Competencies
Quality Assurance Program Presenter: Erin Mustain 1.
Chapter 14 Information System Development
MICS Data Processing Workshop Multiple Indicator Cluster Surveys Data Processing Workshop Overview of MICS Tools, Templates, Resources, Technical Assistance.
P1 External Quality Assessment (EQA) Proficiency Testing.
Module 6. Data Management Plans  Definitions ◦ Quality assurance ◦ Quality control ◦ Data contamination ◦ Error Types ◦ Error Handling  QA/QC best practices.
School of Health Systems and Public Health Monitoring & Evaluation of HIV and AIDS Programs Data Quality Wednesday March 2, 2011 Win Brown USAID/South.
Data Quality Assessment
CHAPTER V Health Information. Updates on new legislation (1)  Decision No.1605/2010/QĐ-TTg approving the National Program for Application of information.
Unit 4: Reporting, Data Management and Analysis #4-4-1.
Guide to Requirements Gathering. 2 Contents  What is requirements gathering?  Why requirements gathering is key  Requirements gathering activities.
Data quality: adapted from a presentation given by Mr. B Sikhakhane, Gauteng.
November 15, 2007 The “ABC” of Effective Field Monitoring & Supervision November 15, 2007.
Healthy Futures Performance Measures Session. Session Overview Combination of presentation and interactive components Time at the end of the session for.
Consultant Advance Research Team. Outline UNDERSTANDING M&E DATA NEEDS PEOPLE, PARTNERSHIP AND PLANNING 1.Organizational structures with HIV M&E functions.
CISB113 Fundamentals of Information Systems IS Development.
United Nations Oslo City Group on Energy Statistics OG7, Helsinki, Finland October 2012 ESCM Chapter 8: Data Quality and Meta Data 1.
HISP activities are all about moving people from providing services, to also using information to manage services.
Monitoring Afghanistan, 2015 Food Security and Agriculture Working Group – 9 December 2015.
Unit 9: Evaluating a Public Health Surveillance System #1-9-1.
1 Session 14. Getting Started Drug and Therapeutics Committee.
Session 2: Developing a Comprehensive M&E Work Plan.
Session 5: Selecting and Operationalizing Indicators.
Scientific data storage: How are computers involved in the following?
A Training Course for the Analysis and Reporting of Data from Education Management Information Systems (EMIS)
Data Quality Management. Learning Objectives  At the end of this session, participants will be able to:  Summarize basic terminology regarding data.
1 Auditing Your Fusion Center Privacy Policy. 22 Recommendations to the program resulting in improvements Updates to privacy documentation Informal discussions.
MEASURE Evaluation Data Quality Assurance Workshop Session 3 Introduction to Routine Data Quality Assessment.
A FRUIT AND VEGETABLE PRESCRIPTION PROGRAM
11 ii. Develop a plan for aDSM
HIV Drug Resistance Training
Software Requirements analysis & specifications
Data Quality By Suparna Kansakar.
Medical Students Documenting in the EMR
Indicators, Data Sources and Data Quality for TB M&E
Presentation transcript:

Session 6: Data Flow, Data Management, and Data Quality

Session Overview Surveillance Project Example Data Management: Six Essential Steps Small-Group Exercise: Data Flow and Use Table 2

3 Session Objectives By the end of the session, the participant will be able to:  understand what a data management system is and how it can help with data quality assurance; and  identify ways to ensure data quality at each step of a data management system.

4 Surveillance Project: Objectives Grant: To strengthen animal and human health surveillance for novel influenza viruses Year 1 program objectives for the grant are:  Objective 1: Strengthen case-based reporting for influenza-like illness in humans.  Objective 2: Establish market-based active surveillance of poultry.

5 Surveillance Project: Selecting Indicators Objective 1: Strengthen case-based reporting for influenza-like illness in humans:  Capacity building – Number of hospitals with at least two clinicians trained in case reporting for influenza-like illness in humans.  Case-based reporting – Number of patients diagnosed with ILI for which a complete set of case-based data has been recorded. Objective 2: Establish market-based active surveillance of poultry:  Animal health surveillance – Number of targeted markets in which samples were collected as part of an active surveillance plan within the past six months.

6 Surveillance Project: Capacity Building Indicator — Number of hospitals with at least two clinicians trained in case reporting for influenza-like illness in humans. How are these data collected?  Clinicians selected for training – how?  Training records  How are data aggregated at the training organization level?  How are data forwarded to next levels (national, international)?

7 Surveillance Project: ILI Case-Based Reporting Indicator — Number of patients diagnosed with ILI for which a complete set of case-based data has been recorded.  A person diagnosed with ILI – how?  Clinician completes case reports.  How are data recorded in facility records?  How are data aggregated at all levels (facility, district, national, international)?

8 Surveillance Project: Market active surveillance Indicator — Number of targeted markets in which samples were collected as part of an active surveillance plan within the past six months.  A market is targeted as part of an active surveillance plan — how?  Samples are collected by surveillance team. What data are collected?  Samples sent to laboratory along with associated data. Lab conducts testing.  How are data aggregated at next level (e.g., from surveillance team or laboratory)? How are data aggregated at the national and international level?

9 Six Key Steps SOURCE  COLLECTION  COLLATION  ANALYSIS  REPORTING  USE

10 Sources Definition: Data/facts may originate from primary, secondary, or tertiary data sources. Examples include:  Primary sources — Training sign-in sheets, talking to training participants.  Secondary — District spreadsheets that list hospitals with trained clinicians.  Tertiary — National census data. Data can be routine or non-routine.

11 Collection Definition: The process of gathering data that are generated from various activities implemented by an organization and are relevant to an organization’s M&E framework. This involves obtaining data from original sources and using tools (paper or electronic) to collate, analyze, and report the data. Data can be collected using:  questionnaires  interviews  observations  existing records

12 Collation Definition: To combine data into summarized (often standardized) formats. This can be done:  electronically or manually; and  at a different levels (district, national). Example: Monthly surveillance summary sheets in all of the districts are sent to the national ministry of health/agriculture where they are combined to get the total at the national level.

13 Analysis Definition: Analysis is the review and manipulation of data. Depending on the type of data and the purpose, this might include:  application of statistical methods  selecting or discarding certain subsets based on specific criteria  other techniques Analysis enables data users to understand or interpret the results.

14 Reporting Definition: To describe and present crude data and processed data as useful knowledge is reporting. It is the process that provides program implementers and stakeholders an opportunity to inform themselves of progress, problems, difficulties encountered, successes, and lessons learned during implementation. Reporting can take many forms, such as:  narratives  graphics

15 Use Definition: Applying information to make timely and appropriate decisions.  Different users require different types of information to serve various purposes.  Determining factor of the usefulness of the collection of the various data.

Data Flow and Use SourceCollectionCollation & Storage AnalysisReportingUse What are we collecting? Who collects this data, from where, and how often? How are data aggregated? Where are the data stored? List any possible opportunities to transform the data into more meaningful information and thus for further review Are there other pieces of information available? To whom will this information be reported? How can this information be used to make informed decisions? List specific opportunities for use. Link to Data Use Template (4.2) Data elements Indicators Data elements Indicators 16

17 Exercise: Data Flow and Use Table In your groups: Using your M&E plan project, map the flow of your data from collection to use. Differentiate between data elements and indicators (transformed data):  source  collection  collation  analysis  reporting  use

18 Data Quality Issues: Source Poor recording of the data: Incomplete information, illegible notes, etc. Examples:  Data could be incomplete (e.g., incomplete physician notes).  Inconsistent recording of information by different staff.  Data are not available or feasible to collect. Important to consider data quality issues with secondary and tertiary data sources.

19 Ensuring Data Quality: Source Design instruments carefully and correctly. Include data providers and data processors in decisions to establish what is feasible to collect, review, and process, and how to draft instruments. Develop detailed instructions for data collection process. Ensure all personnel are trained in their assigned task.

20 Data Quality Issues: Collection Different instruments used to collect the same data. Data entered incorrectly or in wrong fields in a database. Inconsistent entries of data by different data capturers.

21 Ensuring Data Quality: Collection Develop specific instructions for data collection and routinely check to see if they are followed. Identify procedures for making changes (if necessary) to the data collection process as well as for reporting problems during data collection. Develop standard operating procedures for the collection and management of data as well as for revising collection tools and reporting problems. Check to see if people follow implemented changes to the process by conducting on-site reviews.

22 Data Quality Issues: Collation Collation issues include:  data inconsistently collated  data entry errors or other errors associated with manual collation  formulae errors  problematic sampling or estimations  no verification or other quality control mechanisms  data not kept secure

23 Ensuring Data Quality: Collation Develop checklists and approval procedures for key steps. Conduct reviews during entry process. Create an electronic or manual collation tool that includes a data review process by a second individual who is not entering the data. Randomly sample data and verify. Ensure problems are reported and documented, corrected and communicated, and tracked back to the source.

24 Data Quality Issues: Analysis Data quality issues during analysis include:  incorrect analysis  inconsistent analysis  misrepresentation of results

25 Ensuring Data Quality: Analysis Ensuring quality during analysis includes:  making sure analysis techniques meet the requirements for proper use;  disclosing all conditions/assumptions affecting interpretations for data; and  having experts review reports to ensure the analysis is reasonable.

26 Data Quality Issues: Reporting Data quality issues during reporting include:  too little reporting or feedback;  not presenting information in an accessible manner;  misrepresentation of results.

27 Ensuring Data Quality: Reporting Maintain integrity in reporting – don’t leave out any key information. Synthesize results for the appropriate audience. Have multiple reviewers within the organization. Protect confidentiality in reports and communication tools. Review data and provide feedback to those who have a stake in the results.

28 Data Quality Issues: Use Quality issues during use may involve:  not having complete data for programmatic decisions; and  not using data to inform programmatic decisions.

29 Ensuring Data Quality: Use Make timely, data-driven decisions for your program.

30 Exercise: Data Flow, Data Management, and Data Quality In your groups: Looking back at your data flow and use table, identify potential data quality issues for each of the six steps and what management steps you would take to address them.