Data Processing & Data Quality. A virtuous cycle The implicit assumptions underlying information systems are twofold: first, that good data, once available,

Slides:



Advertisements
Similar presentations
Key Criteria/Principles in selecting indicators Follow-up consultation on TFI Recommendations on Communication for EPI/Polio Dakar - Senegal, 05 and 07.
Advertisements

Accident and Incident Investigation
1 World Bank Support TFSCB STATCAP Monitoring systems / Core Welfare Indicators Questionnaire (CWIQ) Readiness Assessment.
Donald T. Simeon Caribbean Health Research Council
Data Processing Topic 4 Health Management Information Systems João Carlos de Timóteo Mavimbe Oslo, April 2007.
ISO 9001 : 2000.
Data Collection Six Sigma Foundations Continuous Improvement Training Six Sigma Foundations Continuous Improvement Training Six Sigma Simplicity.
Ray C. Rist The World Bank Washington, D.C.
Using Information for Health Management; Part I
PARIS 21 Meeting Ghana July Challenges in health information Health Metrics Network HMN Framework and profiling tool.
Chapter 2 Health Care Information Systems: A Practical Approach for Health Care Management 2nd Edition Wager ~ Lee ~ Glaser.
Indicators, Data Sources, and Data Quality for TB M&E
1 Assuring the Quality of your COSF Data. 2 What factors work to improve the quality of your data? What factors work to lessen the quality of your data?
Purpose of the Standards
Systematic learning from mistakes: achievements and challenges Andy Sutherland, NHS Information Centre for health and social care.
Quality Improvement Prepeared By Dr: Manal Moussa.
Assessing Statistical Systems Graham Eele – World Bank, Development Data Group.
The Education Adjustment Program Profile – Revised.
1 Interpretation and use. 2 The walls inside are plastered with laboriously made graphs…
Today’s Lecture application controls audit methodology.
How should we assess students in Geography at KS 3? Why do we assess? What is the difference between summative and formative assessment? What is the cartoons.
Adeyl Khan, Faculty, BBA, NSU Compared to planning, developing org. structure, developing strategies, goals and motivating employees.
Cross Border Animal Health Plan of Action – Kenya and Uganda Four Strategic areas 1. To improve prevention, management and control of cross border animal.
University of Palestine software engineering department Testing of Software Systems Fundamentals of testing instructor: Tasneem Darwish.
Sina Keshavaarz M.D Public Health &Preventive Medicine Measuring level of performance & sustaining improvement.
Module 5: Assuring the Quality of HIV Rapid Testing
Medical Audit.
Unit 10. Monitoring and evaluation
Project Management Chapter 9 Project Quality Management Dr. Jana Jagodick Polytechnic of Namibia, 2012.
INF5761 Health Management Information Systems. What this course is about: HMIS 2 decision making (the use) statistics & aggregate data ( the evidence)
The aim / learning outcome of this module is to understand how to gather and use data effectively to plan the development of recycling and composting.
1 Data analysis. 2 Turning data into information.
Strengthening Health Information Systems: Creating an Information Culture Manila, June 14, 2011 Theo Lippeveld, MD, MPH,
2013 NEO Program Monitoring & Evaluation Framework.
Uganda Health Information Strategy Eddie Mukooyo, MD, MSc Assistant Commissioner Health Services Dublin, Ireland 13 th September 2010.
UNDAF M&E Systems Purpose Can explain the importance of functioning M&E system for the UNDAF Can support formulation and implementation of UNDAF M&E plans.
What is HMN? Global partnership founded on the premise that better health information means better decisions and better health Partners reflect wide.
Risk Management & Corporate Governance 1. What is Risk?  Risk arises from uncertainty; but all uncertainties do not carry risk.  Possibility of an unfavorable.
FACILITATOR Prof. Dr. Mohammad Majid Mahmood Art of Leadership & Motivation HRM – 760 Lecture - 25.
Risk Management - “Local Government Pitfalls.” IMFO – Sustainability Workshop Risk Management 30 March
INTEGRATED ASSESSMENT AND PLANNING FOR SUSTAINABLE DEVELOPMENT 1 Click to edit Master title style 1 Evaluation and Review of Experience from UNEP Projects.
TBS 2008-H. Tata & M. Babaley Mapping and In-depth Assessment of Medicines Procurement and Supply Systems WHO Technical Briefing Seminar 17 th -21 st November.
Consultant Advance Research Team. Outline UNDERSTANDING M&E DATA NEEDS PEOPLE, PARTNERSHIP AND PLANNING 1.Organizational structures with HIV M&E functions.
United Nations Oslo City Group on Energy Statistics OG7, Helsinki, Finland October 2012 ESCM Chapter 8: Data Quality and Meta Data 1.
Kathy Corbiere Service Delivery and Performance Commission
Chapter 3 Data Control Ensure the Accurate and Complete data is entering into the data processing system.
HISP activities are all about moving people from providing services, to also using information to manage services.
Monitoring and Evaluation for ACSM Charlotte Colvin, PhD TB/HIV Technical Officer PATH 23 February 2010.
Monitoring Afghanistan, 2015 Food Security and Agriculture Working Group – 9 December 2015.
High-level forum on strategic planning, Ulaanbaatar, October 2006 Some issues in NSDS design and implementation planning Presentation by PARIS21.
HSA 171 CAR. 1436/5/10 3  Concept Of Controlling.  Definition.  Controlling Process. 4.
Quality Milestones Elaborate quality system developed over the years “Joint Agenda Building” (JAB) group “Strategic Quality” – Progress report CA/80/04.
TOPIC WORKSHOP: DATA AND MEASUREMENT Identifying, defining and collecting measures from PDSAs Early Years Collaborative: Learning Session 5.
Analysing Performance and Gathering Information Preparation of the Body.
Presentation to the Ad-hoc Joint Sub-Committee on Parliamentary Oversight and Accountability Wednesday 20 March 2002 PUBLIC SERVICE MONITORING AND EVALUATION.
Session 6: Data Flow, Data Management, and Data Quality.
Session 2: Developing a Comprehensive M&E Work Plan.
A Training Course for the Analysis and Reporting of Data from Education Management Information Systems (EMIS)
MEASURE Evaluation Data Quality Assurance Workshop Session 3 Introduction to Routine Data Quality Assessment.
 Meaning  Purpose  Steps in evaluation  Models of evaluation  Program evaluators and the role.
Using Information for Health Management; Part I
Six-Sigma : DMAIC Cycle & Application
Closing the circle and Reviewing the plan
IX- PREPARING THE BUDGET
Assuring the Quality of your COSF Data
مدیریت عدم انطباق Nonconformity Management
Integrating Outcomes Learning Community June 12, 2013
OGB Partner Advocacy Workshop 18th & 19th March 2010
Indicators, Data Sources and Data Quality for TB M&E
Assuring the Quality of your COSF Data
Presentation transcript:

Data Processing & Data Quality

A virtuous cycle The implicit assumptions underlying information systems are twofold: first, that good data, once available, will be transformed into useful information which, in turn, will influence decisions; second, that such information-based decisions will lead to a more effective and appropriate use of scarce resources through better procedures, programmes, and policies, the execution of which will lead to a new set of data which will then stimulate further decisions, and so forth in a spiral fashion. (Sauerborn 2000 in Lippeveld et.al. Design and implementations of health information systems)

How do we process it? How do we present it? How do we use it? Reliable Information Information Cycle What do we collect?  Stages  Tools  Outputs data sources & tools Timely Quality data Data quality checks & analysis Information

Ensuring data accuracy Once data has been collected, it should be checked for any inaccuracies and obvious errors. Ideally this should be done as close to the point of data collection as possible. – Identify cause – Prevent future errors Remember Johan’s little investigation

Why checking data is vital? Use of inaccurate data leads to – Wrong priorities (focus on the wrong data) – Wrong decisions (not applying the right actions) – Garbage in = garbage out Producing data is EXPENSIVE – Waste of resources and time to collect poor data

Data, in order to be useful, should be: RELIABLE: correct, complete, consistent TIMELY:fixed deadlines for reporting AVAILABLE :  who reports to whom?  feedback mechanisms ACTIONABLE: no action = throw data away COMPARABLE: same numerator and denominator definitions used by all (e.g. geography vs org. unit function)

Complete data?  Geography: submission by all (most) reporting facilities  Time: can you do analysis over time? Consistency?  Does your services cover the full population? Many indicators depend on population figures as denominators

Correct data?  Are we even collecting the right data?  The data seems sensible/plausible?  The same definition applied uniformly  Legible handwriting  Are there any preferential end digits used?

PREFERENTIAL END-DIGITS JANFEBMARCHAPRILMAYJUNEJULY

Consistent data? Data in the similar range as this time last year or similar to comparable reporting organization units No large gaps or missing data No multiplicity of data (same data from multiple sources –which one to trust?)

Timely data? Some data needs to be acted upon immediately Late reports weaken the potential for comparison, and action can be too late, but still useful for documenting trends

Accuracy enhancing principles  Capacity building through training (90% of HISP activities)  User-friendly collection/collation tools  Feedback on data errors (but not only!)  Feedback of analysed Information  Local Use of information

Controlling quality with DHIS2 Maximum / minimum values Validation rules Validation Checks/Reminders Completeness and timeliness reports (input for a league table?) - Will be covered in lab session -

Good data quality 10 steps to achieve it

1. Small, Essential Data Set – EDS 2. Use of data locally by the collectors 3. Clear definitions - standards 4. Careful collection and collation of data – good tools 5. Sharing of information 6. Regular feedback 7. Supportive supervision - at all levels 8. Ongoing capacity building through training and support 9. Regular discussion of information at facility team meetings 10. Monitoring & Rewarding good information (League Table)

or else…

limited capacity to manage or analyse data Using evidence not perceived as a winning strategy A vicious cycle Data not trusted Weak demand Weak HIS Poor data quality Limited investment in HIS Decisions not evidence-based Donors get their own Fragmentation