Data Quality Quality data collection and management.

Slides:



Advertisements
Similar presentations
HIV/AIDS Results Monitoring and Evaluation Systems Measuring the Multi-Sector Response.
Advertisements

Monitoring and evaluation program March, Fabio Ferretti.
Prepared by BSP/PMR Results-Based Programming, Management and Monitoring Presentation to Geneva Group - Paris Hans d’Orville Director, Bureau of Strategic.
What You Will Learn From These Sessions
Monitoring and Evaluation for HES Activities
An Introduction to Expenditure Analysis ~ an overview of the NASA methodology Teresa Guthrie Centre for Economic Governance and AIDS in Africa OSI Workshop,
Hospital Information System (HIS) Stakeholder Consultation
MINISTRY OF DEVOLUTION AND PLANNING. M&E DEPARTMENT WELIME Using Technology in Monitorin g and Evaluation (e-ProMIS)
Results-Based Management: Logical Framework Approach
7.2 System Development Life Cycle (SDLC)
Monitoring, Review and Reporting Project Cycle Management A short training course in project cycle management for subdivisions of MFAR in Sri Lanka.
Module 2: Key systems of effective training institutions MOA – FAO – TCP Workshop on Managing Training Institutions Beijing, 9 July 2012.
The Current Status of States' Early Childhood Outcome Measurement Systems Kathy Hebbeler, SRI International Lynne Kahn, FPG Child Dev Inst October 17,
Employee Orientation and Training
Unit 9. Human resource development for TB infection control TB Infection Control Training for Managers at National and Subnational Level.
Developing a programme of information literacy. Strategy Will you work at an institutional level? Will you work at a course level? Will you work at a.
TRAINING AND DEVELOPMENT. - A planned effort by a company to facilitate employees’ learning of job-related competencies. Training Defined:
PROGRAMS MONITORING AND SUPERVISION
Copyright 2010, The World Bank Group. All Rights Reserved. Training and Procedural Manuals Section A 1.
Scottish Government funded Projects: Monitoring, Evaluation and Learning (MEL) … and reporting 4 September 2013.
Systems Analysis – Analyzing Requirements.  Analyzing requirement stage identifies user information needs and new systems requirements  IS dev team.
May 12 th Monitoring and Project Control. Objectives Anticipated Outcomes Express why Monitoring and Controlling are Important. Differentiate between.
Pacific TA Meeting: Quality Practices in Early Intervention and Preschool Programs Overview to Trends and Issues in Quality Services Jane Nell Luster,
Global Action Plan and its implementation in other regions Meeting for Discussion of the draft Plan for the Implementation of the Global Strategy to Improve.
APAPDC National Safe Schools Framework Project. Aim of the project To assist schools with no or limited systemic support to align their policies, programs.
FewSomeAll. Multi-Tiered System of Supports A Comprehensive Framework for Implementing the California Common Core State Standards Professional Learning.
Clinical Observer Training Session 1 Course Overview Courtesy: HIP.
“Opening the Doors of Policy-Making: Central Asia and South Caucasus” (UDF- GLO ) Skills Development Training for CSOs Istanbul, June 2-3, 2011 In-depth.
Unit 10. Monitoring and evaluation
OECD/INFE Tools for evaluating financial education programmes Adele Atkinson, PhD Policy Analyst OECD With the support of the Russian/World Bank/OECD Trust.
Writing Clear Learning Objectives. Help trainers focus on “need to know” content and eliminate unnecessary content. Guide trainers in choosing appropriate.
Christiana Noestlinger & Bea Vuylsteke, Institute of Tropical Medicine, Belgium This work is part of the Joint Action on Improving Quality in HIV Prevention.
Director of Evaluation and Accountability Manager, UW’s Grand Rapids, Michigan Robert McKown, CIRS Director of Evaluation and Accountability Sherri.
Level 2 Unit 6 Application of Manufacturing Techniques in Engineering Engineering Diploma Level 2 Unit 6 Application of Manufacturing Techniques in Engineering.
Monitoring & Evaluation: The concepts and meaning Day 9 Session 1.
EDPQS in 10 minutes: Overview of European Drug Prevention Quality Standards (EDPQS) With financial support from the Drug Prevention and Information Programme.
RTQII Indicator Reporting
TrainSMART for Decision Makers. Goal of this training: To prepare I-TECH decision makers to rollout TrainSMART in their country project.
Clinical Assessment Program for Residencies Jim Czarnecki, D.O.
Dr. Shulagna Sarkar Assistant Professor , Jt. Coordinator – PGDM – HRM
21/4/2008 Evaluation of control measures 1. 21/4/2008 Evaluation of control measures 2 Family and Community Medicine Department.
The Major Steps of a Public Health Evaluation 1. Engage Stakeholders 2. Describe the program 3. Focus on the evaluation design 4. Gather credible evidence.
Better Community Engagement Training for Trainers Course Day 1 This was developed as part of the Scottish Government’s Better Community Engagement Programme.
Prevention, Partnership and Family Support
Use of Routine Feedback Reports to Site and District Staff in Cote d’Ivoire Hermann Brou Annual Meeting, M&E satellite meeting Johannesburg, July 28, 2010.
Consultant Advance Research Team. Outline UNDERSTANDING M&E DATA NEEDS PEOPLE, PARTNERSHIP AND PLANNING 1.Organizational structures with HIV M&E functions.
Data Collection. Data Capture This is the first stage involved in getting data into a computer Various input devices are used when getting data to the.
Managing Training Institutions: Welcome Session MOA – FAO – TCP Workshop Beijing 9 July 2012.
AR350: Maintaining Customers Welcome to AR350: Maintaining Customers.
Using Logic Models to Create Effective Programs
Visualizing data for program improvement with ICAP’s Unified Reporting System (URS): Presented by: Deborah Horowitz, Strategic Information Specialist,
Data Quality Assessment of PEPFAR ART Sites In Nigeria Final Report February 17, 2006 Nigeria/Monitoring and Evaluation Management Services in collaboration.
Session 6: Data Flow, Data Management, and Data Quality.
NFM: Modular Template Measurement Framework: Modules, Interventions and Indicators LFA M&E Training February
Output of Development Phase Training Title: Training Duration: Trainer: Seating: Target Population: Overall Training Objectives: After having attended.
Assessing Logistics System Supply Chain Management 1.
Using TrainSMART to Improve Data Collection and Reporting for In-Service Training. A Pilot Project in Tanzania ICASA Conference, Addis Ababa Dec
National TB Infection Control Seminar S’celo S. Dlamini Director: Research, Information, Monitoring, Evaluation & Surveillance 12 th of August 2015 Integrating.
Strategic Information on ART Scale up Kevin O'Reilly Department of HIV/AIDS WHO.
Exploitation means to use and benefit from something. For Erasmus+ this means maximising the potential of the funded activities, so that the results are.
Monitoring and Evaluation Systems for NARS organizations in Papua New Guinea Day 4. Session 10. Evaluation.
Train the Trainer Inland Navigation Simulators. Welcome Please tell us: Who you are Where your from What your experiences are as instructor Why you are.
How to show your social value – reporting outcomes & impact
FORMAL SYSTEM DEVELOPMENT METHODOLOGIES
Data Quality By Suparna Kansakar.
Module P4 Identify Data Products and Views So Their Requirements and Attributes Can Be Controlled Learning Objectives: Understand the value of data. Understand.
OVC_HIVSTAT HIV Risk Assessment Prototype.
monitoring & evaluation THD Unit, Stop TB department WHO Geneva
Purpose of the Workshop
Monitoring and Evaluation in SIYB Program
Presentation transcript:

Data Quality Quality data collection and management

Data Quality2 Learning Objectives By the end of this session, you should be able to: 1. Describe the importance of training data collection and management 2. Describe and appropriately complete the various data fields (including PEPFAR categories and training levels) 3. Demonstrate appropriate use of the forms 4. Identify the necessary steps to ensure data quality at each stage of the data collection and management process 5. Distinguish between data that is complete and correct and data that needs cleaning

Data Quality3 Group Discussion: Why Data?

Data Quality4 Programme Data  Measures programme inputs and outputs  Helps determine programme outcomes  Use programme data to:  Assess whether the programme is meeting its established targets  Identify and improve problem areas in a programme  Improve efficiency of the use of programme resources  Inform reporting to partners and funders

Data Quality5 Data Management  Systems, policies, practices and procedures that manage and organize data for specific needs

Data Quality6 TrainSMART  TrainSMART is I-TECH’s open-source, web-based training data collection system  Allows users to accurately track data including:  training programmes  trainers  trainees  Also enables users to better evaluate programmes and report activities to stakeholders.

Data QualityTraining Summit 7 TrainSMART Tool

Data Quality8 Data Collection  Forms:  Participant Registration Form  Trainer Registration Form  Course Form  The data entry pages have been built to follow our data collection forms that are completed in the field making data entry much easier.

Data Quality9 Training Levels  Level 1 (Didactic, Seminar, Lecture)  Level 2 (Skills-building) Group-based Workshop  Level 3 (Clinical Training/ preceptorship— trainer led)  Level 4 (Clinical Consultation—trainee directed)  Level 5 (TA—Tech. Assistance, other than direct care)

Data Quality10 PEPFAR Categories  ART  Counseling & Testing  Laboratory  Orphans and Vulnerable Children  Palliative Care (OI, TB/HIV, etc.)  TB/HIV  PMTCT  Policy Analysis & System Strengthening  Prevention  Strategic Information

Exercise Data Collection

Demonstration Entering Data in TrainSMART

Data Quality13 Data Quality  What is quality data?  Complete  Consistent  Timely  Accurate  What influences the quality of data?

Data Quality14 Need Quality Data  The quality of the analysis and interpretation of data can only be as good as the data itself  Ensure data is accurate, specific, and complete

Data Quality Small Group Activity

Data Quality16

Data Quality17

Data Quality18

Data Quality19

Data QualityTraining Summit 20

Data QualityTraining Summit 21

Data Quality22 Data: “Clean” vs. “Dirty”  identifying incomplete, incorrect, inaccurate, irrelevant parts of the data and then replacing, modifying or deleting this dirty data

Discussion Clean or Dirty Data

Data Quality24 Data Flow  Data collection and entry should be done in a methodical and defined way  Specific individuals need to be identified to be responsible for each step of the process

Data Quality25 Key Points  Good quality training data collection and management is essential for accurate and complete reporting  PEPFAR and Training Categories need to be understood to enter them correctly into the training database.  Training forms correspond with data entry screens in TrainSMART  Data entry processes need to be defined at each training centre and responsible persons identified to manage the data