Ensuring the quality of Qualitative Data Presented by Lorie Broomhall, Ph.D. Senior HIV/AIDS Advisor Nigeria Monitoring and Evaluation Management Services.

Slides:



Advertisements
Similar presentations
Data Quality Considerations
Advertisements

Teacher Evaluation Model
Collecting High-Quality Data. M&E Plan FrameworkIndicators Data Collection Data Quality Data Use and Reporting Evaluation Strategy Budget Part of the.
Auditing Concepts.
Beginning the Research Design
SOWK 6003 Social Work Research Week 4 Research process, variables, hypothesis, and research designs By Dr. Paul Wong.
SOWK 6003 Social Work Research Week 4 Research process, variables, hypothesis, and research designs By Dr. Paul Wong.
ALEC 604: Writing for Professional Publication Week 7: Methodology.
Introduction to Qualitative Research
Indicators, Data Sources, and Data Quality for TB M&E
Quality evaluation and improvement for Internal Audit
Knowledge is Power Marketing Information System (MIS) determines what information managers need and then gathers, sorts, analyzes, stores, and distributes.
Evaluation. Practical Evaluation Michael Quinn Patton.
Chapter 5 Copyright © Allyn & Bacon 2008 This multimedia product and its contents are protected under copyright law. The following are prohibited by law:
Dimensions of Data Quality M&E Capacity Strengthening Workshop, Addis Ababa 4 to 8 June 2012 Arif Rashid, TOPS.
Chapter 2 Understanding the Research Process
INTRODUCTION Performance management is a relatively new concept to the field of management.
Reliability & Validity Qualitative Research Methods.
Auditing Standards IFTA\IRP Audit Guidance Government Auditing Standards (GAO) Generally Accepted Auditing Standards (GAAS) International Standards on.
Chapter 14 Overview of Qualitative Research Gay, Mills, and Airasian
Spreadsheet Management. Sarbanes-Oxley Act (SOX, 2002) Requires “an effective system of internal control” for financial reporting in publicly- held companies.
The Research Process Interpretivist Positivist
RELIABILITY AND VALIDITY OF DATA COLLECTION. RELIABILITY OF MEASUREMENT Measurement is reliable when it yields the same values across repeated measures.
Reporting & Ethical Standards EPSY 5245 Michael C. Rodriguez.
RESEARCH A systematic quest for undiscovered truth A way of thinking
PDHPE K-6 Using the syllabus for consistency of assessment © 2006 Curriculum K-12 Directorate, NSW Department of Education and Training.
Sina Keshavaarz M.D Public Health &Preventive Medicine Measuring level of performance & sustaining improvement.
Research Methods in Computer Science Lecture: Quantitative and Qualitative Data Analysis | Department of Science | Interactive Graphics System.
Verification: Quality Assurance in Assessment Verification is the main quality assurance process associated with assessment systems and practice - whether.
1 Copyright © 2011 by Saunders, an imprint of Elsevier Inc. Chapter 9 Examining Populations and Samples in Research.
NO FRAUD LEFT BEHIND The Effect of New Risk Assessment Auditing Standards on Schools Runyon Kersteen Ouellette.
Evaluating a Research Report
DATA QUALITY How closely do the data used reflect the truth about results?
Methods: Pointers for good practice Ensure that the method used is adequately described Use a multi-method approach and cross-check where possible - triangulation.
Evaluating Research Articles Approach With Skepticism Rebecca L. Fiedler January 16, 2002.
School of Health Systems and Public Health Monitoring & Evaluation of HIV and AIDS Programs Data Quality Wednesday March 2, 2011 Win Brown USAID/South.
Data Quality Assessment
Eloise Forster, Ed.D. Foundation for Educational Administration (FEA)
Statistics for Managers Using Microsoft Excel, 4e © 2004 Prentice-Hall, Inc. Chap 1-1 Statistics for Managers Using Microsoft ® Excel 4 th Edition Chapter.
TRUSTWORTHINESS OF THE RESEARCH Assoc. Prof. Dr. Şehnaz Şahinkarakaş.
Ensuring rigour in qualitative research CPWF Training Workshop, November 2010.
McGraw-Hill/Irwin © 2003 The McGraw-Hill Companies, Inc., All Rights Reserved. 6-1 Chapter 6 CHAPTER 6 INTERNAL CONTROL IN A FINANCIAL STATEMENT AUDIT.
Chapter 2 Doing Sociological Research Key Terms. scientific method Involves several steps in research process, including observation, hypothesis testing,
1 The Good, the Bad, and the Ugly: Collecting and Reporting Quality Performance Data.
Copyright © Allyn & Bacon 2008 Intelligent Consumer Chapter 14 This multimedia product and its contents are protected under copyright law. The following.
McMillan Educational Research: Fundamentals for the Consumer, 6e © 2012 Pearson Education, Inc. All rights reserved. Educational Research: Fundamentals.
Biochemistry Clinical practice CLS 432 Dr. Samah Kotb Lecturer of Biochemistry 2015 Introduction to Quality Control.
Healthy Futures Performance Measures Session. Session Overview Combination of presentation and interactive components Time at the end of the session for.
PROCUREMENT PROCEDURES. Procurement Procurement is the process of acquiring goods, supplies and services. It includes: Equipment, spare parts & supplies.
Education Performance Measures Session. Session Overview Combination of presentation and interactive components Time at the end of the session for Q&A.
Quality Assessment of MFA’s evaluations Rita Tesselaar Policy and operations Evaluation Department Netherlands Ministry of Foreign Affairs.
Data Quality Assessment of PEPFAR ART Sites In Nigeria Final Report February 17, 2006 Nigeria/Monitoring and Evaluation Management Services in collaboration.
HSA 171 CAR. 1436/5/10 3  Concept Of Controlling.  Definition.  Controlling Process. 4.
Types of method Quantitative: – Questionnaires – Experimental designs Qualitative: – Interviews – Focus groups – Observation Triangulation.
Data Quality. Learning Objectives 1.Identify data quality issues at each step of a data management system 2.List key criteria used to assess data quality.
Data Quality Management. Learning Objectives  At the end of this session, participants will be able to:  Summarize basic terminology regarding data.
Data Collection Methods NURS 306, Nursing Research Lisa Broughton, MSN, RN, CCRN.
CRITICALLY APPRAISING EVIDENCE Lisa Broughton, PhD, RN, CCRN.
MEASURE Evaluation Data Quality Assurance Workshop Session 3 Introduction to Routine Data Quality Assessment.
USAID’s Data Quality Standards & Conducting a DQA # ? % ?
Measurement & Data Collection
Quality Assurance processes
Planning my research journey
Ensuring Data Quality for Monitoring and Evaluation
Rethinking data: Get creative!
Alignment Dr. Mary Clisbee
Measuring Data Quality
Indicators, Data Sources and Data Quality for TB M&E
Internal Control Internal control is the process designed and affected by owners, management, and other personnel. It is implemented to address business.
Chapter 6 Both qualitative and quantitative processes are necessary to fully answer a conceptual question.
Presentation transcript:

Ensuring the quality of Qualitative Data Presented by Lorie Broomhall, Ph.D. Senior HIV/AIDS Advisor Nigeria Monitoring and Evaluation Management Services The Mitchell Group Abuja, Nigeria

Mini Pop Quiz Please indicate T if the statement is true and F if the statement is false: 1.Quantitative methods are more accurate than qualitative methods. 2.Qualitative methods are not as credible as quantitative methods because they cannot be replicated. 3.Quantitative methods are more generalizable than qualitative methods. 4.Qualitative evaluations are more costly and time consuming that quantitative evaluations – less bang for the buck. 5.Quantitative evaluations are less biased than qualitative evaluations 6.Qualitative methods are useful for explaining findings derived from quantitative methods – telling the story – but not the other way around. 7.Qualitative data analysis is not systematic 8.Focus group discussions are the most useful and reliable qualitative evaluation method.

Workshop Objectives Definition of high quality qualitative data Uses of qualitative data in M&E Quality standards Qualitative method designs Analysis and reporting

High Quality Data  Ensure that program and budget decisions are as well informed as practically possible  Support efficient use of resources  Are credible  Meet reasonable standards accuracy, objectivity, and consistency

“In principle, the same quality standards for quantitative data apply to qualitative data” USAID Functional Series 200 – Programming Policy ADS 203 – Assessing and Learning

High quality can be achieved using any qualitative method (as long as it is used appropriately):  Interviews  Mapping/Walk-throughs  Directed observations  Participant observation  Focus Group discussions  Elicitation techniques  Pile sorts  Ranking  Free lists

High quality data have Validity (internal and external) Reliability Timeliness Precision Integrity

Validity “Data are valid to the extent that they clearly, directly, and adequately represent the result that was intended to be measured.”* * Quotations in this presentation from USAID ADS 203, and Tips #12

Validity increases with  Clear goals and objectives  Sound method design and data collection plan - triangulation  Adequate training and supervision  Systematic data analysis  Method triangulation

Badly designed data collection instruments Imprecise definitions Inappropriate sampling techniques Inadequately trained data collectors Transcription error Validity is compromised by

A Sound Sampling Plan is Representative: Are all people or groups affected by program represented in the sample? Are they represented in proportion to their size? Inclusive: Are all the different views and perspectives represented in the sample? Appropriate: Are the right people being sampled? Consistent: Is the sampling strategy applied to all sites over the length of the evaluation? Documented and justified

Examples of sampling strategies for qualitative evaluations: Convenience/purposive sampling Stratified sampling (significant variation) Random sampling (people, time, location, etc.) Snowball sampling (social networks)

Reliability “ Data should reflect stable and consistent data collection processes and analysis methods over time.”

Reliability in assured with:  Consistent data collection processes  Field supervision and observation  Checks for bais and ‘drift’  Data replicability

Precision is improved with:  Direct, understandable data collection instruments  Concise and operationalized definitions  Clear, descriptive, objective and decipherable documentation (e.g. field notes, summaries)  Scrutinize!  Frequent checks of data bases and inter-coder reliability Precision “Data should be sufficiently accurate (precise) to present a fair picture of performance...”

Timeliness “Data should be current and available frequently enough to permit management decisions to be made” Integrity Data should be free from manipulation for professional, political or personal purposes.

Timeliness can be achieved through  Building an appropriate data collection strategy – minimum methods for the maximum return.  Well planned data management/processing  Adequate training and supervision  Effective communication  Efficient analysis  Contingency plans for unforeseen events (strikes)

Maintaining Integrity  Secure data storage: names on folders, electronic files, registers and logs  Supervision – scrutinizers  Awareness of potential data manipulation

Thank you