Validity and Reliability Dr. Voranuch Wangsuphachart Dept. of Social & Environmental Medicine Faculty of Tropical Medicine Mahodil University 420/6 Rajvithi.

Slides:



Advertisements
Similar presentations
Bias Lecture notes Sam Bracebridge.
Advertisements

Andrea M. Landis, PhD, RN UW LEAH
Research Curriculum Session II –Study Subjects, Variables and Outcome Measures Jim Quinn MD MS Research Director, Division of Emergency Medicine Stanford.
The Research Consumer Evaluates Measurement Reliability and Validity
Errors in Chemical Analyses: Assessing the Quality of Results
The Basics of Experimentation I: Variables and Control
Estimation of Sample Size
Bias Thanks to T. Grein.
Dr Samah Kotb Lecturer of Biochemistry 1 CLS 432 Dr. Samah Kotb Nasr El-deen Biochemistry Clinical practice CLS 432 Dr. Samah Kotb Nasr.
Concept of Measurement
Intermediate methods in observational epidemiology 2008 Quality Assurance and Quality Control.
Measurement Systems Analysis
SOME ADDITIONAL POINTS ON MEASUREMENT ERROR IN EPIDEMIOLOGY Sholom May 28, 2011 Supplement to Prof. Carroll’s talk II.
FOUNDATIONS OF NURSING RESEARCH Sixth Edition CHAPTER Copyright ©2012 by Pearson Education, Inc. All rights reserved. Foundations of Nursing Research,
How Science Works Glossary AS Level. Accuracy An accurate measurement is one which is close to the true value.
Bias and errors in epidemiologic studies Manish Chaudhary BPH( IOM) MPH(BPKIHS)
Unit 4: Monitoring Data Quality For HIV Case Surveillance Systems #6-0-1.
FINAL REPORT: OUTLINE & OVERVIEW OF SURVEY ERRORS
COHORT STUDY DR. A.A.TRIVEDI (M.D., D.I.H.) ASSISTANT PROFESSOR
Quality Assurance in the clinical laboratory
Chapter 8 Experimental Research
EDRS6208 Lecture Three Instruments and Instrumentation Data Collection.
Chapter 4 Hypothesis Testing, Power, and Control: A Review of the Basics.
Sampling : Error and bias. Sampling definitions  Sampling universe  Sampling frame  Sampling unit  Basic sampling unit or elementary unit  Sampling.
Lecture 8 Objective 20. Describe the elements of design of observational studies: case reports/series.
Epidemiology The Basics Only… Adapted with permission from a class presentation developed by Dr. Charles Lynch – University of Iowa, Iowa City.
Research Design Blue print for conducting a study APRIL 21, 2014 RG 701- ADVANCE RESEARCH METHODS.
VARIATION, VARIABLE & DATA POSTGRADUATE METHODOLOGY COURSE Hairul Hafiz Mahsol Institute for Tropical Biology & Conservation School of Science & Technology.
Biostatistics: Measures of Central Tendency and Variance in Medical Laboratory Settings Module 5 1.
Measurement in Medicine. Lord Kelvin When you can measure what you are speaking about, and express it in numbers, you know some thing about it; but when.
CSD 5100 Introduction to Research Methods in CSD Observation and Data Collection in CSD Research Strategies Measurement Issues.
Chapter 5 Errors In Chemical Analyses Mean, arithmetic mean, and average (x) are synonyms for the quantity obtained by dividing the sum of replicate measurements.
Bias Defined as any systematic error in a study that results in an incorrect estimate of association between exposure and risk of disease. To err is human.
Systematic Review Module 7: Rating the Quality of Individual Studies Meera Viswanathan, PhD RTI-UNC EPC.
Mother and Child Health: Research Methods G.J.Ebrahim Editor Journal of Tropical Pediatrics, Oxford University Press.
Independent vs Dependent Variables PRESUMED CAUSE REFERRED TO AS INDEPENDENT VARIABLE (SMOKING). PRESUMED EFFECT IS DEPENDENT VARIABLE (LUNG CANCER). SEEK.
Protocol Overview Background4-5 pages Question/Objective/Hypothesis4 lines Design4-20 lines Study Population0.5-1 page Measurement3.5-4 pgs. Outcomes Exposures/predictors.
Criteria to assess quality of observational studies evaluating the incidence, prevalence, and risk factors of chronic diseases Minnesota EPC Clinical Epidemiology.
Outcome research 1 Outcome/ instruments selection Wei-Chu Chie Preventive Medicine.
Potential Errors In Epidemiologic Studies Bias Dr. Sherine Shawky III.
For ABA Importance of Individual Subjects Enables applied behavior analysts to discover and refine effective interventions for socially significant behaviors.
Copyright  2003 by Dr. Gallimore, Wright State University Department of Biomedical, Industrial Engineering & Human Factors Engineering Human Factors Research.
Sampling Error.  When we take a sample, our results will not exactly equal the correct results for the whole population. That is, our results will be.
Evidence Based Medicine Week 2: Basic Research Concepts in Western and Eastern Medicine.
EXPERIMENTAL EPIDEMIOLOGY
Causal relationships, bias, and research designs Professor Anthony DiGirolamo.
System error Biases in epidemiological studies FETP India.
Reliability, Validity, and Bias. Reliability Reliability Reliability is the extent to which an experiment, test, or any measuring procedure yields the.
Biochemistry Clinical practice CLS 432 Dr. Samah Kotb Lecturer of Biochemistry 2015 Introduction to Quality Control.
Chapter Six: The Basics of Experimentation I: Variables and Control.
Measurement Issues General steps –Determine concept –Decide best way to measure –What indicators are available –Select intermediate, alternate or indirect.
CHP400: Community Health Program - lI Research Methodology STUDY DESIGNS Observational / Analytical Studies Cohort Study Present: Disease Past: Exposure.
Quality Control: Analysis Of Data Pawan Angra MS Division of Laboratory Systems Public Health Practice Program Office Centers for Disease Control and.
Understanding lack of validity: Bias
Introduction to Validity True Experiment – searching for causality What effect does the I.V. have on the D.V. Correlation Design – searching for an association.
BIOSTATISTICS Lecture 2. The role of Biostatisticians Biostatisticians play essential roles in designing studies, analyzing data and creating methods.
Module 11 Module I: Terminology— Data Quality Indicators (DQIs) Melinda Ronca-Battista ITEP Catherine Brown U.S. EPA.
Research design By Dr.Ali Almesrawi asst. professor Ph.D.
Survival Skills for Researchers Study Design. Typical Process in Research Design study Generate hypotheses Develop tentative new theories Analyze & interpret.
Data Collection Methods NURS 306, Nursing Research Lisa Broughton, MSN, RN, CCRN.
ESTABLISHING RELIABILITY AND VALIDITY OF RESEARCH TOOLS Prof. HCL Rawat Principal UCON,BFUHS Faridkot.
Quality Assurance in the clinical laboratory
Reliability and Validity
RELIABILITY OF QUANTITATIVE & QUALITATIVE RESEARCH TOOLS
Dr Seyyed Alireza Moravveji Community Medicine Specialist
Reliability, Validity, and Bias
ERRORS, CONFOUNDING, and INTERACTION
Lecture 1: Descriptive Statistics and Exploratory
Intermediate methods in observational epidemiology 2008
Precision, Accuracy, And Validity
Presentation transcript:

Validity and Reliability Dr. Voranuch Wangsuphachart Dept. of Social & Environmental Medicine Faculty of Tropical Medicine Mahodil University 420/6 Rajvithi Road Bangkok 10400, THAILAND

1. Know concepts & definition of validity & reliability 2. List importance and impact of validity & reliability 3. Specify strategies to assess validity & reliability 4. List strategies to enhance validity & reliability 5. Describe major types of bias Objective: at the end of lecture, student would be able to:

1. Validity 2. Reliability Definition and synonyms Important points Accessing validity & reliability Strategies to enhance validity & reliability 3. Major types of bias Contents:

Medical or epidemiological study, major consideration is to obtain: Valid measurement Reliable measurement on the exposure factors and outcomes of interest in the study population “WITHOUT BIAS and ERRORS” or to minimize them to the least as possible

To achieve a high standard quality study: Ensure right answers to study questions Good the study design Valid and reliable the measurements. Control for any possible bias Good cooperation between * research group and * study population

Screening for fasting blood cholesterol profile among people x1x1 x2x2 x3x3 x5x5 x6x6 X 11, x 12, x 13 X 21, x 22, x 23 X 31, x 32, x 33 X 41, x 42, x 43 X 51, x 52, x 53 X 61, x 62, x 63 x4x4

Screening for fasting blood cholesterol profile among people x1x1 x2x2 x3x3 x5x5 x6x6 x4x4 X 11 X 21 X 31 X 41 X 51 X 61

Instrument or Research Tool “equipment hard ware” –a red blood cell counter –a PH meter –an electronic weighing machine “paper ware” –a questionnaire –a weekly diet diary “people ware” –observers/investigators –technicians

How good is the instrument or tool? instrument tool measurement –without bias or error –minimize bias true value truth measurement – valid/accurate – precise/reliable

What is accuracy & precision? What do you think of first when talking about validity & reliability? What is the different between validity & reliability? Why are validity & reliability important in conducting any medical research - both in laboratory & field setting?

PRECISION DEFINITION : A precise measurement in one that has nearly the same value each time it is measured SYNONYM reliability repeatability reproducibility consistency agreement

IMPORTANT POINTS precision depends on: –sample size –efficiency of the study VIP influence on the power of a study precision, reliability and consistency affected by RANDOM ERROR

ASSESSING PRECISION Using S.D.    Variance (  2 ) Using Coefficient of variation = S.D. X Using Kappa statistic Using Cronbach's alpha

Strategies for enhancing precision 1. standardizing measurement methods preparing study protocols preparing operations manual writing specific guidelines or instructions for making each measurement serving as basis for describing methods when results are reported

Strategies for enhancing precision preparing operations manual – write down precisely : - how to prepare environment and subject - how to carry out and record interview - how to calibrate instrument

Strategies for enhancing precision writing specific guidelines or instructions for making the measurement uniform performance over the duration of study

Strategies for enhancing precision 2. Training and certifying the observers improving consistency of measurement techniques (several observers) performing pilot study –to test the power of techniques specified in operations manual

3. Refining the instruments writing or spelling out questionnaires and interviews to increase clarity 4. Automating the instruments using automatic mechanical devices Strategies for enhancing precision

5. Repeating the measurement impact of random error of any source can be reduced by –repeating measurement –using mean of the two or more readings Strategies for enhancing precision

ACCURACY DEFINITION : The degree to which the results of a measurement correspond to the true state or truth SYNONYM: validity conformity

IMPORTANT POINTS accuracy is a function of “SYSTEMATIC RROR” VIP influence on the internal and external validity of the study the greater the systematic error, the less accurate the variable

IMPORTANT POINTS It is attributed to: –Methodological aspect of study design or analysis –Selection of subject –Quality of information obtained –Confounding –Effect Modification –Misclassification

ASSESSING ACCURACY Comparison with reference techniques Gold standards

Strategies for enhancing accuracy 1. Standardizing measurement methods 2. Training and certifying the observers 3. Refining the instruments 4. Automating the instruments 5. Making informal measures 6. Blinding 7. Calibrating the instrument

MAJOR TYPES OF BIAS Observer bias Subject bias Instrument bias Information bias Selection bias

MAJOR TYPES OF BIAS Observer bias consistent distortion in reporting measurement by observer - more intensive measurements in certain subjects - ask questions about specific exposures several times of cases but only once of controls

MAJOR TYPES OF BIAS Observer bias Ex.a tendency to underestimate blood pressure in cases known to be receiving treatment Ex.a more persistent search of medical records for a history of smoking cigarettes in patients known to have lung cancer

MAJOR TYPES OF BIAS Subject bias consistent distortion of measurement by study subject - selective recall or reporting of an event respondent bias or recall bias

MAJOR TYPES OF BIAS Instrument bias - may result from faulty function of a mechanical instrument - may result from inappropriate use of technique or tool to objective of measurement leading questions on questionnaire

MAJOR TYPES OF BIAS Information bias a distortion in the estimate of effect or variable due to: * measurement error * misclassification of subjects on measurement variable * invalid measurement

MAJOR TYPES OF BIAS Information bias * incorrect diagnostic criteria * inadequacies in previously recorded data * unequal diagnostic surveillance among exposure study groups in follow up studies

Selection bias a distortion in the estimate of effect resulting from how subjects are selected for study population “self-selection bias” MAJOR TYPES OF BIAS

Selection bias can result from: - choice of groups to be compared (in all types of studies) - choice of sampling frame - loss to follow up or NON RESPONSE during data collection (in follow-up studies) MAJOR TYPES OF BIAS

Selection bias can result from: - selective surveillance/diagnostic surveillance varies with exposure status - more intensive measurements in certain subjects MAJOR TYPES OF BIAS

SUMMARY 1. Reliability : Precision, Reproducibility Random Error 2. Validity : Accuracy, Conformity Systematic Error Bias

MAJOR TYPES OF BIAS Observer Bias Subject Bias Recall Bias Respondent Bias Instrument Bias Information Bias Selection Bias

Reliability and validity of measurement Reliability Validity Definition Best way to assess The degree to which a variable has nearly the same value when measured several times The degree to which a variable actually represents what it is supposed to represent Comparison among repeated measures Comparison with a reference standard

Increase power to detect effects Increase validity of conclusions Value to study Threatened by Random error (variance) contributed by : Systematic error (Bias) contributed by : The observer The subject The instrument The observer The subject The instrument Reliability Validity

Illustration of the difference between Precision and Accuracy

good precision poor accuracy poor precisiongood precision poor precision good accuracypoor accuracygood accuracy Illustration of the difference between Precision and Accuracy

Frequency True value Measurement AC Unreliable Invalid BD DIFFERENCES BETWEEN VALIDITY AND RELIABILITY

Frequency True value Measurement AC Unreliable Invalid BD DIFFERENCES BETWEEN VALIDITY AND RELIABILITY A- Valid and reliable B- Valid but not reliable C- Not valid but reliable D- Not valid and not reliable