Presentation is loading. Please wait.

Presentation is loading. Please wait.

Teaching Registrars Research Methods Variable definition and quality control of measurements Prof. Rodney Ehrlich.

Similar presentations


Presentation on theme: "Teaching Registrars Research Methods Variable definition and quality control of measurements Prof. Rodney Ehrlich."— Presentation transcript:

1 Teaching Registrars Research Methods Variable definition and quality control of measurements Prof. Rodney Ehrlich

2 Learning objectives 1.Scales of measurement 2.Conceptual vs operational variables 3.Precision, accuracy and validity of measurements: Understand Maximise Assess

3 Measurement scales Categorical: Nominal (no natural order), e.g. blood group Ordinal, e.g. cancer staging Continuous: Discrete (counts), e.g. outpatient attendance “True” continuous, e.g. haemoglobin

4 Defining your variables Conceptual variable = everyday term, or alternatively, theoretical construct Operational variable = what is actually measured

5 Defining your variables: examples Renal function Alcoholism Risk taking behaviour Chronic pain Obesity

6 Quality control of measurement “Measuring instrument” = questionnaire, laboratory test, clinical judgement. Precision = reliability, repeatability or reproducibility Accuracy = proximity to true value Validity = subset of accuracy

7 Precision Repetition between occasions, testers, instruments, gives same result. Lack of reliability may also indicate an accuracy or validity problem but the two are separable, at least in theory. Precision is not determinable in a single measurement.

8 2 nd Measure -ment HighNormalTotal High122 Normal8178 200 Example (dichotomous variable): precision of BP measurement First measurement. What is the precision (reliability) of the BP measurement?

9 Measuring precision Categorical  Percent agreement (concordance);  Kappa statistic (takes chance agreement into account) Continuous  Various (see Hulley, Ch. 12) (NOT correlation coefficients)

10 Accuracy Measurement agrees with another measurement accepted to be the truth, the so-called “gold standard”. Most intuitive for physical and physiological measurements Validity Where variable being measured is abstract, subjective, complex, etc., where gold standard debatable or not available

11 Types of validity Face validity Measurement, or question, makes sense to you, interviewers, experts, subjects, et al. Construct validity Measurement agrees with other operational measurements of the same concept. Example: depression Criterion validity Measurement agrees with a “gold standard”.

12 Measuring criterion validity Categorical Sensitivity: proportion of true positives testing positive on the instrument Specificity: proportion of true negatives testing negative on the instrument Continuous More complex, but often involves choosing cutpoints, i.e. categorising as positive/negative

13 Has your child ever had chicken pox? YesNoTotal Yes6020 No9030 200 Example of criterion validity of recall (categorical) Chicken pox antibodies in blood? Sensitivity, specificity; Implications?

14 Lack of precision/reliability = “random error” Descriptive study e.g. prevalence of hypertension?  Wider confidence interval (reduced power);  Need greater sample size (or repeated measurements) for same power Comparative study e.g. does nurse home visiting improve hypertension control?  Same as for descriptive

15 Lack of accuracy/validity = “systematic error” Descriptive study  Biased estimate;  Cannot remove by increasing sample size. Comparative study  If affects both groups equally, will mask a true difference or association;  If affects the two groups differently, could mask true difference or create a spurious difference or association (e.g. recall bias).


Download ppt "Teaching Registrars Research Methods Variable definition and quality control of measurements Prof. Rodney Ehrlich."

Similar presentations


Ads by Google