Measurement & Data Collection The Pennsylvania state university college of nursing Nursing 200w
What are the concepts of measurement theory? Directness Error Level Reliability Validity
What is directness of measurement? Direct measures Indirect measures Concrete measures such as: Oxygen saturation Temperature Weight Blood pressure Blood glucose Abstract measures such as: Pain Depression Anxiety Self-care Self-esteem
What is a measurement error? Error is the difference between the true measure and what is actually measured SYSTEMATIC ERROR: The variation in measurement in the same direction (e.g., a blood pressure cuff that consistently measures 5 mmHg above the true measure on every person, every time) RANDOM ERROR: Differences in measurements without a pattern (e.g., a blood pressure cuff measuring 15 mmHg less than the true measure due to a mechanical issue on a patient before it is fixed)
What are the levels of measurement? Nominal Ordinal Interval Ratio
Nominal Lowest level of measurement Ranking not possible Categories are exclusive of one another and exhaustive Example: Political Affiliation 1= Republican 2= Democrat 3= Independent 4= Other
Ordinal Exclusive and exhaustive categories Order or ranking imposed on categories Intervals between the categories may not always be equal Example: ADL Scale 1= Completely dependent 2= Needs assistance 3= Completely independent
Interval Numerical distance between intervals NO absolute zero Often used in math and higher level statistics Example (not many!): Temperature as measured in Fahrenheit or Celsius Likert-scale scores
Ratio Considered highest level of measurement Continuum of values Absolute zero Example: Weight Percentile test scores
What are reliability and validity? Reliability is similar to the concept of consistency Validity is accuracy or precision of a measure
How would you critically appraise reliability in a research study? What reliability information is provided? Does the author include reports of the validity of the instrument from previous studies? Did the author perform pilot studies to examine the validity of the instrument? Did the researcher report use of data from the present study to examine instrument validity in the discussion section of the report?
How would you critically appraise methods of measurement? Were the measurement strategies clearly identified and described? What level of measurement was achieved by each instrument? Was the reliability of each measurement method adequate? Was the validity of each measurement method adequate? If physiological instruments were used, were they accurate and precise?
What types of measurements strategies are used in nursing? Physiological measurements Observational measurements Interviews Focus groups
What are examples of physiological measures? Physical measurement methods ECG BP I&O Microbiological measurement methods Cultures Sensitivities Smears
What are observational measures? Unstructured observations Subjective observation Structured observations Objective observation using: Category systems Checklists Rating scales
How would you critically appraise observational measurements? Is the object of observation clearly identified and defined? Is interrater reliability described? Are the techniques for recording observations described?
How is self-reported data collected? With this type of strategy, the subjects or participants provide data Can be done using the following strategies: Interviews Questionnaires or surveys Scales
What do unstructured interviews look like? Unstructured interview questions may look like these: Tell me about... What has been your experience with... What was it like to hear you have cancer?
What are structured interviews? Structured interviews include: Describing interview questions Pretesting the interview protocol Training the interviewers Preparing for an interview Probing Recording interview data
How would you critically appraise interview methods? Do the interview questions address concerns expressed in the research problem? Are the interview questions relevant for the research purpose and objectives, questions, or hypotheses? Does the design of the questions tend to bias subjects’ responses? Does the sequence of questions tend to bias subjects’ responses?
What do questionnaires or surveys look like? Many factors* need to be considered in developing questionnaires or surveys! These include items like language, reading level, length, clarity, etc. Questionnaires may be distributed or administered: In-person or by phone Internet or e-mail (e.g., Survey Monkey) Mail *This is why it is generally recommended that a researcher utilize an existing tool, if possible!
What are the reasons for using (or not using) questionnaires or surveys? advantages disadvantages Large samples Affordable Efficient High bias May have very low response rates impacting sample representativeness of population of interest
What are some types of scales? Rating scales The Likert scale Visual analog scales
What questions should you consider when critically appraising the data collection process? Were the recruitment and selection of study participants or subjects clearly described and appropriate? Were the data collected in a consistent way? Were the study controls maintained as indicated by the design? Did the design include an intervention that was consistently implemented? Was the integrity of the study protected, and how were any problems resolved? Did the researchers obtain data from an existing database? If so, did the data obtained address the study problem and objectives, questions, or hypotheses? Was the reliability and validity of the database addressed in the research report?
Questions? Comments? The end!