Download presentation
Presentation is loading. Please wait.
1
Measurement & Data Collection
The Pennsylvania state university college of nursing Nursing 200w
2
What are the concepts of measurement theory?
Directness Error Level Reliability Validity
3
What is directness of measurement?
Direct measures Indirect measures Concrete measures such as: Oxygen saturation Temperature Weight Blood pressure Blood glucose Abstract measures such as: Pain Depression Anxiety Self-care Self-esteem
4
What is a measurement error?
Error is the difference between the true measure and what is actually measured SYSTEMATIC ERROR: The variation in measurement in the same direction (e.g., a blood pressure cuff that consistently measures 5 mmHg above the true measure on every person, every time) RANDOM ERROR: Differences in measurements without a pattern (e.g., a blood pressure cuff measuring 15 mmHg less than the true measure due to a mechanical issue on a patient before it is fixed)
5
What are the levels of measurement?
Nominal Ordinal Interval Ratio
6
Nominal Lowest level of measurement Ranking not possible
Categories are exclusive of one another and exhaustive Example: Political Affiliation 1= Republican 2= Democrat 3= Independent 4= Other
7
Ordinal Exclusive and exhaustive categories
Order or ranking imposed on categories Intervals between the categories may not always be equal Example: ADL Scale 1= Completely dependent 2= Needs assistance 3= Completely independent
8
Interval Numerical distance between intervals NO absolute zero
Often used in math and higher level statistics Example (not many!): Temperature as measured in Fahrenheit or Celsius Likert-scale scores
9
Ratio Considered highest level of measurement Continuum of values
Absolute zero Example: Weight Percentile test scores
10
What are reliability and validity?
Reliability is similar to the concept of consistency Validity is accuracy or precision of a measure
11
How would you critically appraise reliability in a research study?
What reliability information is provided? Does the author include reports of the validity of the instrument from previous studies? Did the author perform pilot studies to examine the validity of the instrument? Did the researcher report use of data from the present study to examine instrument validity in the discussion section of the report?
12
How would you critically appraise methods of measurement?
Were the measurement strategies clearly identified and described? What level of measurement was achieved by each instrument? Was the reliability of each measurement method adequate? Was the validity of each measurement method adequate? If physiological instruments were used, were they accurate and precise?
13
What types of measurements strategies are used in nursing?
Physiological measurements Observational measurements Interviews Focus groups
14
What are examples of physiological measures?
Physical measurement methods ECG BP I&O Microbiological measurement methods Cultures Sensitivities Smears
15
What are observational measures?
Unstructured observations Subjective observation Structured observations Objective observation using: Category systems Checklists Rating scales
16
How would you critically appraise observational measurements?
Is the object of observation clearly identified and defined? Is interrater reliability described? Are the techniques for recording observations described?
17
How is self-reported data collected?
With this type of strategy, the subjects or participants provide data Can be done using the following strategies: Interviews Questionnaires or surveys Scales
18
What do unstructured interviews look like?
Unstructured interview questions may look like these: Tell me about... What has been your experience with... What was it like to hear you have cancer?
19
What are structured interviews?
Structured interviews include: Describing interview questions Pretesting the interview protocol Training the interviewers Preparing for an interview Probing Recording interview data
20
How would you critically appraise interview methods?
Do the interview questions address concerns expressed in the research problem? Are the interview questions relevant for the research purpose and objectives, questions, or hypotheses? Does the design of the questions tend to bias subjects’ responses? Does the sequence of questions tend to bias subjects’ responses?
21
What do questionnaires or surveys look like?
Many factors* need to be considered in developing questionnaires or surveys! These include items like language, reading level, length, clarity, etc. Questionnaires may be distributed or administered: In-person or by phone Internet or (e.g., Survey Monkey) Mail *This is why it is generally recommended that a researcher utilize an existing tool, if possible!
22
What are the reasons for using (or not using) questionnaires or surveys?
advantages disadvantages Large samples Affordable Efficient High bias May have very low response rates impacting sample representativeness of population of interest
23
What are some types of scales?
Rating scales The Likert scale Visual analog scales
24
What questions should you consider when critically appraising the data collection process?
Were the recruitment and selection of study participants or subjects clearly described and appropriate? Were the data collected in a consistent way? Were the study controls maintained as indicated by the design? Did the design include an intervention that was consistently implemented? Was the integrity of the study protected, and how were any problems resolved? Did the researchers obtain data from an existing database? If so, did the data obtained address the study problem and objectives, questions, or hypotheses? Was the reliability and validity of the database addressed in the research report?
25
Questions? Comments? The end!
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.