Download presentation
Presentation is loading. Please wait.
Published byPeter Hunt Modified over 9 years ago
1
Chapter 23 How to collect data
2
This chapter is about the tools/techniques used to collect data Hang on, let’s review: what are we collecting? What’s being measured? This chapter is about the tools/techniques used to collect data Hang on, let’s review: what are we collecting? What’s being measured?
3
What are we collecting? Classify the type of info we want:
4
What are we collecting? Classify the type of info we want: Subjective: Satisfaction: qualitative / quantitative (e.g., Likert) Classify the type of info we want: Subjective: Satisfaction: qualitative / quantitative (e.g., Likert)
5
What are we collecting? Classify the type of info we want: Subjective: Satisfaction: qualitative / quantitative (e.g., Likert) Objective: Performance measures Efficiency: speed Effectiveness: accuarcy (error rates) Classify the type of info we want: Subjective: Satisfaction: qualitative / quantitative (e.g., Likert) Objective: Performance measures Efficiency: speed Effectiveness: accuarcy (error rates)
6
What are we collecting? Classify the type of info we want: Subjective: Satisfaction: qualitative / quantitative (e.g., Likert) Objective: Performance measures Efficiency: speed Effectiveness: accuarcy (error rates) Objective: Process measures Eye movements, brain waves, physiological data (heart rate, skin response, etc.) Classify the type of info we want: Subjective: Satisfaction: qualitative / quantitative (e.g., Likert) Objective: Performance measures Efficiency: speed Effectiveness: accuarcy (error rates) Objective: Process measures Eye movements, brain waves, physiological data (heart rate, skin response, etc.)
7
Timing and logging Performance measures: e.g., time to task completion Start / stop times: stopwatch Specific events (e.g., time Help was clicked): time stamp (best if code is retooled to provide this) Maybe use logging software? Performance measures: e.g., time to task completion Start / stop times: stopwatch Specific events (e.g., time Help was clicked): time stamp (best if code is retooled to provide this) Maybe use logging software?
8
Logging software Can be expensive Ovo Logger appears to be free…does it come with source? (doesn’t look like it) Ovo Logger Does it provide mechanism for aligning notes or comments? (You may want to add comments at specific points in time) Does it interfere with running program? (e.g., ClearView samples at 15 fps, therefore slows PC app down a bit) Can be expensive Ovo Logger appears to be free…does it come with source? (doesn’t look like it) Ovo Logger Does it provide mechanism for aligning notes or comments? (You may want to add comments at specific points in time) Does it interfere with running program? (e.g., ClearView samples at 15 fps, therefore slows PC app down a bit)
9
Talk Aloud Subjective Process measures: User says what s/he’s thinking as s/he’s doing Talk-aloud protocols: good: gives glimpes of what user’s thinking bad: may influence performance (probably slow user down; reduce errors), may be distracting Alternative: record (log) session, ask questions in review “what did you do here”? use cognitive walkthrough questions Subjective Process measures: User says what s/he’s thinking as s/he’s doing Talk-aloud protocols: good: gives glimpes of what user’s thinking bad: may influence performance (probably slow user down; reduce errors), may be distracting Alternative: record (log) session, ask questions in review “what did you do here”? use cognitive walkthrough questions
10
Note taking Take good notes Bring paper and pens Take good notes Bring paper and pens Task no. Date: Subject No.: Evaluator Name: Start Time: End Time: ActionUser’s remarksObserver’s comments Don’t use subject’s name: should be anonymous
11
Debrief Following session, you may want/need to: ask user more questions explain in more detail what was going on (participants may be curious as to what was really being tested) some users may blame themselves for problems during test---this may sound silly but you need to be sensitive to this; if in doubt blame the machine, software, etc. Following session, you may want/need to: ask user more questions explain in more detail what was going on (participants may be curious as to what was really being tested) some users may blame themselves for problems during test---this may sound silly but you need to be sensitive to this; if in doubt blame the machine, software, etc.
12
Questionnaires May be difficult to design “did this program meet your expectations?” some expect crappy program, so “yes” others expect good program, also “yes” badly formed question… Supplement questionnaires with interviews Use pre-designed questionnaires SUMI: Software Usability Measurement Inventory WAMMI:Website Analysis and MeasureMent Inventory May be difficult to design “did this program meet your expectations?” some expect crappy program, so “yes” others expect good program, also “yes” badly formed question… Supplement questionnaires with interviews Use pre-designed questionnaires SUMI: Software Usability Measurement Inventory WAMMI:Website Analysis and MeasureMent Inventory
13
SUMI Questions such as: “I’ll never learn all the features of this software” “The instructions are helpful” “I sometimes don’t know what to do next” “I would recommend this software to my Mom” Of course you may want to adapt the questions to your study Questions such as: “I’ll never learn all the features of this software” “The instructions are helpful” “I sometimes don’t know what to do next” “I would recommend this software to my Mom” Of course you may want to adapt the questions to your study
14
Recording technologies Video and audio recording: good idea But what kind of data is it recording? satisfaction (maybe) efficiency (only if you look at length of video) effectiveness (if you can tell an error is made) Doesn’t seem to offer any data recording, so what’s the point? process measures perhaps… don’t forget asking for permission to record… Video and audio recording: good idea But what kind of data is it recording? satisfaction (maybe) efficiency (only if you look at length of video) effectiveness (if you can tell an error is made) Doesn’t seem to offer any data recording, so what’s the point? process measures perhaps… don’t forget asking for permission to record…
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.