Chapter 23 How to collect data. This chapter is about the tools/techniques used to collect data Hang on, let’s review: what are we collecting? What’s.

Slides:



Advertisements
Similar presentations
INTRODUCTION Chapter 1 1. Java CPSC 1100 University of Tennessee at Chattanooga 2  Difference between Visual Logic & Java  Lots  Visual Logic Flowcharts.
Advertisements

6.811 / PPAT: Principles and Practice of Assistive Technology Wednesday, 16 October 2013 Prof. Rob Miller Today: User Testing.
Web E’s goal is for you to understand how to create an initial interaction design and how to evaluate that design by studying a sample. Web F’s goal is.
Each individual person is working on a GUI subset. The goal is for you to create screens for three specific tasks your user will do from your GUI Project.
Deciding How to Measure Usability How to conduct successful user requirements activity?
A Student’s Guide to Methodology Justifying Enquiry 3 rd edition P ETER C LOUGH AND C ATHY N UTBROWN.
Customer: Rosalva Gallardo Team members: Susan Lin Buda Chiou Jim Milewski Marcos Mercado November 23, 2010.
Semester in review. The Final May 7, 6:30pm – 9:45 pm Closed book, ONE PAGE OF NOTES Cumulative Similar format to midterm (probably about 25% longer)
Chapter 14: Usability testing and field studies. Usability Testing Emphasizes the property of being usable Key Components –User Pre-Test –User Test –User.
Observation Watch, listen, and learn…. Agenda  Observation exercise Come back at 3:40.  Questions?  Observation.
Empirical Methods in Human- Computer Interaction.
Think-aloud usability experiments or concurrent verbal accounts Judy Kay CHAI: Computer human adapted interaction research group School of Information.
Evaluation Methodologies
Part 4: Evaluation Chapter 20: Why evaluate? Chapter 21: Deciding on what to evaluate: the strategy Chapter 22: Planning who, what, where, and when Chapter.
Fall 2006ITCS4230 Playtesting Tiffany Barnes
Data-collection techniques. Contents Types of data Observations Event logs Questionnaires Interview.
COMP 7970 Playtesting Cheryl Seals
Usability Methods: Cognitive Walkthrough & Heuristic Evaluation Dr. Dania Bilal IS 588 Spring 2008 Dr. D. Bilal.
Gathering Usability Data
Mixed-level English classrooms What my paper is about: Basically my paper is about confirming with my research that the use of technology in the classroom.
Chapter 14: Usability testing and field studies
User Interface Evaluation Usability Inquiry Methods
User Interface Evaluation Usability Testing Methods.
S-005 Collecting data: What methods to use. Common methods Interviews – Face-to-face – Focus group – Telephone – Skype / video conference Questionnaires.
Interviews. Having worked out who will be using your web site (personas, questionnaires etc), you may want to interview selected representatives In traditional.
Gathering User Data IS 588 Dr. Dania Bilal Spring 2008.
Chapter 20 Why evaluate the usability of UI designs?
3461P Crash Course Lesson on Usability Testing The extreme, extreme basics...
Usability Evaluation June 8, Why do we need to do usability evaluation?
Copyright © 2008 Pearson Education, Inc. Publishing as Pearson Addison-Wesley The Resonant Interface HCI Foundations for Interaction Design First Edition.
Copyright © 2008 Pearson Education, Inc. Publishing as Pearson Addison-Wesley Lecturer – Prof Jim Warren Lecture 4 - Usability Testing Based on Heim, Chapter.
Chapter 20 Deciding on what to evaluate: the strategy.
Usability Testing Chapter 6. Reliability Can you repeat the test?
COMP5047 Pervasive Computing: 2012 Think-aloud usability experiments or concurrent verbal accounts Judy Kay CHAI: Computer human adapted interaction research.
Unit 2 (task 28) In this PowerPoint I will tell you about 7 important IT job roles and if a candidate might want one what he would have to do to get one.
Usability Assessment Methods beyond Testing Chapter 7 Evaluating without users.
Welcome to the Usability Center Tour Since 1995, the Usability Center has been a learning environment that supports and educates in the process of usability.
CS5714 Usability Engineering Formative Evaluation of User Interaction: During Evaluation Session Copyright © 2003 H. Rex Hartson and Deborah Hix.
Usability Engineering Dr. Dania Bilal IS 582 Spring 2006.
Chapter 25 Analysis and interpretation of user observation evaluation data.
Usability Evaluation, part 2. REVIEW: A Test Plan Checklist, 1 Goal of the test? Specific questions you want to answer? Who will be the experimenter?
EVALUATION PROfessional network of Master’s degrees in Informatics as a Second Competence – PROMIS ( TEMPUS FR-TEMPUS-JPCR)
Usability Engineering Dr. Dania Bilal IS 592 Spring 2005.
Write revision cards for the following topics: n Advantages & disadvantages of questionnaires n Advantages & disadvantages of using structured interview.
AVI/Psych 358/IE 340: Human Factors Evaluation October 31, 2008.
June 5, 2007Mohamad Eid Usability Testing Chapter 8.
Chapter 23 Deciding how to collect data. UIDE Chapter 23 Introduction: Preparing to Collect Evaluation Data Timing and Logging Actions –Automatic Logging.
Chapter 27 Variations and more complex evaluations.
Day 10 Analysing usability test results. Objectives  To learn more about how to understand and report quantitative test results  To learn about some.
Getting ready. Why C? Design Features – Efficiency (C programs tend to be compact and to run quickly.) – Portability (C programs written on one system.
Usability Evaluation or, “I can’t figure this out...do I still get the donuts?”
Steps in Planning a Usability Test Determine Who We Want To Test Determine What We Want to Test Determine Our Test Metrics Write or Choose our Scenario.
Usability Engineering Dr. Dania Bilal IS 582 Spring 2007.
Usability Engineering Dr. Dania Bilal IS 587 Fall 2007.
Observational Methods Think Aloud Cooperative evaluation Protocol analysis Automated analysis Post-task walkthroughs.
Evaluation / Usability. ImplementDesignAnalysisEvaluateDevelop ADDIE.
School of Engineering and Information and Communication Technology KIT305/607 Mobile Application Development Week 7: Usability (think-alouds) Dr. Rainer.
Day 8 Usability testing.
Hi there. Please silence your mobile phone. Offenders will be invited
Engineering and Debugging an App Chapter 15
Usability Evaluation, part 2
Usability Evaluation.
SY DE 542 User Testing March 7, 2005 R. Chow
Chapter 20 Why evaluate the usability of user interface designs?
Chapter 23 Deciding how to collect data
Primary Research Methods
Experimental Evaluation
Empirical Evaluation Data Collection: Techniques, methods, tricks Objective data IRB Clarification All research done outside the class (i.e., with non-class.
Human-Computer Interaction: Overview of User Studies
Cognitive Walkthrough
Presentation transcript:

Chapter 23 How to collect data

This chapter is about the tools/techniques used to collect data Hang on, let’s review: what are we collecting? What’s being measured? This chapter is about the tools/techniques used to collect data Hang on, let’s review: what are we collecting? What’s being measured?

What are we collecting? Classify the type of info we want:

What are we collecting? Classify the type of info we want:  Subjective: Satisfaction: qualitative / quantitative (e.g., Likert) Classify the type of info we want:  Subjective: Satisfaction: qualitative / quantitative (e.g., Likert)

What are we collecting? Classify the type of info we want:  Subjective: Satisfaction: qualitative / quantitative (e.g., Likert)  Objective: Performance measures Efficiency: speed Effectiveness: accuarcy (error rates) Classify the type of info we want:  Subjective: Satisfaction: qualitative / quantitative (e.g., Likert)  Objective: Performance measures Efficiency: speed Effectiveness: accuarcy (error rates)

What are we collecting? Classify the type of info we want:  Subjective: Satisfaction: qualitative / quantitative (e.g., Likert)  Objective: Performance measures Efficiency: speed Effectiveness: accuarcy (error rates)  Objective: Process measures Eye movements, brain waves, physiological data (heart rate, skin response, etc.) Classify the type of info we want:  Subjective: Satisfaction: qualitative / quantitative (e.g., Likert)  Objective: Performance measures Efficiency: speed Effectiveness: accuarcy (error rates)  Objective: Process measures Eye movements, brain waves, physiological data (heart rate, skin response, etc.)

Timing and logging Performance measures:  e.g., time to task completion Start / stop times: stopwatch Specific events (e.g., time Help was clicked): time stamp (best if code is retooled to provide this) Maybe use logging software? Performance measures:  e.g., time to task completion Start / stop times: stopwatch Specific events (e.g., time Help was clicked): time stamp (best if code is retooled to provide this) Maybe use logging software?

Logging software Can be expensive  Ovo Logger appears to be free…does it come with source? (doesn’t look like it) Ovo Logger Does it provide mechanism for aligning notes or comments? (You may want to add comments at specific points in time) Does it interfere with running program? (e.g., ClearView samples at 15 fps, therefore slows PC app down a bit) Can be expensive  Ovo Logger appears to be free…does it come with source? (doesn’t look like it) Ovo Logger Does it provide mechanism for aligning notes or comments? (You may want to add comments at specific points in time) Does it interfere with running program? (e.g., ClearView samples at 15 fps, therefore slows PC app down a bit)

Talk Aloud Subjective Process measures:  User says what s/he’s thinking as s/he’s doing Talk-aloud protocols:  good: gives glimpes of what user’s thinking  bad: may influence performance (probably slow user down; reduce errors), may be distracting Alternative: record (log) session, ask questions in review “what did you do here”?  use cognitive walkthrough questions Subjective Process measures:  User says what s/he’s thinking as s/he’s doing Talk-aloud protocols:  good: gives glimpes of what user’s thinking  bad: may influence performance (probably slow user down; reduce errors), may be distracting Alternative: record (log) session, ask questions in review “what did you do here”?  use cognitive walkthrough questions

Note taking Take good notes Bring paper and pens Take good notes Bring paper and pens Task no. Date: Subject No.: Evaluator Name: Start Time: End Time: ActionUser’s remarksObserver’s comments Don’t use subject’s name: should be anonymous

Debrief Following session, you may want/need to:  ask user more questions  explain in more detail what was going on (participants may be curious as to what was really being tested)  some users may blame themselves for problems during test---this may sound silly but you need to be sensitive to this; if in doubt blame the machine, software, etc. Following session, you may want/need to:  ask user more questions  explain in more detail what was going on (participants may be curious as to what was really being tested)  some users may blame themselves for problems during test---this may sound silly but you need to be sensitive to this; if in doubt blame the machine, software, etc.

Questionnaires May be difficult to design  “did this program meet your expectations?”  some expect crappy program, so “yes”  others expect good program, also “yes”  badly formed question… Supplement questionnaires with interviews Use pre-designed questionnaires  SUMI: Software Usability Measurement Inventory  WAMMI:Website Analysis and MeasureMent Inventory May be difficult to design  “did this program meet your expectations?”  some expect crappy program, so “yes”  others expect good program, also “yes”  badly formed question… Supplement questionnaires with interviews Use pre-designed questionnaires  SUMI: Software Usability Measurement Inventory  WAMMI:Website Analysis and MeasureMent Inventory

SUMI Questions such as:  “I’ll never learn all the features of this software”  “The instructions are helpful”  “I sometimes don’t know what to do next”  “I would recommend this software to my Mom” Of course you may want to adapt the questions to your study Questions such as:  “I’ll never learn all the features of this software”  “The instructions are helpful”  “I sometimes don’t know what to do next”  “I would recommend this software to my Mom” Of course you may want to adapt the questions to your study

Recording technologies Video and audio recording: good idea But what kind of data is it recording?  satisfaction (maybe)  efficiency (only if you look at length of video)  effectiveness (if you can tell an error is made) Doesn’t seem to offer any data recording, so what’s the point?  process measures perhaps…  don’t forget asking for permission to record… Video and audio recording: good idea But what kind of data is it recording?  satisfaction (maybe)  efficiency (only if you look at length of video)  effectiveness (if you can tell an error is made) Doesn’t seem to offer any data recording, so what’s the point?  process measures perhaps…  don’t forget asking for permission to record…