Observation Watch, listen, and learn…. Agenda  Observation exercise Come back at 3:40.  Questions?  Observation.

Slides:



Advertisements
Similar presentations
Evaluation of User Interface Design
Advertisements

Testing through user observations User Observation: Guidelines for Apple Developers, Kathleen Gomoll & Anne Nicol, January 1990 Notes based on:
6.811 / PPAT: Principles and Practice of Assistive Technology Wednesday, 16 October 2013 Prof. Rob Miller Today: User Testing.
its impossible to get everything into every record. keep your eyes on the child, not on the printed page its not who much you record, but what and.
Web E’s goal is for you to understand how to create an initial interaction design and how to evaluate that design by studying a sample. Web F’s goal is.
Each individual person is working on a GUI subset. The goal is for you to create screens for three specific tasks your user will do from your GUI Project.
Observing users Chapter 12. Observation ● Why? Get information on.. – Context, technology, interaction ● Where? – Controlled environments – In the field.
Deciding How to Measure Usability How to conduct successful user requirements activity?
Evaluation 3: Approaches and Data Collection Slides adapted from Saul Greenberg’s “Evaluating Interfaces with Users”.
IAT 334 Experimental Evaluation ______________________________________________________________________________________ SCHOOL OF INTERACTIVE ARTS + TECHNOLOGY.
©N. Hari Narayanan Computer Science & Software Engineering Auburn University 1 COMP 7620 Evaluation Chapter 9.
Evaluation. formative 4 There are many times throughout the lifecycle of a software development that a designer needs answers to questions that check.
Empirical Methods in Human- Computer Interaction.
Think-aloud usability experiments or concurrent verbal accounts Judy Kay CHAI: Computer human adapted interaction research group School of Information.
User Interface Testing. Hall of Fame or Hall of Shame?  java.sun.com.
Experiments Testing hypotheses…. Agenda Homework assignment Review evaluation planning Observation continued Empirical studies In-class practice.
Evaluation Methodologies
Part 4: Evaluation Chapter 20: Why evaluate? Chapter 21: Deciding on what to evaluate: the strategy Chapter 22: Planning who, what, where, and when Chapter.
Observing users.
Observing users. The aims Discuss the benefits & challenges of different types of observation. Describe how to observe as an on-looker, a participant,
6.S196 / PPAT: Principles and Practice of Assistive Technology Monday, 28 Nov Prof. Rob Miller Today: User Testing & Ethics.
Intro to Evaluation See how (un)usable your software really is…
Evaluation Methods April 20, 2005 Tara Matthews CS 160.
An evaluation framework
Human Computer Interaction Design Evaluation and Content Presentation
Evaluation How do we test the interaction design? Several Dimensions
ICS 463, Intro to Human Computer Interaction Design: 8. Evaluation and Data Dan Suthers.
Intro to Evaluation See how (un)usable your software really is…
Gathering Usability Data
Chapter 14: Usability testing and field studies
Usability Testing. Testing Methods Same as Formative Surveys/questionnaires Interviews Observation Documentation Automatic data recording/tracking Artificial/controlled.
1 Testing the UI This material has been developed by Georgia Tech HCI faculty, and continues to evolve. Contributors include Gregory Abowd, Jim Foley,
Intro to Evaluation See how (un)usable your software really is…
Chapter 23 How to collect data. This chapter is about the tools/techniques used to collect data Hang on, let’s review: what are we collecting? What’s.
CSCI 4163/6904, summer Quiz  Multiple choice  Answer individually - pass in  Then class discussion.
Evaluation Techniques Material from Authors of Human Computer Interaction Alan Dix, et al.
Chapter 12 Observing Users Li, Jia Li, Wei. Outline What and when to observe Approaches to observation How to observe How to collect data Indirect observation.
Ch. 12 Observing Users Reece, Rogers, Sharp. Beyond human computer interaction. Team 1:Andy, Nikhil, Vladimir, Sajay.
Human Computer Interaction
Evaluation Techniques zEvaluation ytests usability and functionality of system yoccurs in laboratory, field and/or in collaboration with users yevaluates.
COMP5047 Pervasive Computing: 2012 Think-aloud usability experiments or concurrent verbal accounts Judy Kay CHAI: Computer human adapted interaction research.
Observation, Interviews, and Questionnaires a.k.a. How to watch and talk to your users.
User Testing 101. Recruiting Users Find people with the same experience level as the typical user Don’t get people who are familiar with the product or.
Intro to Evaluation See how (un)usable your software really is…
CS5714 Usability Engineering Formative Evaluation of User Interaction: During Evaluation Session Copyright © 2003 H. Rex Hartson and Deborah Hix.
Requirements Gathering CS 561. Where do Requirements Come From? Handed to you (?) Dialogue with – Customer – User Are these always the same? Are these.
Observing users. The aims Discuss the benefits & challenges of different types of observation. Describe how to observe as an on-looker, a participant,
Observation & Experiments Watch, listen, and learn…
EVALUATION PROfessional network of Master’s degrees in Informatics as a Second Competence – PROMIS ( TEMPUS FR-TEMPUS-JPCR)
Chapter 23 Deciding how to collect data. UIDE Chapter 23 Introduction: Preparing to Collect Evaluation Data Timing and Logging Actions –Automatic Logging.
Intro to Evaluation See how (un)usable your software really is…
Observing users. What and when to observe Goals & questions determine the paradigms and techniques used. Observation is valuable any time during design.
Spring /6.831 User Interface Design and Implementation1 Lecture 13: User Testing next 3 lectures required for 6.831G only nanoquizzes on those.
Observation & Experiments Watch, listen, and learn…
How do we know if our UI is good or bad?.
Observational Methods Think Aloud Cooperative evaluation Protocol analysis Automated analysis Post-task walkthroughs.
Evaluation / Usability. ImplementDesignAnalysisEvaluateDevelop ADDIE.
1 ITM 734 Introduction to Human Factors in Information Systems Cindy Corritore Testing the UI – part 2.
School of Engineering and Information and Communication Technology KIT305/607 Mobile Application Development Week 7: Usability (think-alouds) Dr. Rainer.
Day 8 Usability testing.
SIE 515 Design Evaluation Lecture 7.
Evaluation through user participation
Usability Testing 3 CPSC 481: HCI I Fall 2014 Anthony Tang.
Evaluation of Mobile Interfaces
Chapter 12 Observing Users
Chapter 23 Deciding how to collect data
Observation & Experiments
Evaluation Techniques
Experimental Evaluation
Empirical Evaluation Data Collection: Techniques, methods, tricks Objective data IRB Clarification All research done outside the class (i.e., with non-class.
Presentation transcript:

Observation Watch, listen, and learn…

Agenda  Observation exercise Come back at 3:40.  Questions?  Observation

Observing Users  Not as easy as you think  One of the best ways to gather feedback about your interface  Watch, listen and learn as a person interacts with your system  Qualitative & quantitative, end users, experimental or naturalistic

Observation  Direct In same room Can be intrusive Users aware of your presence Only see it one time May use 1-way mirror to reduce intrusiveness  Indirect Video or app recording Reduces intrusiveness, but doesn’t eliminate it Cameras focused on screen, face & keyboard Gives archival record, but can spend a lot of time reviewing it

Location  Observations may be In lab - Maybe a specially built usability lab  Easier to control  Can have user complete set of tasks In field  Watch their everyday actions  More realistic  Harder to control other factors

Usability Lab ation_room2.htm Large viewing area in this one- way mirror which includes an angled sheet of glass the improves light capture and prevents sound transmission between rooms. Doors for participant room and observation rooms are located such that participants are unaware of observers movements in and out of the observation room.

Observation Room  State-of-the-art observation room equipped with three monitors to view participant, participant's monitor, and composite picture in picture.  One-way mirror plus angled glass captures light and isolates sound between rooms.  Comfortable and spacious for three people, but room enough for six seated observers.  Digital mixer for unlimited mixing of input images and recording to VHS, SVHS, or MiniDV recorders.

Task Selection  What tasks are people performing? Representative and realistic? Tasks dealing with specific parts of the interface you want to test? Problematic tasks?  Don’t forget to pilot your entire evaluation!!

Engaging Users in Evaluation  What’s going on in the user’s head?  Use verbal protocol where users describe their thoughts  Qualitative techniques Think-aloud - can be very helpful Post-hoc verbal protocol - review video Critical incident logging - positive & negative Structured interviews - good questions  “What did you like best/least?”  “How would you change..?”

Think Aloud  User describes verbally what s/he is thinking and doing What they believe is happening Why they take an action What they are trying to do  Widely used, popular protocol  Potential problems: Can be awkward for participant Thinking aloud can modify way user performs task

Cooperative approach  Another technique: Co-discovery learning (Constructive iteration) Join pairs of participants to work together Use think aloud Perhaps have one person be semi-expert (coach) and one be novice More natural (like conversation) so removes some awkwardness of individual think aloud  Variant: let coach be from design team (cooperative evaluation)

Alternative  What if thinking aloud during session will be too disruptive?  Can use post-event protocol User performs session, then watches video afterwards and describes what s/he was thinking Sometimes difficult to recall Opens up door of interpretation

What if a user gets stuck?  Decide ahead of time what you will do. Offer assistance or not? What kind of assistance?  You can ask (in cooperative evaluation) “What are you trying to do..?” “What made you think..?” “How would you like to perform..?” “What would make this easier to accomplish..?” Maybe offer hints This is why cooperative approaches are used

Inputs / Outcomes  Need operational prototype could use Wizard of Oz simulation  What you get out “process” or “how-to” information Errors, problems with the interface compare user’s (verbalized) mental model to designer’s intended model

Capturing a Session  1. Paper & pencil Can be slow May miss things Is definitely cheap and easy Time 10:00 10:03 10:08 10:22 Task 1 Task 2 Task 3 … SeSe SeSe

Capturing a Session  2. Recording (audio and/or video) Good for think-aloud Hard to tie to interface Multiple cameras may be needed Good, rich record of session Can be intrusive Can be painful to transcribe and analyze

Capturing a Session  3. Software logging Modify software to log user actions Can give time-stamped key press or mouse event Two problems:  Too low-level, want higher level events  Massive amount of data, need analysis tools

Example logs |hrichter| |MV|START| |hrichter| |MV|QUESTION|false|false|false|false|false|false| |hrichter| |MV|TAB|AGENDA |hrichter| |MV|TAB|PRESENTATION |hrichter| |MV|SLIDECHANGE| |hrichter| |MV|SEEK|PRESENTATION-A|566|604189| |hrichter| |MV|SEEK|PRESENTATION-A|566|604189| |hrichter| |MV|SEEK|PRESENTATION-A|566|604189| |hrichter| |MV|TAB|AGENDA |hrichter| |MV|SEEK|AGENDA|566|149613| |hrichter| |MV|TAB|PRESENTATION |hrichter| |MV|SLIDECHANGE| |hrichter| |MV|SEEK|PRESENTATION|566|315796| |hrichter| |MV|PLAY|566| |hrichter| |MV|TAB|AV |hrichter| |MV|TAB|PRESENTATION |hrichter| |MV|SLIDECHANGE| |hrichter| |MV|SEEK|PRESENTATION|566|271191| |hrichter| |MV|TAB|AV |hrichter| |MV|TAB|PRESENTATION |hrichter| |MV|TAB|AGENDA |hrichter| |MV|TAB|PRESENTATION |hrichter| |MV|TAB|AV |hrichter| |MV|TAB|AGENDA |hrichter| |MV|TAB|AV |hrichter| |MV|STOP|566| |hrichter| |MV|END

Analysis  Many approaches  Task based How do users approach the problem What problems do users have Need not be exhaustive, look for interesting cases  Performance based Frequency and timing of actions, errors, task completion, etc.  Very time consuming!!

Example: Heather’s study  Software: MeetingViewer interface fully functional  Criteria – learnability, efficiency, see what aspects of interface get used, what might be missing  Resources – subjects were students in a research group, just me as evaluator, plenty of time  Wanted completely authentic experience

Heather’s evaluation  Task: answer questions from a recorded meeting, use my software as desired  Think-aloud  Video taped, software logs  Also had post questionnaire  Wrote my own code for log analysis  Watched video and matched behavior to software logs

Example: Analysis  Printed logs  Watched video and marked actions, thoughts, and video timing info on the logs  Wrote down interesting quotes from the think aloud  Took 2 to 3 times the session time just to do this analysis alone!