CS5714 Usability Engineering Formative Evaluation of User Interaction: During Evaluation Session Copyright © 2003 H. Rex Hartson and Deborah Hix.

Slides:



Advertisements
Similar presentations
GENERAL USABILITY CONSIDERATIONS. Overview of usability Aspects of usability – Learnability, memorability, efficiency, error reduction Techniques for.
Advertisements

Slide 1 Systems Analysis and Design with UML Version 2.0 Alan Dennis, Barbara Wixom, and David Tegarden Chapter 5: Requirements Determination John Wiley.
6.811 / PPAT: Principles and Practice of Assistive Technology Wednesday, 16 October 2013 Prof. Rob Miller Today: User Testing.
Each individual person is working on a GUI subset. The goal is for you to create screens for three specific tasks your user will do from your GUI Project.
Observing users Chapter 12. Observation ● Why? Get information on.. – Context, technology, interaction ● Where? – Controlled environments – In the field.
Deciding How to Measure Usability How to conduct successful user requirements activity?
CS5714 Usability Engineering Formative Evaluation of User Interaction: After Evaluation Session Copyright © 2003 H. Rex Hartson and Deborah Hix.
Advanced Technical Writing Today in class ► Presentation guidelines ► Ideas for displaying data.
Congenerative Dialogues: A Tool For Change in the Science Classroom By Michele Dixon.
Observation Watch, listen, and learn…. Agenda  Observation exercise Come back at 3:40.  Questions?  Observation.
Think-aloud usability experiments or concurrent verbal accounts Judy Kay CHAI: Computer human adapted interaction research group School of Information.
User Interface Testing. Hall of Fame or Hall of Shame?  java.sun.com.
6.S196 / PPAT: Principles and Practice of Assistive Technology Monday, 28 Nov Prof. Rob Miller Today: User Testing & Ethics.
Formative and Summative Evaluations
Usable Privacy and Security Carnegie Mellon University Spring 2008 Lorrie Cranor 1 Designing user studies February.
Focus Groups for the Health Workforce Retention Study.
Data collection methods Questionnaires Interviews Focus groups Observation –Incl. automatic data collection User journals –Arbitron –Random alarm mechanisms.
COMP 7970 Playtesting Cheryl Seals
ICS 463, Intro to Human Computer Interaction Design: 8. Evaluation and Data Dan Suthers.
Damian Gordon.  Summary and Relevance of topic paper  Definition of Usability Testing ◦ Formal vs. Informal methods of testing  Testing Basics ◦ Five.
1 MTN-003 Training General Interviewing Techniques Some specific tips for administering the Screening interviewer-administered CRFs SSP Section 14.
Chapter 3 Needs Assessment
Usability Testing RMIT Usability Test Lab.
Gathering Usability Data
Chapter 14: Usability testing and field studies
Chapter 8: Systems analysis and design
Presentation: Techniques for user involvement ITAPC1.
Focus groups ScWk 242 – Session 4 Slides.
CSCI 4163/6904, summer Quiz  Multiple choice  Answer individually - pass in  Then class discussion.
Multimedia Specification Design and Production 2013 / Semester 1 / week 9 Lecturer: Dr. Nikos Gazepidis
Human Computer Interaction
Interface Design Natural Design. What is natural design? Intuitive Considers our learned behaviors Naturally designed products are easy to interpret and.
What is Usability? Usability Is a measure of how easy it is to use something: –How easy will the use of the software be for a typical user to understand,
Week 9 Usability Test Plan and Quality Assurance Plan.
Usability Testing Chapter 6. Reliability Can you repeat the test?
CS2003 Usability Engineering Usability Evaluation Dr Steve Love.
Human Factors in Information Seeking and Use
COMP5047 Pervasive Computing: 2012 Think-aloud usability experiments or concurrent verbal accounts Judy Kay CHAI: Computer human adapted interaction research.
Usability Assessment Methods beyond Testing Chapter 7 Evaluating without users.
Usability Testing Chris North cs3724: HCI. Presentations karen molye, steve kovalak Vote: UI Hall of Fame/Shame?
Usability Testing 2 Planning and Reporting From:. rnusa/testplan.html
User Testing 101. Recruiting Users Find people with the same experience level as the typical user Don’t get people who are familiar with the product or.
Chapter 8 Usability Specification Techniques Hix & Hartson.
Welcome to the Usability Center Tour Since 1995, the Usability Center has been a learning environment that supports and educates in the process of usability.
Unit 1 – Improving Productivity Mollie painter. Instructions- 100 words per box.
©2001 Southern Illinois University, Edwardsville All rights reserved. Today Tuesday Running A Paper Prototyping Session CS 321 Human-Computer Interaction.
Step 5 Training Session: Interview Techniques. Questions Generate useful information Generate useful information Focus on reasons or motives Focus on.
Dr. H. Rex Hartson Fall 2003 Introduction to the Course Copyright © 2003 H. Rex Hartson and Deborah Hix. CS5714 Usability Engineering.
CSCI 4163/6610 WINTER Housekeeping  Group membership update.
Facilitate Group Learning
Usability Evaluation, part 2. REVIEW: A Test Plan Checklist, 1 Goal of the test? Specific questions you want to answer? Who will be the experimenter?
EVALUATION PROfessional network of Master’s degrees in Informatics as a Second Competence – PROMIS ( TEMPUS FR-TEMPUS-JPCR)
CS5714 Usability Engineering Usability Inspection Copyright © 2003 H. Rex Hartson and Deborah Hix.
By Godwin Alemoh. What is usability testing Usability testing: is the process of carrying out experiments to find out specific information about a design.
Monday September 14th, 2015 Planner: – HW: Safety rules poster due Wed. 9/16 – Safety Quiz tomorrow - based on rules You Need: – Today: Daily 5 To Do:
Research Methods Observations Interviews Case Studies Surveys Quasi Experiments.
Observing users. What and when to observe Goals & questions determine the paradigms and techniques used. Observation is valuable any time during design.
Today Discussion Follow-Up Interview Techniques Next time Interview Techniques: Examples Work Modeling CD Ch.s 5, 6, & 7 CS 321 Human-Computer Interaction.
Spring /6.831 User Interface Design and Implementation1 Lecture 13: User Testing next 3 lectures required for 6.831G only nanoquizzes on those.
Usability Evaluation or, “I can’t figure this out...do I still get the donuts?”
Steps in Planning a Usability Test Determine Who We Want To Test Determine What We Want to Test Determine Our Test Metrics Write or Choose our Scenario.
How do we know if our UI is good or bad?.
Evaluation / Usability. ImplementDesignAnalysisEvaluateDevelop ADDIE.
1 ITM 734 Introduction to Human Factors in Information Systems Cindy Corritore Testing the UI – part 2.
School of Engineering and Information and Communication Technology KIT305/607 Mobile Application Development Week 7: Usability (think-alouds) Dr. Rainer.
Day 8 Usability testing.
Usability Evaluation, part 2
SY DE 542 User Testing March 7, 2005 R. Chow
Evaluation of Mobile Interfaces
Empirical Evaluation Data Collection: Techniques, methods, tricks Objective data IRB Clarification All research done outside the class (i.e., with non-class.
Presentation transcript:

CS5714 Usability Engineering Formative Evaluation of User Interaction: During Evaluation Session Copyright © 2003 H. Rex Hartson and Deborah Hix.

Eval during 2 Topics Generating and collecting data Quantitative techniques Qualitative techniques Observational techniques Forms, variations, attitudes, tools

Eval during 3 Generating and Collecting Data Preliminaries with participants – Explain protocol to participant, including any compensation – Show participant the lab and experimental set-up if they are interested – Have participant sign informed consent form and NDA

Eval during 4 Quantitative Techniques Collecting quantitative data (to assess usability levels) – Benchmark tasks Measuring time on task, number of errors, etc. Quantitative measures such as timing can be valuable even with paper prototype, though not very precise – User satisfaction scores

Eval during 5 Qualitative Techniques Collecting qualitative data (to identify usability problems) – Verbal protocol taking Participants think aloud, talking while performing tasks – Can be intrusive, but effective – Some participants not good at talking – Evaluator/facilitator sits in room with participant to collect this kind of data Can be used for both time and un-timed tasks – Studies show it can be done with minimal effect on performance time

Eval during 6 Qualitative Techniques Collecting qualitative data (to identify usability problems) – Verbal protocol taking Facilitator may need to prod participant who stops talking, but don’t get into discussion during timed tasks Answer questions about what to do with a hint, not a direct answer

Eval during 7 Qualitative Techniques Collecting qualitative data (to identify usability problems) – Critical incident taking Critical incident: something that happens while participant is working that has significant effect on task performance or user satisfaction Although participant may indicate a critical incident, it is responsibility of evaluator to identify and record critical incidents

Eval during 8 Qualitative Techniques Collecting qualitative data (to identify usability problems) – Critical incident taking Arguably single most important kind of formative evaluation data Critical incidents are indicators of usability problems Later analyze the problem and cause within the interaction design

Eval during 9 Qualitative Techniques Collecting qualitative data (to identify usability problems) – Critical incident taking Pay attention to detailed participant behavior – IMPORTANT! – It’s easy to miss them! It’s a skill; takes experience Example: user wasn’t sure what the alarm clock icon meant – Could have had to do with time of day. Solution: show it “ringing” to emphasize alarm part Example: user confused by word “cancel” on button in dialogue box showing appointments – Subtle: Normally understand Cancel, but with appointment in calendar domain cancel has meaning of removing appointment CI Training

Eval during 10 Observational Techniques Some observational data collection techniques – Structured interviews Post-session questioning Typically obtain general information – Co-discovery More than one participant, using system together, thinking aloud together Can lead to rich verbal protocol from conversations among participants

Eval during 11 Observational Techniques Some observational data collection techniques – Software tools for critical incident recording (e.g., Ideal) – Note taking – the primary technique Most important: Real-time notes (e.g., pencil and paper, on- line) Nothing beats this for effective data gathering

Eval during 12 Observational Techniques Some observational data collection techniques – Audio recording can be useful Effective if used selectively for note taking, if not too distracting Can be used to capture continuous dialogue with participant (more agile than video taping)

Eval during 13 Observational Techniques Some observational data collection techniques – Video taping Used primarily as backup Captures every detail, but tedious to analyze Generally one camera on hands/keyboard/mouse/screen; if a second camera, on user’s face Used to use scan converter for screen action (but resolution too low to read)

Eval during 14 Observational Techniques Some observational data collection techniques – The role of video taping today Screen capture software gives much higher resolution screen images (e.g., Camtasia) Typically not use video camera (unless important – e.g., VR) – Video can be used with paper prototypes, too Camtasia demoDilbert demo Evaluating lo-fi prototype, from M. Rettig, CACM, April 1994

Eval during 15 Data Collection Forms Form for collecting both quantitative and qualitative data during session DATA COLLECTION FORM TASK NAME: PARTICIPANT ID: Date: No. of errors: Task start time: Task end time: Time to perform task: Critical Incident Description Tape Counter Evaluator's Comments

Eval during 16 Adopt the Right Attitude Evolution of developers’ attitude as they watch user evaluating a product “Stupid user!” “Let me at him/her!” “It’s his/her (another developer’s) fault!” “I’m mud!” “Let’s fix it!” Feel better

Eval during 17 Variations Variations on the theme – Major point: No rules; do what works best in your situation – Evaluator sitting with participant (cf. in separate room) – Abandon verbal protocol if it doesn’t work for a participant – Try co-discovery with two participants Video-clip: Formative evaluation of Envision – PLEASE ignore the production quality problems; the content is good!

Eval during 18 Data Collection Tools Ideal – a software tool for critical incident recording – Usability engineer uses to capture raw usability data – Quantitative: timing, error counts – Critical incidents: most important Tags with video (Camtasia) clip for later review Gathers full critical incident records in database Shares database with other UE tools Feeds critical incident descriptions to usability problem analysis tools Ideal demo

Eval during 19 Team Exercise – Formative Usability Evaluation Goal: To perform the data collection part of a very simple formative usability evaluation Activities: – Assemble in teams

Eval during 20 Team Exercise – Formative Usability Evaluation – Decide roles for team members: Prototype executor, to move transparencies, provide feedback (person who knows design best and who can “play computer”) Evaluation facilitator, to keep experiment moving, to interact with participants, and to record critical incidents (qualitative data) User performance timer, to time participants performing tasks and/or count errors (quantitative data) Two participants, to trade to another team Anyone else can help record critical incidents

Eval during 21 Team Exercise – Formative Usability Evaluation – Now make the switch: Trade your two participants to another team, getting two new participants from a different team (we’ll help make this work in a “cycle” among the teams). Your new participants are now permanently on your team (for this exercise). Newly formed teams sit together in groups now. We will now pass out and explain the forms you will use

Eval during 22 Team Exercise – Formative Usability Evaluation – Getting ready: Have your new participants leave the room temporarily. Get your prototype “booted up”. Bring first participant into "lab", greet them, and explain evaluation session to them.

Eval during 23 Team Exercise – Formative Usability Evaluation – Cautions and restrictions Team members must not coach participants as they perform tasks. Person playing computer must not anticipate user actions, especially do not give the correct computer response for a wrong user action! Respond only to what user actually does! Person playing computer may not speak, make gestures, etc. You may not change the design on the fly!

Eval during 24 Team Exercise – Formative Usability Evaluation – Run the experiment: Have first participant use your prototype to perform the benchmark tasks for your objective usability specifications. That is, have participant read first task aloud, then perform that task while thinking aloud. As executor moves transparencies in response to participant actions, the timer records times and/or counts errors as user performs task. Evaluation facilitator records critical incidents. Don’t count reading of task in task timing.

Eval during 25 Team Exercise – Formative Usability Evaluation – Run the experiment: Next have participant read second task aloud and perform it while thinking aloud. Have this participant complete questionnaire, and then give them their "reward". Now have the second participant perform the tasks and complete the questionnaire. The first participant should stay and help with observations.