Fall 2002CS/PSY 67501 Predictive Evaluation (Evaluation Without Users) Gathering data about usability of a design by a specified group of users for a particular.

Slides:



Advertisements
Similar presentations
Chapter 15: Analytical evaluation
Advertisements

Acknowledgements: Most of this course is based on the excellent course offered by Prof. Kellogg Booth at the British Columbia University, Vancouver, Canada.
Cognitive Walkthrough More evaluation without users.
CS305: HCI in SW Development Continuing Evaluation: Asking Experts Inspections and walkthroughs.
Ch 11 Cognitive Walkthroughs and Heuristic Evaluation Yonglei Tao School of Computing and Info Systems GVSU.
Part 4: Evaluation Days 25, 27, 29, 31 Chapter 20: Why evaluate? Chapter 21: Deciding on what to evaluate: the strategy Chapter 22: Planning who, what,
Chapter 15: Analytical evaluation. 2 FJK User-Centered Design and Development Instructor: Franz J. Kurfess Computer Science Dept. Cal Poly San.
Heuristic Evaluation. Sources for today’s lecture: Professor James Landay: stic-evaluation/heuristic-evaluation.ppt.
1 User-Centered Design and Development Instructor: Franz J. Kurfess Computer Science Dept. Cal Poly San Luis Obispo FJK 2009.
Discount Evaluation Evaluating with experts. Agenda Online collaboration tools Heuristic Evaluation Perform HE on each other’s prototypes Cognitive Walkthrough.
Discount Evaluation Evaluating with experts. Agenda Part 4 preview Heuristic Evaluation Perform HE on each other’s prototypes Cognitive Walkthrough Perform.
Heuristic Evaluation Evaluating with experts. Discount Evaluation Techniques  Basis: Observing users can be time- consuming and expensive Try to predict.
Evaluation Through Expert Analysis U U U
Evaluating with experts
Usability 2004 J T Burns1 Usability & Usability Engineering.
Discount Evaluation Evaluating with experts. Agenda Project was due today – You will demo your prototype next class Heuristic Evaluation Cognitive Walkthrough.
Evaluation techniques Part 1
Evaluation: Inspections, Analytics & Models
John Kelleher (IT Sligo) Cognitive Walkthrough. 1 Background Authors – Lewis & Polson (1991) Based on theory L&P CE+ Theory of Exploratory Learning Assesses.
Analytical Evaluations 2. Field Studies
Usability 2009 J T Burns1 Usability & Usability Engineering.
Heuristic evaluation IS 403: User Interface Design Shaun Kane.
©2011 1www.id-book.com Analytical evaluation Chapter 15.
Heuristic Evaluation “Discount” Usability Testing Adapted from material by Marti Hearst, Loren Terveen.
1 Usability evaluation and testing User interfaces Jaana Holvikivi Metropolia.
… and after unit testing …
Predictive Evaluation
Discount Evaluation Evaluating with experts. Discount Evaluation Techniques Basis: – Observing users can be time-consuming and expensive – Try to predict.
Part 1-Intro; Part 2- Req; Part 3- Design  Chapter 20 Why evaluate the usability of user interface designs?  Chapter 21 Deciding on what you need to.
Formative Evaluation cs3724: HCI. Problem scenarios summative evaluation Information scenarios claims about current practice analysis of stakeholders,
Chapter 26 Inspections of the UI. Heuristic inspection Recommended before but in lieu of user observations Sort of like an expert evaluation Heuristics.
Nielsen’s Ten Usability Heuristics
Usability Evaluation/LP Usability: how to judge it.
10 Usability Heuristics for User Interface Design.
Multimedia Specification Design and Production 2012 / Semester 1 / week 5 Lecturer: Dr. Nikos Gazepidis
Usability Evaluation June 8, Why do we need to do usability evaluation?
Heuristic Evaluation and Discount Usability Engineering Taken from the writings of Jakob Nielsen – inventor of both.
SEG3120 User Interfaces Design and Implementation
Evaluation of User Interface Design 4. Predictive Evaluation continued Different kinds of predictive evaluation: 1.Inspection methods 2.Usage simulations.
Y ASER G HANAM Heuristic Evaluation. Roadmap Introduction How it works Advantages Shortcomings Conclusion Exercise.
Chapter 15: Analytical evaluation. Inspections Heuristic evaluation Walkthroughs.
Chapter 15: Analytical evaluation Q1, 2. Inspections Heuristic evaluation Walkthroughs Start Q3 Reviewers tend to use guidelines, heuristics and checklists.
Evaluating a UI Design Expert inspection methods Cognitive Walkthrough
Usability 1 Usability evaluation Without users - analytical techniques With users - survey and observational techniques.
Heuristic Evaluation Short tutorial to heuristic evaluation
Cognitive Walkthrough More evaluating with experts.
Chapter 15: Analytical evaluation. Aims: Describe inspection methods. Show how heuristic evaluation can be adapted to evaluate different products. Explain.
Introduction to Evaluation without Users. Where are you at with readings? Should have read –TCUID, Chapter 4 For Next Week –Two Papers on Heuristics from.
Administrivia  Feedback from the mid-term evaluation  Insights from project proposal.
ParaQ Usability ParaQ Summit II March 14, 2006 Matthew Wong, SNL/CA.
1 Usability evaluation and testing User interfaces Jaana Holvikivi Metropolia.
Oct 211 The next two weeks Oct 21 & 23: Lectures on user interface evaluation Oct 28: Lecture by Dr. Maurice Masliah No office hours (out of town) Oct.
Introduction to Evaluation “Informal” approaches.
Ten Usability Heuristics with Example.. Page 2 Heuristic Evaluation Heuristic evaluation is the most popular of the usability inspection methods. Heuristic.
COGNITIVE WALKTHROUGH Vigneshwar Poojar. The cognitive walkthrough is a formalized way of imagining people’s thoughts and actions when they use an interface.
Ten Usability Heuristics These are ten general principles for user interface design. They are called "heuristics" because they are more in the nature of.
Asking Users and Experts Li Zhang (Jacey) Yuewei Zhou (Joanna)
SIE 515 Design Evaluation Lecture 7.
Human Computer Interaction Lecture 15 Usability Evaluation
Imran Hussain University of Management and Technology (UMT)
CS3205: HCI in SW Development Evaluation (Return to…)
Evaluation Techniques 1
Chapter 26 Inspections of the user interface
Evaluation.
Miguel Tavares Coimbra
Miguel Tavares Coimbra
Cognitive Walkthrough
Miguel Tavares Coimbra
Evaluation: Inspections, Analytics & Models
Evaluation: Inspections, Analytics, and Models
Presentation transcript:

Fall 2002CS/PSY Predictive Evaluation (Evaluation Without Users) Gathering data about usability of a design by a specified group of users for a particular activity within a specified environment

Fall 2002CS/PSY Goals 1. Assess extent of system’s functionality 2. Assess effect of interface on user 3. Identify specific problems with system

Fall 2002CS/PSY Forms Formative  As project is forming. All through the lifecycle. Early, continuous. Iterative. Summative  After a system has been finished. Make judgments about final item.

Fall 2002CS/PSY Approaches Experimental (Lab studies, quantitative)  Typically in a closed, lab setting Manipulate independent variables to see effect on dependent variables Naturalistic (Field studies, qualitative)  Observation occurs in “real life” setting Watch process over time

Fall 2002CS/PSY Tradeoffs Experimental + Replicable + More “objective” - Expensive, requires real users & lab - Realistic? Naturalistic + “Ecologically valid” + Cheap, quick - Not reproducible, user-specific results - Not quantitative (how much better?)

Fall 2002CS/PSY Evaluation Methods 1. Experimental/Observational Evaluation  a. Collecting user opinions  b. Observing usage  c. Experiments (usability specifications) 2. Predictive Evaluation 3. Interpretive Evaluation

Fall 2002CS/PSY Predictive Evaluation Basis:  Observing users can be time-consuming and expensive  Try to predict usage rather than observing it directly  Conserve resources (quick & low cost)

Fall 2002CS/PSY Approach Expert reviews (frequently used)  HCI experts interact with system and try to find potential problems and give prescriptive feedback Best if  Haven’t used earlier prototype  Familiar with domain or task  Understand user perspectives

Fall 2002CS/PSY Predictive Eval. Methods 1. Heuristic Evaluation 2. Discount usability testing 3. Cognitive Walkthrough 4. User Modeling

Fall 2002CS/PSY Heuristic Evaluation Developed by Jakob Nielsen Several expert usability evaluators assess system based on simple and general heuristics (principles or rules of thumb) (Web site: )

Fall 2002CS/PSY Procedure 1. Gather inputs 2. Evaluate system 3. Debriefing and collection 4. Severity rating

Fall 2002CS/PSY Gather Inputs Who are evaluators?  Need to learn about domain, its practices Get the prototype to be studied  May vary from mock-ups and storyboards to a working system

Fall 2002CS/PSY Evaluation Method Reviewers evaluate system based on high-level heuristics: use simple and natural dialog provide clearly marked exits speak user’s language provide shortcuts minimize memory load provide good error messages be consistent prevent errors provide feedback

Fall 2002CS/PSY Updated Heuristics Stresses visibility of system status aesthetic and minimalist design user control and freedom consistency and standards error prevention recognition rather than recall flexibility and efficiency of use recognition, diagnosis and recovery from errors help and documentation match between system and real world

Fall 2002CS/PSY Process Perform two or more passes through system inspecting  Flow from screen to screen  Each screen Evaluate against heuristics Find “problems”  Subjective (if you think it is, it is)  Don’t dwell on whether it is or isn’t

Fall 2002CS/PSY Debriefing Organize all problems found by different reviewers  At this point, decide what are and aren’t problems  Group, structure

Fall 2002CS/PSY Severity Rating 0-4 rating scale Based on  frequency  impact  persistence  market impact

Fall 2002CS/PSY Advantage Cheap, good for small companies who can’t afford more Getting someone practiced in method is valuable

Fall 2002CS/PSY Application Nielsen found that about 5 evaluations found 75% of the problems Above that you get more, but at decreasing efficiency

Fall 2002CS/PSY Somewhat Controversial Very subjective assessment of problems  Depends of expertise of reviewers Why are these the right heuristics?  Others have been suggested How to determine what is a true usability problem  Some recent papers suggest that many identified “problems” really aren’t

Fall 2002CS/PSY Discount Usability Testing Hybrid of empirical usability testing and heuristic evaluation Have 2 or 3 think-aloud user sessions with paper or prototype-produced mock-ups

Fall 2002CS/PSY Cognitive Walkthrough Assess learnability and usability through simulation of way users explore and become familiar with interactive system A usability “thought experiment” Like code walkthrough (s/w engineering) From Polson, Lewis, et al at UC Boulder

Fall 2002CS/PSY CW Process Construct carefully designed tasks from system spec or screen mock-up Walk through (cognitive & operational) activities required to go from one screen to another Review actions needed for task, attempt to predict how users would behave and what problems they’ll encounter

Fall 2002CS/PSY Requirements Description of users and their backgrounds Description of task user is to perform Complete list of the actions required to complete task Prototype or description of system

Fall 2002CS/PSY Assumptions User has rough plan User explores system, looking for actions to contribute to performance of action User selects action seems best for desired goal User interprets response and assesses whether progress has been made toward completing task

Fall 2002CS/PSY Methodology Step through action sequence  Action 1  Response A, B,..  Action 2  Response A ... For each one, ask four questions and try to construct a believability story

Fall 2002CS/PSY CW Questions 1. Will users be trying to produce whatever effect action has? 2. Will users be able to notice that correct action is available? 3. Once found, will they know it’s the right one for desired effect? 4. Will users understand feedback after action?

Fall 2002CS/PSY Answering the Questions 1. Will user be trying to produce effect?  Typical supporting Evidence It is part of their original task They have experience using the system The system tells them to do it  No evidence? Construct a failure scenario Explain, back up opinion

Fall 2002CS/PSY Next Question 2.Will user notice action is available?  Typical supporting evidence Experience Visible device, such as a button Perceivable representation of an action such as a menu item

Fall 2002CS/PSY Next Question 3.Will user know it’s the right one for the effect?  Typical supporting evidence Experience Interface provides a visual item (such as prompt) to connect action to result effect All other actions look wrong 4.Will user understand the feedback?  Typical supporting evidence Experience Recognize a connection between a system response and what user was trying to do

Fall 2002CS/PSY Example Program VCR  List actions  Ask questions

Fall 2002CS/PSY User/Cognitive Modeling Build a model of user in order to predict usage  User as processor model GOMS & keystroke level model  Contextual models Activity theory, distributed cognition,…