Predictive Evaluation

Slides:



Advertisements
Similar presentations
Chapter 15: Analytical evaluation
Advertisements

Human Computer Interaction
Task-Centered User Interface Design who are the users? what are the tasks? plagiarize! iterative design –rough descriptions, mock-ups, prototypes test.
Acknowledgements: Most of this course is based on the excellent course offered by Prof. Kellogg Booth at the British Columbia University, Vancouver, Canada.
Copyright 1999 all rights reserved The HCI Design Process n User Interfaces are not just built by sitting down and drawing up designs for them n Just like.
CS305: HCI in SW Development Evaluation (Return to…)
Cognitive Walkthrough More evaluation without users.
Evaluation 1 Introduction & Usability Inspection.
©N. Hari Narayanan Computer Science & Software Engineering Auburn University 1 COMP 7620 Evaluation Chapter 9.
Methodology Overview Dr. Saul Greenberg John Kelleher.
Chapter 15: Analytical evaluation. 2 FJK User-Centered Design and Development Instructor: Franz J. Kurfess Computer Science Dept. Cal Poly San.
1 User-Centered Design and Development Instructor: Franz J. Kurfess Computer Science Dept. Cal Poly San Luis Obispo FJK 2009.
Discount Evaluation Evaluating with experts. Agenda Online collaboration tools Heuristic Evaluation Perform HE on each other’s prototypes Cognitive Walkthrough.
Empirical Methods in Human- Computer Interaction.
Discount Evaluation Evaluating with experts. Agenda Part 4 preview Heuristic Evaluation Perform HE on each other’s prototypes Cognitive Walkthrough Perform.
Heuristic Evaluation Evaluating with experts. Discount Evaluation Techniques  Basis: Observing users can be time- consuming and expensive Try to predict.
Evaluation Through Expert Analysis U U U
Evaluating with experts
1 CS 430 / INFO 430 Information Retrieval Lecture 24 Usability 2.
Intro to Evaluation See how (un)usable your software really is…
Evaluation Methods April 20, 2005 Tara Matthews CS 160.
Usability 2004 J T Burns1 Usability & Usability Engineering.
Discount Evaluation Evaluating with experts. Agenda Project was due today – You will demo your prototype next class Heuristic Evaluation Cognitive Walkthrough.
Evaluation techniques Part 1
Evaluation: Inspections, Analytics & Models
John Kelleher (IT Sligo) Cognitive Walkthrough. 1 Background Authors – Lewis & Polson (1991) Based on theory L&P CE+ Theory of Exploratory Learning Assesses.
Analytical Evaluations 2. Field Studies
Evaluating User Interfaces Walkthrough Analysis Joseph A. Konstan
©2011 1www.id-book.com Analytical evaluation Chapter 15.
류 현 정류 현 정 Human Computer Interaction Introducing evaluation.
1 Usability evaluation and testing User interfaces Jaana Holvikivi Metropolia.
Discount Evaluation Evaluating with experts. Discount Evaluation Techniques Basis: – Observing users can be time-consuming and expensive – Try to predict.
System Design: Designing the User Interface Dr. Dania Bilal IS582 Spring 2009.
Evaluation Techniques Material from Authors of Human Computer Interaction Alan Dix, et al.
Part 1-Intro; Part 2- Req; Part 3- Design  Chapter 20 Why evaluate the usability of user interface designs?  Chapter 21 Deciding on what you need to.
Multimedia Specification Design and Production 2013 / Semester 1 / week 9 Lecturer: Dr. Nikos Gazepidis
Human Computer Interaction
Usability Evaluation June 8, Why do we need to do usability evaluation?
Evaluation Techniques zEvaluation ytests usability and functionality of system yoccurs in laboratory, field and/or in collaboration with users yevaluates.
Testing & modeling users. The aims Describe how to do user testing. Discuss the differences between user testing, usability testing and research experiments.
Usability Testing Chapter 6. Reliability Can you repeat the test?
Evaluation of User Interface Design 4. Predictive Evaluation continued Different kinds of predictive evaluation: 1.Inspection methods 2.Usage simulations.
Chapter 15: Analytical evaluation. Inspections Heuristic evaluation Walkthroughs.
Evaluation Techniques Evaluation –tests usability and functionality of system –occurs in laboratory, field and/or in collaboration with users –evaluates.
Chapter 15: Analytical evaluation Q1, 2. Inspections Heuristic evaluation Walkthroughs Start Q3 Reviewers tend to use guidelines, heuristics and checklists.
Evaluating a UI Design Expert inspection methods Cognitive Walkthrough
Usability 1 Usability evaluation Without users - analytical techniques With users - survey and observational techniques.
User Interface Evaluation Cognitive Walkthrough Lecture #16.
EVALUATION PROfessional network of Master’s degrees in Informatics as a Second Competence – PROMIS ( TEMPUS FR-TEMPUS-JPCR)
Usability Evaluation. Objectives for today Go over upcoming deliverables Learn about usability testing (testing with users) BTW, we haven’t had a quiz.
Cognitive Walkthrough More evaluating with experts.
Chapter 15: Analytical evaluation. Aims: Describe inspection methods. Show how heuristic evaluation can be adapted to evaluate different products. Explain.
Introduction to Evaluation without Users. Where are you at with readings? Should have read –TCUID, Chapter 4 For Next Week –Two Papers on Heuristics from.
1 Usability evaluation and testing User interfaces Jaana Holvikivi Metropolia.
Oct 211 The next two weeks Oct 21 & 23: Lectures on user interface evaluation Oct 28: Lecture by Dr. Maurice Masliah No office hours (out of town) Oct.
Introduction to Evaluation “Informal” approaches.
Fall 2002CS/PSY Predictive Evaluation (Evaluation Without Users) Gathering data about usability of a design by a specified group of users for a particular.
Searching and Using Electronic Literature III. Research Design.
Usability Engineering Dr. Dania Bilal IS 587 Fall 2007.
Human Computer Interaction Lecture 15 Usability Evaluation
CS3205: HCI in SW Development Evaluation (Return to…)
Evaluation techniques
Model based design.
Evaluation.
HCI Evaluation Techniques
Testing & modeling users
Evaluation Techniques
Cognitive Walkthrough
Evaluation: Inspections, Analytics & Models
Evaluation: Inspections, Analytics, and Models
Presentation transcript:

Predictive Evaluation Evaluation without users Evaluation Overview Predictive Evaluation Heuristic evaluation Discount usability testing Cognitive walkthrough (User modeling)

Evaluation • Gathering data about usability of a design by a specified group of users for a particular activity within a specified environment • Goals 1.Assess extent of system’s functionality 2.Assess effect of interface on user 3.Identify specific problems with system

Evaluation … • Forms – Formative ~As project is forming. All through the lifecycle. Early, continuous. Iterative. – Summative ~After a system has been finished. Make judgments about final item. • Approaches – Experimental (Lab studies, quantitative) ~ Typically in a closed, lab setting Manipulate independent variables to see effect on dependent variables – Naturalistic (Field studies, qualitative) ~ Observation occurs in “real life” setting Watch process over time

Evaluation Methods 1. Experimental/Observational Evaluation a. Collecting user opinions b. Observing usage c. Experiments (usability specifications) 2. Predictive Evaluation 3. Interpretive Evaluation

Predictive Evaluation • Basis: – Observing users can be timeconsuming and expensive – Try to predict usage rather than observing it directly – Conserve resources (quick & low cost) • Approach – Expert reviews (frequently used) HCI experts interact with system and try to find potential problems and give prescriptive feedback – Best if • Haven’t used earlier prototype • Familiar with domain or task • Understand user perspectives

1. Heuristic Evaluation • Developed by Jakob Nielsen • Several expert usability evaluators assess system based on simple and general heuristics (principles or rules of thumb) • Procedure 1. Gather inputs 2. Evaluate system 3. Debriefing and collection 4. Severity rating

1. Heuristic Evaluation … • Advantage – Cheap, good for small companies who can’t afford more – Getting someone practiced in method is valuable • Somewhat Controversial – Very subjective assessment of problems • Depends of expertise of reviewers – Why are these the right heuristics? • Others have been suggested – How to determine what is a true usability problem • Some recent papers suggest that many identified “problems” really aren’t

2. Discount Usability Testing Hybrid of empirical usability testing and heuristic evaluation Have 2 or 3 think-aloud user sessions with paper or prototype-produced mock-ups

3. Cognitive Walkthrough • Assess learnability and usability through simulation of way users explore and become familiar with interactive system • A usability “thought experiment” • Like code walkthrough (s/wengineering) • From Polson, Lewis, et al at UC Boulder

3. Cognitive Walkthrough … • CW Process – Construct carefully designed tasks from system spec or screen mock-up – Walk through (cognitive & operational) activities required to go from one screen to another – Review actions needed for task, attempt to predict how users would behave and what problems they’ll encounter • Requirements – Description of users and their backgrounds – Description of task user is to perform – Complete list of the actions required to complete task – Prototype or description of system

3. Cognitive Walkthrough … • Assumptions – User has rough plan – User explores system, looking for actions to contribute to performance of action – User selects action seems best for desired goal – User interprets response and assesses whether progress has been made toward completing task • Methodology – Step through action sequence • Action 1 • Response A, B, .. • Action 2 • Response A • ... – For each one, ask four questions and try to construct a believability story

CW Questions & Answers 1. Will user be trying to produce effect? – Typical supporting Evidence • It is part of their original task • They have experience using the system • The system tells them to do it – No evidence? • Construct a failure scenario • Explain, back up opinion

CW Questions & Answers 2. Will user notice action is available? – Typical supporting evidence Experience Visible device, such as a button Perceivable representation of an action such as a menu item 3. Will user know it’s the right one for the effect? Interface provides a visual item (such as prompt) to connect action to result effect All other actions look wrong

CW Questions & Answers 4. Will user understand the feedback? – Typical supporting evidence Experience Recognize a connection between a system response and what user was trying to do Example: • Program VCR – List actions – Ask questions

4. User/Cognitive Modeling • Build a model of user in order to predict usage – User as processor model GOMS & keystroke level model – Contextual models Activity theory, distributed cognition,…