EVALUATION PROfessional network of Master’s degrees in Informatics as a Second Competence – PROMIS (544319-TEMPUS-1-2013-1-FR-TEMPUS-JPCR)

Slides:



Advertisements
Similar presentations
Chapter 15: Analytical evaluation
Advertisements

Critical Reading Strategies: Overview of Research Process
Human Computer Interaction
User Interfaces 4 BTECH: IT WIKI PAGE:
Copyright 1999 all rights reserved Why Conduct a Task Analysis? n Uncovered Information may lead to abandoning a bad idea –The world is ready for video.
6.811 / PPAT: Principles and Practice of Assistive Technology Wednesday, 16 October 2013 Prof. Rob Miller Today: User Testing.
CS305: HCI in SW Development Evaluation (Return to…)
Analyzing and Presenting Results Establishing a User Orientation Alfred Kobsa University of California, Irvine.
Chapter 14: Usability testing and field studies. 2 FJK User-Centered Design and Development Instructor: Franz J. Kurfess Computer Science Dept.
Semester in review. The Final May 7, 6:30pm – 9:45 pm Closed book, ONE PAGE OF NOTES Cumulative Similar format to midterm (probably about 25% longer)
©N. Hari Narayanan Computer Science & Software Engineering Auburn University 1 COMP 7620 Evaluation Chapter 9.
1 User Centered Design and Evaluation. 2 Overview Why involve users at all? What is a user-centered approach? Evaluation strategies Examples from “Snap-Together.
Empirical Methods in Human- Computer Interaction.
Usability Inspection n Usability inspection is a generic name for a set of methods based on having evaluators inspect or examine usability-related issues.
Evaluation Methodologies
Observing users.
Inspection Methods. Inspection methods Heuristic evaluation Guidelines review Consistency inspections Standards inspections Features inspection Cognitive.
Intro to Evaluation See how (un)usable your software really is…
Evaluation Methods April 20, 2005 Tara Matthews CS 160.
Human Computer Interaction Design Evaluation and Content Presentation
ICS 463, Intro to Human Computer Interaction Design: 8. Evaluation and Data Dan Suthers.
Analytical Evaluations 2. Field Studies
Usability Evaluation Methods Computer-Mediated Communication Lab week 10; Tuesday 7/11/06.
Damian Gordon.  Summary and Relevance of topic paper  Definition of Usability Testing ◦ Formal vs. Informal methods of testing  Testing Basics ◦ Five.
1. Learning Outcomes At the end of this lecture, you should be able to: –Define the term “Usability Engineering” –Describe the various steps involved.
©2011 1www.id-book.com Analytical evaluation Chapter 15.
INTROSE Introduction to Software Engineering Raymund Sison, PhD College of Computer Studies De La Salle University User Interface Design.
Predictive Evaluation
Evaluation of Products for Accessibility: The CUDA Lab at CSULB and Technical Evaluation at the Campus Level Fred Garcia and Shawn Bates.
Chapter 11: An Evaluation Framework Group 4: Tony Masi, Sam Esswein, Brian Rood, & Chris Troisi.
Ch 14. Testing & modeling users
Evaluation Techniques Material from Authors of Human Computer Interaction Alan Dix, et al.
Multimedia Specification Design and Production 2013 / Semester 1 / week 9 Lecturer: Dr. Nikos Gazepidis
Object-Oriented Software Engineering Practical Software Development using UML and Java Chapter 7: Focusing on Users and Their Tasks.
Usability Evaluation June 8, Why do we need to do usability evaluation?
Click to edit Master subtitle style USABILITY and USER INTERFACE DESIGN Application.
Usability Testing CS774 Human Computer Interaction Spring 2004.
Evaluation Techniques zEvaluation ytests usability and functionality of system yoccurs in laboratory, field and/or in collaboration with users yevaluates.
Testing & modeling users. The aims Describe how to do user testing. Discuss the differences between user testing, usability testing and research experiments.
New Media Research Methods- Part 1 Planning stage: Using Methods, Data and Tools to explore User’s Experience. How methods relate to research questions?
Evaluation of User Interface Design 4. Predictive Evaluation continued Different kinds of predictive evaluation: 1.Inspection methods 2.Usage simulations.
COMPSCI 345 / SOFTENG 350 Review for mid-semester test AProf Beryl Plimmer.
Chapter 15: Analytical evaluation. Inspections Heuristic evaluation Walkthroughs.
Chapter 15: Analytical evaluation Q1, 2. Inspections Heuristic evaluation Walkthroughs Start Q3 Reviewers tend to use guidelines, heuristics and checklists.
Usability 1 Usability evaluation Without users - analytical techniques With users - survey and observational techniques.
Usability Engineering Dr. Dania Bilal IS 582 Spring 2006.
Overview and Revision for INFO3315. The exam
Chapter 15: Analytical evaluation. Aims: Describe inspection methods. Show how heuristic evaluation can be adapted to evaluate different products. Explain.
Object-Oriented Software Engineering Practical Software Development using UML and Java Chapter 7: Focusing on Users and Their Tasks.
Observing users. What and when to observe Goals & questions determine the paradigms and techniques used. Observation is valuable any time during design.
Oct 211 The next two weeks Oct 21 & 23: Lectures on user interface evaluation Oct 28: Lecture by Dr. Maurice Masliah No office hours (out of town) Oct.
Fall 2002CS/PSY Predictive Evaluation (Evaluation Without Users) Gathering data about usability of a design by a specified group of users for a particular.
Usability Engineering Dr. Dania Bilal IS 582 Spring 2007.
How do we know if our UI is good or bad?.
1 Usability Analysis n Why Analyze n Types of Usability Analysis n Human Subjects Research n Project 3: Heuristic Evaluation.
Day 8 Usability testing.
Cognitive Informatics for Biomedicine – Chapter 5
SIE 515 Design Evaluation Lecture 7.
Evaluation through user participation
Usability Evaluation, part 2
Introducing Evaluation
SY DE 542 User Testing March 7, 2005 R. Chow
Observing users.
Evaluation.
COMP444 Human Computer Interaction Usability Engineering
HCI Evaluation Techniques
Testing & modeling users
Evaluation Techniques
Empirical Evaluation Data Collection: Techniques, methods, tricks Objective data IRB Clarification All research done outside the class (i.e., with non-class.
Evaluation - Analytical Cognitive Walkthrough and Heuristic Evaluation
Presentation transcript:

EVALUATION PROfessional network of Master’s degrees in Informatics as a Second Competence – PROMIS ( TEMPUS FR-TEMPUS-JPCR)

What’s Human-Computer Interaction? PROfessional network of Master’s degrees in Informatics as a Second Competence – PROMIS ( TEMPUS FR-TEMPUS-JPCR)

A Story  In 1995, now-famous web guru Jakob Nielsen had less than 24 hours to recommend if adding three new buttons to Sun’s home page was a good idea.  Check out his “Alertbox” online column for good (and often fun) web design advice  He found that each new, but unused button costs visitors.5 million $ per year.  2 of the 3 new buttons were taken back out.  The method he used for his estimate: GOMS. PROfessional network of Master’s degrees in Informatics as a Second Competence – PROMIS ( TEMPUS FR-TEMPUS-JPCR)

Some Cognitive Psychology Basics  The Model Human Processor  Perception, Cognition, Motor System  Fitts’ Law PROfessional network of Master’s degrees in Informatics as a Second Competence – PROMIS ( TEMPUS FR-TEMPUS-JPCR)

Evaluation Techniques PROfessional network of Master’s degrees in Informatics as a Second Competence – PROMIS ( TEMPUS FR-TEMPUS-JPCR)

Evaluation Techniques  Evaluating Without Users  Evaluating With Users PROfessional network of Master’s degrees in Informatics as a Second Competence – PROMIS ( TEMPUS FR-TEMPUS-JPCR)

E1 Literature Review E2 Cognitive Walkthrough E3 Heuristic Evaluation E4 Model-based Evaluation (GOMS,...) Evaluation without users PROfessional network of Master’s degrees in Informatics as a Second Competence – PROMIS ( TEMPUS FR-TEMPUS-JPCR)

E1: Literature Review  Many research results about user interface design have been published  Idea: Search literature for evidence for (or against) aspects of your design  Saves own experiments  Results only carry over reliably if context (users, assumptions) is very similar PROfessional network of Master’s degrees in Informatics as a Second Competence – PROMIS ( TEMPUS FR-TEMPUS-JPCR)

E2: Cognitive Walkthrough  Without users  Expert = designer or cognitive psychologist  Goal: Judge learnability and ease of use  Step through each task and ask  How does interaction influence user?  What cognitive processes will she need?  What problems could learning/doing this step have?  Does system help user to get from goals to intentions and actions?  Requires interface description, task description, and user profile PROfessional network of Master’s degrees in Informatics as a Second Competence – PROMIS ( TEMPUS FR-TEMPUS-JPCR)

E2: Cognitive Walkthrough  What to do:  Choose task—describe goals—determine actions  Analyze this decision process using above questions  Question forms capture psychological knowledge and guide the tester  Analytical method for early design or existing systems  Takes time PROfessional network of Master’s degrees in Informatics as a Second Competence – PROMIS ( TEMPUS FR-TEMPUS-JPCR)

E3: Heuristic Evaluation  Variant of Cognitive Walkthrough  Choose usability heuristics  General guidelines, e.g., Nine Golden Rules  Step through tasks and check whether guidelines are followed + Quick and cheap – Subjective Better done by several independent designers PROfessional network of Master’s degrees in Informatics as a Second Competence – PROMIS ( TEMPUS FR-TEMPUS-JPCR)

E4: Model-Based Evaluation  Some models exist that offer a framework for design and evaluation  Examples:  Information efficiency  GOMS, KLM  Design Rationale (History of design decisions with reasons and alternatives)  Design Patterns PROfessional network of Master’s degrees in Informatics as a Second Competence – PROMIS ( TEMPUS FR-TEMPUS-JPCR)

Evaluating with Users  Evaluating With Users QualitativeQuantitative PROfessional network of Master’s degrees in Informatics as a Second Competence – PROMIS ( TEMPUS FR-TEMPUS-JPCR) E5 Model Extraction E6 Silent Observation E7 Think Aloud E8 Constructive Interaction E9 Retrospective Testing + Interviews, questionnaires,...  E10 Controlled Experiments

Evaluating with Users  E1–E4 evaluate designs without the user  As soon as implementations (prototypes) exist they should also be tested with users, using the following methods PROfessional network of Master’s degrees in Informatics as a Second Competence – PROMIS ( TEMPUS FR-TEMPUS-JPCR)

E5: Model Extraction PROfessional network of Master’s degrees in Informatics as a Second Competence – PROMIS ( TEMPUS FR-TEMPUS-JPCR)  Designer shows user prototype or screen shots  User tries to explain elements and their function +Good to understand naïve user’s conceptual model of the system – Bad to understand how the system is learned over time

E6: Silent Observation  Designer watches user in lab or in natural environment while  working on one of the tasks  No communication during observation +Helps discover big problems – No understanding of decision process (that lead to problems) or user’s mental model, opinions, or feelings PROfessional network of Master’s degrees in Informatics as a Second Competence – PROMIS ( TEMPUS FR-TEMPUS-JPCR)

E7: Think Aloud  As E7, but user is asked to say aloud  What she thinks is happening (state)  What she is trying to achieve (goals)  Why she is doing something specific (actions)  Most common method in industry +Good to get some insight into user’s thinking, but: – Talking is hard while focusing on a task – Feels weird for most users to talk aloud – Conscious talking can change behavior PROfessional network of Master’s degrees in Informatics as a Second Competence – PROMIS ( TEMPUS FR-TEMPUS-JPCR)

E8: Constructive Interaction  Two people work on a task together  Normal conversation is observed (and recorded)  More comfortable than Think Aloud  Variant of this: Different partners  Semi-expert as “trainer”, newbie as “student”  Student uses UI and asks, trainer answers  Good: Gives insight into mental models of beginner and advanced users at the same time! PROfessional network of Master’s degrees in Informatics as a Second Competence – PROMIS ( TEMPUS FR-TEMPUS-JPCR)

Recording Observations  Paper + pencil  Evaluator notes events, interpretations, other observations  Cheap but hard with many details (writing is slow). Forms can help.  Audio recording  Good for speech with Think Aloud and Constructive Interaction  But hard to connect to interface state  Video  Ideal: two cameras (user + screen) in one picture  Best capture, but may be too intrusive initially PROfessional network of Master’s degrees in Informatics as a Second Competence – PROMIS ( TEMPUS FR-TEMPUS-JPCR)

Silverback PROfessional network of Master’s degrees in Informatics as a Second Competence – PROMIS ( TEMPUS FR-TEMPUS-JPCR)

E9: Retrospective Testing  Additional activity after an observation  Subject and evaluator look at video recordings together, user comments his actions retrospectively  Good starting point for subsequent interview, avoids wrong memories  Often results in concrete suggestions for improvement PROfessional network of Master’s degrees in Informatics as a Second Competence – PROMIS ( TEMPUS FR-TEMPUS-JPCR)

E10: Controlled Experiments  Quantitative, empirical method  Steps:  Formulate hypothesis  Design experiment, pick variable and fixed parameters  Choose subjects  Run experiment  Interpret results to accept or reject hypothesis PROfessional network of Master’s degrees in Informatics as a Second Competence – PROMIS ( TEMPUS FR-TEMPUS-JPCR)

Other Evaluation Methods  Before and during the design, with users:  Questionnaires  Personal interviews  After completing a project:  bug report forms  Hotlines  Retrospective interviews and questionnaires  Field observations (observe running system in real use) PROfessional network of Master’s degrees in Informatics as a Second Competence – PROMIS ( TEMPUS FR-TEMPUS-JPCR)