류 현 정류 현 정 2005-09-01 Human Computer Interaction Introducing evaluation.

Slides:



Advertisements
Similar presentations
Introducing evaluation. The aims Discuss how developers cope with real-world constraints. Explain the concepts and terms used to discuss evaluation. Examine.
Advertisements

Chapter 13: An evaluation framework
Chapter 14: Usability testing and field studies
Chapter 12: Introducing Evaluation. The aims To illustrate how observation, interviews and questionnaires that you encountered in Chapters 7 and 8 are.
Structured Design The Structured Design Approach (also called Layered Approach) focuses on the conceptual and physical level. As discussed earlier: Conceptual.
Evaluating health informatics projects Reasons for and problems of evaluation Objective model Subjective model.
©2011 1www.id-book.com Introducing Evaluation Chapter 12.
CHAPTER 10 Introducing Evaluation Doosup Baek Sungmin Cho Syed Omer Jan.
Virtual University - Human Computer Interaction 1 © Imran Hussain | UMT Imran Hussain University of Management and Technology (UMT) Lecture 16 HCI PROCESS.
Agile Usability Testing Methods
CS305: HCI in SW Development Evaluation (Return to…)
Alternate Software Development Methodologies
WHAT IS INTERACTION DESIGN?
Methodology Overview Why do we evaluate in HCI? Why should we use different methods? How can we compare methods? What methods are there? see
Chapter 14: Usability testing and field studies. 2 FJK User-Centered Design and Development Instructor: Franz J. Kurfess Computer Science Dept.
Evaluation 3: Approaches and Data Collection Slides adapted from Saul Greenberg’s “Evaluating Interfaces with Users”.
User-centered approaches to interaction design
Methodology Overview Dr. Saul Greenberg John Kelleher.
1 User Centered Design and Evaluation. 2 Overview Why involve users at all? What is a user-centered approach? Evaluation strategies Examples from “Snap-Together.
User-centered approaches to interaction design. Overview Why involve users at all? What is a user-centered approach? Understanding users’ work —Coherence.
1 User-Centered Design and Development Instructor: Franz J. Kurfess Computer Science Dept. Cal Poly San Luis Obispo FJK 2005.
Evaluation Methodologies
Design and Evaluation of Iterative Systems n For most interactive systems, the ‘design it right first’ approach is not useful. n The 3 basic steps in the.
An evaluation framework
1 User Centered Design and Evaluation. 2 Overview My evaluation experience Why involve users at all? What is a user-centered approach? Evaluation strategies.
Human Computer Interaction Design Evaluation and Content Presentation
Saul Greenberg Evaluating Interfaces With Users Why evaluation is crucial to interface design General approaches and tradeoffs in evaluation The role of.
ICS 463, Intro to Human Computer Interaction Design: 8. Evaluation and Data Dan Suthers.
ISE554 The WWW 3.4 Evaluation Methods. Evaluating Interfaces with Users Why evaluation is crucial to interface design General approaches and tradeoffs.
James Tam Evaluating Interfaces With Users Why evaluation is crucial to interface design General approaches and tradeoffs in evaluation The role of ethics.
Damian Gordon.  Summary and Relevance of topic paper  Definition of Usability Testing ◦ Formal vs. Informal methods of testing  Testing Basics ◦ Five.
Lecture 7: User-centered approaches and Introducing evaluation.
Web Content Development Dr. Komlodi Class 25: Evaluative testing.
Chapter 10: Introducing Evaluation Group 4: Tony Masi, Sam Esswein, Brian Rood, Chris Troisi.
Predictive Evaluation
Chapter 11: An Evaluation Framework Group 4: Tony Masi, Sam Esswein, Brian Rood, & Chris Troisi.
Presentation: Techniques for user involvement ITAPC1.
Descriptive and Causal Research Designs
Methods of Media Research Communication covers a broad range of topics. Also it draws heavily from other fields like sociology, psychology, anthropology,
Human Computer Interaction
Usability testing. Goals & questions focus on how well users perform tasks with the product. – typical users – doing typical tasks. Comparison of products.
Computer Science Department California Polytechnic State University San Luis Obispo, CA, U.S.A. Franz J. Kurfess CPE/CSC 484: User-Centered Design and.
Introducing Evaluation: why, what, when, where Text p Text p 317 – 323;
Chapter 12/13: Evaluation/Decide Framework Question 1.
Level 2 Prepared by: RHR First Prepared on: Nov 23, 2006 Last Modified on: Quality checked by: MOH Copyright 2004 Asia Pacific Institute of Information.
Midterm Stats Min: 16/38 (42%) Max: 36.5/38 (96%) Average: 29.5/36 (78%)
Chapter 12: Introducing Evaluation. The aims To illustrate how observation, interviews and questionnaires that you encountered in Chapters 7 and 8 are.
CS2003 Usability Engineering Human-Centred Design Dr Steve Love.
Web Content Development Dr. Komlodi Class 25: Evaluative testing.
Chapter 12: Introducing Evaluation. The aims To illustrate how observation, interviews and questionnaires that you encountered in Chapters 7 and 8 are.
User Interfaces 4 BTECH: IT WIKI PAGE:
Design Process … and some design inspiration. Course ReCap To make you notice interfaces, good and bad – You’ll never look at doors the same way again.
Chapter 12/13: Evaluation/Decide Framework. Why Evaluate? Why: to check that users can use the product and that they like it. Designers need to check.
Chapter 7 Causal Designs. Issues Addressed l conditions of causality l what is an experiment? l differences between a lab vs field experiment l two types.
Paper III Qualitative research methodology.  Qualitative research is designed to reveal a specific target audience’s range of behavior and the perceptions.
Oct 211 The next two weeks Oct 21 & 23: Lectures on user interface evaluation Oct 28: Lecture by Dr. Maurice Masliah No office hours (out of town) Oct.
User Interface Evaluation Introduction Lecture #15.
Introducing Evaluation Chapter 12. What is Evaluation?  Assessing and judging  Reflecting on what it is to be achieved  Assessing the success  Identifying.
Evaluation / Usability. ImplementDesignAnalysisEvaluateDevelop ADDIE.
Introduction to System Evaluation IS 588 Dania Bilal & Lorraine Normore Spring 2007.
Methodology Overview 2 basics in user studies Lecture /slide deck produced by Saul Greenberg, University of Calgary, Canada Notice: some material in this.
Introducing Evaluation
WHAT IS INTERACTION DESIGN?
Imran Hussain University of Management and Technology (UMT)
Introducing Evaluation
COMP444 Human Computer Interaction Evaluation
Chapter 14 INTRODUCING EVALUATION
Presentation transcript:

류 현 정류 현 정 Human Computer Interaction Introducing evaluation

The aims  Discuss how developers cope with real- world constraints  Explain the concepts and terms used to discuss evaluation  Examine how different techniques are used at different stages of development

Two main types of evaluation  Formative evaluation is done at different stages of development to check that the product meets users ’ needs  Summative evaluation assesses the quality of a finished product  Our focus is on formative evaluation

What to evaluate  Iterative design & evaluation is a continuous process that examines:  Early ideas for conceptual model  Early prototypes of the new system  Later, more complete prototypes  Designers need to check that they understand users ’ requirements

Bruce Tognazzini tells you why you need to evaluate  “ Iterative design, with its repeating cycle of design and testing, is the only validated methodology in existence that will consistently produce successful results. If you don ’ t have user-testing as an integral part of your design process you are going to throw buckets of money down the drain. ”  See AskTog.com for topical discussion about design and evaluation

When to evaluate  Throughout design  From the first descriptions, sketches etc. of users needs through to the final product  Design proceeds through iterative cycles of ‘ design-test-redesign ’  Evaluation is a key ingredient for a successful design

Approaches: Naturalistic  Naturalistic:  describes an ongoing process as it evolves over time  observation occurs in realistic setting  ecologically valid  “ real life ”  External validity  degree to which research results applies to real situations

Approaches: Experimental  Experimental  study relations by manipulating one or more independent variables  experimenter controls all environmental factors  observe effect on one or more dependent variables  Internal validity  confidence that we have in our explanation of experimental results  Trade-off: Natural vs Experimental  precision and direct control over experimental design versus  desire for maximum generalizability in real life situations

Approaches: Reliability Concerns  Would the same results be achieved if the test were repeated?  Problem: individual differences:  best user 10x faster than slowest  best 25% of users ~2x faster than slowest 25%  Partial Solution  reasonable number and range of users tested  statistics provide confidence intervals of test results  95% confident that mean time to perform task X is 4.5+/-0.2 minutes means 95% chance true mean is between 4.3 and 4.7, 5% chance its outside that

Approaches: Validity Concerns  Does the test measure something of relevance to usability of real products in real use outside of lab?  Some typical reliability problems of testing vs real use  non-typical users tested  tasks are not typical tasks  physical environment different  quiet lab vs very noisy open offices vs interruptions  social influences different  motivation towards experimenter vs motivation towards boss  Partial Solution  use real users  tasks from task-centered system design  environment similar to real situation

Qualitative Evaluation Techniques

Qualitative methods for usability evaluation  Qualitative:  produces a description, usually in non-numeric terms  may be subjective  Methods  Introspection  Extracting the conceptual model  Direct observation  simple observation  think-aloud  constructive interaction  Query via interviews and questionnaires  Continuous evaluation via user feedback and field studies

Querying Users via Interviews  Excellent for pursuing specific issues  vary questions to suit the context  probe more deeply on interesting issues as they arise  good for exploratory studies via open-ended questioning  often leads to specific constructive suggestions  Problems:  accounts are subjective  time consuming  evaluator can easily bias the interview  prone to rationalization of events/thoughts by user  user ’ s reconstruction may be wrong

Evaluating the 1984 OMS  Early tests of printed scenarios & user guides  Early simulations of telephone keypad  An Olympian joined team to provide feedback  Interviews & demos with Olympians outside US  Overseas interface tests with friends and family.  Free coffee and donut tests  Usability tests with 100 participants.  A ‘ try to destroy it ’ test  Pre-Olympic field-test at an international event  Reliability of the system with heavy traffic

Development of HutchWorld  Many informal meetings with patients, carers & medical staff early in design  Early prototype was informally tested on site  Designers learned a lot e.g.  language of designers & users was different  asynchronous communication was also needed  Redesigned to produce the portal version

Usability testing  User tasks investigated:  how users ’ identify was represented  communication  information searching  entertainment  User satisfaction questionnaire  Triangulation to get different perspectives

Findings from the usability test  The back button didn’t always work  Users didn’t pay attention to navigation buttons  Users expected all objects in the 3-D view to be clickable  Users did not realize that there could be others in the 3-D world with whom to chat  Users tried to chat to the participant list

Key points  Evaluation & design are closely integrated in user-centered design  Some of the same techniques are used in evaluation & requirements but they are used differently (e.g., interviews & questionnaires)  Triangulation involves using a combination of techniques to gain different perspectives  Dealing with constraints is an important skill for evaluators to develop