Computer Science Department California Polytechnic State University San Luis Obispo, CA, U.S.A. Franz J. Kurfess CPE/CSC 484: User-Centered Design and.

Slides:



Advertisements
Similar presentations
©2011 1www.id-book.com Evaluation studies: From controlled to natural settings Chapter 14.
Advertisements

Introducing evaluation. The aims Discuss how developers cope with real-world constraints. Explain the concepts and terms used to discuss evaluation. Examine.
Chapter 14: Usability testing and field studies
Chapter 12: Introducing Evaluation. The aims To illustrate how observation, interviews and questionnaires that you encountered in Chapters 7 and 8 are.
Chapter 5 Development and Evolution of User Interface
©2011 1www.id-book.com Introducing Evaluation Chapter 12.
CS305: HCI in SW Development Evaluation (Return to…)
WHAT IS INTERACTION DESIGN?
ACTIVELY ENGAGING THE STAKEHOLDER IN DEFINING REQUIREMENTS FOR THE BUSINESS, THE STAKEHOLDER, SOLUTION OR TRANSITION Requirements Elicitation.
Chapter 14: Usability testing and field studies. 2 FJK User-Centered Design and Development Instructor: Franz J. Kurfess Computer Science Dept.
1 User-Centered Design and Development Instructor: Franz J. Kurfess Computer Science Dept. Cal Poly San Luis Obispo FJK 2005.
Chapter 14: Usability testing and field studies. Usability Testing Emphasizes the property of being usable Key Components –User Pre-Test –User Test –User.
1 FJK User-Centered Design and Development Instructor: Franz J. Kurfess Computer Science Dept. Cal Poly San Luis Obispo.
Chapter 15: Analytical evaluation. 2 FJK User-Centered Design and Development Instructor: Franz J. Kurfess Computer Science Dept. Cal Poly San.
1 User-Centered Design and Development Instructor: Franz J. Kurfess Computer Science Dept. Cal Poly San Luis Obispo FJK 2009.
1 User-Centered Design and Development Instructor: Franz J. Kurfess Computer Science Dept. Cal Poly San Luis Obispo FJK 2005.
User-Centered Design and Development Instructor: Franz J. Kurfess Computer Science Dept. Cal Poly San Luis Obispo FJK 2005.
User-Centered Design and Development Instructor: Franz J. Kurfess Computer Science Dept. Cal Poly San Luis Obispo.
Design and Evaluation of Iterative Systems n For most interactive systems, the ‘design it right first’ approach is not useful. n The 3 basic steps in the.
An evaluation framework
Usability Evaluation 1
Evaluation: Inspections, Analytics & Models
From Controlled to Natural Settings
INTRODUCTION. Concepts HCI, CHI Usability User-centered Design (UCD) An approach to design (software, Web, other) that involves the user Interaction Design.
Design in the World of Business
Sofia Carlander Kinoshita Laboratory 2004/2005
©2011 1www.id-book.com Analytical evaluation Chapter 15.
Chapter 14: Usability testing and field studies
Computer Science Department California Polytechnic State University San Luis Obispo, CA, U.S.A. Franz J. Kurfess CPE/CSC 484: User-Centered Design and.
류 현 정류 현 정 Human Computer Interaction Introducing evaluation.
Evaluation Framework Prevention vs. Intervention CHONG POH WAN 21 JUNE 2011.
Chapter 11: An Evaluation Framework Group 4: Tony Masi, Sam Esswein, Brian Rood, & Chris Troisi.
User-Centered Design and Development Instructor: Franz J. Kurfess Computer Science Dept. Cal Poly San Luis Obispo FJK 2005.
Computer Science Department California Polytechnic State University San Luis Obispo, CA, U.S.A. Franz J. Kurfess CPE/CSC 484: User-Centered Design and.
Computer Science Department California Polytechnic State University San Luis Obispo, CA, U.S.A. Franz J. Kurfess CPE/CSC 484: User-Centered Design and.
©2011 1www.id-book.com Introducing Evaluation Chapter 12 adapted by Wan C. Yoon
Human Computer Interaction
Computer Science Department California Polytechnic State University San Luis Obispo, CA, U.S.A. Franz J. Kurfess CPE/CSC 484: User-Centered Design and.
Introducing Evaluation: why, what, when, where Text p Text p 317 – 323;
Chapter 12/13: Evaluation/Decide Framework Question 1.
Children: Can they inspect it? Yes they can! Gavin Sim.
CSCD 487/587 Human Computer Interface Winter 2013 Lecture 19 Evaluation.
Computer Science Department California Polytechnic State University San Luis Obispo, CA, U.S.A. Franz J. Kurfess CPE/CSC 484: User-Centered Design and.
Level 2 Prepared by: RHR First Prepared on: Nov 23, 2006 Last Modified on: Quality checked by: MOH Copyright 2004 Asia Pacific Institute of Information.
Chapter 12: Introducing Evaluation. The aims To illustrate how observation, interviews and questionnaires that you encountered in Chapters 7 and 8 are.
CS2003 Usability Engineering Human-Centred Design Dr Steve Love.
© An Evaluation Framework Chapter 13.
Chapter 12: Introducing Evaluation. The aims To illustrate how observation, interviews and questionnaires that you encountered in Chapters 7 and 8 are.
Chapter 12/13: Evaluation/Decide Framework. Why Evaluate? Why: to check that users can use the product and that they like it. Designers need to check.
Computer Science Department California Polytechnic State University San Luis Obispo, CA, U.S.A. Franz J. Kurfess CPE/CSC 484: User-Centered Design and.
User-Centered Design and Development Instructor: Franz J. Kurfess Computer Science Dept. Cal Poly San Luis Obispo FJK 2005.
Oct 211 The next two weeks Oct 21 & 23: Lectures on user interface evaluation Oct 28: Lecture by Dr. Maurice Masliah No office hours (out of town) Oct.
User Interface Evaluation Introduction Lecture #15.
Introducing Evaluation Chapter 12. What is Evaluation?  Assessing and judging  Reflecting on what it is to be achieved  Assessing the success  Identifying.
School of Engineering and Information and Communication Technology KIT305/607 Mobile Application Development Week 7: Usability (think-alouds) Dr. Rainer.
The aims Show how design & evaluation are brought together in the development of interactive products. Show how different combinations of design & evaluation.
User-Centered Design and Development
User Interface Evaluation
SIE 515 Design Evaluation Lecture 7.
User-Centered Design and Development
Introducing Evaluation
From Controlled to Natural Settings
WHAT IS INTERACTION DESIGN?
Imran Hussain University of Management and Technology (UMT)
From Controlled to Natural Settings
Evaluation.
Introducing Evaluation
COMP444 Human Computer Interaction Evaluation
Chapter 14 INTRODUCING EVALUATION
Presentation transcript:

Computer Science Department California Polytechnic State University San Luis Obispo, CA, U.S.A. Franz J. Kurfess CPE/CSC 484: User-Centered Design and Development

2 © Franz J. Kurfess Logistics ❖ Senate appearance by the instructor  during the first half of the lab period ❖ Assignments  A3 - Storyboards deadline Thu, May 3 ❖ Term Project  mid-quarter project displays Thu, May 3  allocation of display stations  external visitors anticipated  don’t reveal confidential information  e.g. names of collaboration partners

© Franz J. Kurfess Copyright Notice These slides are a revised version of the originals provided with the book “Interaction Design” by Jennifer Preece, Yvonne Rogers, and Helen Sharp, Wiley, I added some material, made some minor modifications, and created a custom show to select a subset.  Slides added or modified by me use a different style, with my name at the bottom

©2011 1www.id-book.com Introducing Evaluation Chapter 12

5 FJK 2005 Motivation provide an overview of usability evaluation introduce the context, terms, and concepts used

6 FJK 2005 Objectives Discuss how developers cope with real-world constraints. Explain the concepts and terms used to discuss evaluation. Examine how different techniques are used at different stages of development.

©2011 2www.id-book.com The aims Explain the key concepts used in evaluation. Introduce different evaluation methods. Show how different methods are used for different purposes at different stages of the design process and in different contexts. Show how evaluators mix and modify methods. Discuss the practical challenges Illustrate how methods discussed in Chapters 7 and 8 are used in evaluation and describe some methods that are specific to evaluation.

©2011 3www.id-book.com Why, what, where and when to evaluate Iterative design & evaluation is a continuous process that examines: Why: to check users’ requirements and that users can use the product and they like it. What: a conceptual model, early prototypes of a new system and later, more complete prototypes. Where: in natural and laboratory settings. When: throughout design; finished products can be evaluated to collect information to inform new products.

©2011 4www.id-book.com Bruce Tognazzini tells you why you need to evaluate “Iterative design, with its repeating cycle of design and testing, is the only validated methodology in existence that will consistently produce successful results. If you don’t have user-testing as an integral part of your design process you are going to throw buckets of money down the drain.” See AskTog.com for topical discussions about design and evaluation.

©2011 5www.id-book.com Types of evaluation Controlled settings involving users, eg usability testing & experiments in laboratories and living labs. Natural settings involving users, eg field studies to see how the product is used in the real world. Any settings not involving users, eg consultants critique; to predict, analyze & model aspects of the interface analytics.

©2011 6www.id-book.com Usability lab

©2011 7www.id-book.com Living labs People’s use of technology in their everyday lives can be evaluated in living labs. Such evaluations are too difficult to do in a usability lab. Eg the Aware Home was embedded with a complex network of sensors and audio/video recording devices (Abowd et al., 2000).

©2011 8www.id-book.com Usability testing & field studies

©2011 9www.id-book.com Evaluation case studies Experiment to investigate a computer game In the wild field study of skiers Crowdsourcing

© www.id-book.com Challenge & engagement in a collaborative immersive game Physiological measures were used. Players were more engaged when playing against another person than when playing against a computer. What precautionary measures did the evaluators take?

© www.id-book.com What does this data tell you?

© www.id-book.com Why study skiers in the wild ? Jambon et al. (2009) User experience in the wild. In: Proceedings of CHI ’09, ACM Press, New York, p

© www.id-book.com e-skiing system components Jambon et al. (2009) User experience in the wild. In: Proceedings of CHI ’09, ACM Press, New York, p

© www.id-book.com Crowdsourcing-when might you use it?

© www.id-book.com Evaluating an ambient system The Hello Wall is a new kind of system that is designed to explore how people react to its presence. What are the challenges of evaluating systems like this?

© www.id-book.com Evaluation methods MethodControlled settings Natural settings Without users Observing x x Asking users x x Asking experts x x Testing x Modeling x

© www.id-book.com The language of evaluation Analytics Analytical evaluation Controlled experiment Expert review or crit Field study Formative evaluation Heuristic evaluation In the wild evaluation Living laboratory Predictive evaluation Summative evaluation Usability laboratory User studies Usability testing Users or participants

©2011 Key points Evaluation & design are closely integrated in user- centered design. Some of the same techniques are used in evaluation as for establishing requirements but they are used differently –(e.g. observation interviews & questionnaires). Three types of evaluation: laboratory based with users, in the field with users, studies that do not involve users The main methods are: observing, asking users, asking experts, user testing, inspection, and modeling users’ task performance, analytics. Dealing with constraints is an important skill for evaluators to develop. 17www.id-book.com

© www.id-book.com A project for you … “The Butterfly Ballot: Anatomy of disaster” was written by Bruce Tognazzini, and you can find it by going to AskTog.com and looking through the 2001 column. Alternatively go directly to: – utterflyBallot.htmlhttp:// utterflyBallot.html

© www.id-book.com A project for you … continued Read Tog’s account and look at the picture of the ballot card. Make a similar ballot card for a class election and ask 10 of your friends to vote using the card. After each person has voted ask who they intended to vote for and whether the card was confusing. Note down their comments. Redesign the card and perform the same test with 10 different people. Report your findings.