WHAT IS INTERACTION DESIGN?

Slides:



Advertisements
Similar presentations
©2011 1www.id-book.com An evaluation framework Chapter 13.
Advertisements

Introducing evaluation. The aims Discuss how developers cope with real-world constraints. Explain the concepts and terms used to discuss evaluation. Examine.
Chapter 13: An evaluation framework
Chapter 14: Usability testing and field studies
Chapter 12: Introducing Evaluation. The aims To illustrate how observation, interviews and questionnaires that you encountered in Chapters 7 and 8 are.
©2011 1www.id-book.com Introducing Evaluation Chapter 12.
Agile Usability Testing Methods
CS305: HCI in SW Development Evaluation (Return to…)
IS214 Recap. IS214 Understanding Users and Their Work –User and task analysis –Ethnographic methods –Site visits: observation, interviews –Contextual.
Chapter 14: Usability testing and field studies. 2 FJK User-Centered Design and Development Instructor: Franz J. Kurfess Computer Science Dept.
Chapter 14: Usability testing and field studies. Usability Testing Emphasizes the property of being usable Key Components –User Pre-Test –User Test –User.
1 User-Centered Design and Development Instructor: Franz J. Kurfess Computer Science Dept. Cal Poly San Luis Obispo FJK 2005.
Evaluation Methodologies
User-Centered Design and Development Instructor: Franz J. Kurfess Computer Science Dept. Cal Poly San Luis Obispo FJK 2005.
Design and Evaluation of Iterative Systems n For most interactive systems, the ‘design it right first’ approach is not useful. n The 3 basic steps in the.
An evaluation framework
An evaluation framework
Evaluation How do we test the interaction design? Several Dimensions
Usability Evaluation 1
From Controlled to Natural Settings
Design in the World of Business
©2011 1www.id-book.com Analytical evaluation Chapter 15.
Chapter 14: Usability testing and field studies
류 현 정류 현 정 Human Computer Interaction Introducing evaluation.
Evaluation Framework Prevention vs. Intervention CHONG POH WAN 21 JUNE 2011.
Chapter 11: An Evaluation Framework Group 4: Tony Masi, Sam Esswein, Brian Rood, & Chris Troisi.
ITEC224 Database Programming
Writing research proposal/synopsis
©2011 1www.id-book.com Introducing Evaluation Chapter 12 adapted by Wan C. Yoon
Human Computer Interaction
Computer Science Department California Polytechnic State University San Luis Obispo, CA, U.S.A. Franz J. Kurfess CPE/CSC 484: User-Centered Design and.
Evaluation Paradigms & Techniques IS 588 Spring 2008 Dr. D. Bilal.
There are many occasions for fact-finding during the database system development lifecycle. fact-finding is particularly crucial to the early stages of.
Introducing Evaluation: why, what, when, where Text p Text p 317 – 323;
Chapter 12/13: Evaluation/Decide Framework Question 1.
CSCD 487/587 Human Computer Interface Winter 2013 Lecture 19 Evaluation.
Level 2 Prepared by: RHR First Prepared on: Nov 23, 2006 Last Modified on: Quality checked by: MOH Copyright 2004 Asia Pacific Institute of Information.
Usability Testing Chapter 6. Reliability Can you repeat the test?
Usability Testing: Role Playing as an Illustration Defining experimental tasks Recording stuff Identifying problems  changes Leading to A6 Presented by.
Chapter 12: Introducing Evaluation. The aims To illustrate how observation, interviews and questionnaires that you encountered in Chapters 7 and 8 are.
CS2003 Usability Engineering Human-Centred Design Dr Steve Love.
Web Content Development Dr. Komlodi Class 25: Evaluative testing.
© An Evaluation Framework Chapter 13.
Systems Analysis and Design in a Changing World, Fourth Edition
Chapter 12: Introducing Evaluation. The aims To illustrate how observation, interviews and questionnaires that you encountered in Chapters 7 and 8 are.
Interaction Design Dr. Jim Rowan Foley Introduction What’s in the Book that we’ll cover.
Chapter 12/13: Evaluation/Decide Framework. Why Evaluate? Why: to check that users can use the product and that they like it. Designers need to check.
Computer Science Department California Polytechnic State University San Luis Obispo, CA, U.S.A. Franz J. Kurfess CPE/CSC 484: User-Centered Design and.
1 Human-Computer Interaction Usability Evaluation: 2 Expert and Empirical Methods.
AVI/Psych 358/IE 340: Human Factors Evaluation October 31, 2008.
DECIDE: An evaluation framework. DECIDE: a framework to guide evaluation D D etermine the goals. E E xplore the questions. C C hoose the evaluation approach.
Um ambiente para avaliação. Objetivos Explicar conceitos e termos da avaliação Descrever paradigmas de avaliação e técnicas utilizadas no design de.
User Interface Evaluation Introduction Lecture #15.
Introducing Evaluation Chapter 12. What is Evaluation?  Assessing and judging  Reflecting on what it is to be achieved  Assessing the success  Identifying.
Chapter 13: An evaluation framework. The aims are: To discuss the conceptual, practical and ethical issues involved in evaluation. To introduce and explain.
User Interface Evaluation
SIE 515 Design Evaluation Lecture 7.
Introducing Evaluation
Evaluation techniques
From Controlled to Natural Settings
WHAT IS INTERACTION DESIGN?
Imran Hussain University of Management and Technology (UMT)
From Controlled to Natural Settings
Evaluation.
HCI Evaluation Techniques
Introducing Evaluation
COMP444 Human Computer Interaction Evaluation
Chapter 14 INTRODUCING EVALUATION
Presentation transcript:

WHAT IS INTERACTION DESIGN? Chapter 13 WHAT IS INTERACTION DESIGN?

The aims Explain the key concepts and terms used in evaluation Introduce different types of evaluation methods. Show how different evaluation methods are used for different purposes at different stages of the design process and in different contexts of use. Show how evaluators mix and modify methods to meet the demands of evaluating novel systems. Discuss some of the challenges that evaluators have to consider when doing evaluation. Illustrate how methods discussed in Chapters 7 and 8 are used in evaluation and describe some methods that are specific to evaluation. www.id-book.com 2

Why, what, where and when to evaluate Iterative design & evaluation is a continuous process that examines: Why: to check users’ requirements and that they can use the product and they like it. What: a conceptual model, early prototypes of a new system and later, more complete prototypes. Where: in natural and laboratory settings. When: throughout design; finished products can be evaluated to collect information to inform new products. www.id-book.com 3

Bruce Tognazzini tells you why you need to evaluate “Iterative design, with its repeating cycle of design and testing, is the only validated methodology in existence that will consistently produce successful results. If you don’t have user-testing as an integral part of your design process you are going to throw buckets of money down the drain.” See AskTog.com for topical discussions about design and evaluation. www.id-book.com 4

Types of evaluation Controlled settings involving users, eg usability testing & experiments in laboratories and living labs. Natural settings involving users, eg field studies and in the wild studies to see how the product is used in the real world. Settings not involving users, e.g. to predict, analyze & model aspects of the interface analytics. www.id-book.com

Living labs People’s use of technology in their everyday lives can be evaluated in living labs. Such evaluations are too difficult to do in a usability lab. Eg the Aware Home was embedded with a complex network of sensors and audio/video recording devices (Abowd et al., 2000). www.id-book.com

Usability testing & field studies can compliment www.id-book.com

Evaluation case studies Experiment to investigate a computer game In the wild field study of skiers Crowdsourcing www.id-book.com 8

Challenge & engagement in a collaborative immersive game Physiological measures were used. Players were more engaged when playing against another person than when playing against a computer. What precautionary measures did the evaluators take? www.id-book.com 9

Challenge & engagement in a collaborative immersive game www.id-book.com 10

What does this data tell you? www.id-book.com 11

Why study skiers in the wild ? www.id-book.com

e-skiing system components www.id-book.com

What did we learn from the case studies? How to observe users in natural settings. Unexpected findings resulting from in the wild studies. Having to develop different data collection and analysis techniques to evaluate user experience goals such as challenge and engagement. The ability to run experiments on the Internet that are quick and inexpensive using crowdsourcing. How to recruit a large number of participants using Mechanical Turk.Test text www.id-book.com

Evaluation methods x Method Controlled settings Natural settings Without users Observing x Asking users Asking experts Testing Modeling www.id-book.com 15

The language of evaluation Analytics Analytical evaluation Biases Controlled experiment Crowdsourcing Ecological validity Expert review or crit Field study Formative evaluation Heuristic evaluation Informed consent form In the wild evaluation Living laboratory Predictive evaluation Reliability Scope Summative evaluation Usability laboratory User studies Usability testing Users or participants Validity www.id-book.com 16

Participants’ rights and getting their consent Participants need to be told why the evaluation is being done, what they will be asked to do and their rights. Informed consent forms provide this information. The design of the informed consent form, the evaluation process, data analysis and data storage methods are typically approved by a high authority, eg. Institutional Review Board. www.id-book.com

Things to consider when interpreting data Reliability: does the method produce the same results on separate occasions? Validity: does the method measure what it is intended to measure? Ecological validity: does the environment of the evaluation distort the results? Biases: Are there biases that distort the results? Scope: How generalizable are the results? www.id-book.com

Key points Evaluation and design are very closely integrated. Some of the same data gathering methods are used in evaluation as for establishing requirements and identifying users’ needs, e.g. observation, interviews, and questionnaires. Evaluations can be done in controlled settings such as laboratories, less controlled field settings, or where users are not present. Usability testing and experiments enable the evaluator to have a high level of control over what gets tested, whereas evaluators typically impose little or no control on participants in field studies. www.id-book.com