Introducing Evaluation

Slides:



Advertisements
Similar presentations
©2011 1www.id-book.com An evaluation framework Chapter 13.
Advertisements

Introducing evaluation. The aims Discuss how developers cope with real-world constraints. Explain the concepts and terms used to discuss evaluation. Examine.
Chapter 13: An evaluation framework
Chapter 14: Usability testing and field studies
Chapter 12: Introducing Evaluation. The aims To illustrate how observation, interviews and questionnaires that you encountered in Chapters 7 and 8 are.
©2011 1www.id-book.com Introducing Evaluation Chapter 12.
CS305: HCI in SW Development Evaluation (Return to…)
WHAT IS INTERACTION DESIGN?
©N. Hari Narayanan Computer Science & Software Engineering Auburn University 1 COMP 7620 Evaluation Chapter 9.
Evaluation Methodologies
User-Centered Design and Development Instructor: Franz J. Kurfess Computer Science Dept. Cal Poly San Luis Obispo FJK 2005.
Part 4: Evaluation Chapter 20: Why evaluate? Chapter 21: Deciding on what to evaluate: the strategy Chapter 22: Planning who, what, where, and when Chapter.
Design and Evaluation of Iterative Systems n For most interactive systems, the ‘design it right first’ approach is not useful. n The 3 basic steps in the.
An evaluation framework
An evaluation framework
Evaluation How do we test the interaction design? Several Dimensions
Evaluation: Inspections, Analytics & Models
From Controlled to Natural Settings
©2011 1www.id-book.com Analytical evaluation Chapter 15.
Chapter 14: Usability testing and field studies
류 현 정류 현 정 Human Computer Interaction Introducing evaluation.
Evaluation Framework Prevention vs. Intervention CHONG POH WAN 21 JUNE 2011.
Chapter 11: An Evaluation Framework Group 4: Tony Masi, Sam Esswein, Brian Rood, & Chris Troisi.
Multimedia Specification Design and Production 2013 / Semester 1 / week 9 Lecturer: Dr. Nikos Gazepidis
©2011 1www.id-book.com Introducing Evaluation Chapter 12 adapted by Wan C. Yoon
Human Computer Interaction
Computer Science Department California Polytechnic State University San Luis Obispo, CA, U.S.A. Franz J. Kurfess CPE/CSC 484: User-Centered Design and.
Introducing Evaluation: why, what, when, where Text p Text p 317 – 323;
Evaluation approaches Text p Text p
Chapter 12/13: Evaluation/Decide Framework Question 1.
Level 2 Prepared by: RHR First Prepared on: Nov 23, 2006 Last Modified on: Quality checked by: MOH Copyright 2004 Asia Pacific Institute of Information.
Chapter 12: Introducing Evaluation. The aims To illustrate how observation, interviews and questionnaires that you encountered in Chapters 7 and 8 are.
© An Evaluation Framework Chapter 13.
Chapter 15: Analytical evaluation. Inspections Heuristic evaluation Walkthroughs.
Chapter 12: Introducing Evaluation. The aims To illustrate how observation, interviews and questionnaires that you encountered in Chapters 7 and 8 are.
Chapter 12/13: Evaluation/Decide Framework. Why Evaluate? Why: to check that users can use the product and that they like it. Designers need to check.
Computer Science Department California Polytechnic State University San Luis Obispo, CA, U.S.A. Franz J. Kurfess CPE/CSC 484: User-Centered Design and.
AVI/Psych 358/IE 340: Human Factors Evaluation October 31, 2008.
Chapter 15: Analytical evaluation. Aims: Describe inspection methods. Show how heuristic evaluation can be adapted to evaluate different products. Explain.
DECIDE: An evaluation framework. DECIDE: a framework to guide evaluation D D etermine the goals. E E xplore the questions. C C hoose the evaluation approach.
Paper III Qualitative research methodology.  Qualitative research is designed to reveal a specific target audience’s range of behavior and the perceptions.
Um ambiente para avaliação. Objetivos Explicar conceitos e termos da avaliação Descrever paradigmas de avaliação e técnicas utilizadas no design de.
User Interface Evaluation Introduction Lecture #15.
Introducing Evaluation Chapter 12. What is Evaluation?  Assessing and judging  Reflecting on what it is to be achieved  Assessing the success  Identifying.
Chapter 13: An evaluation framework. The aims are: To discuss the conceptual, practical and ethical issues involved in evaluation. To introduce and explain.
Cognitive Informatics for Biomedicine – Chapter 5
The aims Show how design & evaluation are brought together in the development of interactive products. Show how different combinations of design & evaluation.
User Interface Evaluation
SIE 515 Design Evaluation Lecture 7.
Digital media & interaction design
Research Methods Lesson 2 Validity.
CIS 375Competitive Success/tutorialrank.com
CIS 375 Education for Service-- tutorialrank.com.
© 2012 The McGraw-Hill Companies, Inc.
Chapter 20 Why evaluate the usability of user interface designs?
Evaluation techniques
From Controlled to Natural Settings
WHAT IS INTERACTION DESIGN?
Lesson 1 Foundations of measurement in Psychology
From Controlled to Natural Settings
Evaluation.
HCI Evaluation Techniques
Introducing Evaluation
Human-Computer Interaction: Overview of User Studies
COMP444 Human Computer Interaction Evaluation
Evaluation: Inspections, Analytics & Models
Chapter 14 INTRODUCING EVALUATION
Evaluation: Inspections, Analytics, and Models
Presentation transcript:

Introducing Evaluation Chapter 13 Introducing Evaluation

The aims Explain the key concepts and terms used in evaluation Introduce different types of evaluation methods. Show how different evaluation methods are used for different purposes at different stages of the design process and in different contexts of use. Show how evaluators mix and modify methods to meet the demands of evaluating novel systems. Discuss some of the challenges that evaluators have to consider when doing evaluation. Illustrate how methods discussed in Chapters 7 and 8 are used in evaluation and describe some methods that are specific to evaluation. www.id-book.com 2

Why, what, where and when to evaluate Iterative design & evaluation is a continuous process that examines: Why: to check users’ requirements and that they can use the product and they like it. What: a conceptual model, early prototypes of a new system and later, more complete prototypes. Where: in natural and laboratory settings. When: throughout design; finished products can be evaluated to collect information to inform new products. www.id-book.com 3

Types of evaluation Controlled settings involving users, eg usability testing & experiments in laboratories. Natural settings involving users, eg to see how the product is used in the real world. Settings not involving users, e.g. consultants, to predict, analyze & model aspects of the interface in order to identify the most obvious usability problems. www.id-book.com

Usability testing & field studies can complement www.id-book.com

Evaluation case studies Experiment to investigate a computer game In the wild field study of skiers Crowdsourcing www.id-book.com 6

An Experiment Investigating a Computer Game Physiological measures were used. Players were more engaged when playing against another person than when playing against a computer. www.id-book.com 7

An Experiment Investigating a Computer Game www.id-book.com 8

What does this data tell you? www.id-book.com 9

What did we learn from the case studies? How to observe users in natural settings. Having to develop different data collection and analysis techniques to evaluate user experience goals such as challenge and engagement. www.id-book.com

Evaluation methods x Method Controlled settings Natural settings Without users Observing x Asking users Asking experts Testing Modeling www.id-book.com 11

The language of evaluation Analytics Analytical evaluation Biases Controlled experiment Crowdsourcing Ecological validity Expert review or crit Field study Formative evaluation Heuristic evaluation Informed consent form In the wild evaluation Living laboratory Predictive evaluation Reliability Scope Summative evaluation Usability laboratory User studies Usability testing Users or participants Validity www.id-book.com 12

The language of evaluation Analytics: Data analytics refers to examining large volumes of raw data with the purpose of drawing inferences about that information. Web analytics is commonly used to measure website traffic through analyzing users’ click data. Analytical evaluation: Evaluation methods that model and predict user behavior. This term has been used to refer to heuristic evaluation, walkthroughs, modeling, and analytics www.id-book.com 13

The language of evaluation Bias: The results of an evaluation are distorted. This can happen for several reasons. For example, selecting a population of users that has already had experience with the new system and describing their performance as if they were new users. Field study: An evaluation study that is done in a natural environment such as in a person's home, or in a work or leisure place. www.id-book.com 14

The language of evaluation Formative evaluation: An evaluation that is done during design to check that the product fulfills requirements and continues to meet users’ needs. Heuristic evaluation: An evaluation method in which knowledge of typical users is applied, often guided by heuristics, to identify usability problems. Predictive evaluation: Evaluation methods in which theoretically based models are used to predict user performance. www.id-book.com 15

The language of evaluation Summative evaluation: An evaluation that is done when the design is complete. Usability laboratory: A laboratory that is specially designed for usability testing. Usability testing: Involves measuring users’ performance on various tasks. User studies: A generic term that covers a range of evaluations involving users, including field studies and experiments. Users or participants: These terms are used interchangeably to refer to the people who take part in evaluation studies. www.id-book.com 16

Participants’ rights and getting their consent Participants need to be told why the evaluation is being done, what they will be asked to do and their rights. Informed consent forms provide this information. www.id-book.com

Things to consider when interpreting data Reliability: does the method produce the same results on separate occasions? Validity: does the method measure what it is intended to measure? Ecological validity: does the environment of the evaluation distort the results? Biases: Are there biases that distort the results? Scope: How generalizable are the results? www.id-book.com

Key points Evaluation and design are very closely integrated. Some of the same data gathering methods are used in evaluation as for establishing requirements and identifying users’ needs, e.g. observation, interviews, and questionnaires. Evaluations can be done in controlled settings such as laboratories, less controlled field settings, or where users are not present. Usability testing and experiments enable the evaluator to have a high level of control over what gets tested, whereas evaluators typically impose little or no control on participants in field studies. www.id-book.com