User-Centered Design and Development Instructor: Franz J. Kurfess Computer Science Dept. Cal Poly San Luis Obispo FJK 2005.

Slides:



Advertisements
Similar presentations
©2011 1www.id-book.com Evaluation studies: From controlled to natural settings Chapter 14.
Advertisements

©2011 1www.id-book.com An evaluation framework Chapter 13.
Chapter 13: An evaluation framework
Chapter 14: Usability testing and field studies
Chapter 5 Development and Evolution of User Interface
Imran Hussain University of Management and Technology (UMT)
Chapter 7 Data Gathering 1.
Data gathering. Overview Four key issues of data gathering Data recording Interviews Questionnaires Observation Choosing and combining techniques.
CS305: HCI in SW Development Evaluation (Return to…)
WHAT IS INTERACTION DESIGN?
Chapter 14: Usability testing and field studies. 2 FJK User-Centered Design and Development Instructor: Franz J. Kurfess Computer Science Dept.
1 User-Centered Design and Development Instructor: Franz J. Kurfess Computer Science Dept. Cal Poly San Luis Obispo FJK 2005.
Data gathering.
Chapter 14: Usability testing and field studies. Usability Testing Emphasizes the property of being usable Key Components –User Pre-Test –User Test –User.
1 FJK User-Centered Design and Development Instructor: Franz J. Kurfess Computer Science Dept. Cal Poly San Luis Obispo.
Chapter 15: Analytical evaluation. 2 FJK User-Centered Design and Development Instructor: Franz J. Kurfess Computer Science Dept. Cal Poly San.
1 User-Centered Design and Development Instructor: Franz J. Kurfess Computer Science Dept. Cal Poly San Luis Obispo FJK 2009.
1 User-Centered Design and Development Instructor: Franz J. Kurfess Computer Science Dept. Cal Poly San Luis Obispo FJK 2005.
An evaluation framework
Observing users.
User-Centered Design and Development Instructor: Franz J. Kurfess Computer Science Dept. Cal Poly San Luis Obispo.
Data-collection techniques. Contents Types of data Observations Event logs Questionnaires Interview.
An evaluation framework
Usability 2004 J T Burns1 Usability & Usability Engineering.
An evaluation framework
Evaluation How do we test the interaction design? Several Dimensions
From Controlled to Natural Settings
Design in the World of Business
Chapter 14: Usability testing and field studies
Computer Science Department California Polytechnic State University San Luis Obispo, CA, U.S.A. Franz J. Kurfess CPE/CSC 484: User-Centered Design and.
Evaluation Framework Prevention vs. Intervention CHONG POH WAN 21 JUNE 2011.
Chapter 11: An Evaluation Framework Group 4: Tony Masi, Sam Esswein, Brian Rood, & Chris Troisi.
Data gathering. Overview Four key issues of data gathering Data recording Interviews Questionnaires Observation Choosing and combining techniques.
Ch 14. Testing & modeling users
Computer Science Department California Polytechnic State University San Luis Obispo, CA, U.S.A. Franz J. Kurfess CPE/CSC 484: User-Centered Design and.
Human Computer Interaction
Usability testing. Goals & questions focus on how well users perform tasks with the product. – typical users – doing typical tasks. Comparison of products.
Computer Science Department California Polytechnic State University San Luis Obispo, CA, U.S.A. Franz J. Kurfess CPE/CSC 484: User-Centered Design and.
Evaluation Paradigms & Techniques IS 588 Spring 2008 Dr. D. Bilal.
Computer Science Department California Polytechnic State University San Luis Obispo, CA, U.S.A. Franz J. Kurfess CPE/CSC 484: User-Centered Design and.
Chapter 12/13: Evaluation/Decide Framework Question 1.
Testing & modeling users. The aims Describe how to do user testing. Discuss the differences between user testing, usability testing and research experiments.
Level 2 Prepared by: RHR First Prepared on: Nov 23, 2006 Last Modified on: Quality checked by: MOH Copyright 2004 Asia Pacific Institute of Information.
CS2003 Usability Engineering Usability Evaluation Dr Steve Love.
Chapter 12: Introducing Evaluation. The aims To illustrate how observation, interviews and questionnaires that you encountered in Chapters 7 and 8 are.
CS3205: HCI in SW Development Evaluation (Return to…) We’ve had an introduction to evaluation. Now for more details on…
© An Evaluation Framework Chapter 13.
Chapter 12: Introducing Evaluation. The aims To illustrate how observation, interviews and questionnaires that you encountered in Chapters 7 and 8 are.
Chapter 12/13: Evaluation/Decide Framework. Why Evaluate? Why: to check that users can use the product and that they like it. Designers need to check.
Computer Science Department California Polytechnic State University San Luis Obispo, CA, U.S.A. Franz J. Kurfess CPE/CSC 484: User-Centered Design and.
User-Centered Design and Development Instructor: Franz J. Kurfess Computer Science Dept. Cal Poly San Luis Obispo FJK 2005.
AVI/Psych 358/IE 340: Human Factors Data Gathering October 3, 2008.
AVI/Psych 358/IE 340: Human Factors Evaluation October 31, 2008.
DECIDE: An evaluation framework. DECIDE: a framework to guide evaluation D D etermine the goals. E E xplore the questions. C C hoose the evaluation approach.
Um ambiente para avaliação. Objetivos Explicar conceitos e termos da avaliação Descrever paradigmas de avaliação e técnicas utilizadas no design de.
Oct 211 The next two weeks Oct 21 & 23: Lectures on user interface evaluation Oct 28: Lecture by Dr. Maurice Masliah No office hours (out of town) Oct.
Data gathering (Chapter 7 Interaction Design Text)
School of Engineering and Information and Communication Technology KIT305/607 Mobile Application Development Week 7: Usability (think-alouds) Dr. Rainer.
Chapter 13: An evaluation framework. The aims are: To discuss the conceptual, practical and ethical issues involved in evaluation. To introduce and explain.
SIE 515 Design Evaluation Lecture 7.
ELT 329 ACTION RESEARCH Week 4
Introducing Evaluation
From Controlled to Natural Settings
WHAT IS INTERACTION DESIGN?
Evaluation Paradigms & Techniques
From Controlled to Natural Settings
Human-Computer Interaction: Overview of User Studies
COMP444 Human Computer Interaction Evaluation
Presentation transcript:

User-Centered Design and Development Instructor: Franz J. Kurfess Computer Science Dept. Cal Poly San Luis Obispo FJK 2005

Copyright Notice These slides are a revised version of the originals provided with the book “Interaction Design” by Jennifer Preece, Yvonne Rogers, and Helen Sharp, Wiley, I added some material, made some minor modifications, and created a custom show to select a subset. –Slides added or modified by me are marked with my initials (FJK), unless I forgot it … FJK 2005

484-W07 Quarter The slides I use in class are in the Custom Show “484-W07”. It is a subset of the whole collection in this file. Week 6 contains slides from Chapters 10 and 11 of the textbook. The original slides for this chapter were in much better shape, and I mainly made changes to better match our emphasis FJK 2005

Chapter 11 Usability Evaluation Framework FJK 2005

Chapter Overview Evaluation paradigms and techniques DECIDE evaluation framework FJK 2005

Motivation since it can be difficult to put together a customized evaluation plan for each project, it is often useful to follow a template or framework evaluation paradigms have been identified over time as practical combinations of methods

FJK 2005 Objectives Explain key evaluation concepts & terms. Describe the evaluation paradigms & techniques used in interaction design. Discuss the conceptual, practical and ethical issues that must be considered when planning evaluations. Introduce the DECIDE framework.

An evaluation framework

The aims Explain key evaluation concepts & terms. Describe the evaluation paradigms & techniques used in interaction design. Discuss the conceptual, practical and ethical issues that must be considered when planning evaluations. Introduce the DECIDE framework.

FJK 2005 Evaluation paradigm Any kind of evaluation is guided explicitly or implicitly by a set of beliefs –these beliefs are often supported by theory The beliefs and the methods associated with them are known as an ‘evaluation paradigm’

User studies looking at how people behave –in their natural environments, –or in the laboratory, –with old technologies and with new ones

Four evaluation paradigms ‘quick and dirty’ usability testing field studies predictive evaluation

Quick and dirty ‘quick & dirty’ evaluation describes a common practice –designers informally get feedback from users or consultants to confirm that their ideas are in-line with users’ needs and are liked. Quick & dirty evaluations can be done any time The emphasis is on fast input to the design process rather than carefully documented findings.

Usability Testing recording the performance of typical users –on typical tasks in controlled settings –field observations may also be used users are watched –recorded on video –their activities are logged mouse movements, key presses evaluation –calculation of performance times –identification of errors –explanation why the users did what they did user satisfaction –questionnaires and interviews are used to elicit the opinions of users

Field Studies done in natural settings to understand what users do naturally and how technology impacts them in product design field studies can be used to - identify opportunities for new technology - determine design requirements - decide how best to introduce new technology - evaluate technology in use

Predictive Evaluation experts apply their knowledge of typical users to predict usability problems –often guided by heuristics another approach involves theoretical models users need not be present relatively quick & inexpensive

Overview of Techniques observing users asking users about their opinions asking experts about their opinions testing the performance of users modeling the task performance of users

DECIDE: A framework to guide evaluation Determine the goals the evaluation addresses. Explore the specific questions to be answered. Choose the evaluation paradigm and techniques to answer the questions. Identify the practical issues. Decide how to deal with the ethical issues. Evaluate, interpret and present the data.

Determine the Goals What are the high-level goals of the evaluation? Who wants it and why? The goals influence the paradigm for the study Some examples of goals: Identify the best metaphor on which to base the design. Check to ensure that the final interface is consistent. Investigate how technology affects working practices. Improve the usability of an existing product.

Explore the Questions All evaluations need goals & questions to guide them so time is not wasted on ill-defined studies. For example, the goal of finding out why many customers prefer to purchase paper airline tickets rather than e-tickets can be broken down into sub- questions: - What are customers’ attitudes to these new tickets? - Are they concerned about security? - Is the interface for obtaining them poor? What questions might you ask about the design of a cell phone?

Choose the Evaluation Paradigm & Techniques The evaluation paradigm strongly influences the techniques used, how data is analyzed and presented. –E.g. field studies do not involve testing or modeling

Identify Practical Issues For example, how to: –select users –stay on budget –staying on schedule –find evaluators –select equipment

Decide on Ethical Issues Develop an informed consent form Participants have a right to: - know the goals of the study - what will happen to the findings - privacy of personal information - not to be quoted without their agreement - leave when they wish - be treated politely

Evaluate, Interpret and Present Data How data is analyzed and presented depends on the paradigm and techniques used. The following also need to be considered: - Reliability: can the study be replicated? - Validity: is it measuring what you thought? - Biases: is the process creating biases? - Scope: can the findings be generalized? - Ecological validity: is the environment of the study influencing it e.g. Hawthorn effect

Pilot studies A small trial run of the main study. The aim is to make sure your plan is viable. Pilot studies check: –that you can conduct the procedure –that interview scripts, questionnaires, experiments, etc. work appropriately It’s worth doing several to iron out problems before doing the main study. Ask colleagues if you don’t have access to real users –danger: bias

Key points An evaluation paradigm is an approach that is influenced by particular theories and philosophies. Five categories of techniques were identified: observing users, asking users, asking experts, user testing, modeling users. The DECIDE framework has six parts: - Determine the overall goals - Explore the questions that satisfy the goals - Choose the paradigm and techniques - Identify the practical issues - Decide on the ethical issues - Evaluate ways to analyze & present data Do a pilot study

Activity: DECIDE Framework apply the DECIDE framework to one of the projects in this class –design and development tools tutorial –Learning Commons storyboard

A project for you … Find an evaluation study from the list of URLs on this site or one of your own choice. Use the DECIDE framework to analyze it. Which paradigms are involved? Does the study report address each aspect of DECIDE? Is triangulation used? If so which techniques? On a scale of 1-5, where 1 = poor and 5 = excellent, how would you rate this study?