Chapter 10: Introducing Evaluation Group 4: Tony Masi, Sam Esswein, Brian Rood, Chris Troisi.

Slides:



Advertisements
Similar presentations
Introducing evaluation. The aims Discuss how developers cope with real-world constraints. Explain the concepts and terms used to discuss evaluation. Examine.
Advertisements

Chapter 12: Introducing Evaluation. The aims To illustrate how observation, interviews and questionnaires that you encountered in Chapters 7 and 8 are.
Chapter 11 Designing the User Interface
Evaluation Mary Rowlatt MDR Partners. Definition of project evaluation Evaluation focuses on whether the project was effective, achieved its objectives,
©2011 1www.id-book.com Introducing Evaluation Chapter 12.
CHAPTER 10 Introducing Evaluation Doosup Baek Sungmin Cho Syed Omer Jan.
Virtual University - Human Computer Interaction 1 © Imran Hussain | UMT Imran Hussain University of Management and Technology (UMT) Lecture 16 HCI PROCESS.
The design process IACT 403 IACT 931 CSCI 324 Human Computer Interface Lecturer:Gene Awyzio Room:3.117 Phone:
Agile Usability Testing Methods
CS305: HCI in SW Development Evaluation (Return to…)
Alternate Software Development Methodologies
WHAT IS INTERACTION DESIGN?
IS 214 Needs Assessment and Evaluation of Information Systems Managing Usability © Copyright 2001 Kevin McBride.
Methodology Overview Dr. Saul Greenberg John Kelleher.
Knowledge Acquisitioning. Definition The transfer and transformation of potential problem solving expertise from some knowledge source to a program.
User-centered approaches to interaction design. Overview Why involve users at all? What is a user-centered approach? Understanding users’ work —Coherence.
1 User-Centered Design and Development Instructor: Franz J. Kurfess Computer Science Dept. Cal Poly San Luis Obispo FJK 2005.
Design and Evaluation of Iterative Systems n For most interactive systems, the ‘design it right first’ approach is not useful. n The 3 basic steps in the.
An evaluation framework
Course Wrap-Up IS 485, Professor Matt Thatcher. 2 C.J. Minard ( )
Valve’s Design Process for Creating Half-Life 2  Presented by David Speyrer and Brian Jacobson.
Prototyping March 8, Product Concept Prototypes An approximation (in any appropriate medium) of a concept along one or more dimensions of interest.
Damian Gordon.  Summary and Relevance of topic paper  Definition of Usability Testing ◦ Formal vs. Informal methods of testing  Testing Basics ◦ Five.
Web 2.0 Testing and Marketing E-engagement capacity enhancement for NGOs HKU ExCEL3.
Review an existing website Usability in Design. to begin with.. Meeting Organization’s objectives and your Usability goals Meeting User’s Needs Complying.
CSC271 Database Systems Lecture # 20.
Human Interface Engineering1 Main Title, 60 pt., U/L case LS=.8 lines Introduction to Human Interface Engineering NTU Seminar Amy Ma HIE Global Director.
Lecture 7: User-centered approaches and Introducing evaluation.
The Four-phase Lesson Plan
류 현 정류 현 정 Human Computer Interaction Introducing evaluation.
Tietojärjestelmien peruskurssi Systeemisuunnittelu ja prototyyppimenetelmä Malin Brännback.
Evaluation Framework Prevention vs. Intervention CHONG POH WAN 21 JUNE 2011.
Chapter 11: An Evaluation Framework Group 4: Tony Masi, Sam Esswein, Brian Rood, & Chris Troisi.
The ID process Identifying needs and establishing requirements Developing alternative designs that meet those requirements Building interactive versions.
HCI Prototyping Chapter 6 Prototyping. Learning Outcomes At the end of this lecture, you should be able to: –Define the term “prototyping” –Explain the.
Chapter 15: Design and Evaluation in the Real World: Communicators and Advisory Systems Group 4: Tony Masi, Sam Esswein, Chris Troisi, Brian Rood.
Human Computer Interaction
Object-Oriented Software Engineering Practical Software Development using UML and Java Chapter 7: Focusing on Users and Their Tasks.
Usability testing. Goals & questions focus on how well users perform tasks with the product. – typical users – doing typical tasks. Comparison of products.
Computer Science Department California Polytechnic State University San Luis Obispo, CA, U.S.A. Franz J. Kurfess CPE/CSC 484: User-Centered Design and.
What is Usability? Usability Is a measure of how easy it is to use something: –How easy will the use of the software be for a typical user to understand,
Introducing Evaluation: why, what, when, where Text p Text p 317 – 323;
Using UML, Patterns, and Java Object-Oriented Software Engineering Chapter 4, Requirements Elicitation.
Chapter 12/13: Evaluation/Decide Framework Question 1.
CSCD 487/587 Human Computer Interface Winter 2013 Lecture 19 Evaluation.
Level 2 Prepared by: RHR First Prepared on: Nov 23, 2006 Last Modified on: Quality checked by: MOH Copyright 2004 Asia Pacific Institute of Information.
Chapter 12: Introducing Evaluation. The aims To illustrate how observation, interviews and questionnaires that you encountered in Chapters 7 and 8 are.
CS2003 Usability Engineering Human-Centred Design Dr Steve Love.
Web Content Development Dr. Komlodi Class 25: Evaluative testing.
Chapter 9 Prototyping. Objectives  Describe the basic terminology of prototyping  Describe the role and techniques of prototyping  Enable you to produce.
Chapter 12: Introducing Evaluation. The aims To illustrate how observation, interviews and questionnaires that you encountered in Chapters 7 and 8 are.
Writing Software Documentation A Task-Oriented Approach Thomas T. Barker Chapter 5: Analyzing Your Users Summary Cornelius Farrell Emily Werschay February.
Chapter 12/13: Evaluation/Decide Framework. Why Evaluate? Why: to check that users can use the product and that they like it. Designers need to check.
Copyright 2010, The World Bank Group. All Rights Reserved. Testing and Documentation Part II.
Usability Evaluation, part 2. REVIEW: A Test Plan Checklist, 1 Goal of the test? Specific questions you want to answer? Who will be the experimenter?
AVI/Psych 358/IE 340: Human Factors Evaluation October 31, 2008.
Requirements Engineering Processes. Syllabus l Definition of Requirement engineering process (REP) l Phases of Requirements Engineering Process: Requirements.
Oct 211 The next two weeks Oct 21 & 23: Lectures on user interface evaluation Oct 28: Lecture by Dr. Maurice Masliah No office hours (out of town) Oct.
User Interface Evaluation Introduction Lecture #15.
Introducing Evaluation Chapter 12. What is Evaluation?  Assessing and judging  Reflecting on what it is to be achieved  Assessing the success  Identifying.
Introduction to System Evaluation IS 588 Dania Bilal & Lorraine Normore Spring 2007.
The aims Show how design & evaluation are brought together in the development of interactive products. Show how different combinations of design & evaluation.
Usability Evaluation, part 2
Introducing Evaluation
WHAT IS INTERACTION DESIGN?
Imran Hussain University of Management and Technology (UMT)
Introducing Evaluation
COMP444 Human Computer Interaction Evaluation
Chapter 14 INTRODUCING EVALUATION
Presentation transcript:

Chapter 10: Introducing Evaluation Group 4: Tony Masi, Sam Esswein, Brian Rood, Chris Troisi

Chapter Goals Explain the concepts and terms used to discuss evaluation. HutchWorld case study. Examine how different techniques are used at different stages of development. Discuss how developers cope with real- world constraints.

Definition of Evaluation Dictionary Definition – “To ascertain or fix the value or worth of.” Book’s Definition – “The process of systematically collecting data that informs us about what it is like for a particular user or group of users to use a product for a particular task in a certain type of environment.”

Some Evaluation Companies: META Group, Inc. – Canadian Innovation Centere Hitachi Data Systems How To Evaluate Your Software -

Iterative design & evaluation is a continuous process that examines: Early ideas for conceptual model Early prototypes of the new system Later, more complete prototypes Designers need to check that they understand users’ requirements. Designers need to check that they understand users’ requirements. What to evaluate

Why to evaluate Designers should not presume everyone is like them or that following set guidelines guarantees usability. Evaluation is needed to check that users can use the product and like it.

Why to evaluate –“Iterative design, with its repeating cycle of design and testing, is the only validated methodology in existence that will consistently produce successful results. If you don’t have user-testing as an integral part of your design process you are going to throw buckets of money down the drain.” – Bruce Tognazzini

Why to evaluate Tognazzini’s 5 reasons to evaluate: –Problems are fixed before the product is shipped, not after. –The team can concentrate on real problems, not imaginary ones. –Engineers code instead of debating –Time to market is sharply reduced –Upon first release, sales department has a rock-solid design it can sell without having to pepper pitches with how well the NEXT release will work.

Why to evaluate USABILITY TESTING involves measuring the performance of typical users on typical tasks. SATISFACTION can be evaluated through questionnaires and interviews. Trends are towards evaluating more subjective user-experience goals, like emotionally satisfying, motivating, fun, etc.

When to evaluate Throughout design New product –Use mockups, sketches, and other low-fidelity prototyping techniques to elicit users’ opinions –Goal is to asses how well a design fulfills users’ needs and whether the users like it Upgrade existing product –Evaluations to compare user performance and attitudes toward new design with those of the previous versions Evaluation is a key ingredient for a successful design.

Two main types of evaluation Formative evaluation is done at different stages of development to check that the product meets users’ needs. –Design proceeds through iterative cycles of ‘design-test- redesign’ –Helps ensure products success upon first arrival in market Summative evaluation assesses the quality of a finished product. –Satisfy sponsoring agency –Check that standard is being upheld Chapter 10 focuses on Formative Evaluation.

Story of the 1984 OMS (Box 10.1) Background – –Voice mail system for Olympic Games contestants and their families could send and receive messages –Developed by IBM –Could be used from almost any push-button phone system around the world Reasons for intense evaluation - –IBM’s reputation at stake –Olympics a high-profile event

Story of the 1984 OMS (Box 10.1) Evaluation activities – –Use of printed scenarios –Iterative testing of user guides –Development of early simulations KeypadsReactions –Olympian on design team –Interviews with other Olympians –Overseas tests –Free coffee and donut tests –‘Try-and-destroy-it’ with Computer Science students –Pre-Olympic field tests –Heavy traffic tests

HutchWorld Case Study A Virtual Community Collaboration: –Microsoft’s Virtual Worlds Research Group –Fred Hutchinson Cancer Research Center Uses: –chatting –storytelling –discussions –emotional support Why? –cancer patient isolation issues

How to design HutchWorld? Needs: –useful –engaging –easy-to-use –emotional satisfaction Early ideas: –what is typical cancer treatment? –what resources are available to patients? –what are the specific needs of the users? –what kind of “world” should be the model? –how will users interact within the virtual community? –what should it look like?

No stone left unturned Interviews with patients, caregivers, family, friends, clinicians, social support groups, former patients, and experts Reading of latest research literature, and HutchWorld web pages Visiting Fred Hutch research facilities, and the Hutch school for pediatric patients and juvenile patient family members

Problem Inadequate non-verbal feedback –potential for misunderstanding –no: facial expressions body language tone of voice

Research Studies indicates social support helps cancer patients cope psychologically with their disease. Patients also benefit in their overall physical wellbeing. Example: women with breast cancer receiving therapy lived on average twice as long as those that did not.

Features of HutchWorld Availability: –anytime, day or night –regardless of geographic location Designed to resemble the outpatient facility –This real-world metaphor helped users infer the functionality. Synchronous chat environment was selected for realism (vs. asynchronous) 3D photographic avatars (p. 326)

Before testing Logistical issues: –Who would provide training for patients and testers? –How many systems were needed for testing? –Where should these systems be placed?

Testing HutchWorld Test 1: –six computers –scaled-back prototype –Microsoft specialists trained Hutch volunteers –events were hosted in the prototype Test 1 observations: –general usage of the prototype –usage of the space during unscheduled times

Testing HutchWorld Test 1 results: –small user community –critical mass concept – not enough participants to fill the chat room for successful conversation –lack of interest –patient availability –patients preferred asynchronous communication (via , journals, etc.) –prototype did not include original computer uses patients played games and searched the internet

Redesigning HutchWorld a more “unified” product was desired that included a variety of communication, information, and entertainment tasks new support: –more asynchronous communication –information-retrieval tools – , a bulletin board, text-chat –games and other entertainment tasks –a web page creation tool –a way to check if anyone is around to chat with

Usability Tests Seven participants –four had used chat rooms –all had browsed the web –given five minutes to get familiar with software A running commentary was given by each during exploration (what each was looking at, thinking, or confused by) After five minutes, a series of structured tasks were given focusing on how the participants (p. 329): –dealt with their virtual identity –communicated with others –retrieved desired information –found entertainment

Questionnaire After the test participants were asked to fill out a questionnaire about their experience with HutchWorld –What did you like about HutchWorld? –What did you not like about HutchWorld? –What did you find confusing or difficult to use in HutchWorld? –How would you suggest improving HutchWorld?

Usability Findings The back button did not always work. Users ignored navigation buttons –more prominent buttons were needed Users expected that objects in 3D would do something when clicked on –provide links to web pages when objects are clicked Users did not realize other real people were interacting with them in the world –wording was changed in the overview description Users did not notice the chat window and instead chatted with people on the participation list –instructions on where to chat were clarified

Follow-up more rounds of observation and testing were conducted with new subjects HutchWorld was installed at the Fred Hutchinson Center observations continued on the users –which parts of the system are being used? –when are they being used? –why are they being used?

Future of HutchWorld evaluation of the effects of the software at the Fred Hutchinson Center investigation will include: –How the computers and software impact the social wellbeing of the patients and their caregivers? –What type of computer-based communication best supports this patient community? –What are the general usage patterns of the system? –How might any medical facility use computers and software like HutchWorld to provide social support for its patients and caregivers?