Web Content Development Dr. Komlodi Class 25: Evaluative testing.

Slides:



Advertisements
Similar presentations
Introducing evaluation. The aims Discuss how developers cope with real-world constraints. Explain the concepts and terms used to discuss evaluation. Examine.
Advertisements

Chapter 14: Usability testing and field studies
Chapter 12: Introducing Evaluation. The aims To illustrate how observation, interviews and questionnaires that you encountered in Chapters 7 and 8 are.
©2011 1www.id-book.com Introducing Evaluation Chapter 12.
CS305: HCI in SW Development Evaluation (Return to…)
Cognitive Walkthrough More evaluation without users.
WHAT IS INTERACTION DESIGN?
Chapter 14: Usability testing and field studies. 2 FJK User-Centered Design and Development Instructor: Franz J. Kurfess Computer Science Dept.
Methodology Overview Dr. Saul Greenberg John Kelleher.
User-centered approaches to interaction design. Overview Why involve users at all? What is a user-centered approach? Understanding users’ work —Coherence.
Discount Evaluation Evaluating with experts. Agenda Online collaboration tools Heuristic Evaluation Perform HE on each other’s prototypes Cognitive Walkthrough.
1 User-Centered Design and Development Instructor: Franz J. Kurfess Computer Science Dept. Cal Poly San Luis Obispo FJK 2005.
Evaluation Methodologies
User-Centered Design and Development Instructor: Franz J. Kurfess Computer Science Dept. Cal Poly San Luis Obispo FJK 2005.
Design and Evaluation of Iterative Systems n For most interactive systems, the ‘design it right first’ approach is not useful. n The 3 basic steps in the.
Heuristic Evaluation Evaluating with experts. Discount Evaluation Techniques  Basis: Observing users can be time- consuming and expensive Try to predict.
Evaluating with experts
SIMS 213: User Interface Design & Development Marti Hearst Tues Feb 13, 2001.
An evaluation framework
An evaluation framework
From Controlled to Natural Settings
Analytical Evaluations 2. Field Studies
Web Design cs414 spring Announcements Project status due Friday (submit pdf)
User Interface Evaluation CIS 376 Bruce R. Maxim UM-Dearborn.
Web 2.0 Testing and Marketing E-engagement capacity enhancement for NGOs HKU ExCEL3.
Evaluating User Interfaces Walkthrough Analysis Joseph A. Konstan
류 현 정류 현 정 Human Computer Interaction Introducing evaluation.
… and after unit testing …
Predictive Evaluation
Evaluation Framework Prevention vs. Intervention CHONG POH WAN 21 JUNE 2011.
Chapter 11: An Evaluation Framework Group 4: Tony Masi, Sam Esswein, Brian Rood, & Chris Troisi.
Discount Evaluation Evaluating with experts. Discount Evaluation Techniques Basis: – Observing users can be time-consuming and expensive – Try to predict.
System Design: Designing the User Interface Dr. Dania Bilal IS582 Spring 2009.
Evaluation of Adaptive Web Sites 3954 Doctoral Seminar 1 Evaluation of Adaptive Web Sites Elizabeth LaRue by.
Put it to the Test: Usability Testing of Library Web Sites Nicole Campbell, Washington State University.
Part 1-Intro; Part 2- Req; Part 3- Design  Chapter 20 Why evaluate the usability of user interface designs?  Chapter 21 Deciding on what you need to.
Multimedia Specification Design and Production 2013 / Semester 1 / week 9 Lecturer: Dr. Nikos Gazepidis
Object-Oriented Software Engineering Practical Software Development using UML and Java Chapter 7: Focusing on Users and Their Tasks.
Computer Science Department California Polytechnic State University San Luis Obispo, CA, U.S.A. Franz J. Kurfess CPE/CSC 484: User-Centered Design and.
Introducing Evaluation: why, what, when, where Text p Text p 317 – 323;
SEG3120 User Interfaces Design and Implementation
CSCD 487/587 Human Computer Interface Winter 2013 Lecture 19 Evaluation.
Testing & modeling users. The aims Describe how to do user testing. Discuss the differences between user testing, usability testing and research experiments.
Level 2 Prepared by: RHR First Prepared on: Nov 23, 2006 Last Modified on: Quality checked by: MOH Copyright 2004 Asia Pacific Institute of Information.
CS2003 Usability Engineering Usability Evaluation Dr Steve Love.
Evaluation of User Interface Design 4. Predictive Evaluation continued Different kinds of predictive evaluation: 1.Inspection methods 2.Usage simulations.
An Overview of Usability Evaluation #15 1. Outline  What is usability evaluation ?  Why perform usability evaluation ?  Types of usability evaluations.
Chapter 12: Introducing Evaluation. The aims To illustrate how observation, interviews and questionnaires that you encountered in Chapters 7 and 8 are.
CS2003 Usability Engineering Human-Centred Design Dr Steve Love.
Web Content Development Dr. Komlodi Class 25: Evaluative testing.
Software Engineering User Interface Design Slide 1 User Interface Design.
Chapter 12: Introducing Evaluation. The aims To illustrate how observation, interviews and questionnaires that you encountered in Chapters 7 and 8 are.
AVI/Psych 358/IE 340: Human Factors Evaluation October 31, 2008.
SE 204, IES 506 – Human Computer Interaction Lecture 6: Evaluating Interface Designs Lecturer: Gazihan Alankuş Please look at the end of the.
Cognitive Walkthrough More evaluating with experts.
Um ambiente para avaliação. Objetivos Explicar conceitos e termos da avaliação Descrever paradigmas de avaliação e técnicas utilizadas no design de.
Oct 211 The next two weeks Oct 21 & 23: Lectures on user interface evaluation Oct 28: Lecture by Dr. Maurice Masliah No office hours (out of town) Oct.
Introduction to Evaluation “Informal” approaches.
Fall 2002CS/PSY Predictive Evaluation (Evaluation Without Users) Gathering data about usability of a design by a specified group of users for a particular.
User Interface Evaluation Introduction Lecture #15.
Introducing Evaluation Chapter 12. What is Evaluation?  Assessing and judging  Reflecting on what it is to be achieved  Assessing the success  Identifying.
Design Evaluation Overview Introduction Model for Interface Design Evaluation Types of Evaluation –Conceptual Design –Usability –Learning Outcome.
Introducing Evaluation
From Controlled to Natural Settings
WHAT IS INTERACTION DESIGN?
From Controlled to Natural Settings
Introducing Evaluation
COMP444 Human Computer Interaction Evaluation
Chapter 14 INTRODUCING EVALUATION
Presentation transcript:

Web Content Development Dr. Komlodi Class 25: Evaluative testing

Web Design and Evaluation Information organization (Site map) User, content, context research (Site scope) Labeling and navigation design (Wireframes) User-system interaction design (Application flow) Graphics design and branding Content creation (Content inventory) Evaluate

The aims Introduction to the goals and methods of user interface evaluation Practice methods Focus on: –Usability evaluation –Expert reviews: Heuristic evaluation

The need for evaluation Usable and useful user interfaces and information architectures need evaluation Evaluation should not be carried out by designers Two main types of evaluation Formative evaluation is done at different stages of development to check that the product meets users’ needs. Summative evaluation assesses the quality of a finished product. Our focus is on formative evaluation

Bruce Tognazzini tells you why you need to evaluate “Iterative design, with its repeating cycle of design and testing, is the only validated methodology in existence that will consistently produce successful results. If you don’t have user-testing as an integral part of your design process you are going to throw buckets of money down the drain.” See AskTog.com for topical discussion about design and evaluation.

Steve Krug tells you why you need to evaluate 2C839EA840CFhttp:// 2C839EA840CF

Iterative design & evaluation is a continuous process that examines: Early ideas for conceptual model Early prototypes of the new system Later, more complete prototypes Designers need to check that they understand users’ requirements. What to evaluate

What To Evaluate - Examples Evaluate a paper prototype: – –Paper prototype usability test Evaluate a mockup: – –Balsamiq Mockups Intro

When to evaluate Throughout design From the first descriptions, sketches etc. of users needs through to the final product Design proceeds through iterative cycles of ‘design-test-redesign’ Evaluation is a key ingredient for a successful design.

Four evaluation paradigms ‘quick and dirty’ usability testing field studies expert reviews

Quick and dirty ‘quick & dirty’ evaluation describes the common practice in which designers informally get feedback from users or consultants to confirm that their ideas are in- line with users’ needs and are liked. Quick & dirty evaluations are done any time. The emphasis is on fast input to the design process rather than carefully documented findings.

Usability testing Usability testing involves recording typical users’ performance on typical tasks in controlled settings. Field observations may also be used. As the users perform these tasks they are watched & recorded on video & their key presses are logged. This data is used to calculate performance times, identify errors & help explain why the users did what they did. User satisfaction questionnaires & interviews are used to elicit users’ opinions.

Usability Testing Video: – –With adults: UIQ, Usability test –With kids: Math Flash Game Usability Test

Evaluation Observation methods Define typical user tasks Collect background information: –Demographic questionnaire –Skills questionnaire Define success metrics Collect performance and satisfaction data Do not interfere with user Think aloud Prompt: What are you thinking? What are you doing? But ask follow-up questions on problems Analyze data Suggest improvements

Usability Testing Exercise Teams of three: –Participant –Test administrator –Note-taker Test the following sites: –USMAI catalog ( –Research Port (

Usability Testing Exercise Procedure Whole group: Familiarize yourself with the site, try to figure out the goals and intended user group – the note-taker should take notes The test administrator and note-taker should read and modify the usability evaluation script, including devising two tasks Conduct the study Post your notes and lessons learned about the site and the usability evaluation process

Visit the Usability Lab Next Class

Field studies Field studies are done in natural settings The aim is to understand what users do naturally and how technology impacts them. In product design field studies can be used to: - identify opportunities for new technology - determine design requirements - decide how best to introduce new technology - evaluate technology in use.

Expert reviews Experts apply their knowledge of typical users, often guided by heuristics, to predict usability problems. A key feature of predictive evaluation is that users need not be present Relatively quick & inexpensive Expert reviews entail one-half day to one week effort, although a lengthy training period may sometimes be required to explain the task domain or operational procedures There are a variety of expert review methods to chose from: –Heuristic evaluation –Guidelines review –Consistency inspection –Cognitive walkthrough –Formal usability inspection

Expert reviews (cont.) Expert reviews can be scheduled at several points in the development process when experts are available and when the design team is ready for feedback. Different experts tend to find different problems in an interface, so 3-5 expert reviewers can be highly productive, as can complementary usability testing. The dangers with expert reviews are that the experts may not have an adequate understanding of the task domain or user communities. Even experienced expert reviewers have great difficulty knowing how typical users, especially first-time users will really behave.

Heuristic Evaluation Example Information visualization tool for intrusion detection Project sponsored by Department of Defense Review created by Enrique Stanziola and Azfar Karimullah

Heuristics We developed certain heuristics that were utilized to effectively evaluate the system. We looked at the following criteria: Match user task with the transitions provided on the interface. Object grouping based according to their relatedness. Color Usage – Accessibility evaluation. Interface provides just enough Information Speak User’s language. User’s conceptual Model evaluation User Memory load (design issues) Consistency Evaluation User Feedback Clearly marked exits Shortcuts Constructing error messages Error Handling Help and documentation

Findings The rest of this document focuses on the individual findings of each expert user. We report the comments of each user as he completed all the tasks. Expert Reviewer A: c.1.In the File Menu, user’s language is not used. There is no term like “New” or “ New Session” that would indicate the initial step the user must take to start a session. c.2.No help is provided. c.3.Labels in the graph window are too small on the color bar. Font size is not consistent with the font size used in the 3D graph display. c.4.User Language: ‘Binding’ term used in Menu is hard to understand. Also the window title: ‘dGUI’ could be made more meaningful. c.5.No keyboard navigation functions available to the user in the data configuration window. c.6.No clue as to how to select a variable (double clicking) and how to deselect the selected variable. Dragging function not evident to the user. Balloon help could be useful. Buttons next to Visualization attribute list have no label.

Heuristic Evaluation Exercise Louis Rosenfeld’s IA Heuristics (2004) Select an area of heuristics: –Main page –Search interface –Search results –Site-wide & Contextual navigation Evaluate the UMBC library site in light of these Report your results to the class

Choose the evaluation paradigm & techniques Goals Budgets Participants Time limits Context

Evaluating the 1984 OMS Early tests of printed scenarios & user guides  Early simulations of telephone keypad  An Olympian joined team to provide feedback  Interviews & demos with Olympians outside US  Overseas interface tests with friends and family.  Free coffee and donut tests  Usability tests with 100 participants.  A ‘try to destroy it’ test  Pre-Olympic field-test at an international event  Reliability of the system with heavy traffic

Design Example Video Allison Druin et al.: Designing with and for children Videos: –Juan Pablo Hourcade, Allison Druin, Lisa Sherman, Benjamin B. Bederson, Glenda Revelle, Dana Campbell, Stacey Ochs & Beth Weinstein (2002) SearchKids: a Digital Library Interface for Young Children. ACM SIGCHI 2002 Conference Questions: –Who: who are the designers, evaluators, and other participants? –What & how: what evaluation methods are they applying and how are they using these?