Evaluating the Usability of Web-based Applications A Case Study of a Field Study Sam J. Racine, PhD Unisys Corporation.

Slides:



Advertisements
Similar presentations
Chapter 14: Usability testing and field studies
Advertisements

Cultural Heritage in REGional NETworks REGNET Project Meeting Content Group Part 1: Usability Testing.
Components of a Product Vision/Strategy
A3 PROBLEM SOLVING TOOL: Date: Contact: SOLUTIONS / COUNTERMEASURES What solutions will solve the root causes? (Tools: Brainstorming and Affinity Diagram)
CS305: HCI in SW Development Evaluation (Return to…)
Cognitive Walkthrough More evaluation without users.
Positivistic versus Naturalistic Inquiry: changing the way we think and investigate by Dennis Ondrejka, Ph.D. This is a 100 year old debate Is often correlated.
William H. Bowers – Understanding Users: Qualitative Research Cooper 4.
ACTIVELY ENGAGING THE STAKEHOLDER IN DEFINING REQUIREMENTS FOR THE BUSINESS, THE STAKEHOLDER, SOLUTION OR TRANSITION Requirements Elicitation.
Professional Ethics and Responsibilities
Part 4: Evaluation Chapter 20: Why evaluate? Chapter 21: Deciding on what to evaluate: the strategy Chapter 22: Planning who, what, where, and when Chapter.
© Tefko Saracevic, Rutgers University1 digital libraries and human information behavior Tefko Saracevic, Ph.D. School of Communication, Information and.
Evaluating with experts
1 CS 430 / INFO 430 Information Retrieval Lecture 24 Usability 2.
A Gift of Fire Third edition Sara Baase
An evaluation framework
Gender Issues in Systems Design and User Satisfaction for e- testing software Prepared by Sahel AL-Habashneh. Department of Business information systems.
Systems Analysis and Design in a Changing World, Fourth Edition
SIMS 213: User Interface Design & Development Marti Hearst Thurs, Jan 22, 2004.
Fact-Finding Fact-Finding Overview
Chapter 4: Beginning the Analysis: Investigating System Requirements
Project Design and Data Collection Methods: A quick and dirty introduction to classroom research Margaret Waterman September 21, 2005 SoTL Fellows
Web 2.0 Testing and Marketing E-engagement capacity enhancement for NGOs HKU ExCEL3.
Welcome to CMPE003 Personal Computer Concepts: Hardware and Software Winter 2003 UC Santa Cruz Instructor: Guy Cox.
The role of eye tracking in usability evaluation of LMS in ODL context Mr Sam Ssemugabi Ms Jabulisiwe Mabila (Professor Helene Gelderblom) College of Science.
ACOS 2010 Standards of Mathematical Practice
Qualitative Data Collection Strategy, Populations, and Instruments
Human Interface Engineering1 Main Title, 60 pt., U/L case LS=.8 lines Introduction to Human Interface Engineering NTU Seminar Amy Ma HIE Global Director.
Chapter 4: Beginning the Analysis: Investigating System Requirements
Systems Analysis and Design: The Big Picture
McGraw-Hill © 2006 The McGraw-Hill Companies, Inc. All rights reserved. The Nature of Research Chapter One.
Evaluation Framework Prevention vs. Intervention CHONG POH WAN 21 JUNE 2011.
EAAPAC EAAPAC TRAINING FOR MEMBERS OF NEW PACs – JUBA SOUTH SUDAN February 6 – 8, 2013.
Chapter 11: An Evaluation Framework Group 4: Tony Masi, Sam Esswein, Brian Rood, & Chris Troisi.
Slide 1 D2.TCS.CL5.04. Subject Elements This unit comprises five Elements: 1.Define the need for tourism product research 2.Develop the research to be.
Qualitative Analysis Information Studies Division Research Workshop Elisabeth Logan.
CIS 321—IS Analysis & Design Chapter 4: Analysis— Investigating System Requirements.
SoberIT Software Business and Engineering Institute HELSINKI UNIVERSITY OF TECHNOLOGY User Studies Basic principles, methods, and examples Sari.
Program Evaluation EDL 832 Jeffrey Oescher, Instructor 6 June 2013.
CSE323 การวิเคราะห์และออกแบบระบบ (Systems Analysis and Design) Lecture 03: Requirements Capture Requirements Analysis.
Interaction Design Process COMPSCI 345 S1 C and SoftEng 350 S1 C Lecture 5 Chapter 3 (Heim)
Part 1-Intro; Part 2- Req; Part 3- Design  Chapter 20 Why evaluate the usability of user interface designs?  Chapter 21 Deciding on what you need to.
Educational Action Research Todd Twyman Summer 2011 Week 2.
Principles of Learning and Teaching Course Module One Unit (1) Principle of Learning and Teaching Learning Needs & Adult Learning.
The Analysis of the quality of learning achievement of the students enrolled in Introduction to Programming with Visual Basic 2010 Present By Thitima Chuangchai.
Patterns of Event Causality Suggest More Effective Corrective Actions Abstract: The Occurrence Reporting and Processing System (ORPS) has used a consistent.
EXPLORATORY / COMPARISON TEST. What types of written information will be required? – Prerequisite – Theoretical or conceptual – Procedural – Examples.
Innovation insight Peter H. Jones, Ph.D. Dayton, Toronto redesignresearch.com designdialogues.net A Bag of Tricks: What is the Right Mix of Methods?
EVALUATION PROfessional network of Master’s degrees in Informatics as a Second Competence – PROMIS ( TEMPUS FR-TEMPUS-JPCR)
An Overview of Business Research Process
 An Information System (IS) is a collection of interrelated components that collect, process, store, and provide as output the information needed to.
TOTAL QUALITY MANAGEMENT
What is project management?
CUSTOMER COMPLAINTS (FEEDBACK) CUSTOMER FEEDBACK OR CUSTOMER COMPLAINT IS REQUIRED 1.To discover customer dissatisfaction 2.To identify customer‘s needs.
ANALYSIS PHASE OF BUSINESS SYSTEM DEVELOPMENT METHODOLOGY.
©2001 Southern Illinois University, Edwardsville All rights reserved. Today Putting it in Practice: CD Ch. 20 Monday Fun with Icons CS 321 Human-Computer.
William H. Bowers – Participatory Methods Torres 6.
Usability Engineering Dr. Dania Bilal IS 582 Spring 2007.
1 Requirements Elicitation – 2 Lecture # Requirements Engineering Process Requirements Elicitation Requirements Analysis and Negotiation Requirements.
Implementing the Professional Growth Process Session 3 Observing Teaching and Professional Conversations American International School-Riyadh Saturday,
Cooper Goal-Directed Design: Practice Session Dr. Cindy Corritore Creighton University ITM 734 Fall 2005.
1st Year Review [ M&E, Experience in Burkina Faso ] 1 Dr Victor NANA Programme Manager ACCESS-SMC Project Burkina Faso 19/1/16.
Innovation insight Peter H. Jones, Ph.D. Dayton, Toronto redesignresearch.com designdialogues.net.
Improved socio-economic services for a more social microfinance.
Day 8 Usability testing.
CHAPTER 2 SYSTEM PLANNING DFC4013 System Analysis & Design.
4 Chapter 4: Beginning the Analysis: Investigating System Requirements Systems Analysis and Design in a Changing World, 3 rd Edition.
Cognitive Informatics for Biomedicine – Chapter 5
User Studies Basic principles, methods, and examples
Presentation transcript:

Evaluating the Usability of Web-based Applications A Case Study of a Field Study Sam J. Racine, PhD Unisys Corporation

What does “Usability” mean? The measure of how a given user operates a given interface in a given context effectively, efficiently, and with satisfaction

How do we measure usability? Goals determine techniques Techniques determine test designs Test designs determine results Results must be interpreted

What is “Contextual Inquiry”? Ethnographic observation of users in their environment Flexible methodology that produces open-ended results Technique that is excellent for beginning or revisiting a UI design process

What does contextual inquiry provide? An understanding of what users are really doing day to day, in order to translate their tasks into an effective UI design

Contextual inquiry applied Unisys LMS Enterprise Services (ES) a web-based cargo management application Basis for “sister” applications Basis for other web-based applications in similar environments and markets Desire to build on our successes (and failures)

Our goals Determine the real-world value that users (not customers) assign to our application Learn what users really do, not just what they say they do Sort through the “lore” from the “reality”

Our technique Field study methodology to learn about users: their background, needs, and working environment interactions with the application: types of tasks, frequency, and completion time available support material and derived work-arounds ‘training’ and to whom users go for help

External factors affecting test design Customer site Management Employees Contractors Schedules Stakeholders Logistics

Internal factors affecting test design Expertise Personnel availability Budget More stakeholders More logistics

Our test design Two weeks, two evaluators First week observation; second week analysis First week ‘note taking;’ second week discussion and response to users’ questions Open-ended details: Daily revamp of test design Intervening weekend finalized second week’s details

“Rules” for data gathering Note comments verbatim For a survey, repeat questions exactly For information, customize approach Note response and participant and context Respect each evaluator’s approach Note what users do as well as what they say Be flexible

“Rules” for sorting data Pay attention to what users do but don’t discount what they say Allow categories to emerge from data Categorize after collecting data no preconceptions Assign weight according to user needs and yours separately Don’t dismiss anomalies measure against the participant and context

“Rules” for dealing with users and stakeholders Work according to their comfort level not yours Treat each user as a CEO Know communication valued by stakeholders Be prepared with multiple arsenals of communication nonverbal like diagramming or picture drawing bring a camera Appreciate, don’t bribe

The test Step 1: Preparation Establish climate and expectations Announcement to participations Confirmation of arrival logistics Practice observation Step 2: Observation only Gather data “Take a note” No “answers” or “corrections” Participation survey and identification Investigation of expressed concerns by managers and users Note everything!

The test Step 3: Analysis Sort observations according to organic categories Sites of difficulty Clustered impressions Patterns Preliminary Findings Training opportunities Step 4: Directive Activities Test categories and verify findings Cognitive walkthroughs Directed activities Demonstrations

Findings: Data Internet is a new creature changed expectations changed operation Training and orientation two different things Users value accuracy Domain experts require supportive UI “Old” eyes are everywhere

Findings: Methodology A ‘true’ source exists for most complaints Multiple sources doesn’t guarantee validity Order in which data is discovered affects interpretation Significance is assigned by users and evaluators

What did we do well? Had patience, patience, patience Staggered across times and shifts Swallowed our pride and rejected seduction Asked for help Made the most of what we had Timed our exit Brought a camera

What could we have improved? Extended to multiple sites Sought a different quarter Better pre-testing, identification, and administration of surveys Brought a video camera

Recommendations Contextual inquiry requires observation communication detection theoretical foundation In other words, use a professional!

Questions?

Thank you! Sam Racine, Unisys Corporation