Download presentation
Presentation is loading. Please wait.
Published byNaomi Dorsey Modified over 9 years ago
1
Evaluating the Usability of Web-based Applications A Case Study of a Field Study Sam J. Racine, PhD Unisys Corporation
2
What does “Usability” mean? The measure of how a given user operates a given interface in a given context effectively, efficiently, and with satisfaction
3
How do we measure usability? Goals determine techniques Techniques determine test designs Test designs determine results Results must be interpreted
4
What is “Contextual Inquiry”? Ethnographic observation of users in their environment Flexible methodology that produces open-ended results Technique that is excellent for beginning or revisiting a UI design process
5
What does contextual inquiry provide? An understanding of what users are really doing day to day, in order to translate their tasks into an effective UI design
6
Contextual inquiry applied Unisys LMS Enterprise Services (ES) a web-based cargo management application Basis for “sister” applications Basis for other web-based applications in similar environments and markets Desire to build on our successes (and failures)
7
Our goals Determine the real-world value that users (not customers) assign to our application Learn what users really do, not just what they say they do Sort through the “lore” from the “reality”
8
Our technique Field study methodology to learn about users: their background, needs, and working environment interactions with the application: types of tasks, frequency, and completion time available support material and derived work-arounds ‘training’ and to whom users go for help
9
External factors affecting test design Customer site Management Employees Contractors Schedules Stakeholders Logistics
10
Internal factors affecting test design Expertise Personnel availability Budget More stakeholders More logistics
11
Our test design Two weeks, two evaluators First week observation; second week analysis First week ‘note taking;’ second week discussion and response to users’ questions Open-ended details: Daily revamp of test design Intervening weekend finalized second week’s details
12
“Rules” for data gathering Note comments verbatim For a survey, repeat questions exactly For information, customize approach Note response and participant and context Respect each evaluator’s approach Note what users do as well as what they say Be flexible
13
“Rules” for sorting data Pay attention to what users do but don’t discount what they say Allow categories to emerge from data Categorize after collecting data no preconceptions Assign weight according to user needs and yours separately Don’t dismiss anomalies measure against the participant and context
14
“Rules” for dealing with users and stakeholders Work according to their comfort level not yours Treat each user as a CEO Know communication valued by stakeholders Be prepared with multiple arsenals of communication nonverbal like diagramming or picture drawing bring a camera Appreciate, don’t bribe
15
The test Step 1: Preparation Establish climate and expectations Announcement to participations Confirmation of arrival logistics Practice observation Step 2: Observation only Gather data “Take a note” No “answers” or “corrections” Participation survey and identification Investigation of expressed concerns by managers and users Note everything!
16
The test Step 3: Analysis Sort observations according to organic categories Sites of difficulty Clustered impressions Patterns Preliminary Findings Training opportunities Step 4: Directive Activities Test categories and verify findings Cognitive walkthroughs Directed activities Demonstrations
17
Findings: Data Internet is a new creature changed expectations changed operation Training and orientation two different things Users value accuracy Domain experts require supportive UI “Old” eyes are everywhere
18
Findings: Methodology A ‘true’ source exists for most complaints Multiple sources doesn’t guarantee validity Order in which data is discovered affects interpretation Significance is assigned by users and evaluators
19
What did we do well? Had patience, patience, patience Staggered across times and shifts Swallowed our pride and rejected seduction Asked for help Made the most of what we had Timed our exit Brought a camera
20
What could we have improved? Extended to multiple sites Sought a different quarter Better pre-testing, identification, and administration of surveys Brought a video camera
21
Recommendations Contextual inquiry requires observation communication detection theoretical foundation In other words, use a professional!
22
Questions?
23
Thank you! Sam Racine, Unisys Corporation sam.racine@unisys.com
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.