Jesper Kjeldskov, Mikael B. Skov, Benedikte S. Als, and Rune T. Høegh Mikael B. Skov Department of Computer Science Aalborg University, Denmark Is it Worth.

Slides:



Advertisements
Similar presentations
Chapter 14: Usability testing and field studies
Advertisements

PS50118 – Interacting with Technology Laboratory vs. Field Usability Evaluation Jason Cooper.
World of work How is science used in the world of work? Science in the world of work.
Faculty of Health Title of Session Professor Steve Campbell, Head, School of Health Sciences, Faculty of Health.
CCRS QUARTERLY MEETING CCRS QUARTERLY MEETING LITERACY STANDARDS FOR HISTORY/SOCIAL STUDIES; SCIENCE AND TECHNICAL SUBJECTS
©2011 1www.id-book.com Introducing Evaluation Chapter 12.
Steve Howard 12, Jesper Kjeldskov 2, Mikael B Skov 2, Kasper Garnoes 2, Olga Gruberger 2 1 Interaction Design Group, 2 HCI Group Department of Information.
WHAT IS INTERACTION DESIGN?
Chapter 14: Usability testing and field studies. 2 FJK User-Centered Design and Development Instructor: Franz J. Kurfess Computer Science Dept.
The USE Project: Usability Evaluation and Software Design: Bridging the Gap University of Copenhagen Aalborg University Use Case Evaluation (UCE): A Method.
Evaluation of A PDA Based Clinical Handover System Dr Marilyn R McGee-Lennon University of Glasgow SIHI, Portsmouth, Sept 2007 HECTOR.
Instant Data Analysis (IDA): Evaluating Usability in a Day Jesper Kjeldskov Mikael B. Skov Jan Stage.
Jesper Kjeldskov & Jan Stage Department of Computer Science Aalborg University Denmark New Techniques for Usability Evaluation of Mobile Systems.
Pervasive Healthcare Martin Mogensen mastering student and student programmer Centre for Pervasive Healthcare [ Computer Science Department.
1 Distributed Systems – Aims The main aim of the course is to introduce fundamental concepts and techniques for distributed systems, i.e., systems in which.
1 CS 430 / INFO 430 Information Retrieval Lecture 24 Usability 2.
ISIS Katrinebjerg i n t e r a c t i v e s p a c e s. n e t 1 Frank Allan Hansen, Integrating the Web and the World: Contextual Trails on.
Feedback from Usability Evaluation to User Interface Design: Are Usability Reports Any Good? Christian M. Nielsen 1 Michael Overgaard 2 Michael B. Pedersen.
ISIS Katrinebjerg i n t e r a c t i v e s p a c e s. n e t 1 Frank Allan Hansen, Integrating the Web and the World: Contextual Trails on.
Jesper Kjeldskov Mikael B. Skov Jan Stage HCI-Lab Department of Computer Science Aalborg University Denmark Does Time Heal? A Longitudinal Study of Usability.
From Controlled to Natural Settings
Copyright © 2005, Pearson Education, Inc. Case Study Medical Monitoring Devices Based on work done at Hewlett-Packard Patient Monitoring Division Circa.
Department of Computer Science, University of Bath, UK1 History as part of context Manasawee (Jay) Kaenampornpan and Eamonn O’Neill {cspmk,
Career Research Power Point My Top Cluster  Health Science was my top cluster  My video link for this cluster is ( deos/CareerandClusterVideos/care.
Jennifer Oldford PAHS All information from All information from: “Biology guide. First assessment 2016.” Welcome to IB Biology.
Chapter 14: Usability testing and field studies
Module 4: Systems Development Chapter 13: Investigation and Analysis.
Context awareness in health care: A review Nathalie Bricon-Souf, Conrad R. NewMan Centre d’Etude et de Recherche en Informatique Medicale, France International.
Understanding MYP Criteria
Ch 14. Testing & modeling users
Evaluation Techniques Material from Authors of Human Computer Interaction Alan Dix, et al.
Gruppearbejde Mikael B. Skov Jan Stage. 2 Gruppearbejde Hvad er jeres produkter, og hvad gør I vedr. evaluering? Hvordan kan specifikt evaluere disse?
Computer Science Department California Polytechnic State University San Luis Obispo, CA, U.S.A. Franz J. Kurfess CPE/CSC 484: User-Centered Design and.
Improving Patient Health Outcomes in Acute Care Hospital Settings Using Mobile Wireless Technology and Handheld Computers Research Team Maureen Farrell,
Aalborg UniversitetMorten AagaardSide 1/Max Consideration on the tunnel for teachers introducing the Smartphone for one pupil The tunnel consist of parts.
Peer-to-Peer learning using a learning management system in an ODL environment.
Testing & modeling users. The aims Describe how to do user testing. Discuss the differences between user testing, usability testing and research experiments.
CS2003 Usability Engineering Usability Evaluation Dr Steve Love.
CS2003 Usability Engineering Human-Centred Design Dr Steve Love.
Human Computer Interaction G52HCI Dave Kirk Participatory Design User Evaluation.
Wireless communications and mobile computing conference, p.p , July 2011.
Student name Student ID Degree program. Title of the Internship Report.
WERST – Methodology Group
Real-time Content Filtering for Mobile Devices Philip West Greg Foster and Peter Clayton Department of Computer Science Rhodes University.
User Interface Lab. ISIE Jeong, Seok-Hyun Using the experience sampling method to evaluate ubicomp applications.
B. RAMMAMURTHY Connected Vehicle Technology 6/6/2014 cse651 1.
Qualities of an Aligned Lesson Aligning Content and Process.
Welcome to Laubrass INC. CREATORS OF UMT PRODUCS.
1 Exotic Disease Response (EDR) Training Surveillance Processes – Overview.
HTA as evaluation of treatment with Lucentis for patients with wet AMD in the region of North Jutland, Denmark Karen Marie Poulsen, Charge nurse Department.
The Process of Developing a Mobile Device for Communication in a Safety-Critical Domain Jesper Kjeldskov and Jan Stage Mobile HCI Research Group Department.
Ideas on designing International Learning Outcomes
SIE 515 Design Evaluation Lecture 7.
Connected Vehicle Technology
Methodology Overview 2 basics in user studies Lecture /slide deck produced by Saul Greenberg, University of Calgary, Canada Notice: some material in this.
From Controlled to Natural Settings
WHAT IS INTERACTION DESIGN?
From Controlled to Natural Settings
John H.L. Hansen & Taufiq Al Babba Hasan
HCI Evaluation Techniques
Medical Lab.
Introducing Evaluation
Testing & modeling users
Does Time Heal? A Longitudinal Study of Usability
Kasper Hornbæk Department of Computer Science University of Copenhagen
Operational Definition Worksheet
Step-2: Understand the Business Function
Chapter 14 INTRODUCING EVALUATION
From Controlled to Natural Settings
Presentation transcript:

Jesper Kjeldskov, Mikael B. Skov, Benedikte S. Als, and Rune T. Høegh Mikael B. Skov Department of Computer Science Aalborg University, Denmark Is it Worth the Hassle? Exploring the Added Value of Evaluating the Usability of Context- Aware Mobile Systems in the Field

2 Jesper Kjeldskov, Mikael B. Skov, Benedikte S. Als, and Rune T. Høegh Motivation We have to investigate into the criteria, methods, and data collection techniques for usability evaluation of mobile systems (Johnson 1998) Often it is assumed that usability evaluations of mobile devices should be done in the field “… the scaling dimensions that characterize context-aware systems makes it impossible to use traditional, contained usability laboratories …” Abowd and Mynatt (2000) Kjeldskov and Graham (2003) found that 71% of mobile device evaluations were done in laboratory experiments

3 Jesper Kjeldskov, Mikael B. Skov, Benedikte S. Als, and Rune T. Høegh Motivation

4 Jesper Kjeldskov, Mikael B. Skov, Benedikte S. Als, and Rune T. Høegh Aim to compare the outcome of evaluating the usability of a mobile system in a laboratory respectively in the field to describe techniques used for improving the realism of laboratory settings by including mobility and context and support high-quality video data collection in the field

5 Jesper Kjeldskov, Mikael B. Skov, Benedikte S. Als, and Rune T. Høegh System: MobileWARD Wireless access to EPR on handheld computer Information and functionality adapted to location, time and the nurse’s assignments

6 Jesper Kjeldskov, Mikael B. Skov, Benedikte S. Als, and Rune T. Høegh System: MobileWARD In the corridor Overview of all patients, assigned patients and pending tasks Direct access to reading details about each individual patient’s history Entering a ward Overview of the patients in the ward Scanning patient’s wrist band Access to entering new measures General: Button size to allow interaction w. finger or pen

7 Jesper Kjeldskov, Mikael B. Skov, Benedikte S. Als, and Rune T. Høegh Method Laboratory evaluation Lab at Aalborg University, Denmark 6 test subjects (trained nurses) Tasks derived from user study Laboratory furnished as hospital, divided into two wards + corridor Field evaluation Frederikshavn Hospital, Denmark 6 test subjects (trained nurses) No specified tasks Involving real work activities

8 Jesper Kjeldskov, Mikael B. Skov, Benedikte S. Als, and Rune T. Høegh Method Mobile usability equipment enabling the capturing of video and audio The usability problems were classified as cosmetic, serious or critical (Molich, 2000) All sessions were analyzed in random order by two teams of trained usability evaluators The two teams produced two lists of usability problems and these were merged into one complete list.

9 Jesper Kjeldskov, Mikael B. Skov, Benedikte S. Als, and Rune T. Høegh Findings (1) 37 different usability problems Lab evaluation resulted in 36 problems 8 critical, 18 serious, and 10 cosmetic Field evaluation resulted in 23 problems 7 critical, 10 serious, and 6 cosmetic Primarily more serious and cosmetic problems

10 Jesper Kjeldskov, Mikael B. Skov, Benedikte S. Als, and Rune T. Høegh Findings (2) More problems per session 18.8 (2.0) problems versus 11.8 (3.3) problems (U=2.651, p<0.01) Critical: 5.3 (1.2) and 4.5 (2.2) problems Serious: 7.5 (1.0) and 4.5 (0.8) problems Cosmetic: 6.0 (0.9) and 2.8 (1.0) problems Identified significantly more serious (U=2.79, p<0.01) and cosmetic problems (U=2.84, p<0.01)

11 Jesper Kjeldskov, Mikael B. Skov, Benedikte S. Als, and Rune T. Høegh Field Evaluations Revisited (1) Little added value of taking the evaluation into the field Same problems in the laboratory Field contribution: Validity of data entered into the system Lack of control undermined the extendibility of the field None of the field subjects used the note taking facility The higher number of identified problems in the lab condition could be a result of irrelevant usability problems

12 Jesper Kjeldskov, Mikael B. Skov, Benedikte S. Als, and Rune T. Høegh Field Evaluations Revisited (2) Both the lab and field revealed context-aware problems All seven context-aware related problems in both conditions All field subjects got confused when the system automatically updated information or functionality according to the physical location The clip-on camera facilitated data collection of mobile use The configuration allowed subjects to move freely in the environment while at the same time still providing a close-up view of the interaction However, problems of placing the devices between use

13 Jesper Kjeldskov, Mikael B. Skov, Benedikte S. Als, and Rune T. Høegh Conclusions Was it worth the hassle? Not really, at least not for usability problem identification However, the real use situation provided additional information on use Replicating the context – always possible? Lab evaluation without context replication Field evaluation with task assignments

14 Jesper Kjeldskov, Mikael B. Skov, Benedikte S. Als, and Rune T. Høegh Questions…