Instant Data Analysis (IDA): Evaluating Usability in a Day Jesper Kjeldskov Mikael B. Skov Jan Stage.

Slides:



Advertisements
Similar presentations
T. Bothos / D. Apostolou Prediction Market prototype V1 Information Management Unit (IMU) Institute of Communication and.
Advertisements

Assessment for Learning Tools and Activities. Links to Tools and Activities Comment-only marking Exemplar Work Student Marking Traffic Lights Self-assessment.
Jesper Kjeldskov, Mikael B. Skov, Benedikte S. Als, and Rune T. Høegh Mikael B. Skov Department of Computer Science Aalborg University, Denmark Is it Worth.
ACTIVELY ENGAGING THE STAKEHOLDER IN DEFINING REQUIREMENTS FOR THE BUSINESS, THE STAKEHOLDER, SOLUTION OR TRANSITION Requirements Elicitation.
Observing Users Paul Bogen, LaShon Johnson, Jehoon Park.
Proposed plan for the summative evaluation of a Technology Enhanced Learning Project Dean Petters, Statement of the research question Participants Method.
Chapter 14: Usability testing and field studies. Usability Testing Emphasizes the property of being usable Key Components –User Pre-Test –User Test –User.
The USE Project: Usability Evaluation and Software Design: Bridging the Gap University of Copenhagen Aalborg University Use Case Evaluation (UCE): A Method.
Identifying Needs and Establishing Requirements John Thiesfeld Jeff Morton Josh Edwards.
Unit 16 University of Sunderland CSEM04 ROSCO Unit 16: Post Implementation Review CSEM04: Risk and Opportunities of Systems Change in Organisations Dr.
Think-aloud usability experiments or concurrent verbal accounts Judy Kay CHAI: Computer human adapted interaction research group School of Information.
Objectives Define collaboration as it relates to parent leadership and collaboration in a variety of settings Learn about the defining characteristics.
Observing users.
Jesper Kjeldskov & Jan Stage Department of Computer Science Aalborg University Denmark New Techniques for Usability Evaluation of Mobile Systems.
Observing users. The aims Discuss the benefits & challenges of different types of observation. Describe how to observe as an on-looker, a participant,
Fundamentals of Information Systems, Second Edition
An evaluation framework
Feedback from Usability Evaluation to User Interface Design: Are Usability Reports Any Good? Christian M. Nielsen 1 Michael Overgaard 2 Michael B. Pedersen.
Jump to first page Chapter 2 System Analysis - Determining System Requirements.
Jesper Kjeldskov Mikael B. Skov Jan Stage HCI-Lab Department of Computer Science Aalborg University Denmark Does Time Heal? A Longitudinal Study of Usability.
Facilitating Focus Groups Insert date Insert presenters.
Testing Dojo Łukasz Kempny Autor: Łukasz KempnyCopyright© Future Processing 2012.
Web 2.0 Testing and Marketing E-engagement capacity enhancement for NGOs HKU ExCEL3.
Welcome to PC362: Managing Grants. Please set cell phones and pagers to silent Refrain from side discussions. We all want to hear what you have to say!
TTMG 5103 Module Techniques and Tools for problem diagnosis and improvement prior to commercialization Shiva Biradar TIM Program, Carleton University.
Qualitative Research Methodologies Keys to Exploratory Research.
#17 - Involve Users in the Development Model of Multinational Corporations - Is it worth it? Experience Report IRCSE '08: IDT Workshop Friday 31 October.
Evaluation Framework Prevention vs. Intervention CHONG POH WAN 21 JUNE 2011.
COLLABORATION MODULE #2 Assigning Roles In Meetings An online module developed by Pivot Learning Partners for the West Contra Costa Unified School District.
Performance Development at The Cathedral of the Incarnation A Supervisor’s Guide.
ISO 9001: 2000 Certified Audit Process What to do.
Read to Learn How to use formal and informal methods to research careers How to evaluate sources of career information How to identify work experience.
Presented by ESC 7 Advanced Academic Services. Click on Set up new account and follow the directions. Return to this page to log in and register for.
Multimedia Specification Design and Production 2013 / Semester 1 / week 9 Lecturer: Dr. Nikos Gazepidis
Overview Lifting the Curtain - Debriefings FAI Acquisition Seminar.
Gruppearbejde Mikael B. Skov Jan Stage. 2 Gruppearbejde Hvad er jeres produkter, og hvad gør I vedr. evaluering? Hvordan kan specifikt evaluere disse?
What is Usability? Usability Is a measure of how easy it is to use something: –How easy will the use of the software be for a typical user to understand,
What do we know about exploratory testing? Implications for Practice
The Creative Problem Solving Pack. The following pages provide separate packs that you can use in the following situations. * Creative problem solving.
Evolution: What’s Next?. Explore Overview The goal of this part of the project is to explore and learn in order to be prepared to meet the project challenge!
COMP5047 Pervasive Computing: 2012 Think-aloud usability experiments or concurrent verbal accounts Judy Kay CHAI: Computer human adapted interaction research.
Database Analysis and the DreamHome Case Study
Facilitation  The TAB Facilitation Process and Techniques.
1 Learning Objectives: 1.Understand data collection principles and practices. 2.Describe the differences between collecting qualitative and quantitative.
Fundamentals of Information Systems, Second Edition 1 Systems Development.
Identifying needs and establishing requirements Data gathering for requirements.
Observing users. The aims Discuss the benefits & challenges of different types of observation. Describe how to observe as an on-looker, a participant,
Lecture 10 More Innovation SE3821 Software Requirements and Specification Dr. Rob Hasker (based on slides by Dr. Brad Dennis)
By Godwin Alemoh. What is usability testing Usability testing: is the process of carrying out experiments to find out specific information about a design.
Action Research Research Methods CAGS NITE Action Research WHAT IS ACTION RESEARCH? –A form of qualitative research –Self-reflective enquiry undertaken.
Program Evaluation for Nonprofit Professionals Unit 3: Collecting the Data.
—————————————————————————————————————————— Design of Interactive Computational Media Jan.-Apr Slide 4T.1 Observation Techniques The What The Why.
Observing users. What and when to observe Goals & questions determine the paradigms and techniques used. Observation is valuable any time during design.
Just Checking: Inspections and Documentation Greenville, SC September 8-11, 2015.
Meeting Management Part I. Importance of Meetings  Meetings are one of the most important management tools necessary to make teams, groups, and organizations.
Content and Subject Matter Experts in Online Interactions.
Usability Engineering Dr. Dania Bilal IS 587 Fall 2007.
“The Role of Experience in Software Testing Practice” A Review of the Article by Armin Beer and Rudolf Ramler By Jason Gero COMP 587 Prof. Lingard Spring.
SWE 214 (071) Chapter 12: Brainstorming and Idea Reduction Slide 1 Chapter 12: Brainstorming and Idea Reduction.
Multimedia Industry Knowledge CUFGEN01A Develop And Apply Industry Knowledge CUFMEM08A Apply Principles Of Instructional Design To A Multimedia Product.
Evaluation / Usability. ImplementDesignAnalysisEvaluateDevelop ADDIE.
2. Focus Group Discussions April 1, FGD, Defined Focus groups are a data collection method. Data is collected through a semi-structured group.
Structured Decisions How to facilitate a creative problem solving meeting. Loredo Sola, Dir SW Dev, PKC Inc.
Chapter 29 Conducting Market Research. Objectives  Explain the steps in designing and conducting market research  Compare primary and secondary data.
1 Interprofessional Health Care Team Meetings OBJECTIVES: Identify key principles and characteristics of effective interprofessional team meetings Identify.
School of Engineering and Information and Communication Technology KIT305/607 Mobile Application Development Week 7: Usability (think-alouds) Dr. Rainer.
Observing users.
Does Time Heal? A Longitudinal Study of Usability
Prewriting 5/19/2019.
Kasper Hornbæk Department of Computer Science University of Copenhagen
Presentation transcript:

Instant Data Analysis (IDA): Evaluating Usability in a Day Jesper Kjeldskov Mikael B. Skov Jan Stage

2 Overview Experiences with conventional evaluations Instant data analysis (IDA): basic idea Participants and materials, procedure and roles The IDA session IDA facilitator – during and after the IDA session Experiment Findings: usability problems Compared to ad-hoc analysis Conclusion Trade-Off: approach and resources

3 Experiences with Conventional Evaluations The value of usability evaluations has become widely acknowledged in the software industry However, time and other resources available for evaluating usability are often highly constrained Typical required effort: manhours, with spent on data analysis Aim: allow usability evaluations to be conducted, analyzed and documented in a day: Instant Data Analysis … but without sacrificing a systematic and user-oriented approach

4 IDA: Basic Idea Designed to be combined with user-based think-aloud testing Exploits that a typical think-aloud test already involves a test monitor and a data logger High level of usability expertise Often gain insight into key usability problems quickly Systematically capture a valuable moment of insight into the usability of a system that otherwise needs to be reconstructed during later video analysis (and is sometimes lost…) This approach replaces video analysis and transcription of log files Makes it possible to complete a usability evaluation in a day (using 4-6 test subjects)

5 Participants and Materials 4-6 test subjects 1 test monitor 1 data logger 1 IDA facilitator (not present during the tests) 1 software system 1 whiteboard or flip-over Printed screenshots of the system (optionally)

6 Procedure Tests (4-6 hours) Conduct 4-6 think-aloud sessions with the test monitor and data logger (makes notes) present Analysis (2-2½ hours) Conduct 1 hour brainstorming and data analysis session Articulate and discuss the most critical problems of the system Rate the severity of the problems (e.g. as critical, serious or cosmetic) and categorize them in themes (as they emerge) The discussion is managed by the IDA facilitator who asks questions for clarification and writes the problems on a whiteboard or flip-over Use printed screenshots and written notes for supporting overview Spend 1-1½ hours on writing up the content of the whiteboard into a ranked list of problems with clear references to the system Review the problem list together for final consensus

7 Roles in IDA There are three roles to be filled in IDA: Test Monitor, 1 person Data Logger, at least one person IDA session facilitator, 1 person

8 Test Monitor and Data Logger The test monitor's responsibility during the evaluation session is the ”traditional test monitor responsibilities ”, eg. Ensures that the participants understand what will happen and are put at their ease as much as possible Administers the test Make sure data is gathered Debriefs the participants The Data loggers responsibility: Records incidents and problems Possibly accoding to a standard agreed upon upfront The logged data will be used in the following in IDA session

9 The IDA session The IDA session is a one-hour brainstorm and analysis session. The test monitor and data logger articulate and discuss the most critical usability problems identified in the evaluation sessions. Screenshots of the system is a good tool to spark the memory Usability problems should also be rated according to their severity. Goal: To identify the most critical usability problems (not to find as many problems as possible)

10 IDA facilitator – during the IDA session. The IDA facilitators responsibility is to support the brainstorming and analysis session by Asking for clarifications Writing down identified usability problems on a white board Categorize problems in themes The hard part is keeping track of all the information!

11 IDA facilitator – after the IDA session After the IDA session it is the IDA facilitator's responsibility to go through the identified usability problems and write down a ranked list of usability problems (1-1½ hour) The list should include short descriptions of the problems, and clear references to the system such as references to specific parts of the GUI Like an ordinary problem list The last step of the IDA method is that the test monitor, the data logger and the IDA facilitator runs through the list of ranked usability problems to ensure consensus.

12 Experiment We studied the use of Instant Data Analysis through an exploratory experiment Purpose Gaining practical experience with the use of the technique Comparing results produced “instantly” with results from traditional video data analysis Identifying opportunities and challenges for improving IDA The system: resource booking at a large hospital Participants 5 test subjects 1 test monitor 1 data logger 1 IDA facilitator 2 observers (developers from the software company)

13 Findings: Usability Problems Instant Data AnalysisVideo Data AnalysisTotal Critical Serious15 22 Cosmetic Total

14 Findings: Distribution of Problems (1) A black square represents a usability problem identified by the corresponding technique Critical problems Both approaches assisted in identifying nearly all 13 identified critical problems (85% and 92% respectively) The two critical problems not identified by IDA were related to User frustration due to slow system responses A software bug

15 Findings: Distribution of Problems (2) A black square represents a usability problem identified by the corresponding technique Critical problems Both approaches assisted in identifying nearly all 13 identified critical problems (85% and 92% respectively) The two critical problems not identified by IDA were related to User frustration due to slow system responses A software bug Serious problems IDA and VDA both identified 68% of all experienced problems 8 problems were identified by both approaches

16 Findings: Distribution of Problems (3) Critical problems Both approaches assisted in identifying nearly all 13 identified critical problems (85% and 92% respectively) The two critical problems not identified by IDA were related to 1.User frustration due to slow system responses 2.A software bug Serious problems IDA and VDA both identified 68% of all experienced problems 8 problems were identified by both approaches Cosmetic problems IDA identified 56% of all experienced problems 7 problems were identified by both approaches 11 out of 12 cosmetic problems only identified by VDA were experienced by only one of the five test subjects (unique) A black square represents a usability problem identified by the corresponding technique

17 Compared to Ad-Hoc Analysis The two developers that observed the tests made a list of their own the day after the tests They employed an ad-hoc approach (using no structured method) They identified 8 usability problems When they read the report, they discovered several usability problems that they had forgotten or could not even remember

18 Conclusion Instant Data Analysis can… Assist usability researchers in quickly identifying most the critical and serious usability problems experienced by users in a think-aloud evaluation Be conducted in 10% of the time required to do a traditional video data analysis (analysis: 4 manhours compared to 40 manhours) Reduce the noise of unique (false?) usability problems Provide closure for the evaluators by capturing an immediate response to long a day of evaluation Qualitatively, the serious problems identified only by Instant Data Analysis were on a higher level of abstraction Often related to more general usability issues than the problems identified through video data analysis May be attributed to the test monitor and data logger not having “direct” access to the data during analysis – thus forcing them to analyze on a higher level of abstraction

19 Trade-Off: Approach and Resources Effort: The time spent on the evaluation Structuration: The amount of explicit and systematic method elements that are used to guide the evaluation Effort Structuration IDA VBDA ad-hoc ?