ICSE2006 Far East Experience Track Detecting Low Usability Web Pages using Quantitative Data of Users’ Behavior Noboru Nakamichi 1, Makoto Sakai 2, Kazuyuki.

Slides:



Advertisements
Similar presentations
Chapter 14: Usability testing and field studies
Advertisements

CS5038 Tom Campbell 1 Can web site consumer eye- movement analysis significantly benefit a web site business?
STUDENT GUIDE. Go to the PUC Homepage located at From the Student drop-down menu, move cursor over the myPUC link and click myPUC Portal.
Prospective and Retrospective Perception of Time Ashis Pati Diwakar Agarwal.
System Design System Design - Mr. Ahmad Al-Ghoul System Analysis and Design.
Topic 6: Introduction to Hypothesis Testing
PRESENTATION OF THE TRAINING TOOL DEVELOPED FOR THE PROJECT MODULAR TE EUROFACE CONSULTING TURKEY, ANKARA 13th – 15th November 2007.
Darlene Fichter Data Coordinator, University of Saskatchewan Libraries February 20, 2002 Usability Testing on a Shoestring.
Image Retrieval Using Eye Movements Fred Stentiford & Wole Oyekoya University College London.
11 HCI - Lesson 5.1 Heuristic Inspection (Nielsen’s Heuristics) Prof. Garzotto.
The art and science of measuring people l Reliability l Validity l Operationalizing.
Design of a Web Page for Librarians’ Daily Use Jiabin Wang Engineering and Computer Science Library University of Toronto
Chapter 14: Usability testing and field studies. Usability Testing Emphasizes the property of being usable Key Components –User Pre-Test –User Test –User.
Using Statistics in Research Psych 231: Research Methods in Psychology.
CSC Alumni Survey Harry Bui Jared Guan Ryan Guest.
Agenda for January 25 th Administrative Items/Announcements Attendance Handouts: course enrollment, RPP instructions Course packs available for sale in.
Topics: Regression Simple Linear Regression: one dependent variable and one independent variable Multiple Regression: one dependent variable and two or.
Jeffrey P. Bigham Anna C. Cavender, Jeremy T. Brudvik, Jacob O. Wobbrock * and Richard E. Ladner Computer Science & Engineering The Information School*
From Controlled to Natural Settings
ICS 463, Intro to Human Computer Interaction Design: 8. Evaluation and Data Dan Suthers.
Do’s and Don’ts of Web Design By: Cassandra VanHumbeck.
1 The Benefits of Using Eye Tracking in Usability Testing Jennifer C. Romano Usability Laboratory Statistical Research Division U.S. Census Bureau.
Website Design. Designing and creating different elements involved in developing a website for e- commerce can help you identify and describe the components.
Eye tracking: principles and applications 廖文宏 Wen-Hung Liao 12/10/2009.
Using research to improve your site’s design and effectiveness Nora Paul, Director, Institute for New Media Studies, University of Minnesota Laura Ruel,
Chapter Objectives Discuss the relationship between page length, content placement, and usability Complete Step 4: Specify the website’s navigation system.
Chapter 14: Usability testing and field studies
1 Core Empirical Concepts and Skills for Computer Science Grant Braught Dickinson College Craig Miller DePaul University
Web Design, 4 th Edition 4 Planning a Successful Web Site: Part 2.
New Tools to Increase Sales And to Enhance The User Experience.
Browse design in digital libraries: Impact on user experience Xiangmin Zhang, Yuelin Li, Jingjing Liu & Ying Zhang SCILS, Rutgers University {xzhang,lynnlee,
Chapter 7 Statistical Issues in Research Planning and Evaluation.
Put it to the Test: Usability Testing of Library Web Sites Nicole Campbell, Washington State University.
Developing Content and Layout Lesson 6. Creating Web Site Content Online users scan a page, read key words of text, and check out graphics Reading from.
Ch 14. Testing & modeling users
Brain Wave Analysis in Optimal Color Allocation for Children’s Electronic Book Design Wu, Chih-Hung Liu, Chang Ju Tzeng, Yi-Lin.
Does More Powerful Really Mean More Power?
Science Presentation Outline and Sequence 7 th and 8 th grade.
Examples of individuals that would benefit from this tutorial: A worker who has lost a job and needs to find a job in a related occupation—immediately!
Implicit Acquisition of Context for Personalization of Information Retrieval Systems Chang Liu, Nicholas J. Belkin School of Communication and Information.
Chapter 6: NavigationCopyright © 2004 by Prentice Hall 6. Navigation Design Site-level navigation: making it easy for the user to get around the site Page-level.
Testing & modeling users. The aims Describe how to do user testing. Discuss the differences between user testing, usability testing and research experiments.
Copyright  2003 by Dr. Gallimore, Wright State University Department of Biomedical, Industrial Engineering & Human Factors Engineering Human Factors Research.
Statistical Power The power of a test is the probability of detecting a difference or relationship if such a difference or relationship really exists.
Copyright ©: SAMSUNG & Samsung Hope for Youth. All rights reserved Tutorials Computer basics: Using a mouse Suitable for: Beginner.
1. DEVELOP THE PROJECT QUESTION/PURPOSE Find a relevant topic of interest Write a question to be answered (How, What, When, Which, or Why?) Write down.
 Descriptive Methods ◦ Observation ◦ Survey Research  Experimental Methods ◦ Independent Groups Designs ◦ Repeated Measures Designs ◦ Complex Designs.
Math 4030 – 9a Introduction to Hypothesis Testing
Usability Engineering Dr. Dania Bilal IS 592 Spring 2005.
NIH and IRB Purpose and Method M.Ed Session 2.
ASSIST: Adaptive Social Support for Information Space Traversal Jill Freyne and Rosta Farzan.
Importance of user interface design – Useful, useable, used Three golden rules – Place the user in control – Reduce the user’s memory load – Make the.
Know your computer Make a Folder Copy from Word to Composer Format the Font Change the Alignment Format the Background Format the Colors Insert a Picture.
What’s on Your Desktop?. Programs on your computer Some programs are standard on most computers for example: Microsoft Word Internet Explorer Microsoft.
UCI Library Website Chris Lee Archana Vaidyanathan Duncan Tsai Karen Quan.
11/10/981 User Testing CS 160, Fall ‘98 Professor James Landay November 10, 1998.
What is the Scientific Method?. The scientific method is a way to ask and answer scientific questions by making observations and doing experiments.
Inferential Statistics Psych 231: Research Methods in Psychology.
By: Your Name ELEMENTS OF WEB DESIGN. VISUAL APPEAL Optimization of Graphics, for people to stay on your website, your pictures have to load out as soon.
Mobile eye tracker construction and gaze path analysis By Wen-Hung Liao 廖文宏.
Evaluation: Analyzing results
Understanding Results
Katherine Prentice, MSIS Richard Usatine, MD
Tips for Writing Free Response Questions on the AP Statistics Exam
Investigations using the
Psych 231: Research Methods in Psychology
Testing & modeling users
Hypothesis Compiling.
Presentation transcript:

ICSE2006 Far East Experience Track Detecting Low Usability Web Pages using Quantitative Data of Users’ Behavior Noboru Nakamichi 1, Makoto Sakai 2, Kazuyuki Shima 3, Ken’ichi Matsumoto 1 1 Nara Institute of Science and Technology 2 SRA Key Technology Laboratory, Inc. 3 Hiroshima City University

ICSE2006 Far East Experience Track2 Background Designing attractive Web sites is an essential issue in business  Web sites directly reflect the images and sales of companies [1] Examples of Usability Problems [2]  Long Scrolling Pages  Dead links  Inconsistency [1] Kelly Goto, Emily Cotler: “Web ReDesign,” Peason Education, [2] Jakob Nielsen, “Designing Web Usability”, New Riders Pub, 1999

ICSE2006 Far East Experience Track3 Web usability evaluation To find usability problems about a Web site Usability testing  Discover problems based on users’ behavior as recorded by VTR Discover serious problems which evaluators usually do not discover Analyzing recorded data takes time because evaluators have to check the Web pages that the users browsed Web pages Evaluation

ICSE2006 Far East Experience Track4 Research goal and approach To empirically verify a proposal of a quantitative usability evaluation method  Various quantitative data of users’ behavior have been proposed for evaluation Gazing point Browsing time Mouse movement we hope to be able to efficiently identify Web pages : Low usability web pages Web pages Evaluation

ICSE2006 Far East Experience Track5 Experiment What is the quantitative data which can discriminate the low usability web pages?  confirm the relationship between users’ evaluation results and various quantitative data Users’ evaluation  hard to use  relatively hard to use  relatively easy to use  easy to use  don’t know Quantitative data  Browsing time  Moving distance of mouse  Moving speed of mouse  Wheel rolling  Moving distance of gazing points  Moving speed of gazing points

ICSE2006 Far East Experience Track6 Outline of the experiment Subjects  10 users who are familiar with the Internet  They have never visited the sites used for the experiment Tasks  Objective information: starting salary for a new masters degree graduate  Web sites: 5 company’s sites The order of the sites was performed at random for every subjects

ICSE2006 Far East Experience Track7 Experimental setting Quantitative data of users’ behavior  Gazing point  Browsing time  Mouse movement Mouse cursor Eyemark Replay functions as VCR Page selector Eye tracking equipment Integrated usability evaluation tool : WebTracer We can monitor what the subject was actually looking at

ICSE2006 Far East Experience Track8 Experimental procedure 1.Task  Quantitative data of the user’s behavior are recorded using WebTracer  Don’t take any interruptions for questions 2.Evaluation by subjects  Web pages that the subjects visited are displayed  Subjects choose the ease of use for every visited Web page from the 5 levels 3.Interview  About the their comments on their search with replaying the record of user’s behavior

ICSE2006 Far East Experience Track9 Analysis What is the quantitative data which can discriminate low usability web pages?  We test the hypothesis that there is a difference between the quantitative data for “low usability” and “others” pages Evaluation by subjects hard to use ⇒ Low usability pages relatively hard to use relatively easy to use Others easy to use don’t know

ICSE2006 Far East Experience Track10 t-test of “low usability” and “others” have both significantly different standard deviation and means Quantitative data of users' behavior for each pages Low Usability (18pages) Others (174pages) t-test (significance probability P) <0.05 average standard deviation average standard deviation Browsing time (sec) Moving distance of mouse (pixel) Moving speed of mouse (pixel/sec) Wheel rolling (Delta) Moving distance of gazing points (pixel) Moving speed of gazing points (pixel/sec)

ICSE2006 Far East Experience Track11 Analysis What is the quantitative data which can discriminate the low usability web pages?  Difference between the quantitative data for “low usability” and “others” pages  Moving distance of the gazing point is long  Moving speed of the gazing point is high

ICSE2006 Far East Experience Track12 Analysis What is the quantitative data which can discriminate the low usability web pages?  Difference between the quantitative data for “low usability” and “others” pages  Moving distance of the gazing point is long  Moving speed of the gazing point is high  Discriminant analysis of “low usability” pages

ICSE2006 Far East Experience Track13 Discriminant functions above discriminant boundary, the page is discriminated to low usability Quantitative data of users' behavior discriminant coefficient constant term discriminant boundary Browsing time (sec) Moving distance of mouse (pixel) Moving speed of mouse (pixel/sec) Wheel rolling (Delta) Moving distance of gazing points (pixel) Moving speed of gazing points (pixel/sec)

ICSE2006 Far East Experience Track14 test of statistical hypotheses Power of test (1-b): The evaluation results by subject are low usability The discrimination results are low usability Type II error b: The evaluation results by subject are low usability The discrimination results are others Type I error a: The evaluation results by subject are others The discrimination results are low usability Power of test (1-a): The evaluation results by subject are others The discrimination results are others Evaluation result by subject Low UsabilityOthers Discrimination result Low Usability Power of test (1-b) Type I error a Others Type II error b Power of test (1-a)

ICSE2006 Far East Experience Track15 Discriminant analysis result We focus to clarify which quantitative data can detect low usability pages Quantitative data of users' behavior Power of test (1-b) Type II error b Type I error a Power of test (1-a) pages% % % % Browsing time (sec) Moving distance of mouse (pixel) Moving speed of mouse (pixel/sec) Wheel rolling (Delta) Moving distance of gazing points (pixel) Moving speed of gazing points (pixel/sec)

ICSE2006 Far East Experience Track16 Discriminant analysis result 14 of 18 low usability pages are detectable Quantitative data of users' behavior Power of test (1-b) Type II error b Type I error a Power of test (1-a) pages% % % % Browsing time (sec) Moving distance of mouse (pixel) Moving speed of mouse (pixel/sec) Wheel rolling (Delta) Moving distance of gazing points (pixel) Moving speed of gazing points (pixel/sec)

ICSE2006 Far East Experience Track17 Analysis What is the quantitative data which can discriminate the low usability web pages?  Difference between the quantitative data for “low usability” and “others” pages Moving distance of the gazing point is long Moving speed of the gazing point is high  Discriminant analysis of “low usability” pages Moving speed of the gazing point (Best detectable)  14 of 18 low usability pages are detectable (= 77.8%)

ICSE2006 Far East Experience Track18 Moving speed of the gazing point is best detectable quantitative data Scatter plot of low usability pages and other pages Discriminant boundary (Moving speed of gazing point) Low usability page Subjects evaluated Browsing time (sec) Moving distance of the gazing points (pixel)

ICSE2006 Far East Experience Track19 Discriminant analysis result To make the power of test (1-b) higher  Large amount of Wheel rolling, 3 of 4 pages Quantitative data of users' behavior Power of test (1-b) Type II error b Type I error a Power of test (1-a) pages% % % % Browsing time (sec) Moving distance of mouse (pixel) Moving speed of mouse (pixel/sec) Wheel rolling (Delta) Moving distance of gazing points (pixel) Moving speed of gazing points (pixel/sec)

ICSE2006 Far East Experience Track20 Relationship between gazing point and wheel rolling Subjects evaluation result Others Low Usability Pages pages

ICSE2006 Far East Experience Track21 Relationship between gazing point and wheel rolling Subjects evaluation result Others Detection by Moving Speed of Gazing Points: Low Usability Low Usability Pages pages

ICSE2006 Far East Experience Track22 Relationship between gazing point and wheel rolling Subjects evaluation result Others Detection by Moving Speed of Gazing Points: Low Usability Low Usability Pages Detection by Wheel Rolling: Low Usability 192 pages 17 of 18 pages are detectable (= 94.4%) Narrow down pages requiring further evaluation by about half (89 of 192 =46%)

ICSE2006 Far East Experience Track23 Analysis What is the quantitative data which can discriminate the low usability web pages?  Difference between the quantitative data for “low usability” and “others” pages Moving distance of the gazing point is long Moving speed of the gazing point is high  Discriminant analysis of “low usability” pages Moving speed of the gazing point (Best detectable)  14 of 18 low usability pages are detectable (= 77.8%) + Wheel rolling  17 of 18 low usability pages are detectable (= 94.4%)  Narrow down the pages requiring by about half

ICSE2006 Far East Experience Track24 Cause for false discrimination Users’ behavior in 18 low usability pages  17peges Subjects get lost because a link is not found "I couldn't easily find the link which leads to the objective information." "I got lost because the menu layout is bad."  1page (not detected) The subject did not stray and clicked the link smoothly subject can not found objective information under the link

ICSE2006 Far East Experience Track25 Discussion Why is there difference only in moving speed of the gazing point?  Moving of gazing point While searching, all subjects continue moving  Browsing time Not low usability even if browsing time is long  Moving of mouse Moving differ from one subject to another  follows the gazing point  put mouse cursor on space area in web page  Wheel rolling There are subjects not using a mouse wheel for scroll

ICSE2006 Far East Experience Track26 Discussion Why are the variables ( moving speed of the gazing point and wheel rolling ) selected?  When subjects are rolling the wheel, although the screen scrolls, the gazing point does not move too much  In the case of wheel heavy users, the moving speed of the gazing point is low

ICSE2006 Far East Experience Track27 Future Work More elaborate large scale experiments  This will clarify the relationships between the various types of quantitative data and the various issues regarding Web usability problems Analyze users’ behavior within a web page  At this time, the experiment focused on searching across web pages within web sites So, we need to analyze each web page to find out further usability problems

ICSE2006 Far East Experience Track28 Conclusion A significant difference between the quantitative data for “low usability” and “others” pages appears in moving of the gazing points Best detectable quantitative data is the moving speed of the gazing point  14 of 18 low usability pages are detectable (= 77.8%) More detectable by combining the wheel rolling  17 of 18 low usability pages are detectable (= 94.4%)  Narrow down pages requiring by about half

ICSE2006 Far East Experience Track29 End Thank you!