The Effects of Feedback on Targeting with Multiple Moving Targets David Mould and Carl Gutwin.

Slides:



Advertisements
Similar presentations
Recuperação de Informação B Cap. 10: User Interfaces and Visualization , , , November 29, 1999.
Advertisements

Methodology and Explanation XX50125 Lecture 3: Interviews and questionnaires Dr. Danaë Stanton Fraser.
YOU CANT RECYCLE WASTED TIME Victoria Hinkson. EXPERIMENT #1 :
Regis Kopper Mara G. Silva Ryan P. McMahan Doug A. Bowman.
Clipping Lists & Change Borders: Improving Multitasking Efficiency with Peripheral Information Design Mary Czerwinski George Robertson Desney Tan Microsoft.
1 Web Usability and Age Thomas S. Tullis Ann Chadwick-Dias Human Interface Design Department,
Software Process Models
Differences in Navigational Ability and Memory in Males and Females A Focus on Road Map usage versus Landmark Map usage in Navigation.
Surveys and Questionnaires. How Many People Should I Ask? Ask a lot of people many short questions: Yes/No Likert Scale Ask a smaller number.
2-May-15 GUI Design. 2 HMI design There are entire college courses taught on HMI (Human-Machine Interface) design This is just a very brief presentation.
Multi-Modal Text Entry and Selection on a Mobile Device David Dearman 1, Amy Karlson 2, Brian Meyers 2 and Ben Bederson 3 1 University of Toronto 2 Microsoft.
Cognitive Walkthrough More evaluation without users.
NAMESPACES AND VISUAL INHERITANCE. OBJECTIVES In this chapter, I will cover the following: Using namespaces Visual inheritance.
SWICKR Alex Odle, Rylan Hawkins, Joe Woo, Mazen Hassan.
Evaluation Adam Bodnar CPSC 533C Monday, April 5, 2004.
SIMS 213: User Interface Design & Development Marti Hearst Thurs, March 13, 2003.
Humboldt University Berlin, University of Novi Sad, University of Plovdiv, University of Skopje, University of Belgrade, University of Niš, University.
User Interface Testing. Hall of Fame or Hall of Shame?  java.sun.com.
If attention moves in quantal jumps how can the mouse get his cheese? David Crundall Rm 315.
Elision Based Text Zooming Sam Davis. 2 Basic Idea Add zoom control to web browser Zoom out to show more of document Focused on text, not images Instead.
Inference in Dynamic Environments Mark Steyvers Scott Brown UC Irvine This work is supported by a grant from the US Air Force Office of Scientific Research.
CSCI 1101 Intro to Computers 7.3 Learning HTML. HTML Coding - Frame sets 2 What are Frames? Frames allow independent navigation and content to two (or.
Wanderlust Pilot Field Study Presented by Brandon Bond.
Introduction to Usability By : Sumathie Sundaresan.
Patrick baudisch microsoft research, ASI, interaction focus desney tan, maxime collomb, dan robbins, ken hinckley, maneesh agrawala, shen zhao, gonzalo.
Effects of practice, age, and task demands, on interference from a phone task while driving Author: David Shinar, Noam Tractinsky, Richard Compton Accident.
Comparing the Effectiveness of Alternative Approaches for Displaying Edit-Error Messages in Web Forms Bill Mockovak Office of Survey Methods Research Bureau.
CSC 480 Software Engineering Lecture 19 Nov 11, 2002.
The Panlingual Mobile Camera CSE 490f Jonathan Pool, Neb Tadesse, Tim Wong, Luke Woods Tim Wong, Luke Woods.
Implicit Relational Learning in a Multiple-Object Tracking Task: Do People Really Track the Objects? Tiffany Williams and Olga Lazareva (Department of.
Just as there are many human languages, there are many computer programming languages that can be used to develop software. Some are named after people,
UCI Library Website Chris Lee Archana Vaidyanathan Duncan Tsai Karen Quan.
Rater Reliability How Good is Your Coding?. Why Estimate Reliability? Quality of your data Number of coders or raters needed Reviewers/Grant Applications.
Cathleen Karaffa Wordpress/ ckaraffa Test Findings ckaraffa.
National Diploma Unit 4 Introduction to Software Development Human Computer Interaction – HCI.
Derek Reinelt Team Manager Angela Lam Design Jake Sanders Development Padma Vaithyam User Testing.
Nutrition Monitor Andrew Guthrie: Chun Yen: Rob Akbarian: Vicki Jones:
Non-Overlapping Aggregated Multivariate Glyphs for Moving Objects Roeland Scheepens, Huub van de Wetering, Jarke J. van Wijk Presented by: David Sheets.
Usability Engineering Dr. Dania Bilal IS 582 Spring 2006.
HIGHLIGHTS OF CHI 2000 Thomas G. Holzman, Ph.D. (404)
SERVICE DESK PERSPECTIVE MyBYU / Mobile App / Mobile Browser.
Final Project Hypertext and the Online Class By Melody Anderson.
Websites with good heuristics Irene Wachirawutthichai.
Introduction to Usability By : Sumathie Sundaresan.
Senior Project – Computer Science – 2015 Effect of Display Size on a Player’s Situational Awareness in Real-Time Strategy Games Peter Miralles Advisor.
Counting How Many Words You Read
Usability Engineering Dr. Dania Bilal IS 592 Spring 2005.
7th Meeting TYPE and CLICK. Keyboard Keyboard, as a medium of interaction between user and machine. Is a board consisting of the keys to type a sentence.
Usability Testing Instructions. Why is usability testing important? In a perfect world, we would always user test instructions before we set them loose.
Factors influencing the usability of icons in the LCD touch screens Hsinfu Huang, Wang-Chin Tsai, Hsin-His Lai Department of Industrial Design, National.
Information Visualization “Ant-vision is humanity’s usual fate; but seeing the whole is every thinking person’s aspiration” - David Gelernter “Visualization.
Software Development Languages and Environments. Computer Languages Just as there are many human languages, there are many computer programming languages.
Output THE BASICS. What is Output? Output is the information that comes FROM a computer OUT to a user Sometimes this information is feedback to an action.
Human Computer Interaction Lecture 21 User Support
INF385G: Topic Discussion Huang, S. C.
Human Computer Interaction Lecture 21,22 User Support
Soliciting Reader Contributions to Software Tutorials
How to Start This PowerPoint® Tutorial
Usability Testing 3 CPSC 481: HCI I Fall 2014 Anthony Tang.
Methodologies By Akinola Soyinka.
Between-Subjects, within-subjects, and factorial Experimental Designs
Data Analysis of EnchantedLearning.com vs. Invent.org
Unit 2 User Interface Design.
Chapter 12: Automated data collection methods
IUED Institute of Translation and Interpreting
GUI Design 24-Feb-19.
Using the TRACK CHANGES Features in MS-Word
Low-Fi Prototype and Testing
Cognitive Walkthrough
Chapter 7 Factors Affecting Instruction
Presentation transcript:

The Effects of Feedback on Targeting with Multiple Moving Targets David Mould and Carl Gutwin

Overview ► Introduction / Motivation ► Research Question ► Experimental Design ► Results ► Conclusions

Targeting ► The user sees a target (a window, a menu item), moves the cursor to it, and selects it ► An extremely common task in modern mouse-based user interfaces

Targeting is easy ► In typical applications, targets are highly visible and stationary. ► Feedback has been suggested as a method of improving targeting performance – but Akamatsu et al. ran a study finding that feedback doesn't help under these conditions

Virtual World applications Xu, Stewart, and Fiume GI 2002

Computer Game applications Warcraft 3 (Blizzard Entertainment)

Targeting can be hard ► In other applications, targeting can be more difficult: ► targets might move ► in a cluttered environment, targets might be hidden or the user might be distracted ► Does feedback improve targeting when the task is more difficult? ► (It can, per Fraser & Gutwin – is this the kind of task where feedback will help?)

The Role of Feedback ► Feedback is used in many applications where targeting might seem difficult. ► Games use sevral kinds of visual feedback: ► highlighting the target ► haloing the target ► writing text in a sidebar ► many others

Our study: Does feedback help? ► Maybe feedback helps in difficult tasks. ► Pilot: a few graduate students performed targeting tasks with moving targets and clutter ► Pilot results: no effect on completion time, maybe some effect on error rate

Targeting Environment

Conditions ► Different conditions: speed, occluder count, feedback condition ► Each participant saw all conditions ► Feedback condition order varied, but other conditions were fixed in order of increasing difficulty

Speed condition ► Three different speeds: "slow", "medium", "fast" ► Slow: 45 pixels/second (~1.3 cm/s) ► Medium: 220 pixels/second (~6.4 cm/s) ► Fast: 400 pixels/second (~11.7 cm/s) ► Entire window: 600 pixels across, square.

Occluder count condition ► We varied the number of occluders: ► None (no occluders) ► Few (22 occluders) ► Medium (44 occluders) ► Many (88 occluders) ► We always showed the home base as well.

Increasing Occluder Count

Feedback condition ► All feedback was visual ► Three feedback conditions: ► None ► Target-only (target lit on mouse-over) ► All objects (all objects lit on mouse-over)

Summary of conditions ► All participants exposed to all conditions: all combinations of 3x4x3 factors ► Feedback type was fully counterbalanced, while occluder count and speed were presented in increasing order of difficulty

Experimental procedure ► Six groups, for six different feedback orders ► Eighteen participants in total ► Participants were asked to click as quickly and accurately as possible ► After a short learning period, participants completed 16 targeting tasks in each of 36 condition combinations ► Following the study, participants completed a questionnaire

Results ► Timing ► Error rate ► Target occlusion at moment of selection ► Preferences

Results (timing) feedback had no effect on timingcompletion time became longer with higher speeds and with more occluders

Results (error rate) Overall error rate went down with increasing feedback

Results (error rate) errors were reduced with more feedback errors increased with speederrors increased with occluder count

Results (occlusion) Average target occlusion increased with occluder count

Results (preferences) Everyone liked target-only feedback best

Results Summary ► Both completion time and errors increase with occluder count and with speed ► Target feedback had no effect on completion time ► Target feedback reduced error rates ► All-object feedback helped more than target-only feedback ► Users preferred target-only feedback

Possible Explanations -- time ► We know that increased target speed increases targeting difficulty (Jagacinski et al. 1980) ► Not surprised to find that increasing occluder count increases targeting difficulty

Possible Explanations -- error ► Increase in error rate with speed: targets can escape ► Increase with occluder count: occluders more likely to intercept a click ► Feedback reducing error rate: signal whether or not a click will succeed ► All-object feedback: a strong signal

Possible explanations -- preference ► Users liked target-only feedback the best ► Six of eighteen participants complained of distraction in all-object feedback ► Target-only feedback only positive ► All-object feedback highlights mistakes ► Positive advice welcome, but all advice useful

Implications ► Despite user's preferences, we cannot provide target-only feedback in a real application ► But, all-object feedback is good too: it does not reduce completion time, and it does reduce error rate

Future Work ► A less intrusive all-object feedback? ► By what mechanism was error reduced? ► Traditionally, feedback helps only the final stage of tracking. Can we devise feedback which is helpful at earlier stages?

Questions?