Two Handed and Gaze Input Stanford and Princeton Lecture Nov 29, 1999 Shumin Zhai.

Slides:



Advertisements
Similar presentations
PUI 98 Leveraging Human Capabilities in Perceptual Interfaces George G. Robertson Microsoft Research George G. Robertson Microsoft Research.
Advertisements

Regis Kopper Mara G. Silva Ryan P. McMahan Doug A. Bowman.
Introduction to Computer Input Devices and Their Evaluation Shumin Zhai IBM Almaden Research Center.
1 Slipping and Drifting: Using Older Users to Uncover Pen-based Target Acquisition Difficulties Karyn Moffatt and Joanna McGrenere Department of Computer.
Stanford hci group / cs376 u Scott Klemmer · 02 November 2006 Inpu t Techniqu es.
LECTURE 2: LOOKING AT LANDSCAPES 1VISUAL PERCEPTION 2VISUAL ELEMENTS 3PERCEPTUAL RELATIONSHIPS.
Stanford hci groupFeb 9, 2009 Björn Hartmann Understanding & Modeling Input Devices.
QUASID – Measuring Interaction Techniques Karin Nieuwenhuizen.
Korea Univ. Division Information Management Engineering UI Lab. Korea Univ. Division Information Management Engineering UI Lab. Human Interface PERCEPTUAL-MOTOR.
Stanford hci group / cs376 research topics in human-computer interaction Multimodal Interfaces Scott Klemmer 15 November 2005.
TAUCHI – Tampere Unit for Computer-Human Interaction Gaze-Based Human-Computer Interaction Kari-Jouko Räihä Tampere Unit for Computer-Human Interaction.
Input: Devices and Theory. Input for Selection and Positioning Devices Power Law of Practice Fitt’s Law (2D, 3D lag) Eye hand coordination Two handed.
Topic: Fitts' Law Lawrence Fyfe CPSC 681. Fitts' Law Formula: ID (index of difficulty) = log 2 (D/W +1) Soukoreff R.W., MacKenzie I.S., Towards.
1 Ken Hinckley Edward Cutrell Steve Bathiche Tim Muss Microsoft Research & Microsoft Hardware April 23, 2002 Quantitative Analysis of Scrolling Techniques.
Stanford hci group / cs376 research topics in human-computer interaction Gestural / Bimanual Input Scott Klemmer 29 November.
CS335 Principles of Multimedia Systems Multimedia and Human Computer Interfaces Hao Jiang Computer Science Department Boston College Nov. 20, 2007.
Lecture 5: Interaction and Navigation Dr. Xiangyu WANG Acknowledge the notes from Dr. Doug Bowman.
Objectives Define predictive and descriptive models and explain why they are useful. Describe Fitts’ Law and explain its implications for interface design.
Computer-Based Animation. ● To animate something – to bring it to life ● Animation covers all changes that have visual effects – Positon (motion dynamic)
Discussion Silvia Lindtner INF 132 April 07. Fitts’ law - recap A predictive model of time to point at an object Help decide the location and size of.
Audiovisual Attentive User Interfaces Attending to the needs and actions of the user Paulina Modlitba T Seminar on User Interfaces and Usability.
Eye tracking: principles and applications 廖文宏 Wen-Hung Liao 12/10/2009.
The Eye-Tracking Butterfly: Morphing the SMI REDpt Eye-Tracking Camera into an Interactive Device. James Cunningham & James D. Miles California State University,
Human Factors for Input Devices CSE 510 Richard Anderson Ken Fishkin.
Engineering Design Centre Project Updates –August 2014.
William H. Bowers – Direct Manipulation and Pointing Devices Cooper 21.
Chapter 5 Models and theories 1. Cognitive modeling If we can build a model of how a user works, then we can predict how s/he will interact with the interface.
User Models Predicting a user’s behaviour. Fitts’ Law.
Multimedia Specification Design and Production 2013 / Semester 2 / week 8 Lecturer: Dr. Nikos Gazepidis
Slides based on those by Paul Cairns, York ( users.cs.york.ac.uk/~pcairns/) + ID3 book slides + slides from: courses.ischool.berkeley.edu/i213/s08/lectures/i ppthttp://www-
Human Factors in Web Design Mohsen Asgari. Contents WWW & Human Factors Relationship Human and Computer Interaction HCI & WWW Information Presentation.
Interaction Gavin Sim HCI Lecture /111. Aims of this lecture Last week focused on persona and scenario creation. This weeks aims are: ◦ To introduce.
Use of Eye Movement Gestures for Web Browsing Kevin Juang Frank Jasen Akshay Katrekar Joe Ahn.
 An eye tracking system records how the eyes move when a subject is sitting in front of a computer screen.  The human eyes are constantly moving until.
===!"§ Deutsche Telekom Laboratories Target Acquisition with Camera Phones when used as Magic Lenses CHI 2008, Florence, Italy, April 9, 2008 Michael Rohs.
Behaviour Models There are a number of models that predict the way in which an interface or user will behave.
Snap-and-go helping users align objects without the modality of traditional snapping patrick baudisch ed cutrell ken hinckley adam eversole microsoft research.
Enabling User Interactions with Video Contents Khalad Hasan, Yang Wang, Wing Kwong and Pourang Irani.
Tovi Grossman, Ravin Balakrishnan Dep. of Computer Science Univ. of Toronto CHI 2004.
KAMI KITT ASSISTIVE TECHNOLOGY Chapter 7 Human/ Assistive Technology Interface.
Human-computer Interaction Source: P.M.Heathcote A Level ICT Chapter 61.
1 Human Computer Interaction Week 5 Interaction Devices and Input-Output.
The Two-handed Desktop Interface
Counting How Many Words You Read
User Performance in Relation to 3D Input Device Design  Studies conducted at University of Toronto  Usability review of 6 degree of freedom (DOF) input.
Lecture 3: Pointing Devices and Fitts’ Law
1 Interaction Devices CIS 375 Bruce R. Maxim UM-Dearborn.
Unit 6 of COMP648 User Interface and Interaction Methods Dr Oscar Lin School of Computing and Information Systems Faculty of Science and Technology Athabasca.
Pen Based User Interface Issues CSE 490RA January 25, 2005.
Introduction Processing Verbal Information in Concept Maps How people read concept maps in the context of a semantically meaningful task?
A novel depth-based head tracking and facial gesture recognition system by Dr. Farzin Deravi– EDA UoK Dr Konstantinos Sirlantzis– EDA UoK Shivanand Guness.
Prepared byMIDHUN.T CHM S5.  Aims at creating computational machines that have perceptual and sensory ability like those of human beings.  Use camera.
Over the recent years, computer vision has started to play a significant role in the Human Computer Interaction (HCI). With efficient object tracking.
Mobile eye tracker construction and gaze path analysis By Wen-Hung Liao 廖文宏.
MULTIMODAL AND NATURAL COMPUTER INTERACTION Domas Jonaitis.
Presented by TRIPTI TRIPATHI EC Final year. WHAT IS BLUE EYE TECHNOLOGY? Aim at creating computational machine that have perceptual and sensory ability.
Perceptive Computing Democracy Communism Architecture The Steam Engine WheelFire Zero Domestication Iron Ships Electricity The Vacuum tube E=mc 2 The.
Lecture Input Devices Keyboard. Mouse Microphone Digital Camera Scanner.
EYE TRACKING TECHNOLOGY
Hand Gestures Based Applications
Standard Methods of Input.
Eye Tracker Performance Evaluation with ISO 9241 – Point and Click by Blinking and Dwelling Student: Matthew Conte Superviser: Prof Scott MacKenzie CSE.
Introducing the M-metric Maurice R. Masliah and Paul Milgram

Do-It-Yourself Eye Tracker: Impact of the Viewing Angle on the
What is blue eyes ? aims on creating computational machines that have perceptual and sensory ability like those of human beings. interactive computer.
Input Techniques Jeffrey Heer · 14 May 2009.
A Comparative Study of Target Assistance
HCI for Pen Computing CSE 481b January 24, 2006.
Presentation transcript:

Two Handed and Gaze Input Stanford and Princeton Lecture Nov 29, 1999 Shumin Zhai

The Present and Future of Computer Input Two - handed input Two - handed input Example of present input researchExample of present input research Gaze tracking Gaze tracking Example of (possible) future inputExample of (possible) future input

Two-handed Inputs Buxton and Myers, CHI’ 86 Buxton and Myers, CHI’ 86 left hand scrolling right hand pointing Bier et al, CHI’93 Bier et al, CHI’93 left hand “tool glass” right hand pointing

Asymmetric division of labor in bimanual action The left hand sets the frame of reference for the action of the right hand. The left hand sets the frame of reference for the action of the right hand. Left hand greater scale than right hand Left hand greater scale than right hand Left-hand precedence in action Left-hand precedence in action Right (terminal) hand dominance Right (terminal) hand dominance The kinematic chain as a model Yves Guiard (1987) Journal of Motor Behavior 19, 4,

Cognitive Benefits Physical - time motion efficiency Physical - time motion efficiency Cognitive benefits - visualization model Cognitive benefits - visualization model Leganchuck, Zhai, Buxton, ACM ToCHI, 1998

Natural Metaphor Navigation in 3D Navigation in 3D Bulldozer Bulldozer Zhai, Kandogan, Smith, Selker, JVLC, 1999

Multi-handed input

Human Computer “Communication” Asymmetry (R.J.K. Jacob, ) Asymmetry (R.J.K. Jacob, ) High bandwidth from computer to humanHigh bandwidth from computer to human –Text –Graphics –Sound Low bandwidth from human to computerLow bandwidth from human to computer –Mouse –Keyboard

Enabling multi-modal interaction Increasing computing power Increasing computing power Speech recognition Speech recognition Low cost ($10) camera to appear Low cost ($10) camera to appear Computer Vision / Image Processing Computer Vision / Image Processing Gaze tracking Gaze tracking

What if computer can “see” More efficient and effective HCI? More efficient and effective HCI? Can computer know user “intention”? Can computer know user “intention”? What if the computer can see the user’s gaze? What if the computer can see the user’s gaze?

IBM Almaden Eye-tracker

The Bright Pupil Effect On-axis IR produces a bright pupil image On-axis IR produces a bright pupil image

How - The Dark Pupil Effect The off-axis IR produces a dark pupil image The off-axis IR produces a dark pupil image

Dual light source gaze tracking

Gaze for Pointing? A classic topic: A classic topic: “What you look at is what you get!”“What you look at is what you get!” –J.L. Levine 1981 –C. Ware and Mikaelain 1987 –R.J.K Jacob 1990 Why gaze pointing? Why gaze pointing? Hand unavailableHand unavailable Eye faster than other organsEye faster than other organs –Look first, manipulation follow Fatigue / injury in hand pointingFatigue / injury in hand pointing

Difficulties with Gaze Pointing Eye tracking not precise Eye tracking not precise Measurement errorMeasurement error Eye movement - saccades and fixations (1 degree - twice scrollbar width)Eye movement - saccades and fixations (1 degree - twice scrollbar width) Only large targets work (0.5 in) Only large targets work (0.5 in)

Difficulties with Gaze Pointing How to do buttons (click)? How to do buttons (click)? Blink - often subconsciousBlink - often subconscious Dwell time - continuous fixation for set period (e.g. 200 ms)Dwell time - continuous fixation for set period (e.g. 200 ms) –False selections (“Midas touch”) –Misses What about double or right click?What about double or right click?

Difficulties with Gaze Pointing Unnatural model: Unnatural model: eye - perception organ, driven by mind and worldeye - perception organ, driven by mind and world hand - manipulation (motor) organhand - manipulation (motor) organ gaze pointing - loading perceptual channel with motor tasksgaze pointing - loading perceptual channel with motor tasks Dead end ???? Dead end ????

Utilize Eye Gaze Implicitly? Combining hand and eye movement Combining hand and eye movement Reposition (warp) cursor by gaze Reposition (warp) cursor by gaze Hand remains to be the control device (fine movement and selection) Hand remains to be the control device (fine movement and selection) Defy Fitts’ law? Defy Fitts’ law?

Eye tracking boundary with 95% confidence MAGIC Pointing Manual And Gaze Input Cascaded Pointing Manual And Gaze Input Cascaded Pointing Manual Acquisition with Gaze Initiated Cursor Manual Acquisition with Gaze Initiated Cursor Gaze position reported by eye tracker True target can be anywhere within the circle with 95% probability The cursor is warped to eye tracking position, on or nearby the true target Previous cursor position far from target (e.g., 200 pixels) Zhai, Ihde, Morimoto, CHI’99

When to warp? Every large saccade Every large saccade pre-intent, “liberal”, proactivepre-intent, “liberal”, proactive possible distractionpossible distraction When input device actuated When input device actuated post-intent, “conservative”post-intent, “conservative” new form of hand-eye coordinationnew form of hand-eye coordination

Experiment Difficult: every thing has to go “right” Difficult: every thing has to go “right” Imperfect tracking system Imperfect tracking system error: 1 degree or moreerror: 1 degree or more delay: 66 ms or moredelay: 66 ms or more

Task

Experimental Design Two target size - 20 vs. 60 pixels Two target size - 20 vs. 60 pixels Three distances - 200, 500, 800 pixels Three distances - 200, 500, 800 pixels Three pointing direction Three pointing direction horizontal, vertical, diagonalhorizontal, vertical, diagonal Three pointing techniques Three pointing techniques two magictwo magic one manualone manual 12 subjects 12 subjects

Trial completion time

Other observations $20 prize claimed with magic technique $20 prize claimed with magic technique User’s subjective experience User’s subjective experience rated both magic techniques higher than manual (1.5 and 3.5 on -5 to +5 scale)rated both magic techniques higher than manual (1.5 and 3.5 on -5 to +5 scale) The “liberal” technique was “easier”The “liberal” technique was “easier” Disappointed with pure manual - subjective ease of operation (work done at will)Disappointed with pure manual - subjective ease of operation (work done at will)

What can we conclude? Reduced fatigue (less manual work) Reduced fatigue (less manual work) More precise than traditional gaze pointing (small target) More precise than traditional gaze pointing (small target) More practical than traditional More practical than traditional

Future work tracking system limitations tracking system limitations FrequencyFrequency resolutionresolution Calibration!Calibration! magic method limitations magic method limitations experimental limitations experimental limitations

There is more …...