TAUCHI – Tampere Unit for Computer-Human Interaction Gaze-Based Human-Computer Interaction Kari-Jouko Räihä Tampere Unit for Computer-Human Interaction.

Slides:



Advertisements
Similar presentations
Regis Kopper Mara G. Silva Ryan P. McMahan Doug A. Bowman.
Advertisements

Introduction to Eye Tracking
Professional and Technical Written Communication for Engineers Session Eight Test Procedure and Discussion of Results.
EyeChess: the tutoring game with visual attentive interface Špakov Oleg Department of Computer Sciences University of Tampere Finland
Gaze vs. Mouse in Games: The Effects on User Experience Tersia //Gowases, Roman Bednarik, Markku Tukiainen Department of Computer Science and Statistics.
Virtual Dart: An Augmented Reality Game on Mobile Device Supervisor: Professor Michael R. Lyu Prepared by: Lai Chung Sum Siu Ho Tung.
Image Retrieval Using Eye Movements Fred Stentiford & Wole Oyekoya University College London.
CS CS 5150 Software Engineering Lecture 12 Usability 2.
MUltimo3-D: a Testbed for Multimodel 3-D PC Presenter: Yi Shi & Saul Rodriguez March 14, 2008.
Practical Gaze Tracking Peter Elliott CS 498 Spring 2009.
Electro-Oculography (EOG) Measurement System The goal : To measure eye movement with maximum accuracy using skin electrodes around the eyes that detect.
Tracking multiple independent targets: Evidence for a parallel tracking mechanism Zenon Pylyshyn and Ron Storm presented by Nick Howe.
TAUCHI – Tampere Unit for Computer-Human Interaction “Easy Access” : Eye-Movements and Function Selection Oleg Spakov Tampere Unit for Computer-Human Interaction.
1 CS 430 / INFO 430 Information Retrieval Lecture 24 Usability 2.
Input Devices Text Entry Devices, Positioning, Pointing and Drawing.
User Centered Design Lecture # 5 Gabriel Spitz.
Lesson Objectives To understand that users with disabilities require different input and output devices To be able to identify these devices and explain.
An Investigation of Usability Issues with Mobile Systems Using a Mobile Eye Tracker thesis by Marie Norlien International University in Germany Thesis.
Society for Psychophysiological Research
Computer Vision Systems for the Blind and Visually Disabled. STATS 19 SEM Talk 3. Alan Yuille. UCLA. Dept. Statistics and Psychology.
Eye Movements and Visual Attention
Eye tracking: principles and applications 廖文宏 Wen-Hung Liao 12/10/2009.
Copyright © 2014, Oracle and/or its affiliates. All rights reserved. | From a certain point of view Eye tracking with Java Gerrit Grunwald Java Technology.
[ §6 : 1 ] 6. Basic Methods II Overview 6.1 Models 6.2 Taxonomy 6.3 Finite State Model 6.4 State Transition Model 6.5 Dataflow Model 6.6 User Manual.
The Eye-Tracking Butterfly: Morphing the SMI REDpt Eye-Tracking Camera into an Interactive Device. James Cunningham & James D. Miles California State University,
Usability Testing – Part II Teppo Räisänen
The Camera Mouse: Visual Tracking of Body Features to Provide Computer Access for People With Severe Disabilities.
Ch 26 & 27 User Interfaces.
CAP4730: Computational Structures in Computer Graphics 3D Concepts.
1 SWE 513: Software Engineering Usability II. 2 Usability and Cost Good usability may be expensive in hardware or special software development User interface.
Cognitive demands of hands-free- phone conversation while driving Professor : Liu Student: Ruby.
Two Handed and Gaze Input Stanford and Princeton Lecture Nov 29, 1999 Shumin Zhai.
11.10 Human Computer Interface www. ICT-Teacher.com.
 An eye tracking system records how the eyes move when a subject is sitting in front of a computer screen.  The human eyes are constantly moving until.
TAUCHI – Tampere Unit for Computer-Human Interaction 1 Statistical Models of Human Performance I. Scott MacKenzie.
TAUCHI – Tampere Unit for Computer-Human Interaction Visualizing gaze path for analysis Oleg Špakov MUMIN workshop 2002, Tampere.
Eye movements: a primer Leanne Chukoskie, Ph.D.. Two classes of eye movements Gaze-stabilizing – Vestibulo-ocular reflex (VOR) – Optokinetic Nystagmus.
SEG3120 User Interfaces Design and Implementation
CMPT480 Term Project Yichen Dang Nov 28,2012.   For me:  Introduce a technology for painting without hands requirement  Deeper understanding of eye.
Gaze-Controlled Human-Computer Interfaces Marc Pomplun Department of Computer Science University of Massachusetts at Boston Homepage:
E.g.: MS-DOS interface. DIR C: /W /A:D will list all the directories in the root directory of drive C in wide list format. Disadvantage is that commands.
Gaze-based Interfaces for Internet
1 Human Computer Interaction Week 5 Interaction Devices and Input-Output.
The Implementation of a Glove-Based User Interface Chris Carey.
Gaming ISV TOBII CONFIDENTIAL INFORMATION. Imagine a computer that knows where you want to point before you do  By looking at your point of gaze the.
Research Background: Depth Exam Presentation
Counting How Many Words You Read
Touch Screen, Head Mouse and Eye Gaze. Alternatives to the mouse & keyboard One alternative to the keyboard and mouse is a touch screen monitor Particularly.
 A navigational display should serve these four different classes of tasks:  Provide guidance about how to get to a destination  Facilitate planning.
1 Partner presentation COGAIN camp 05 September 2007 Prof Alastair Gale, Dr Fangmin Shi Applied Vision Research Centre Loughborough University, UK.
It Starts with iGaze: Visual Attention Driven Networking with Smart Glasses It Starts with iGaze: Visual Attention Driven Networking with Smart Glasses.
Visual Perception By Katie Young and Joe Avery. Overview Visual Perception Eye Trackers Change Blindness.
Picture change during blinks: looking without seeing and seeing without looking Professor: Liu Student: Ruby.
VIRTUAL KEYBOARD By Parthipan.L Roll. No: 36 1 PONDICHERRY ENGINEERING COLLEGE, PUDUCHERRY.
Topic 2 Input devices. Topic 2 Input devices Are used to get raw data into the computer so that it can be processed Include common input devices such.
Color Theory. Primary Colors Colors that cannot be created by mixing others.
Prepared byMIDHUN.T CHM S5.  Aims at creating computational machines that have perceptual and sensory ability like those of human beings.  Use camera.
Mobile eye tracker construction and gaze path analysis By Wen-Hung Liao 廖文宏.
Blue Eye Technology By: ARUN DIXIT. CONTENTS Motivation What is BlueEye technology ? What is BlueEyes ? System designing System overview DAU CSU IBM research.
AS Level ICT Selection and use of input devices and input media: Simple devices.
Perceptive Computing Democracy Communism Architecture The Steam Engine WheelFire Zero Domestication Iron Ships Electricity The Vacuum tube E=mc 2 The.
EYE TRACKING TECHNOLOGY
Analyzing Eye Tracking Data
Input devices.
Presented by Jason Moore
11.10 Human Computer Interface
Real-time Wall Outline Extraction for Redirected Walking
Input devices.
眼動儀與互動介面設計 廖文宏 6/26/2009.
Experimental Evaluation
Presentation transcript:

TAUCHI – Tampere Unit for Computer-Human Interaction Gaze-Based Human-Computer Interaction Kari-Jouko Räihä Tampere Unit for Computer-Human Interaction

TAUCHI – Tampere Unit for Computer-Human Interaction Gaze-based interaction A possibility in hands-busy situations Increasing number of computer users suffer from RSI (repetitive strain injury) Eye movements –are extremely fast –are natural –require little conscious effort Direction of gaze implicitly indicates the focus of attention

TAUCHI – Tampere Unit for Computer-Human Interaction Contents Eye-tracking technology Challenges –technological –interaction related Using eye-movements for application development –algorithms for processing eye movement data Examples of applications

TAUCHI – Tampere Unit for Computer-Human Interaction Eye-tracking equipment Rough taxonomy –Electronic methods –Mechanical methods –Video-based methods Single point Two point

TAUCHI – Tampere Unit for Computer-Human Interaction Electronic methods The most common method is to place skin electrodes around the eyes and measure the potential differences in the eye Does not constrain head movements Poor accuracy Better for relative than absolute eye movements Mainly used in neurological diagnosis

TAUCHI – Tampere Unit for Computer-Human Interaction Mechanical methods Based on contact lenses with –mirror planes + reflecting IR-light –coil + magnetic field Very accurate Very uncomfortable for users who are not used to wearing lenses –usable only for lab studies

TAUCHI – Tampere Unit for Computer-Human Interaction Single point video-based methods Tracking one visible feature of the eyeball, usually center of the pupil A video camera focused on one of the user's eyes Image processing software analyzes the video image and traces the tracked feature Based on calibration, the system determines where the user is currently looking at Head movements not allowed –Bite bar or head rest is needed

TAUCHI – Tampere Unit for Computer-Human Interaction Two point video-based methods Same basic principle as in single-point video- based method Now two points are tracked to allow for (restricted) head movements Uses infrared light Larger scale head movements require head tracking Pupil Corneal reflection

TAUCHI – Tampere Unit for Computer-Human Interaction The Bright Pupil Effect On-axis IR produces a bright pupil image

TAUCHI – Tampere Unit for Computer-Human Interaction The Dark Pupil Effect The off-axis IR produces a dark pupil image

TAUCHI – Tampere Unit for Computer-Human Interaction ASL (Applied Science Laboratories) Head-mounted system Floor-mounted system

TAUCHI – Tampere Unit for Computer-Human Interaction SensoMotoric Instruments EyeLink iViewX and many others…many

TAUCHI – Tampere Unit for Computer-Human Interaction Technological challenges High cost of the equipment –currently in the order of euros –mass production can bring the cost down to hundreds of euros Usability of the equipment –floor-mounted systems convenient but restrict the user’s movements –head-mounted systems reliable but uncomfortable –size (getting smaller and smaller, can soon be embedded in eyeglasses) –future: increased use of video analysis Need for calibration –for all users in the beginning of each session –also during the use

TAUCHI – Tampere Unit for Computer-Human Interaction Interaction challenges Requires the development of new forms of interaction Eyes are normally used for observation, not for control –humans are not used to activating objects just by looking at them –poorly implemented eye control can be extremely annoying The device produces lots of noisy data –the data stream needs to be compacted in order to make it suitable for input (fixations, input tokens, intentions) –the physiological properties of the eye yield limits for accuracy that cannot be overcome

TAUCHI – Tampere Unit for Computer-Human Interaction Processing of eye-movement data Experiment by Yarbus in 1967: gaze paths, when three different persons answered different questions about the same painting Data contains jitter, inaccuracies, tracking errors Raw data must be filtered and the fixations must be computed in real time

TAUCHI – Tampere Unit for Computer-Human Interaction Concepts Fixation –The eyegaze is (almost) still –All information about the target is perceived during fixations –Duration varies: ms, typically ms –No more than 3-4 per second Saccade –Movement between fixations –Typically last for ms –Very fast, therefore practically no information is perceived during saccades –Ballistic: end point cannot be changed after saccade has begun

TAUCHI – Tampere Unit for Computer-Human Interaction Filtering the noisy data Simple algorithm: 1) Fixation starts when the eye position stays within 0.5 o > 100 ms (spatial and temporal thresholds filter the jitter) 2) Fixation continues as long as the position stays within 1 o 3) 200 ms failures to track the eye do not terminate the fixation Time x-coordinate of eyegaze

TAUCHI – Tampere Unit for Computer-Human Interaction Visualizing the fixations Circles denote fixations (centered at the point of gaze) Radius corresponds to duration Lines represent saccades Studies of gaze behaviour while driving

TAUCHI – Tampere Unit for Computer-Human Interaction Gaze-based input tokens The fixations are then turned into input tokens, e.g. –start of fixation –continuation of fixation (every 50 ms) –end of fixation –failure to locate eye position –entering monitored regions The tokens form eye events –are multiplexed into the event queue stream with other input events The eye events can also carry information of the fixated screen object

TAUCHI – Tampere Unit for Computer-Human Interaction Inferring user’s intentions Goals –to refine the data further for recognizing the user’s intentions –to implement a higher level programming interface for gaze aware applications Eye Interpretation Engine (Greg Edwards, claims to be able to identify such behaviors as –the user is reading –just “looking around” –starts and stops searching for an object (e.g. a button) –wants to select an object

TAUCHI – Tampere Unit for Computer-Human Interaction Eye as a control device Gaze behaves very differently from other ways used for controlling computers (hands, voice) –intentional control of eyes is difficult and stressful, the gaze is easily attracted by external events –precise control of eyes is difficult “Midas touch” problem –Most of the time the eyes are used for obtaining information with no intent to initiate commands –Users are easily afraid of looking at the “eye active” objects or areas of the window

TAUCHI – Tampere Unit for Computer-Human Interaction Even though eye movements are an old research area, gaze-aware applications practically do not exist Exception: applications for disabled © Erica, Inc. Command-based gaze interaction

TAUCHI – Tampere Unit for Computer-Human Interaction Object selection The most common task How is the selection activated (avoiding Midas touch)? – dwell time – special on-screen buttons – activation by eyes (e.g. blink or wink) – hardware buttons Empirical observations: –selection by gaze can be faster than selection by mouse –precision is a problem: targets must be large enough –in general, dwell time seems to be the best option, when carefully chosen (not too short, not too long)

TAUCHI – Tampere Unit for Computer-Human Interaction Gaze as mouse accelerator Magic pointing (Zhai, 1999) –gaze is used to warp the mouse pointer to the vicinity of the target object –within a threshold circle gaze no longer affects the mouse pointer –fine-grain adjustment done using the mouse Target sufficiently close Mouse pointer is warped to eyegaze position Gaze position reported by eye tracker Threshold circle Two strategies for warping – always when the point of gaze moves (“liberal”) – only after moving the mouse (“cautious”) Empirical results – liked by the users – interaction was slightly slowed down by the cautious strategy, but the liberal strategy was faster than using just the mouse

TAUCHI – Tampere Unit for Computer-Human Interaction Natural interfaces “Non-command user interfaces” (Nielsen, 1993) Multimodal interfaces are developing towards task- and user-centered operation, instead of command-based operation The computer monitors the user’s actions instead of waiting for commands (“proactive computing”) The point of gaze can provide valuable additional information

TAUCHI – Tampere Unit for Computer-Human Interaction Ship database (Jacob, 1993) Examples

TAUCHI – Tampere Unit for Computer-Human Interaction iEye iEye video

TAUCHI – Tampere Unit for Computer-Human Interaction Current research European Conference on Eye Movements (ECEM) –11 th conference in Turku, August 2001 Eye Tracking Research and Applications (ETRA) –2 nd conference in New Orleans, March 2002

TAUCHI – Tampere Unit for Computer-Human Interaction Wooding: Fixation Maps 1. The original image 2. Map of fixations of 131 traces 3. Corresponding contour plot 4. Image redrawn, areas with higher number of fixations appearing brighter

TAUCHI – Tampere Unit for Computer-Human Interaction GAZE Groupware System multiparty telecon- ferencing and do- cument sharing system images rotate to show gaze direction (who is talking to whom) document “lightspot” (“look at this” reference) Fig.53: GAZE Groupware display

TAUCHI – Tampere Unit for Computer-Human Interaction Eye-gaze correction for videoconferencing Problem: Camera and screen not aligned  eye contact lost Solution: Track the eyes and automatically warp the eyes in each frame to create illusion of eye contact

TAUCHI – Tampere Unit for Computer-Human Interaction Don’t think while you drive

TAUCHI – Tampere Unit for Computer-Human Interaction Is this desirable? Hard to predict Clifford Nass and Byron Reeves, Stanford: CASA (Computers As Social Actors) –humans tend to treat computers as fellow humans BlueEyes project at IBM: In the future, ordinary household devices – such as televisions, refrigerators, and ovens – will do their jobs when we look at them and speak to them..

TAUCHI – Tampere Unit for Computer-Human Interaction Thanks For slides, thoughts and discussions –Antti Aaltonen –Aulikki Hyrskykari –Päivi Majaranta –Shumin Zhai