Presentation is loading. Please wait.

Presentation is loading. Please wait.

TAUCHI – Tampere Unit for Computer-Human Interaction Gaze-Based Human-Computer Interaction Kari-Jouko Räihä Tampere Unit for Computer-Human Interaction.

Similar presentations


Presentation on theme: "TAUCHI – Tampere Unit for Computer-Human Interaction Gaze-Based Human-Computer Interaction Kari-Jouko Räihä Tampere Unit for Computer-Human Interaction."— Presentation transcript:

1

2 TAUCHI – Tampere Unit for Computer-Human Interaction Gaze-Based Human-Computer Interaction Kari-Jouko Räihä Tampere Unit for Computer-Human Interaction

3 TAUCHI – Tampere Unit for Computer-Human Interaction Gaze-based interaction A possibility in hands-busy situations Increasing number of computer users suffer from RSI (repetitive strain injury) Eye movements –are extremely fast –are natural –require little conscious effort Direction of gaze implicitly indicates the focus of attention

4 TAUCHI – Tampere Unit for Computer-Human Interaction Contents Eye-tracking technology Challenges –technological –interaction related Using eye-movements for application development –algorithms for processing eye movement data Examples of applications

5 TAUCHI – Tampere Unit for Computer-Human Interaction Eye-tracking equipment Rough taxonomy –Electronic methods –Mechanical methods –Video-based methods Single point Two point

6 TAUCHI – Tampere Unit for Computer-Human Interaction Electronic methods The most common method is to place skin electrodes around the eyes and measure the potential differences in the eye Does not constrain head movements Poor accuracy Better for relative than absolute eye movements Mainly used in neurological diagnosis

7 TAUCHI – Tampere Unit for Computer-Human Interaction Mechanical methods Based on contact lenses with –mirror planes + reflecting IR-light –coil + magnetic field Very accurate Very uncomfortable for users who are not used to wearing lenses –usable only for lab studies

8 TAUCHI – Tampere Unit for Computer-Human Interaction Single point video-based methods Tracking one visible feature of the eyeball, usually center of the pupil A video camera focused on one of the user's eyes Image processing software analyzes the video image and traces the tracked feature Based on calibration, the system determines where the user is currently looking at Head movements not allowed –Bite bar or head rest is needed

9 TAUCHI – Tampere Unit for Computer-Human Interaction Two point video-based methods Same basic principle as in single-point video- based method Now two points are tracked to allow for (restricted) head movements Uses infrared light Larger scale head movements require head tracking Pupil Corneal reflection

10 TAUCHI – Tampere Unit for Computer-Human Interaction The Bright Pupil Effect On-axis IR produces a bright pupil image

11 TAUCHI – Tampere Unit for Computer-Human Interaction The Dark Pupil Effect The off-axis IR produces a dark pupil image

12 TAUCHI – Tampere Unit for Computer-Human Interaction ASL (Applied Science Laboratories) Head-mounted system Floor-mounted system

13 TAUCHI – Tampere Unit for Computer-Human Interaction SensoMotoric Instruments EyeLink iViewX and many others…many

14 TAUCHI – Tampere Unit for Computer-Human Interaction Technological challenges High cost of the equipment –currently in the order of 2000-50000 euros –mass production can bring the cost down to hundreds of euros Usability of the equipment –floor-mounted systems convenient but restrict the user’s movements –head-mounted systems reliable but uncomfortable –size (getting smaller and smaller, can soon be embedded in eyeglasses) –future: increased use of video analysis Need for calibration –for all users in the beginning of each session –also during the use

15 TAUCHI – Tampere Unit for Computer-Human Interaction Interaction challenges Requires the development of new forms of interaction Eyes are normally used for observation, not for control –humans are not used to activating objects just by looking at them –poorly implemented eye control can be extremely annoying The device produces lots of noisy data –the data stream needs to be compacted in order to make it suitable for input (fixations, input tokens, intentions) –the physiological properties of the eye yield limits for accuracy that cannot be overcome

16 TAUCHI – Tampere Unit for Computer-Human Interaction Processing of eye-movement data Experiment by Yarbus in 1967: gaze paths, when three different persons answered different questions about the same painting Data contains jitter, inaccuracies, tracking errors Raw data must be filtered and the fixations must be computed in real time

17 TAUCHI – Tampere Unit for Computer-Human Interaction Concepts Fixation –The eyegaze is (almost) still –All information about the target is perceived during fixations –Duration varies: 120-1000 ms, typically 200-600 ms –No more than 3-4 per second Saccade –Movement between fixations –Typically last for 40-120 ms –Very fast, therefore practically no information is perceived during saccades –Ballistic: end point cannot be changed after saccade has begun

18 TAUCHI – Tampere Unit for Computer-Human Interaction Filtering the noisy data Simple algorithm: 1) Fixation starts when the eye position stays within 0.5 o > 100 ms (spatial and temporal thresholds filter the jitter) 2) Fixation continues as long as the position stays within 1 o 3) 200 ms failures to track the eye do not terminate the fixation Time x-coordinate of eyegaze

19 TAUCHI – Tampere Unit for Computer-Human Interaction Visualizing the fixations Circles denote fixations (centered at the point of gaze) Radius corresponds to duration Lines represent saccades Studies of gaze behaviour while driving

20 TAUCHI – Tampere Unit for Computer-Human Interaction Gaze-based input tokens The fixations are then turned into input tokens, e.g. –start of fixation –continuation of fixation (every 50 ms) –end of fixation –failure to locate eye position –entering monitored regions The tokens form eye events –are multiplexed into the event queue stream with other input events The eye events can also carry information of the fixated screen object

21 TAUCHI – Tampere Unit for Computer-Human Interaction Inferring user’s intentions Goals –to refine the data further for recognizing the user’s intentions –to implement a higher level programming interface for gaze aware applications Eye Interpretation Engine (Greg Edwards, http://eyetracking.stanford.edu/): claims to be able to identify such behaviors as –the user is reading –just “looking around” –starts and stops searching for an object (e.g. a button) –wants to select an object

22 TAUCHI – Tampere Unit for Computer-Human Interaction Eye as a control device Gaze behaves very differently from other ways used for controlling computers (hands, voice) –intentional control of eyes is difficult and stressful, the gaze is easily attracted by external events –precise control of eyes is difficult “Midas touch” problem –Most of the time the eyes are used for obtaining information with no intent to initiate commands –Users are easily afraid of looking at the “eye active” objects or areas of the window

23 TAUCHI – Tampere Unit for Computer-Human Interaction Even though eye movements are an old research area, gaze-aware applications practically do not exist Exception: applications for disabled © Erica, Inc. http://www.ericainc.com Command-based gaze interaction

24 TAUCHI – Tampere Unit for Computer-Human Interaction Object selection The most common task How is the selection activated (avoiding Midas touch)? – dwell time – special on-screen buttons – activation by eyes (e.g. blink or wink) – hardware buttons Empirical observations: –selection by gaze can be faster than selection by mouse –precision is a problem: targets must be large enough –in general, dwell time seems to be the best option, when carefully chosen (not too short, not too long)

25 TAUCHI – Tampere Unit for Computer-Human Interaction Gaze as mouse accelerator Magic pointing (Zhai, 1999) –gaze is used to warp the mouse pointer to the vicinity of the target object –within a threshold circle gaze no longer affects the mouse pointer –fine-grain adjustment done using the mouse Target sufficiently close Mouse pointer is warped to eyegaze position Gaze position reported by eye tracker Threshold circle Two strategies for warping – always when the point of gaze moves (“liberal”) – only after moving the mouse (“cautious”) Empirical results – liked by the users – interaction was slightly slowed down by the cautious strategy, but the liberal strategy was faster than using just the mouse

26 TAUCHI – Tampere Unit for Computer-Human Interaction Natural interfaces “Non-command user interfaces” (Nielsen, 1993) Multimodal interfaces are developing towards task- and user-centered operation, instead of command-based operation The computer monitors the user’s actions instead of waiting for commands (“proactive computing”) The point of gaze can provide valuable additional information

27 TAUCHI – Tampere Unit for Computer-Human Interaction Ship database (Jacob, 1993) Examples

28 TAUCHI – Tampere Unit for Computer-Human Interaction iEye iEye video

29 TAUCHI – Tampere Unit for Computer-Human Interaction Current research European Conference on Eye Movements (ECEM) –11 th conference in Turku, August 2001 Eye Tracking Research and Applications (ETRA) –2 nd conference in New Orleans, March 2002

30 TAUCHI – Tampere Unit for Computer-Human Interaction Wooding: Fixation Maps 1. The original image 2. Map of fixations of 131 traces 3. Corresponding contour plot 4. Image redrawn, areas with higher number of fixations appearing brighter

31 TAUCHI – Tampere Unit for Computer-Human Interaction GAZE Groupware System multiparty telecon- ferencing and do- cument sharing system images rotate to show gaze direction (who is talking to whom) document “lightspot” (“look at this” reference) Fig.53: GAZE Groupware display

32 TAUCHI – Tampere Unit for Computer-Human Interaction Eye-gaze correction for videoconferencing Problem: Camera and screen not aligned  eye contact lost Solution: Track the eyes and automatically warp the eyes in each frame to create illusion of eye contact

33 TAUCHI – Tampere Unit for Computer-Human Interaction Don’t think while you drive

34 TAUCHI – Tampere Unit for Computer-Human Interaction Is this desirable? Hard to predict Clifford Nass and Byron Reeves, Stanford: CASA (Computers As Social Actors) –humans tend to treat computers as fellow humans BlueEyes project at IBM: In the future, ordinary household devices – such as televisions, refrigerators, and ovens – will do their jobs when we look at them and speak to them..

35 TAUCHI – Tampere Unit for Computer-Human Interaction Thanks For slides, thoughts and discussions –Antti Aaltonen –Aulikki Hyrskykari –Päivi Majaranta –Shumin Zhai


Download ppt "TAUCHI – Tampere Unit for Computer-Human Interaction Gaze-Based Human-Computer Interaction Kari-Jouko Räihä Tampere Unit for Computer-Human Interaction."

Similar presentations


Ads by Google