Presentation is loading. Please wait.

Presentation is loading. Please wait.

Gaze-Controlled Human-Computer Interfaces Marc Pomplun Department of Computer Science University of Massachusetts at Boston Homepage:

Similar presentations


Presentation on theme: "Gaze-Controlled Human-Computer Interfaces Marc Pomplun Department of Computer Science University of Massachusetts at Boston Homepage:"— Presentation transcript:

1 Gaze-Controlled Human-Computer Interfaces Marc Pomplun Department of Computer Science University of Massachusetts at Boston E-mail: marc@cs.umb.edu Homepage: http://www.cs.umb.edu/~marc/

2 Gaze-Controlled Human-Computer Interfaces Overview: Using Eye Movements as a Response Modality in Psychophysics Typing by Eye with Dynamic Recentering An Advanced Typing Interface: Dasher A Gaze-Controlled Zooming Interface

3 Using Eye Movements as a Response Modality in Psychophysics (Stampe & Reingold, 1995) In psychophysical experiments, subjects typically respond to a stimulus by pressing one out of two or more buttons. The obtained response times are used as an indicator of how long it took the subject to process the stimulus. However, the measured duration also includes the time taken for initiating and executing the manual response.

4 Eye Movements as a Response Modality If subjects can indicate their response by moving their eyes instead of pressing a button, this response “overhead” should be reduced. Therefore the signal-to-noise ratio in the reaction data should be improved.

5 Eye Movements as a Response Modality

6

7 Advantages of Gaze-Controlled Interfaces Allow intuitive use of computer programs Operators can simultaneously use their hands for other tasks Enable handicapped people to control systems and communicate by means of eye movements (e.g. “typing by eye”)

8 Problems with Gaze-Controlled Interfaces The “Midas-Touch Problem”: Since eye movements are not completely under conscious control, sometimes functions may be triggered inadvertently. Typically, researchers try to solve this problem by setting a minimum dwell time for triggering events.

9 Typing by Eye with Dynamic Recentering (Stampe & Reingold, 1995) This is a simple “typing by eye” application using a virtual keyboard. Keys are triggered using a dwell time threshold. This threshold can be varied while using the system. The authors also implemented a mechanism of dynamic recentering to avoid frequent recalibration of the eye tracker system.

10 Typing by Eye with Dynamic Recentering (Stampe & Reingold, 1995)

11 The dynamic centering mechanism makes the (reasonable) assumption that users fixate only the keys on the screen. To compensate for drift in gaze-position measurement (as caused by headset shift), the system measures the offset between the centers of the keys and the fixation positions. If a fixation shows such an offset, the following measurements are shifted by about 10% of the offset distance, but in the opposite direction.

12 Typing by Eye with Dynamic Recentering (Stampe & Reingold, 1995) Dynamic recentering is able to reduce the average fixation error and the frequency of system recalibration.

13 Typing by Eye with Dynamic Recentering (Stampe & Reingold, 1995) One disadvantage of the mechanism is that if the offset is larger than half the distance between neighboring keys, it will draw the measurement towards unwanted keys. Moreover, this system can only compensate linear shifts but no rotation or other distortions of measurement.

14 Dasher - An Advanced Typing-by-Eye Interface (Ward & MacKay, 2002) The previously shown interface is the most basic and straightforward implementation of typing by eye. It is possible to make such interfaces more intelligent to allow faster and more convenient typing. One such approach is the Dasher system (freely available on the web).

15 Dasher The initial display of Dasher shows all letters of the alphabet in a column at the right edge of the screen:

16 Dasher The letters flow leftwards, each of them followed by a new alphabet with the most likely continuing letters being the biggest ones.

17 Dasher Use the mouse to control the “typing”: Left-right: control the speed of letters Up-down: select next letter Video Demonstration

18 A Gaze-Controlled Zooming Interface (Pomplun, Ivanovic, Reingold & Shen, 2001) We created a gaze-controlled interface that supports a common, important task (zooming in/out to inspect an image), can be used easily and intuitively, and minimizes the Midas-Touch Problem. DemonstrationDemonstration of the Zooming Interface Demonstration

19 A Gaze-Controlled Zooming Interface We compared the efficiency and practice effects of gaze control vs. mouse control. Four subjects participated in six sessions, each session including 50 gaze and 50 mouse trials. We measured response time, error rate, and the number of magnifications per trial as functions of time (sessions one to six).

20 Response Time

21 Error Rate 123456 0 5 10 15 20 25 30 35 session number

22 Number of Magnifications per Trial

23 Conclusions The novel zooming interface is well-suited for efficient gaze control. With this interface, mouse control is only slightly more efficient than gaze control. Using gaze control can be learned as quickly as using a mouse.


Download ppt "Gaze-Controlled Human-Computer Interfaces Marc Pomplun Department of Computer Science University of Massachusetts at Boston Homepage:"

Similar presentations


Ads by Google