Gaze-Controlled Human-Computer Interfaces Marc Pomplun Department of Computer Science University of Massachusetts at Boston Homepage:

Slides:



Advertisements
Similar presentations
Intro to Computers!.
Advertisements

Regis Kopper Mara G. Silva Ryan P. McMahan Doug A. Bowman.
Evaluating the Effect of Neighborhood Size on Chinese Word Naming and Lexical Decision Meng-Feng Li 1, Jei-Tun WU 1*, Wei-Chun Lin 1 and Fu-Ling Yang 1.
Copyright 1999 all rights reserved Input Devices n What types are there? n Why do we need them? –What functions do they perform? n What are desirable characteristics.
Limit for Title Do not exceed Limit for Subhead and content Do not exceed Overall content limit Tecton Nucleus User Presentation.
Comparison of Spatial and Temporal Discrimination Performance across Various Difficulty Levels J.E. THROPP, J.L. SZALMA, & P.A. HANCOCK Department of Psychology.
QoE Assessment in Olfactory and Haptic Media Transmission: Influence of Inter-Stream Synchronization Error Sosuke Hoshino, Yutaka Ishibashi, Norishige.
Location Based Social Networking For All Presenter: Danny Swisher.
Mobile Phone Use in a Driving Simulation Task: Differences in Eye Movements Stacy Balk, Kristin Moore, Will Spearman, & Jay Steele.
Small Displays Nicole Arksey Information Visualization December 5, 2005 My new kitty, Erwin.
Image Retrieval Using Eye Movements Fred Stentiford & Wole Oyekoya University College London.
People & Devices: (Inputs & Outputs) Startlingly small child using computer History of human-computer interaction Another history video.
CS335 Principles of Multimedia Systems Multimedia and Human Computer Interfaces Hao Jiang Computer Science Department Boston College Nov. 20, 2007.
Sketchify Tutorial Graphics and Animation in Sketchify sketchify.sf.net Željko Obrenović
Objectives Define predictive and descriptive models and explain why they are useful. Describe Fitts’ Law and explain its implications for interface design.
The Experimental Approach September 15, 2009Introduction to Cognitive Science Lecture 3: The Experimental Approach.
Discussion Silvia Lindtner INF 132 April 07. Fitts’ law - recap A predictive model of time to point at an object Help decide the location and size of.
Another Look at Camera Control
Automating with Macros Today we are going to at how to automate frequently used processes with macros: What is a macro? What do we want to automate? How.
Knowledge Systems Lab JN 8/24/2015 A Method for Temporal Hand Gesture Recognition Joshua R. New Knowledge Systems Laboratory Jacksonville State University.
Mobile Text Entry: Methods and Evaluation CSCI 4800 March 31, 2005.
The Camera Mouse: Visual Tracking of Body Features to Provide Computer Access for People With Severe Disabilities.
Studying Visual Attention with the Visual Search Paradigm Marc Pomplun Department of Computer Science University of Massachusetts at Boston
Computer Education for Elementary School Students: Teacher’s Training SSRVM 2007.
Computer Hardware and Software Chapter 1. Overview Brief History of Computers Hardware of a Computer Binary and Hexadecimal Numbers Compiling vs. Interpreting.
User Models Predicting a user’s behaviour. Fitts’ Law.
1 On-Line Help and User Documentation  User manuals, online help, and tutorials are typically not used  However, well written and well-designed user.
11.10 Human Computer Interface www. ICT-Teacher.com.
New Features in Release 9.2 (July 27, 2009). 2 Release 9.2 New Features Updated Shopping Experience Home/Shop page Shop at the top search New Hosted Supplier.
USER INTERFACE.
Use of Eye Movement Gestures for Web Browsing Kevin Juang Frank Jasen Akshay Katrekar Joe Ahn.
Missouri S&T Virtual Learning Environment S TUDENT I NTRODUCTION W IMBA L IVE C LASSROOM 5.0.
MarkNotes Question 1 The Human Computer Interface (HCI) is an important part of an ICT system. Describe four factors which should be taken.
Final Honours Presentation Principal Investigator: João Lourenço Supervisor: Dr Hannah Thinyane.
Keyboard shortcuts in the 2007 Office system Carry out commands by using key combinations A key combination is a set of keystrokes that, when pressed together,
CHRONOS-CONTROL COMPUTER CONTROL USING TI CHRONOS Cihat Keser Yeditepe University
Output Design. Output design  Output can be: Displayed on a screen/VDU/monitor. Printed on paper as hard copy. Sound.
What Is The User Interface Design Gabriel Spitz1 Lecture # 2.
Change detection and occlusion modes in road-traffic scenarios Professor: Liu Student: Ruby.
Intelligent Database Systems Lab 國立雲林科技大學 National Yunlin University of Science and Technology Learning object design considerations for small- screen.
User Modeling of Assistive Technology Rich Simpson.
When Uncertainty Matters: The Selection of Rapid Goal-Directed Movements Julia Trommershäuser, Laurence T. Maloney, Michael S. Landy Department of Psychology.
A Case Study of Interaction Design. “Most people think it is a ludicrous idea to view Web pages on mobile phones because of the small screen and slow.
Technical Paper Review Designing Usable Web Forms – Empirical Evaluation of Web Form Improvement Guidelines By Amit Kumar.
E.g.: MS-DOS interface. DIR C: /W /A:D will list all the directories in the root directory of drive C in wide list format. Disadvantage is that commands.
Human-computer Interaction Source: P.M.Heathcote A Level ICT Chapter 61.
Modeling Visual Search Time for Soft Keyboards Lecture #14.
Input Design Lecture 11 1 BTEC HNC Systems Support Castle College 2007/8.
Gaming ISV TOBII CONFIDENTIAL INFORMATION. Imagine a computer that knows where you want to point before you do  By looking at your point of gaze the.
Counting How Many Words You Read
A discussion by David Harrison. -Give a brief summary of the paper -Bring focus to an aspect of the paper important to mobile interfaces -Give you something.
Event Handling Tonga Institute of Higher Education.
I can be You: Questioning the use of Keystroke Dynamics as Biometrics Tey Chee Meng, Payas Gupta, Debin Gao Ke Chen.
The role of working memory in eye-gaze cueing Anna S. Law, Liverpool John Moores University Stephen R. H. Langton, University of Stirling Introduction.
MRS. SEALY | THOMPSON MIDDLE SCHOOL Touch Typing Introduction & Techniques.
PET Count  Word Frequency effects (coefficients) were reliably related to activation in both the striate and ITG for older adults only.  For older adults,
AHMI Project Advanced Human/Machine Interfaces (AHMI) Grant G. Connell.
Eye Movements and Working Memory Marc Pomplun Department of Computer Science University of Massachusetts at Boston Homepage:
Data Collecting Techniques Telephone interviews Traditional telephone interviews involve phoning a sample of respondents and asking them a series.
Design for usability E6: Human Factors Design IB Technology.
NAME: ADEKANMBI KEHINDE MATRIC NUMBER: 12/SMS02/004 DEPARTMENT: ACCOUNTING TITLE: RECENT ADVANCEMENT AND APPLICATIONS IN TOUCH SCREEN TECHNOLOGY.
COGNITIVE APPROACH TO ROBOT SPATIAL MAPPING
Introduction to Event-Driven Programming
Franklin (Mingzhe) Li, Mingming Fan & Khai N. Truong
Enhancing User identification during Reading by Applying Content-Based Text Analysis to Eye- Movement Patterns Akram Bayat Amir Hossein Bayat Marc.
Lesson 1: Buttons and Events – 12/18
NBKeyboard: An Arm-based Word-gesture keyboard
The Morse Mouse Crossing Interface
眼動儀與互動介面設計 廖文宏 6/26/2009.
Press SPACE to continue
Presentation transcript:

Gaze-Controlled Human-Computer Interfaces Marc Pomplun Department of Computer Science University of Massachusetts at Boston Homepage:

Gaze-Controlled Human-Computer Interfaces Overview: Using Eye Movements as a Response Modality in Psychophysics Typing by Eye with Dynamic Recentering An Advanced Typing Interface: Dasher A Gaze-Controlled Zooming Interface

Using Eye Movements as a Response Modality in Psychophysics (Stampe & Reingold, 1995) In psychophysical experiments, subjects typically respond to a stimulus by pressing one out of two or more buttons. The obtained response times are used as an indicator of how long it took the subject to process the stimulus. However, the measured duration also includes the time taken for initiating and executing the manual response.

Eye Movements as a Response Modality If subjects can indicate their response by moving their eyes instead of pressing a button, this response “overhead” should be reduced. Therefore the signal-to-noise ratio in the reaction data should be improved.

Eye Movements as a Response Modality

Advantages of Gaze-Controlled Interfaces Allow intuitive use of computer programs Operators can simultaneously use their hands for other tasks Enable handicapped people to control systems and communicate by means of eye movements (e.g. “typing by eye”)

Problems with Gaze-Controlled Interfaces The “Midas-Touch Problem”: Since eye movements are not completely under conscious control, sometimes functions may be triggered inadvertently. Typically, researchers try to solve this problem by setting a minimum dwell time for triggering events.

Typing by Eye with Dynamic Recentering (Stampe & Reingold, 1995) This is a simple “typing by eye” application using a virtual keyboard. Keys are triggered using a dwell time threshold. This threshold can be varied while using the system. The authors also implemented a mechanism of dynamic recentering to avoid frequent recalibration of the eye tracker system.

Typing by Eye with Dynamic Recentering (Stampe & Reingold, 1995)

The dynamic centering mechanism makes the (reasonable) assumption that users fixate only the keys on the screen. To compensate for drift in gaze-position measurement (as caused by headset shift), the system measures the offset between the centers of the keys and the fixation positions. If a fixation shows such an offset, the following measurements are shifted by about 10% of the offset distance, but in the opposite direction.

Typing by Eye with Dynamic Recentering (Stampe & Reingold, 1995) Dynamic recentering is able to reduce the average fixation error and the frequency of system recalibration.

Typing by Eye with Dynamic Recentering (Stampe & Reingold, 1995) One disadvantage of the mechanism is that if the offset is larger than half the distance between neighboring keys, it will draw the measurement towards unwanted keys. Moreover, this system can only compensate linear shifts but no rotation or other distortions of measurement.

Dasher - An Advanced Typing-by-Eye Interface (Ward & MacKay, 2002) The previously shown interface is the most basic and straightforward implementation of typing by eye. It is possible to make such interfaces more intelligent to allow faster and more convenient typing. One such approach is the Dasher system (freely available on the web).

Dasher The initial display of Dasher shows all letters of the alphabet in a column at the right edge of the screen:

Dasher The letters flow leftwards, each of them followed by a new alphabet with the most likely continuing letters being the biggest ones.

Dasher Use the mouse to control the “typing”: Left-right: control the speed of letters Up-down: select next letter Video Demonstration

A Gaze-Controlled Zooming Interface (Pomplun, Ivanovic, Reingold & Shen, 2001) We created a gaze-controlled interface that supports a common, important task (zooming in/out to inspect an image), can be used easily and intuitively, and minimizes the Midas-Touch Problem. DemonstrationDemonstration of the Zooming Interface Demonstration

A Gaze-Controlled Zooming Interface We compared the efficiency and practice effects of gaze control vs. mouse control. Four subjects participated in six sessions, each session including 50 gaze and 50 mouse trials. We measured response time, error rate, and the number of magnifications per trial as functions of time (sessions one to six).

Response Time

Error Rate session number

Number of Magnifications per Trial

Conclusions The novel zooming interface is well-suited for efficient gaze control. With this interface, mouse control is only slightly more efficient than gaze control. Using gaze control can be learned as quickly as using a mouse.