Jochen Triesch, UC San Diego, 1 Eye Movements and Eye Tracking Why move the eyes? see the same thing better (stabilize.

Slides:



Advertisements
Similar presentations
Introduction to Eye Tracking
Advertisements

Motion Capture The process of recording movement and translating that movement onto a digital model Games Fast Animation Movies Bio Medical Analysis VR.
Virtual Me. Motion Capture (mocap) Motion capture is the process of simulating actual movement in a computer generated environment The capture subject.
A Natural Interactive Game By Zak Wilson. Background This project was my second year group project at University and I have chosen it to present as it.
Why do we move our eyes? - Image stabilization
Presentation by: Serena, Ann & Nicole
Spatiotemporal Information Processing No.2 3 components of Virtual Reality-1 Sensing System Kazuhiko HAMAMOTO Dept. of Information Media Technology, School.
What is vision Aristotle - vision is knowing what is where by looking.
The Bioloid Robot Project Presenters: Michael Gouzenfeld Alexey Serafimov Supervisor: Ido Cohen Winter Department of Electrical Engineering.
Rotary Encoder. Wikipedia- Definition  A rotary encoder, also called a shaft encoder, is an electro- mechanical device that converts the angular position.
Physiology and Psychophysics of Eye Movements 1.Muscles and (cranial) nerves 2. Classes of eye movements/oculomotor behaviors 3. Saccadic Eye Movements,
Practical Gaze Tracking Peter Elliott CS 498 Spring 2009.
Electro-Oculography (EOG) Measurement System The goal : To measure eye movement with maximum accuracy using skin electrodes around the eyes that detect.
Jeff B. Pelz, Roxanne Canosa, Jason Babcock, & Eric Knappenberger Visual Perception Laboratory Carlson Center for Imaging Science Rochester Institute of.
Virtual Reality. What is virtual reality? a way to visualise, manipulate, and interact with a virtual environment visualise the computer generates visual,
An Introduction to Real-time Machine Vision in Mechatronics
1 Ceng Tracking Gökhan Tekkaya Gürkan Vural Can Eroğul METU, 2008.
Saccades: Rapid rotation of the eyes to reorient gaze to a new object or region. Reflex saccades Attentional saccades Shifting Gaze - Saccades.
Jeff B. Pelz, Roxanne Canosa, Jason Babcock, & Eric Knappenberger Visual Perception Laboratory Carlson Center for Imaging Science Rochester Institute of.
Tracking Gökhan Tekkaya Gürkan Vural Can Eroğul. Outline Tracking –Overview –Head Tracking –Eye Tracking –Finger/Hand Tracking Demos.
Theoretical Foundations of Multimedia Chapter 3 Virtual Reality Devices Non interactive Slow image update rate Simple image Nonengaging content and presentation.
An Introduction to Real-time Machine Vision in Mechatronics Dr. Onur TOKER.
Night Vision James Stacy Brian Herre Maurio Grando Eric Faller Chris Bawiec James Team Bender.
Survey of Eye Tracking Techniques
1 Advanced Sensors Lecture 6 Sensors Technology AUE 2008 Bo Rohde Pedersen.
Active Visual Observer Integration of Visual Processes for Control of Fixation KTH (Royal Institute of Technology, Stockholm)and Aalborg University C.S.
Development of an Eye Tracker By Jason Kajon Barrett of the Center for Imaging Science at the Rochester Institute of Technology Advisor: Jeff Pelz.
Society for Psychophysiological Research
Eye Movements and Visual Attention
Eye tracking: principles and applications 廖文宏 Wen-Hung Liao 12/10/2009.
Copyright © 2014, Oracle and/or its affiliates. All rights reserved. | From a certain point of view Eye tracking with Java Gerrit Grunwald Java Technology.
Granular Systems in Microgravity Michael L. Wilson The University of Tulsa Supported by Research Corporation.
Motion Capture Hardware
Digital Photography A tool for Graphic Design Graphic Design: Digital Photography.
VE Input Devices(I) Doug Bowman Virginia Tech Edited by Chang Song.
Designing 3D Interfaces Examples of 3D interfaces Pros and cons of 3D interfaces Overview of 3D software and hardware Four key design issues: system performance,
 An eye tracking system records how the eyes move when a subject is sitting in front of a computer screen.  The human eyes are constantly moving until.
Eye Tracking and its Application in MRI and EEG Settings
Eye movements: a primer Leanne Chukoskie, Ph.D.. Two classes of eye movements Gaze-stabilizing – Vestibulo-ocular reflex (VOR) – Optokinetic Nystagmus.
A few thoughts…. Opening Thoughts… “Any sufficiently advanced technology is indistinguishable from magic.” Arthur C. Clarke Digital photography is quite.
PSY 369: Psycholinguistics Language Comprehension: Methods for sentence comprehension.
VIRTUAL REALITY Sagar.Khadabadi. Introduction The very first idea of it was presented by Ivan Sutherland in 1965: “make that (virtual) world in the window.
Virtual Reality Lecture2. Some VR Systems & Applications 고려대학교 그래픽스 연구실.
GENESIS OF VIRTUAL REALITY  The term ‘Virtual reality’ (VR) was initially coined by Jaron Lanier, founder of VPL Research (1989)..
ICRA2009 Evaluation of a robot as embodied interface for Brain Computer Interface systems E. Menegatti, L. Tonin Intelligent Autonomous System Laboratory.
Eye-Based Interaction in Graphical Systems: Theory & Practice Part III Potential Gaze-Contingent Applications.
Lenses. Diverging and Converging Lenses Double Convex lenses focus light rays to a point on the opposite side of the lens. Double Concave lenses diverge.
Outlines,, Main idea of the Project ? Introduction
DECREASED FLICKER SENSITIVITY WITH A SCANNED LASER DISPLAY. J.P. Kelly 1, H.L. Pryor, E.S. Viirre, T. Furness III. 1 Children's Hospital & Medical Center;
The geometry of the system consisting of the hyperbolic mirror and the CCD camera is shown to the right. The points on the mirror surface can be expressed.
RoboNova 1.
Augmented Reality Authorized By: Miss.Trupti Pardeshi. NDMVP, Comp Dept. Augmented Reality 1/ 23.
Research Background: Depth Exam Presentation
Counting How Many Words You Read
Measuring Monkey Eye Movement in the MRI Team Members: Josh Anders Betsy Appel Bryan Baxter Alyssa Walsworth Client: Luis Populin,Ph. D. Advisor: Justin.
Visual Perception By Katie Young and Joe Avery. Overview Visual Perception Eye Trackers Change Blindness.
P15051: Robotic Eye Project Definition Review TIM O’HEARNANDREW DROGALISJORGE GONZALEZ KATIE HARDY DANIEL WEBSTER.
How is vision used to catch a ball?
M. Zareinejad 1. 2 Outline # Sensors –––– Sensor types Sensor examples #Actuators Actuator types Actuator examples ––––
Electrical and Computer Engineering Smart Goggles To Chong Ryan Offir Matt Ferrante James Kestyn Advisor: Dr. Tilman Wolf Team Wolf.
Introduction Processing Verbal Information in Concept Maps How people read concept maps in the context of a semantically meaningful task?
What visual image information is needed for the things we do? How is vision used to acquire information from the world?
What you need: In order to use these programs you need a program that sends out OSC messages in TUIO format. There are a few options in programs that.
Mobile eye tracker construction and gaze path analysis By Wen-Hung Liao 廖文宏.
EYE TRACKING TECHNOLOGY
Analyzing Eye Tracking Data
Research Background: Depth Exam Presentation
眼動儀與互動介面設計 廖文宏 6/26/2009.
P16051 – Robotic Eye Motion Simulator
Presentation transcript:

Jochen Triesch, UC San Diego, 1 Eye Movements and Eye Tracking Why move the eyes? see the same thing better (stabilize image, but not too well!) see something else (overcome low peripheral resolution) S. Anstis only small part of visual field is sampled at full resolution

Jochen Triesch, UC San Diego, 2 Cortical Magnification in V1 Eric Schwartz Or in complex form: Approximately:

Jochen Triesch, UC San Diego, 3 Types of eye movements Major Distinctions: conjugate vs. disjunctive abrupt vs. slow Types: vestibuocular reflex: counteract head rotation optokinetic reflex: counteract large field retinal motion smooth pursuit: counteract object motion (<30 deg/s) saccades: rapidly shift gaze direction (<600 deg/s) vergence movements: adjust vergence angle accomodation: adjust focus microsaccaes: counteract drift during fixation (1-2 Hz, <5’) microtremor:.5’, Hz

Jochen Triesch, UC San Diego, 4 Saccades Yarbus (1950s) alternations of fixations and saccades, 3 per second

Jochen Triesch, UC San Diego, 5 Yarbus, 1967: examine the picture at will estimate the economic level of the people estimate the people’s ages guess what the people were doing before the arrival of the visitor remember the people’s clothing remember the people’s (and objects’) position in the room estimate the time since the guest’s last visit

Jochen Triesch, UC San Diego, 6 Saccade Metrics from Becker, 1991 approximately linear relationship between saccade amplitude A and saccade duration D: 20 ms < D 0 < 30 ms 2 ms/° < d < 3 ms/° Example: for 20° saccade display needs to be changed ~75ms after saccade onset

Jochen Triesch, UC San Diego, 7 Anthropomorphic Robot Head

Jochen Triesch, UC San Diego, 8 System overview Gigabit Router Desktop ADesktop BDesktop C

Jochen Triesch, UC San Diego, 9 Motor System 9 DoF (hobby grade servo motors) 2 Mini SSC II interface boards Four bar linkage connection Eye/neck system is redundant

Jochen Triesch, UC San Diego, 10 Vision System 2 Point Grey Research Firefly cameras (Firewire) 640x480 resolution up to 30 fps 2, 4, 6, and 8mm focal length lenses

Jochen Triesch, UC San Diego, 11 Saccade Performance 1

Jochen Triesch, UC San Diego, 12 Saccade Performance 2

Jochen Triesch, UC San Diego, 13 Demo video

Jochen Triesch, UC San Diego, 14 Why Eye tracking Basic Neuroscience: - measuring brain output - understanding neural control architecture - psychophysics: how do we use gaze during natural tasks Applications: - user interface design, human factors: e.g. eye tracking for driver of a car - advertising: do people look where I want them to look (in my web page, my newspaper ad, etc.) Note 1: now several conferences solely devoted to this Note 2: can also be done in fMRI Note 3: possible in the real world: “portable eye trackers”

Jochen Triesch, UC San Diego, 15 Eye tracking techniques Eye tracking techniques (Review in Duchowski&Vertegaal, 2000) Contact lenses: with mirrors or induction loops - fast, accurate, uncomfortable (often used with bite bars) Video based: - temporal accuracy limited to camera frame rate (60Hz) - less accurate (~1 deg) - typically with infrared illumination of the eye (dark pupil vs. bright pupil) - can be done remotely or camera can be attached to head Electro-oculogram: - exploits electric dipole property of eye ball - fast but imprecise Limbus tracking: - predecessor of the video based tracker - imprecise - small operating range

Jochen Triesch, UC San Diego, 16 Electro-oculogram eye is electric dipole electric field of moving dipole induces measurable voltages provides analog voltage output that can be digitized and processed extremely fast, but technique is not accurate electrode

Jochen Triesch, UC San Diego, 17 Limbus Tracking utilize difference in reflective properties of iris vs. sclera infrared LED photo diode sclera limbus iris

Jochen Triesch, UC San Diego, 18 Search Coil gold standard for speed and accuracy (5-10’’), but quite uncomfortable and head movement restrained

Jochen Triesch, UC San Diego, 19 Video based tracking most widely used today, good accuracy and speed, easy-to-use dual purkinje tracker bright pupil image

Jochen Triesch, UC San Diego, 20 inside of head mounted display U. of Rochester

Jochen Triesch, UC San Diego, 21 Gaze contingent display changes Idea: decide what is being displayed contingent on where observer is looking Saccade contingent updating (SCU): a special case of this: make display changes while a saccade is progressing (pioneering work by McConkie and colleagues) powerful technique for studying visual perception frequently used in e.g. change blindness studies

Jochen Triesch, UC San Diego, 22 Combination of video and limbus tracker inside HMD for gaze contingent display analog limbus tracker’s sensor with infrared LED, photo diodes (horizontal) video based pupil tracker’s sensor LCD displays inside the HMD video based tracker for good accuracy limbus tracker for low latency saccade detection

Jochen Triesch, UC San Diego, 23 System Overview 4 processor high-end graphics computer as backbone images rendered in V8 helmet (Virtual Research) Three sensors: magnetic tracking device (Polhemus Fastrak) limbus tracker (ASL 210), sampled at 1.25 kHz pupil tracker (ASL 501), sampled at 60 Hz Sensors send data via separate serial lines

Jochen Triesch, UC San Diego, 24 Latency Comparison in this example: limbus tracker has 26 ms advantage over video tracker on average: limbus tracker has 37±13 ms advantage (15 trials)

Jochen Triesch, UC San Diego, 25 chess pieces (dis-)appear contingent on saccade (~25-30°) with 50% probability