Measuring Gaze Depth with an Eye Tracker During Stereoscopic Display

Slides:



Advertisements
Similar presentations
EXAM 2 !!! Monday, Tuesday, Wednesday, Thursday of NEXT WEEK.
Advertisements

Extra Credit for Participating in Experiments Go to to sign upwww.tatalab.ca We are recruiting for a variety of experiments including basic.
Design of icons for use by Chinese in Mainland China Interacting with computers 9(1998) Yee-Yin Choong, Gavriel Salvendy Report: Yang Kun, Ou.
Chapter 6 Opener. Figure 6.1 The Euclidean geometry of the three-dimensional world turns into something quite different on the curved, two-dimensional.
December 5, 2013Computer Vision Lecture 20: Hidden Markov Models/Depth 1 Stereo Vision Due to the limited resolution of images, increasing the baseline.
The Apparatus. Seeing in Stereo It’s very hard to read words if there are multiple images on your retina.
Stereograms seeing depth requires “only” two different images on the retina.
3/23/2005 © Dr. Zachary Wartell 1 “ The Effect of Interocular Distance upon Operator Performance using Stereoscopic Displays To Perform Depth Tasks ” Louis.
Monocular vs. Binocular View Monocular view: one eye only! Many optical instruments are designed from one eye view. Binocular view: two eyes with each.
Careers for Psychology and Neuroscience Majors Oct. 19th5-7pm in SU 300 Ballroom B.
CAP4730: Computational Structures in Computer Graphics 3D Concepts.
CSE554AlignmentSlide 1 CSE 554 Lecture 5: Alignment Fall 2011.
December 4, 2014Computer Vision Lecture 22: Depth 1 Stereo Vision Comparing the similar triangles PMC l and p l LC l, we get: Similarly, for PNC r and.
1 Perception, Illusion and VR HNRS 299, Spring 2008 Lecture 8 Seeing Depth.
Physiological Depth Cues – Convergence. Physiological Depth Cues – Convergence – small angle of convergence = far away – large angle of convergence =
CS332 Visual Processing Department of Computer Science Wellesley College Binocular Stereo Vision Region-based stereo matching algorithms Properties of.
CSE 185 Introduction to Computer Vision Stereo. Taken at the same time or sequential in time stereo vision structure from motion optical flow Multiple.
Counting How Many Words You Read
Perception and VR MONT 104S, Fall 2008 Lecture 8 Seeing Depth
Outline Of Today’s Discussion 1.Some Disparities are Not Retinal: Pulfrich Effect 2.Random-Dot Stereograms 3.Binocular Rivalry 4.Motion Parallax.
1 Computational Vision CSCI 363, Fall 2012 Lecture 16 Stereopsis.
A computational model of stereoscopic 3D visual saliency School of Electronic Information Engineering Tianjin University 1 Wang Bingren.
1 Computational Vision CSCI 363, Fall 2012 Lecture 18 Stereopsis III.
Computational Vision CSCI 363, Fall 2012 Lecture 17 Stereopsis II
Signal Detection Analysis of a Three Alternative Forced Choice Signal Detection Analysis of a Three Alternative Forced Choice in a Stereoacuity Context.
Exploring Spatial Frequency Channels in Stereopsis
Depth Perception, with Emphasis on Stereoscopic Vision
Computational Vision CSCI 363, Fall 2016 Lecture 15 Stereopsis
Head-Tracked Displays (HTDs)
Paper – Stephen Se, David Lowe, Jim Little
Stereography The Mechanics.
3B Reflections 9-2 in textbook
M. Lewis, K. Muller, M. Dunn, R. T. Eakin, L. D. Abraham
Enhancing User identification during Reading by Applying Content-Based Text Analysis to Eye- Movement Patterns Akram Bayat Amir Hossein Bayat Marc.
Do-It-Yourself Eye Tracker: Impact of the Viewing Angle on the
Bram-Ernst Verhoef, Rufin Vogels, Peter Janssen  Neuron 
Common Classification Tasks
Within a Mixed-Frequency Visual Environment
Interacting Roles of Attention and Visual Salience in V4
Binocular Stereo Vision
Long-Term Speeding in Perceptual Switches Mediated by Attention-Dependent Plasticity in Cortical Visual Processing  Satoru Suzuki, Marcia Grabowecky 
Bram-Ernst Verhoef, Rufin Vogels, Peter Janssen  Neuron 
Binocular Stereo Vision
Satoru Suzuki, Marcia Grabowecky  Neuron 
Binocular Stereo Vision
Perceptual Echoes at 10 Hz in the Human Brain
Binocular Stereo Vision
眼動儀與互動介面設計 廖文宏 6/26/2009.
Dorita H.F. Chang, Carmel Mevorach, Zoe Kourtzi, Andrew E. Welchman 
VISUAL DEPENDENCE IN POSTURAL CONTROL AND SPATIAL ORIENTATION Massimo Cenciarini1, Patrick J. Loughlin1,2, Mark S. Redfern1,3, Patrick J. Sparto1,3 Depts.
Minami Ito, Gerald Westheimer, Charles D Gilbert  Neuron 
Colin J. Palmer, Colin W.G. Clifford  Current Biology 
Binocular Disparity and the Perception of Depth
Detecting image intensity changes
A Novel Form of Stereo Vision in the Praying Mantis
Volume 49, Issue 3, Pages (February 2006)
A Novel Form of Stereo Vision in the Praying Mantis
Satoru Suzuki, Marcia Grabowecky  Neuron 
Syed A. Chowdhury, Gregory C. DeAngelis  Neuron 
Neural Mechanisms of Visual Motion Perception in Primates
Segregation of Object and Background Motion in Visual Area MT
Visual Sensitivity Can Scale with Illusory Size Changes
Chapter 11: Stereopsis Stereopsis: Fusing the pictures taken by two cameras and exploiting the difference (or disparity) between them to obtain the depth.
Receptive Fields of Disparity-Tuned Simple Cells in Macaque V1
Short-Term Memory for Figure-Ground Organization in the Visual Cortex
Stan Van Pelt and W. Pieter Medendorp
Keith A. May, Li Zhaoping, Paul B. Hibbard  Current Biology 
Fig. 2 System and method for determining retinal disparities when participants are engaged in everyday tasks. System and method for determining retinal.
Fig. 2 System and method for determining retinal disparities when participants are engaged in everyday tasks. System and method for determining retinal.
Head-Eye Coordination at a Microscopic Scale
Presentation transcript:

Measuring Gaze Depth with an Eye Tracker During Stereoscopic Display A.T.Duchowski, B.Pelfrey, D.H.House, R.Wang School of Computing, Clemson University

Introduction Stereo images have appeared in a variety of forms since their introduction by Wheatstone (1838) How to capture gaze depth by an eye tracker over a stereo display such as Wheatstone’s? Figure 1. Wheatstone’s mirror stereoscope. The head is brought up to mirrors A’ and A, and pictures E’ and E are viewed stereoscopically (Wheatstone, 1838).

Motivation Figure 2. Layered illustration of horse’s hoof showing outer layer and coffin bone. Idea is to measure amount of switching as gaze depth changes across layered stereo displays Then develop effective means of visualization to facilitate disambiguation of layers (e.g., via grids)

Background Figure 3. Horizontal coordinates of left and right gaze points. Early informal measurements showed promise in simply using horizontal depth disparity Daugherty et al. (2010) used similar idea to measure vergence response to anaglyphic stereo

Background Other work: Holliman et al. (2007) studied depth perception on desktop 3D displays, but did not measure eye movements Essig et al. (2004) measured gaze atop random dot stereograms Kwon and Shul (2006) measured interocular distance during rendering stereo image at five different depths Presently we report observations on vergence response when viewing stereo at different depths vs. monoscopical rendering

Methodology Apparatus two IBM T221 “Big Bertha” LCD displays Figure 4. Custom-built, high-resolution Wheatstone-style stereoscope. Apparatus two IBM T221 “Big Bertha” LCD displays LC Technologies’ Eyegaze system (60 Hz)

Methodology Figure 5. 5x5 grid of cubes rendered with the closest row of cubes at 30 cm in front of the screen with each of the four remaining rows 12 cm farther from the viewer. Stimulus grid rotates about vertical axis inducing motion parallax screen plane aligned at three rows in front of the screen at two rows behind the screen at

Methodology Procedure & participants Figure 6. Calibration result image; random-dot stereogram pair used for pre-screening. Procedure & participants 2D calibration performed until 1.3 cm accuracy achieved random-dot stereogram pre-screened depth perception 20 participants participated (18 M, 2 F; ages 16-34) task was to fixate individually rotating cube

Methodology Experimental design Pilot study within-subjects 2 (display) x 2 (motion parallax) x 5 (depth) familiarity and fatigue effects were mitigated by counterbalancing both the 2 x 2 stereo and motion combinations, and by alternating depth order depth order varied so that all even-numbered participants started with cube target behind the screen progressing to the front; reverse for odd-numbered participants Pilot study used to fine-tune data processing tools (filters, fitting)

Pilot study Same design as main experiment, but: fewer subjects (paper authors basically) test effect of stereo (if negative, cancel study!) double-check file naming scheme, Python scripts, etc. Figure 7. From left to right: Dixon Cleveland (LC Tech president), coauthors AD, BP, DH.

Figure 8. Raw gaze depth with outliers beyond 2 SD removed. Pilot study Figure 8. Raw gaze depth with outliers beyond 2 SD removed. Outcomes: definite observable gaze depth response to stereo need for filtering need for 3D calibration

Pilot study: filtering Figure 9. Gaze depth filtered with a 6th order Butterworth filter. Cascading three 2nd order filters gave good results with cutoff frequency set to 0.15 Hz Problems: lag (1.15 s) and initial conditions

Pilot study: 3D calibration Figure 10. Filtered and either shifted (left) or fit (right) gaze depth. Monoscopic data filtered and shifted (mean set to 0) Stereo data scaled and shifted via least-squares minimization (over all depth targets)

Results RMS of gaze depth computed per each of 15 targets Shifted: monoscopic display elicits no response Fit: stereo elicits closer depth agreement Figure 11. Root mean square error of gaze depth under the four viewing conditions.

Results Shifted data, within-subjects three-way ANOVA: significance of depth (F(4,76)=59.50, p<0.01) no significance of display (F(1,19)=3.29, p<0.09) no significance of motion parallax (F(1,19)=1.21, p=0.29) Fit data, within-subject three-way ANOVA: significance of depth (F(4,76)=323.29, p<0.01) significance of display (F(1,19)=126.00, p<0.01) no significance of motion parallax (F(1,19)=3.12, p=0.09)

Results Interaction between depth and display is significant (F(4,76)=83.77, p<0.01) Gaze depth error increases with target depth Figure 12. Root mean square error of gaze depth over the five viewing depth intervalsb.

Results Subjective data, pairwise t-tests: significance difference between perception of static monoscopic and rocking stereo (p<0.01) Figure 13. Mean responses to each of four viewing conditions (7-point Likert scale of agreement).

Discussion Eye vergence movements clearly respond and match the depth component of a 3D stereo target Noise may be due to eye tracking equipment Wheatstone setup requires splitting of binocular eye tracking optics—most modern eye trackers are binocular (we are currently observing similar effects)

Conclusion Our work documents gaze depth response to stereoscopic manipulation of target depths Currently gaze depth is measured by eye tracker vendor’s proprietary algorithm (we now have our own computation based on horizontal gaze disparity) The combined use of the Butterworth filter with least squares fitting is an effective means of depth calibration (we now have this running in real-time)

Q & A Thank you! Questions?