Why is Spatial Stereoacuity so Poor?

Slides:



Advertisements
Similar presentations
Chapter 6 Spatial Vision. The visual system recognizes objects from patterns of light and dark. We will focus on the mechanisms the visual system uses.
Advertisements

Visual Acuity Adler’s Physiology of the Eye 11th Ed.
Slant Anisotropy and Tilt- dependent Variations in Stereo Precision Tandra Ghose Vision Science Program UC Berkeley James M. Hillis.
Why is Spatial Stereoacuity so Poor? Martin S. Banks School of Optometry, Dept. of Psychology UC Berkeley Sergei Gepshtein Vision Science Program UC Berkeley.
© 2006 by Davi GeigerComputer Vision April 2006 L1.1 Binocular Stereo Left Image Right Image.
1B50 – Percepts and Concepts Daniel J Hulme. Outline Cognitive Vision –Why do we want computers to see? –Why can’t computers see? –Introducing percepts.
Intensity discrimination is the process of distinguishing one stimulus intensity from another Intensity Discrimination.
Active Vision Key points: Acting to obtain information Eye movements Depth from motion parallax Extracting motion information from a spatio-temporal pattern.
Chapter 5 Human Stereopsis, Fusion, and Stereoscopic Virtual Environments.
1 Computational Vision CSCI 363, Fall 2012 Lecture 20 Stereo, Motion.
1 Perception, Illusion and VR HNRS 299, Spring 2008 Lecture 8 Seeing Depth.
CS332 Visual Processing Department of Computer Science Wellesley College Binocular Stereo Vision Region-based stereo matching algorithms Properties of.
Visual Acuity Adler’s Physiology of the Eye 11th Ed. Chapter 33 - by Dennis Levi
CSE 185 Introduction to Computer Vision Stereo. Taken at the same time or sequential in time stereo vision structure from motion optical flow Multiple.
EFFECTS OF STIMULUS WIDTH AND LENGTH ON THE DETECTION THRESHOLDS FOR II-nd ORDER GRATINGS D. Mitov, Ts. Totev, K. Racheva, I. Hristov Institute of Neurobiology,
Why isn’t vision perfect? An exercise in psychoanatomy.
Perception and VR MONT 104S, Fall 2008 Lecture 8 Seeing Depth
1 Computational Vision CSCI 363, Fall 2012 Lecture 16 Stereopsis.
Grenoble Images Parole Signal Automatique Modeling of visual cortical processing to estimate binocular disparity Introduction - The objective is to estimate.
1 Computational Vision CSCI 363, Fall 2012 Lecture 18 Stereopsis III.
Computational Vision CSCI 363, Fall 2012 Lecture 17 Stereopsis II
Signal Detection Analysis of a Three Alternative Forced Choice Signal Detection Analysis of a Three Alternative Forced Choice in a Stereoacuity Context.
Exploring Spatial Frequency Channels in Stereopsis
Computational Vision CSCI 363, Fall 2016 Lecture 15 Stereopsis
From: Motion processing with two eyes in three dimensions
Digital Image Processing
Space Perception and Binocular Vision
Attention Narrows Position Tuning of Population Responses in V1
Volume 60, Issue 4, Pages (November 2008)
Binocular Stereo Vision
“What Not” Detectors Help the Brain See in Depth
A Review in Quality Measures for Halftoned Images
Space Perception and Binocular Vision
Binocular Stereo Vision
Perception We have previously examined the sensory processes by which stimuli are encoded. Now we will examine the ultimate purpose of sensory information.
Illusory Jitter Perceived at the Frequency of Alpha Oscillations
Responses to Spatial Contrast in the Mouse Suprachiasmatic Nuclei
Perceptual Echoes at 10 Hz in the Human Brain
Volume 21, Issue 19, Pages (October 2011)
Volume 19, Issue 2, Pages (August 1997)
Two-Dimensional Substructure of MT Receptive Fields
Sergei Gepshtein, Martin S. Banks  Current Biology 
Alteration of Visual Perception prior to Microsaccades
Visual Search and Attention
Motion-Based Analysis of Spatial Patterns by the Human Visual System
Robert O. Duncan, Geoffrey M. Boynton  Neuron 
A Channel for 3D Environmental Shape in Anterior Inferotemporal Cortex
Binocular Disparity and the Perception of Depth
Saccadic suppression precedes visual motion analysis
Volume 71, Issue 4, Pages (August 2011)
A Map for Horizontal Disparity in Monkey V2
Syed A. Chowdhury, Gregory C. DeAngelis  Neuron 
Liu D. Liu, Christopher C. Pack  Neuron 
Consequences of the Oculomotor Cycle for the Dynamics of Perception
Attention Increases Sensitivity of V4 Neurons
Ryo Sasaki, Takanori Uka  Neuron  Volume 62, Issue 1, Pages (April 2009)
Perception Matches Selectivity in the Human Anterior Color Center
Consequences of the Oculomotor Cycle for the Dynamics of Perception
Normalization Regulates Competition for Visual Awareness
Stephen V. David, Benjamin Y. Hayden, James A. Mazer, Jack L. Gallant 
Representation of Color Stimuli in Awake Macaque Primary Visual Cortex
Chapter 11: Stereopsis Stereopsis: Fusing the pictures taken by two cameras and exploiting the difference (or disparity) between them to obtain the depth.
Receptive Fields of Disparity-Tuned Simple Cells in Macaque V1
Stereoscopic Surface Perception
Outline Announcements Human Visual Information Processing
Occlusion and smoothness probabilities in 3D cluttered scenes
Detection of image intensity changes
Visual Perception: One World from Two Eyes
on Local and Global Stereo Acuity
Maxwell H. Turner, Fred Rieke  Neuron 
Presentation transcript:

Why is Spatial Stereoacuity so Poor? Martin S. Banks School of Optometry, Dept. of Psychology UC Berkeley Sergei Gepshtein Vision Science Program UC Berkeley Michael S. Landy Dept. of Psychology, Center for Neural Science NYU Supported by NIH

Depth Perception

How precise is the depth map generated from disparity? Depth Perception How precise is the depth map generated from disparity?

Precision of Stereopsis from Tyler (1977) Stereo precision measured in various ways A: Precision of detecting depth change on line of sight D: Precision of detecting spatial variation in depth

Precision of Stereopsis from Tyler (1977) Stereo precision measured in various ways A: Detect depth change on line of sight D: Precision of detecting spatial variation in depth

Precision of Stereopsis from Tyler (1977) Stereo precision measured in various ways A: Detect depth change on line of sight D: Detect spatial variation in depth

Spatial Stereoacuity Modulate disparity sinusoidally creating corrugations in depth. Least disparity required for detection as a function of spatial frequency of corrugations: “Disparity MTF”. Index of precision of depth map.

Disparity MTF Disparity modulation threshold as a function of spatial frequency of corrugations. Bradshaw & Rogers (1999). Horizontal & vertical corrugations. Disparity MTF: acuity = 2-3 cpd; peak at 0.3 cpd.

Luminance Contrast Sensitivity & Acuity Luminance contrast sensitivity function (CSF): contrast for detection as function of spatial frequency. Proven useful for characterizing limits of visual performance and for understanding optical, retinal, & post-retinal processing. Highest detectable spatial frequency (grating acuity): 40-50 c/deg.

Disparity MTF Spatial stereoacuity more than 1 log unit lower than luminance acuity.

Disparity MTF Spatial stereoacuity more than 1 log unit lower than luminance acuity. Why is spatial stereoacuity so low?

Likely Constraints to Spatial Stereoacuity Sampling constraints in the stimulus: Stereoacuity measured using random-element stereograms. Discrete sampling limits the highest spatial frequency one can reconstruct. Disparity gradient limit: With increasing spatial frequency, the disparity gradient increases. If gradient approaches 1.0, binocular fusion fails. Spatial filtering at the front end: Optical quality & retinal sampling limit acuity in other tasks, so probably limits spatial stereoacuity as well. The correspondence problem: Manner in which binocular matching occurs presumably affects spatial stereoacuity.

Likely Constraints to Spatial Stereoacuity Sampling constraints in the stimulus: Stereoacuity measured using random-element stereograms. Discrete sampling limits the highest spatial frequency one can reconstruct. Disparity gradient limit: With increasing spatial frequency, the disparity gradient increases. If gradient approaches 1.0, binocular fusion fails. Spatial filtering at the front end: Optical quality & retinal sampling limit acuity in other tasks, so probably limits spatial stereoacuity as well. The correspondence problem: Manner in which binocular matching occurs presumably affects spatial stereoacuity.

Spatial Sampling Limit: Nyquist Frequency Signal reconstruction from discrete samples. At least 2 samples required per cycle. In 1d, highest recoverable spatial frequency is Nyquist frequency: where N is number of samples per unit distance.

Spatial Sampling Limit: Nyquist Frequency Signal reconstruction from 2d discrete samples. In 2d, Nyquist frequency is: where N is number of samples in area A.

Methodology Random-dot stereograms with sinusoidal disparity corrugations. Corrugation orientations: +/-20 deg (near horizontal). Observers identified orientation in 2-IFC psychophysical procedure; phase randomized. Spatial frequency of corrugations varied according to adaptive staircase procedure. Spatial stereoacuity threshold obtained for wide range of dot densities. Duration = 600 msec; disparity amplitude = 16 minarc.

Stimuli

Spatial Stereoacuity as a function of Dot Density Acuity proportional to dot density squared. Scale invariance! Asymptote at high density. 1 0.1 Spatial Stereoacuity (c/deg) 1 0.1 1 0.1 10 100 1 0.1 10 100 Dot Density (dots/deg2)

Spatial Stereoacuity & Nyquist Limit Calculated Nyquist frequency for our displays. 1 0.1 Spatial Stereoacuity (c/deg) 1 0.1 1 0.1 10 100 1 0.1 10 100 Dot Density (dots/deg2)

Spatial Stereoacuity & Nyquist Limit Nyquist frequency Calculated Nyquist frequency for our displays. Acuity approx. equal to Nyquist frequency except at high densities. 1 0.1 Spatial Stereoacuity (c/deg) 1 0.1 1 0.1 10 100 1 0.1 10 100 Dot Density (dots/deg2)

Types of Random-element Stereograms Jittered-lattice: dots displaced randomly from regular lattice Sparse random: dots positioned randomly

Spatial Sampling Limit: Nyquist Frequency Same acuities with jittered-lattice and sparse random stereograms. Both follow Nyquist limit at low densities.

Likely Constraints to Spatial Stereoacuity Sampling constraints in the stimulus: Stereoacuity measured using random-element stereograms. Discrete sampling limits the highest spatial frequency one can reconstruct. Disparity gradient limit: With increasing spatial frequency, the disparity gradient increases. If gradient approaches 1.0, binocular fusion fails. Spatial filtering at the front end: Optical quality & retinal sampling limit acuity in other tasks, so probably limits spatial stereoacuity as well. The correspondence problem: Manner in which binocular matching occurs presumably affects spatial stereoacuity.

Disparity Gradient P1 P2 aL aR Disparity gradient = disparity / separation = (aR – aL) / [(aL + aR)/2] aL aR

Disparity Gradient P1 P2 P1 & P2 on horopter aR = aL, so disparity = 0

Disparity Gradient P1 P2 P1 & P2 on cyclopean line of sight aR = -aL, so separation = 0 P2 Disparity gradient =

Disparity Gradient P1 P2 (left & right eyes) Disparity gradient for different directions. P1 separation P2 (left & right eyes) horizontal disparity

Disparity Gradient Limit Burt & Julesz (1980): fusion as function of disparity, separation, & direction (tilt). Set direction & horizontal disparity and found smallest fusable separation. P1 separation direction P2 (left & right eyes) disparity

Disparity Gradient Limit Fusion breaks when disparity gradient reaches constant value. Critical gradient = ~1. “Disparity gradient limit”. Limit same for all directions.

Disparity Gradient Limit

Disparity Gradient Limit Panum’s fusion area (hatched). Disparity gradient limit means that fusion area affected by nearby objects (A). Forbidden zone is conical (isotropic).

Horizontal Position (deg) Disparity Gradient & Spatial Frequency Disparity gradient for sinusoid is indeterminant. But for fixed amplitude, gradient proportional to spatial frequency. We may have approached disparity gradient limit. Tested by reducing disparity amplitude from 16 to 4.8 minarc. highest gradient peak-trough gradient Disparity (deg) Horizontal Position (deg)

Spatial Stereoacuity & Disparity Gradient Limit Reducing disparity amplitude increases acuity at high dot densities (where DG is high). Lowers acuity slightly at low densities (where DG is low). 1 Spatial Stereoacuity (c/deg) 0.1 1 0.1 1 0.1 10 100 1 0.1 10 100 Dot Density (dots/deg2)

Likely Constraints to Spatial Stereoacuity Sampling constraints in the stimulus: Stereoacuity measured using random-element stereograms. Discrete sampling limits the highest spatial frequency one can reconstruct. Disparity gradient limit: With increasing spatial frequency, the disparity gradient increases. If gradient approaches 1.0, binocular fusion fails. Spatial filtering at the front end: Optical quality & retinal sampling limit acuity in other tasks, so probably limits spatial stereoacuity as well. The correspondence problem: Manner in which binocular matching occurs presumably affects spatial stereoacuity.

Stereoacuity & Front-end Spatial Filtering Low-pass spatial filtering at front-end of visual system determines high-frequency roll-off of luminance CSF. Tested similar effects on spatial stereoacuity by: Decreasing retinal image size of dots by increasing viewing distance. Measuring stereoacuity as a function of retinal eccentricity. Measuring stereoacuity as a function of blur.

Stereoacuity & Front-end Spatial Filtering Low-pass spatial filtering at front-end of visual system determines high-frequency roll-off of luminance CSF. Tested similar effects on spatial stereoacuity by: Decreasing retinal image size of dots by increasing viewing distance. Measuring stereoacuity as a function of retinal eccentricity. Measuring stereoacuity as a function of blur.

Spatial Stereoacuity at Higher Densities . 1 J M A M S B p a t i l e r o c u y ( / d g ) N y q u i s t f r e q u e n c y Monocular artifacts at high dot densities. Reduce dot size to test upper limit. Increase viewing distance from 39-154 cm. Acuity still levels off, but at higher value. 1 M o d u l a t i o n V i e w i n g A m p l i t u d e D i s t a n c e 4 . 8 m i n 3 9 c m 4 . 8 m i n 1 5 4 c m Spatial Stereoacuity (c/deg) 0.1 D M V T M G D o t e n s i y ( d / g 2 ) . 1 1 1 . 0.1 . 1 1 0.1 10 100 1 0.1 10 100 1 1 Dot Density (dots/deg2)

Stereoacuity & Front-end Spatial Filtering Low-pass spatial filtering at front-end of visual system determines high-frequency roll-off of luminance CSF. Tested similar effects on spatial stereoacuity by: Decreasing retinal image size of dots by increasing viewing distance. Measuring stereoacuity as a function of retinal eccentricity. Measuring stereoacuity as a function of blur.

Spatial Stereoacuity & Retinal Eccentricity Elliptical patch with sinusoidal corrugation. Patch centered at one of three eccentricities (subject dependent). Eccentricity random; duration = 250 ms. Same task as before. Again vary dot density. fixation point eccentricity 4 deg 8 deg

Spatial Stereoacuity & Retinal Eccentricity

Spatial Stereoacuity & Retinal Eccentricity M G Y H H S S G 1 1 . Spatial Stereoacuity (c/deg) Retinal eccentricity 0 deg 0 deg 0 deg 0.1 . 1 5.2 6.2 6.8 10.4 12.4 13.6 0.1 1 10 100 1 10 100 1 10 100 . 1 1 . 1 1 1 . 1 1 1 . 1 1 Dot Density (dots/deg2) Same acuities at low dot densities; Nyquist. Asymptote varies significantly with retinal eccentricity.

Stereoacuity & Front-end Spatial Filtering Low-pass spatial filtering at front-end of visual system determines high-frequency roll-off of luminance CSF. Tested similar effects on spatial stereoacuity by: Decreasing retinal image size of dots by increasing viewing distance. Measuring stereoacuity as a function of retinal eccentricity. Measuring stereoacuity as a function of blur.

Spatial Stereoacuity & Blur We examined effect of blur on foveal spatial stereoacuity. Three levels of blur introduced with diffusion plate: no blur (s = 0 deg) low blur (s = 0.12) high blur (s = 0.25)

Spatial Stereoacuity & Blur We examined effect of blur on foveal spatial stereoacuity. Three levels of blur introduced with diffusion plate: no blur (s = 0 deg) low blur (s = 0.12) high blur (s = 0.25)

Spatial Stereoacuity & Blur Same acuities at low dot densities; Nyquist. Asymptote varies significantly with spatial lowpass filtering.

Likely Constraints to Spatial Stereoacuity Sampling constraints in the stimulus: Stereoacuity measured using random-element stereograms. Discrete sampling limits the highest spatial frequency one can reconstruct. Disparity gradient limit: With increasing spatial frequency, the disparity gradient increases. If gradient approaches 1.0, binocular fusion fails. Spatial filtering at the front end: Optical quality & retinal sampling limit acuity in other tasks, so probably limits spatial stereoacuity as well. The correspondence problem: Manner in which binocular matching occurs presumably affects spatial stereoacuity.

Binocular Matching by Correlation Binocular matching by correlation: basic and well-studied technique for obtaining depth map from binocular images. Computer vision: Kanade & Okutomi (1994); Panton (1978) Physiology: Ohzawa, DeAngelis, & Freeman (1990); Cumming & Parker (1997) 2. We developed a cross-correlation algorithm for binocular matching & compared its properties to the psychophysics.

Binocular Matching by Correlation Compute cross-correlation between eyes’ images. Window in left eye’s image moved orthogonal to signal. For each position in left eye, window in right eye’s image moved horizontally & cross-correlation computed. Left eye’s image Right eye’s image

Binocular Matching by Correlation Compute cross-correlation between eyes’ images. Window in left eye’s image moved orthogonal to signal. For each position in left eye, window in right eye’s image moved horizontally & cross-correlation computed. Left eye’s image Right eye’s image

Binocular Matching by Correlation Compute cross-correlation between eyes’ images. Window in left eye’s image moved orthogonal to signal. For each position in left eye, window in right eye’s image moved horizontally & cross-correlation computed. Left eye’s image Right eye’s image

Position in right eye’s image Position in left eye’s image Binocular Matching by Correlation X axis Y axis Plot correlation as a function of position in left eye (red arrow) & relative position in right eye (blue arrow); disparity. Correlation (gray value) high where images similar & low where dissimilar. Position in right eye’s image Position in left eye’s image

Disparity waveform evident in output Examples of Output 1 Spatial Frequency 0.1 0.1 1 10 100 Dot Density Dot density: 16 dots/deg2 Spatial frequency: 1 c/deg Window size: 0.2 deg Disparity waveform evident in output

Effect of Window Size & Dot Density

Effect of Window Size & Dot Density Correlation window must be large enough to contain sufficient luminance variation to find correct matches

Effect of Window Size & Spatial Frequency

Effect of Window Size & Spatial Frequency When significant depth variation is present in a region, window must be small enough to respond differentially

Window Size, # Samples, & Spatial Frequency From two constraints: then substitute for w, take log:

Effect of Disparity Gradient Fix spatial frequency, dot density, & window size. Increase disparity amplitude (which also increases disparity gradient: 0.21, 0.59, 1.77). As approach 1.0, disparity estimation becomes poor. Images too dissimilar in two eyes. Matching by correlation yields piecewise frontal estimates.

Effect of Disparity Gradient Fix spatial frequency, dot density, & window size. Increase disparity amplitude (which also increases disparity gradient: 0.21, 0.59, 1.77). As approach 1.0, disparity estimation becomes poor. Images too dissimilar in two eyes. Matching by correlation yields piecewise frontal estimates.

Effect of Low-pass Spatial Filtering Amount of variation in image dependent on spatial-frequency content. If s proportional to w and inversely proportional to , variation constant in cycles/window. Algorithm yields similar outputs for these images. For each s, there’s a window just large enough to yield good disparity estimates. Highest detectable spatial frequency inversely proportional to s.

Effect of Low-pass Spatial Filtering Spatial stereoacuity for different amounts of blur. s: all filtering elements: dots, optics, diffusion screen. Horizontal lines: predictions for asymptotic acuities. Asymptotic acuity limited by filtering before binocular combination.

Summary of Matching Effects Correlation algorithm reveals two effects. 1. Disparity estimation is poor when there’s insufficient intensity variation within correlation window. a. when window too small for presented dot density b. when spatial-frequency content is too low. c. employ a larger window (or receptive field). 2. Disparity estimation is poor when correlation window is too large in direction of maximum disparity gradient. a. when window width greater than half cycle of stimulus. b. employ a smaller window (or receptive field).

Summary Sampling constraints in the stimulus: Stereoacuity follows Nyquist limit for all but highest densities. Occurs in peripheral visual field and in fovea with blur. Disparity gradient limit: Stereoacuity reduced at high gradients. Spatial filtering at the front end: Low-pass filtering before binocular combination determines asymptotic acuity. The correspondence problem: Binocular matching by correlation requires sufficient information in correlation window & thereby reduces highest attainable acuity. Visual system measures disparity in piecewise frontal fashion.

Depth Map from Disparity depth-varying scene Disparity estimates are piecewise frontal. Only one perceived depth per direction.