Why is Spatial Stereoacuity so Poor? Martin S. Banks School of Optometry, Dept. of Psychology UC Berkeley Sergei Gepshtein Vision Science Program UC Berkeley.

Slides:



Advertisements
Similar presentations
Chapter 5: Space and Form Form & Pattern Perception: Humans are second to none in processing visual form and pattern information. Our ability to see patterns.
Advertisements

Seeing 3D from 2D Images. How to make a 2D image appear as 3D! ► Output and input is typically 2D Images ► Yet we want to show a 3D world! ► How can we.
Chapter 6 Spatial Vision. The visual system recognizes objects from patterns of light and dark. We will focus on the mechanisms the visual system uses.
COMPUTATIONAL NEUROSCIENCE FINAL PROJECT – DEPTH VISION Omri Perez 2013.
Visual Acuity Adler’s Physiology of the Eye 11th Ed.
Chapter 8: Vision in three dimensions Basic issue: How do we construct a three-dimension visual experience from two- dimensional visual input? Important.
December 5, 2013Computer Vision Lecture 20: Hidden Markov Models/Depth 1 Stereo Vision Due to the limited resolution of images, increasing the baseline.
Slant Anisotropy and Tilt- dependent Variations in Stereo Precision Tandra Ghose Vision Science Program UC Berkeley James M. Hillis.
Read Pinker article for Thurs.. Seeing in Stereo.
© 2004 by Davi GeigerComputer Vision April 2004 L1.1 Binocular Stereo Left Image Right Image.
When Texture takes precedence over Motion By Justin O’Brien and Alan Johnston.
Binocular Disparity points nearer than horopter have crossed disparity
Laurent Itti: CS599 – Computational Architectures in Biological Vision, USC Lecture 8: Stereoscopic Vision 1 Computational Architectures in Biological.
© 2006 by Davi GeigerComputer Vision April 2006 L1.1 Binocular Stereo Left Image Right Image.
A Novel 2D To 3D Image Technique Based On Object- Oriented Conversion.
1B50 – Percepts and Concepts Daniel J Hulme. Outline Cognitive Vision –Why do we want computers to see? –Why can’t computers see? –Introducing percepts.
PSYC 330: Perception Depth Perception. The Puzzle The “Real” World and Euclidean Geometry The Retinal World and Projective Geometry Anamorphic art.
Careers for Psychology and Neuroscience Majors Oct. 19th5-7pm in SU 300 Ballroom B.
Intensity discrimination is the process of distinguishing one stimulus intensity from another Intensity Discrimination.
Active Vision Key points: Acting to obtain information Eye movements Depth from motion parallax Extracting motion information from a spatio-temporal pattern.
Chapter 5 Human Stereopsis, Fusion, and Stereoscopic Virtual Environments.
1 Computational Vision CSCI 363, Fall 2012 Lecture 10 Spatial Frequency.
Texture. Texture is an innate property of all surfaces (clouds, trees, bricks, hair etc…). It refers to visual patterns of homogeneity and does not result.
Updating the Standard Spatial Observer for contrast detection Summary Watson and Ahumada (2005) constructed a Standard Spatial Observer (SSO) model for.
December 4, 2014Computer Vision Lecture 22: Depth 1 Stereo Vision Comparing the similar triangles PMC l and p l LC l, we get: Similarly, for PNC r and.
SIGNAL DETECTION IN FIXED PATTERN CHROMATIC NOISE 1 A. J. Ahumada, Jr., 2 W. K. Krebs 1 NASA Ames Research Center; 2 Naval Postgraduate School, Monterey,
1 Computational Vision CSCI 363, Fall 2012 Lecture 20 Stereo, Motion.
User Issues in 3D TV & Cinema Martin S. Banks Vision Science Program UC Berkeley.
1 Perception, Illusion and VR HNRS 299, Spring 2008 Lecture 8 Seeing Depth.
Phase Congruency Detects Corners and Edges Peter Kovesi School of Computer Science & Software Engineering The University of Western Australia.
CS332 Visual Processing Department of Computer Science Wellesley College Binocular Stereo Vision Region-based stereo matching algorithms Properties of.
Visual Acuity Adler’s Physiology of the Eye 11th Ed. Chapter 33 - by Dennis Levi
Computer Vision Lecture #10 Hossam Abdelmunim 1 & Aly A. Farag 2 1 Computer & Systems Engineering Department, Ain Shams University, Cairo, Egypt 2 Electerical.
CSE 185 Introduction to Computer Vision Stereo. Taken at the same time or sequential in time stereo vision structure from motion optical flow Multiple.
EFFECTS OF STIMULUS WIDTH AND LENGTH ON THE DETECTION THRESHOLDS FOR II-nd ORDER GRATINGS D. Mitov, Ts. Totev, K. Racheva, I. Hristov Institute of Neurobiology,
DECREASED FLICKER SENSITIVITY WITH A SCANNED LASER DISPLAY. J.P. Kelly 1, H.L. Pryor, E.S. Viirre, T. Furness III. 1 Children's Hospital & Medical Center;
Evaluating Perceptual Cue Reliabilities Robert Jacobs Department of Brain and Cognitive Sciences University of Rochester.
1 Perception and VR MONT 104S, Fall 2008 Lecture 4 Lightness, Brightness and Edges.
Why isn’t vision perfect? An exercise in psychoanatomy.
Perception and VR MONT 104S, Fall 2008 Lecture 8 Seeing Depth
Correspondence and Stereopsis Original notes by W. Correa. Figures from [Forsyth & Ponce] and [Trucco & Verri]
Outline Of Today’s Discussion 1.Some Disparities are Not Retinal: Pulfrich Effect 2.Random-Dot Stereograms 3.Binocular Rivalry 4.Motion Parallax.
1 Computational Vision CSCI 363, Fall 2012 Lecture 16 Stereopsis.
Outline Of Today’s Discussion 1.Review of Wave Properties, and Fourier Analysis 2.The Contrast Sensitivity Function 3.Metamers 4.Selective Adaptation And.
Grenoble Images Parole Signal Automatique Modeling of visual cortical processing to estimate binocular disparity Introduction - The objective is to estimate.
Understanding Psychophysics: Spatial Frequency & Contrast
A computational model of stereoscopic 3D visual saliency School of Electronic Information Engineering Tianjin University 1 Wang Bingren.
Sharp Targets Are Detected Better Against a Figure, and Blurred Targets Are Detected Better Against a Background Eva Wong and Noami Weisstein, 1983.
1 Computational Vision CSCI 363, Fall 2012 Lecture 18 Stereopsis III.
Computational Vision CSCI 363, Fall 2012 Lecture 17 Stereopsis II
Independent Component Analysis features of Color & Stereo images Authors: Patrik O. Hoyer Aapo Hyvarinen CIS 526: Neural Computation Presented by: Ajay.
Exploring Spatial Frequency Channels in Stereopsis
Computational Vision CSCI 363, Fall 2016 Lecture 15 Stereopsis
Space Perception and Binocular Vision
Why is Spatial Stereoacuity so Poor?
Attention Narrows Position Tuning of Population Responses in V1
Binocular Stereo Vision
Space Perception and Binocular Vision
Binocular Stereo Vision
Volume 21, Issue 19, Pages (October 2011)
Sergei Gepshtein, Martin S. Banks  Current Biology 
Robert O. Duncan, Geoffrey M. Boynton  Neuron 
Binocular Disparity and the Perception of Depth
Saccadic suppression precedes visual motion analysis
A Map for Horizontal Disparity in Monkey V2
Attention Increases Sensitivity of V4 Neurons
Chapter 11: Stereopsis Stereopsis: Fusing the pictures taken by two cameras and exploiting the difference (or disparity) between them to obtain the depth.
Receptive Fields of Disparity-Tuned Simple Cells in Macaque V1
Detection of image intensity changes
Maxwell H. Turner, Fred Rieke  Neuron 
Presentation transcript:

Why is Spatial Stereoacuity so Poor? Martin S. Banks School of Optometry, Dept. of Psychology UC Berkeley Sergei Gepshtein Vision Science Program UC Berkeley Michael S. Landy Dept. of Psychology, Center for Neural Science NYU Supported by NIH

Depth Perception

How precise is the depth map generated from disparity?

Precision of Stereopsis Stereo precision measured in various ways A: Precision of detecting depth change on line of sight D: Precision of detecting spatial variation in depth from Tyler (1977)

Precision of Stereopsis Stereo precision measured in various ways A: Detect depth change on line of sight D: Precision of detecting spatial variation in depth from Tyler (1977)

Precision of Stereopsis Stereo precision measured in various ways A: Detect depth change on line of sight D: Detect spatial variation in depth from Tyler (1977)

Spatial Stereoacuity Modulate disparity sinusoidally creating corrugations in depth. Least disparity required for detection as a function of spatial frequency of corrugations: “Disparity MTF”. Index of precision of depth map.

Disparity MTF Disparity modulation threshold as a function of spatial frequency of corrugations. Bradshaw & Rogers (1999). Horizontal & vertical corrugations. Disparity MTF: acuity = 2-3 cpd; peak at 0.3 cpd.

Luminance Contrast Sensitivity & Acuity Luminance contrast sensitivity function (CSF): contrast for detection as function of spatial frequency. Proven useful for characterizing limits of visual performance and for understanding optical, retinal, & post-retinal processing. Highest detectable spatial frequency (grating acuity): c/deg.

Disparity MTF Spatial stereoacuity more than 1 log unit lower than luminance acuity.

Disparity MTF Why is spatial stereoacuity so low? Spatial stereoacuity more than 1 log unit lower than luminance acuity.

Likely Constraints to Spatial Stereoacuity 1.Sampling constraints in the stimulus: Stereoacuity measured using random-element stereograms. Discrete sampling limits the highest spatial frequency one can reconstruct. 2.Disparity gradient limit: With increasing spatial frequency, the disparity gradient increases. If gradient approaches 1.0, binocular fusion fails. 3.Spatial filtering at the front end: Optical quality & retinal sampling limit acuity in other tasks, so probably limits spatial stereoacuity as well. 4.The correspondence problem: Manner in which binocular matching occurs presumably affects spatial stereoacuity.

Likely Constraints to Spatial Stereoacuity 1.Sampling constraints in the stimulus: Stereoacuity measured using random-element stereograms. Discrete sampling limits the highest spatial frequency one can reconstruct. 2.Disparity gradient limit: With increasing spatial frequency, the disparity gradient increases. If gradient approaches 1.0, binocular fusion fails. 3.Spatial filtering at the front end: Optical quality & retinal sampling limit acuity in other tasks, so probably limits spatial stereoacuity as well. 4.The correspondence problem: Manner in which binocular matching occurs presumably affects spatial stereoacuity.

Spatial Sampling Limit: Nyquist Frequency Signal reconstruction from discrete samples. At least 2 samples required per cycle. In 1d, highest recoverable spatial frequency is Nyquist frequency: where N is number of samples per unit distance.

Spatial Sampling Limit: Nyquist Frequency Signal reconstruction from 2d discrete samples. In 2d, Nyquist frequency is: where N is number of samples in area A.

Methodology Random-dot stereograms with sinusoidal disparity corrugations. Corrugation orientations: +/-20 deg (near horizontal). Observers identified orientation in 2-IFC psychophysical procedure; phase randomized. Spatial frequency of corrugations varied according to adaptive staircase procedure. Spatial stereoacuity threshold obtained for wide range of dot densities. Duration = 600 msec; disparity amplitude = 16 minarc.

Stimuli

Spatial Stereoacuity as a function of Dot Density Dot Density (dots/deg 2 ) Spatial Stereoacuity (c/deg) Acuity proportional to dot density squared. Scale invariance! Asymptote at high density

Calculated Nyquist frequency for our displays. Dot Density (dots/deg 2 ) Spatial Stereoacuity (c/deg) Spatial Stereoacuity & Nyquist Limit

Calculated Nyquist frequency for our displays. Acuity approx. equal to Nyquist frequency except at high densities. Dot Density (dots/deg 2 ) Spatial Stereoacuity (c/deg) Spatial Stereoacuity & Nyquist Limit Nyquist frequency

Types of Random-element Stereograms Jittered-lattice: dots displaced randomly from regular lattice Sparse random: dots positioned randomly

Spatial Sampling Limit: Nyquist Frequency Same acuities with jittered-lattice and sparse random stereograms. Both follow Nyquist limit at low densities.

Likely Constraints to Spatial Stereoacuity 1.Sampling constraints in the stimulus: Stereoacuity measured using random-element stereograms. Discrete sampling limits the highest spatial frequency one can reconstruct. 2.Disparity gradient limit: With increasing spatial frequency, the disparity gradient increases. If gradient approaches 1.0, binocular fusion fails. 3.Spatial filtering at the front end: Optical quality & retinal sampling limit acuity in other tasks, so probably limits spatial stereoacuity as well. 4.The correspondence problem: Manner in which binocular matching occurs presumably affects spatial stereoacuity.

Disparity Gradient P1P1 P2P2 LL RR Disparity gradient = disparity / separation = (  R –  L ) / [(  L +  R )/2]

Disparity Gradient Disparity gradient = 0 P 1 & P 2 on horopter  R =  L, so disparity = 0 P1P1 P2P2

Disparity Gradient Disparity gradient = P 1 & P 2 on cyclopean line of sight  R = -  L, so separation = 0 P1P1 P2P2

Disparity Gradient Disparity gradient for different directions. horizontal disparity separation P1P1 P 2 (left & right eyes)

Disparity Gradient Limit Burt & Julesz (1980): fusion as function of disparity, separation, & direction (tilt). Set direction & horizontal disparity and found smallest fusable separation. disparity separation P1P1 P 2 (left & right eyes) direction

Disparity Gradient Limit Fusion breaks when disparity gradient reaches constant value. Critical gradient = ~1. “Disparity gradient limit”. Limit same for all directions.

Disparity Gradient Limit

Panum’s fusion area (hatched). Disparity gradient limit means that fusion area affected by nearby objects (A). Forbidden zone is conical (isotropic).

Horizontal Position (deg) Disparity (deg) highest gradient peak-trough gradient Disparity Gradient & Spatial Frequency Disparity gradient for sinusoid is indeterminant. But for fixed amplitude, gradient proportional to spatial frequency. We may have approached disparity gradient limit. Tested by reducing disparity amplitude from 16 to 4.8 minarc.

Spatial Stereoacuity & Disparity Gradient Limit Reducing disparity amplitude increases acuity at high dot densities (where DG is high). Lowers acuity slightly at low densities (where DG is low). Dot Density (dots/deg 2 ) Spatial Stereoacuity (c/deg)

Likely Constraints to Spatial Stereoacuity 1.Sampling constraints in the stimulus: Stereoacuity measured using random-element stereograms. Discrete sampling limits the highest spatial frequency one can reconstruct. 2.Disparity gradient limit: With increasing spatial frequency, the disparity gradient increases. If gradient approaches 1.0, binocular fusion fails. 3.Spatial filtering at the front end: Optical quality & retinal sampling limit acuity in other tasks, so probably limits spatial stereoacuity as well. 4.The correspondence problem: Manner in which binocular matching occurs presumably affects spatial stereoacuity.

Stereoacuity & Front-end Spatial Filtering Low-pass spatial filtering at front-end of visual system determines high-frequency roll-off of luminance CSF. Tested similar effects on spatial stereoacuity by: 1.Decreasing retinal image size of dots by increasing viewing distance. 2.Measuring stereoacuity as a function of retinal eccentricity. 3.Measuring stereoacuity as a function of blur.

Stereoacuity & Front-end Spatial Filtering Low-pass spatial filtering at front-end of visual system determines high-frequency roll-off of luminance CSF. Tested similar effects on spatial stereoacuity by: 1.Decreasing retinal image size of dots by increasing viewing distance. 2.Measuring stereoacuity as a function of retinal eccentricity. 3.Measuring stereoacuity as a function of blur.

Spatial Stereoacuity at Higher Densities Monocular artifacts at high dot densities. Reduce dot size to test upper limit. Increase viewing distance from cm. Acuity still levels off, but at higher value Modulation Amplitude Viewing Distance 39cm4.8min 154cm4.8min Nyquistfrequency Dot Density (dots/deg 2 ) Spatial Stereoacuity (c/deg)

Stereoacuity & Front-end Spatial Filtering Low-pass spatial filtering at front-end of visual system determines high-frequency roll-off of luminance CSF. Tested similar effects on spatial stereoacuity by: 1.Decreasing retinal image size of dots by increasing viewing distance. 2.Measuring stereoacuity as a function of retinal eccentricity. 3.Measuring stereoacuity as a function of blur.

Spatial Stereoacuity & Retinal Eccentricity 8 deg 4 deg fixation point eccentricity Elliptical patch with sinusoidal corrugation. Patch centered at one of three eccentricities (subject dependent). Eccentricity random; duration = 250 ms. Same task as before. Again vary dot density.

Spatial Stereoacuity & Retinal Eccentricity

TMG YHH SSG Dot Density (dots/deg 2 ) Spatial Stereoacuity (c/deg) 0 deg deg deg Retinal eccentricity Same acuities at low dot densities; Nyquist. Asymptote varies significantly with retinal eccentricity

Stereoacuity & Front-end Spatial Filtering Low-pass spatial filtering at front-end of visual system determines high-frequency roll-off of luminance CSF. Tested similar effects on spatial stereoacuity by: 1.Decreasing retinal image size of dots by increasing viewing distance. 2.Measuring stereoacuity as a function of retinal eccentricity. 3.Measuring stereoacuity as a function of blur.

Spatial Stereoacuity & Blur We examined effect of blur on foveal spatial stereoacuity. Three levels of blur introduced with diffusion plate: no blur (  = 0 deg) low blur (  = 0.12) high blur (  = 0.25)

Spatial Stereoacuity & Blur We examined effect of blur on foveal spatial stereoacuity. Three levels of blur introduced with diffusion plate: no blur (  = 0 deg) low blur (  = 0.12) high blur (  = 0.25)

Spatial Stereoacuity & Blur Same acuities at low dot densities; Nyquist. Asymptote varies significantly with spatial lowpass filtering.

Likely Constraints to Spatial Stereoacuity 1.Sampling constraints in the stimulus: Stereoacuity measured using random-element stereograms. Discrete sampling limits the highest spatial frequency one can reconstruct. 2.Disparity gradient limit: With increasing spatial frequency, the disparity gradient increases. If gradient approaches 1.0, binocular fusion fails. 3.Spatial filtering at the front end: Optical quality & retinal sampling limit acuity in other tasks, so probably limits spatial stereoacuity as well. 4.The correspondence problem: Manner in which binocular matching occurs presumably affects spatial stereoacuity.

Binocular Matching by Correlation 1.Binocular matching by correlation: basic and well-studied technique for obtaining depth map from binocular images. Computer vision: Kanade & Okutomi (1994); Panton (1978) Physiology: Ohzawa, DeAngelis, & Freeman (1990); Cumming & Parker (1997) 2. We developed a cross-correlation algorithm for binocular matching & compared its properties to the psychophysics.

Left eye’s imageRight eye’s image Compute cross- correlation between eyes’ images. Window in left eye’s image moved orthogonal to signal. For each position in left eye, window in right eye’s image moved horizontally & cross-correlation computed. Binocular Matching by Correlation

Left eye’s imageRight eye’s image Compute cross- correlation between eyes’ images. Window in left eye’s image moved orthogonal to signal. For each position in left eye, window in right eye’s image moved horizontally & cross-correlation computed. Binocular Matching by Correlation

Left eye’s imageRight eye’s image Compute cross- correlation between eyes’ images. Window in left eye’s image moved orthogonal to signal. For each position in left eye, window in right eye’s image moved horizontally & cross-correlation computed. Binocular Matching by Correlation

Plot correlation as a function of position in left eye (red arrow) & relative position in right eye (blue arrow); disparity. Correlation (gray value) high where images similar & low where dissimilar. X axis Y axis Position in left eye’s image Position in right eye’s image

Dot Density Spatial Frequency Examples of Output Dot density: 16 dots/deg 2 Spatial frequency: 1 c/deg Window size: 0.2 deg Disparity waveform evident in output

Effect of Window Size & Dot Density

Correlation window must be large enough to contain sufficient luminance variation to find correct matches

Effect of Window Size & Spatial Frequency

When significant depth variation is present in a region, window must be small enough to respond differentially

Window Size, # Samples, & Spatial Frequency From two constraints: then substitute for w, take log:

Effect of Disparity Gradient Fix spatial frequency, dot density, & window size. Increase disparity amplitude (which also increases disparity gradient: 0.21, 0.59, 1.77). As approach 1.0, disparity estimation becomes poor. Images too dissimilar in two eyes. Matching by correlation yields piecewise frontal estimates.

Effect of Disparity Gradient Fix spatial frequency, dot density, & window size. Increase disparity amplitude (which also increases disparity gradient: 0.21, 0.59, 1.77). As approach 1.0, disparity estimation becomes poor. Images too dissimilar in two eyes. Matching by correlation yields piecewise frontal estimates.

Effect of Low-pass Spatial Filtering Amount of variation in image dependent on spatial- frequency content. If  proportional to w and inversely proportional to, variation constant in cycles/window. Algorithm yields similar outputs for these images. For each , there’s a window just large enough to yield good disparity estimates. Highest detectable spatial frequency inversely proportional to .

Effect of Low-pass Spatial Filtering Spatial stereoacuity for different amounts of blur.  : all filtering elements: dots, optics, diffusion screen. Horizontal lines: predictions for asymptotic acuities. Asymptotic acuity limited by filtering before binocular combination.

Correlation algorithm reveals two effects. 1. Disparity estimation is poor when there’s insufficient intensity variation within correlation window. a. when window too small for presented dot density b. when spatial-frequency content is too low. c. employ a larger window (or receptive field). 2. Disparity estimation is poor when correlation window is too large in direction of maximum disparity gradient. a. when window width greater than half cycle of stimulus. b. employ a smaller window (or receptive field). Summary of Matching Effects

Summary 1.Sampling constraints in the stimulus: Stereoacuity follows Nyquist limit for all but highest densities. Occurs in peripheral visual field and in fovea with blur. 2.Disparity gradient limit: Stereoacuity reduced at high gradients. 3.Spatial filtering at the front end: Low-pass filtering before binocular combination determines asymptotic acuity. 4.The correspondence problem: Binocular matching by correlation requires sufficient information in correlation window & thereby reduces highest attainable acuity. Visual system measures disparity in piecewise frontal fashion.

Depth Map from Disparity depth-varying scene Disparity estimates are piecewise frontal. Only one perceived depth per direction.