Neural Correlates of Shape from Shading

Slides:



Advertisements
Similar presentations
I. Face Perception II. Visual Imagery. Is Face Recognition Special? Arguments have been made for both functional and neuroanatomical specialization for.
Advertisements

The auditory cortex mediates the perceptual effects of acoustic temporal expectation Santiago Jaramillo & Anthony M Zador Cold Spring Harbor Laboratory,
Distinguishing Evidence Accumulation from Response Bias in Categorical Decision-Making Vincent P. Ferrera 1,2, Jack Grinband 1,2, Quan Xiao 1,2, Joy Hirsch.
Lecture 2b Readings: Kandell Schwartz et al Ch 27 Wolfe et al Chs 3 and 4.
Graphical Models in Vision. Alan L. Yuille. UCLA. Dept. Statistics.
As expected, a large N400 effect was observed for all 3 word types in both experiments, |ts|≥7.69, ps
Summary of results. Reiterate goal of investigation: How general is anticipatory behavior observed by Land & McCleod? Found: Untrained subjects exhibit.
Capabilities of Humans. Gestalt More than the sum of its parts.
Unit 4: Perceptual Organization and Interpretation
SPECULAR FLOW AND THE PERCEPTION OF SURFACE REFLECTANCE
A Bayesian account of context-induced orientation illusions
Trans. Vis. Sci. Tech ;4(2):15. doi: /tvst Figure Legend:
Journal of Vision. 2008;8(11):18. doi: / Figure Legend:
From: Occlusion-related lateral connections stabilize kinetic depth stimuli through perceptual coupling Journal of Vision. 2009;9(10):20. doi: /
Perception The process of organizing and interpreting information, enabling us to recognize meaningful objects and events.
Uri Hasson, Ifat Levy, Marlene Behrmann, Talma Hendler, Rafael Malach 
Volume 53, Issue 1, Pages 9-16 (January 2007)
Evoked Response Potential (ERP) and Face Stimuli N170: negative-going potential at 170 ms Largest over the right parietal lobe,
Sensation/Perception AP PSYCH 42S (15) ~ Daniel B. & Hao-Yi S.
Volume 60, Issue 4, Pages (November 2008)
Volume 75, Issue 1, Pages (July 2012)
Temporal Processing and Adaptation in the Songbird Auditory Forebrain
A preference for global convexity in local shape perception
Gijsbert Stoet, Lawrence H Snyder  Neuron 
Volume 26, Issue 4, Pages (February 2016)
Choosing Goals, Not Rules: Deciding among Rule-Based Action Plans
1 Hz 3 Hz Figure 2 (suppl.) The effect of amplitude of modulation on appearance of shading. Results for individual observers. Thick surrounds (blue square),
Perceptual Echoes at 10 Hz in the Human Brain
Volume 64, Issue 4, Pages (November 2009)
Vision: In the Brain of the Beholder
Daphna Shohamy, Anthony D. Wagner  Neuron 
Orientation tuning: strongest response to one orientation
A Vestibular Sensation: Probabilistic Approaches to Spatial Perception
Martin O'Neill, Wolfram Schultz  Neuron 
SENSATION AND PERCEPTION
Volume 66, Issue 6, Pages (June 2010)
Cycle 10: Brain-state dependence
Volume 29, Issue 1, Pages (January 2001)
SENSATION AND PERCEPTION
Cognitive Processes PSY 334
Volume 74, Issue 5, Pages (June 2012)
Perception.
Attentional Modulations Related to Spatial Gating but Not to Allocation of Limited Resources in Primate V1  Yuzhi Chen, Eyal Seidemann  Neuron  Volume.
Dynamic Shape Integration in Extrastriate Cortex
Gamma and the Coordination of Spiking Activity in Early Visual Cortex
Volume 66, Issue 4, Pages (May 2010)
Volume 24, Issue 13, Pages (July 2014)
Multi-level Crowding and the Paradox of Object Recognition in Clutter
Dynamic Coding for Cognitive Control in Prefrontal Cortex
______________ processing refers to how the physical characteristics of stimuli influence their interpretation. Top down Bottom up Para psychological Interdisciplinary.
Machine Learning for Visual Scene Classification with EEG Data
Neural Mechanisms of Visual Motion Perception in Primates
Franco Pestilli, Marisa Carrasco, David J. Heeger, Justin L. Gardner 
Fangtu T. Qiu, Rüdiger von der Heydt  Neuron 
Wallis, JD Helen Wills Neuroscience Institute UC, Berkeley
Redmond G. O’Connell, Michael N. Shadlen, KongFatt Wong-Lin, Simon P
Serial, Covert Shifts of Attention during Visual Search Are Reflected by the Frontal Eye Fields and Correlated with Population Oscillations  Timothy J.
Franco Pestilli, Marisa Carrasco, David J. Heeger, Justin L. Gardner 
Introduction to Perception: Visual Perception
Timescales of Inference in Visual Adaptation
Volume 76, Issue 4, Pages (November 2012)
Short-Term Memory for Figure-Ground Organization in the Visual Cortex
Volume 128, Issue 3, Pages (March 2005)
Encoding of Stimulus Probability in Macaque Inferior Temporal Cortex
Gijsbert Stoet, Lawrence H Snyder  Neuron 
Perception.
Multineuronal Firing Patterns in the Signal from Eye to Brain
Judging Peripheral Change: Attentional and Stimulus-Driven Effects
Pairing-Induced Changes of Orientation Maps in Cat Visual Cortex
Volume 75, Issue 1, Pages (July 2012)
Presentation transcript:

Neural Correlates of Shape from Shading Mamassian, Jentzsch, Bacon, Schweinberger NeuroReport, 2003 Or Hou, Pettet, Vildavski, Norcia Journal of Vision, 2006 Brian Potetz 2/15/06 http://www.cnbc.cmu.edu/cns

Neural Correlates of Shape from Shading Mamassian, Jentzsch, Bacon, Schweinberger NeuroReport, 2003 Human observers are biased towards perceiving light coming from above-left. Where in the brain is this prior represented? How quickly can this prior be observed in neural signals?

Stimulus

Thin stripes lit from above, thick from below

Thin stripes lit from the left, thick from the right

Thin stripes lit from below, thick from above

Thin stripes lit from the right, thick from the left

Stimulus Repeated for 16 orientations 2 phases (responses averaged together due to similarity)

Results Subjects prefer above-left lighting percepts (13.8 bias)

Results Subjects prefer above-left lighting percepts (13.8 bias) N2 (280-300ms) VEP signal resembled “narrow score” (~26 bias)

No Controls for Low-Level Cues Perceived shape is not the only property that changes with stimulus orientation. Many low-level image cues could have similar response profiles.

VEP does not vary according to stimulus orientation until ~232ms.

Behavioral Response vs Early VEP Authors select ambiguous orientations (90, 105 ), (270, 285 ) Divide trials according to perceived 3D shape (narrow or thick strips) Using ANOVA, they find an interaction between behavioral response and early-response (96-104ms) VEP signal. This interaction is found for “all lateral electrode sites”

Some missing statistics The authors use ANOVA to find an interaction between behavioral response and early-response VEP signal at some recording sites. The authors claim: The interaction of the P1 amplitude with the participants’ response indicates that the shape was disambiguated within the first 100ms of the stimulus presentation. Was this “interaction” in the same direction as the interaction between VEP and orientation? Based on Fig. 2, we would expect VEP to be higher when narrow strips are perceived. Are they? (Not always, as we will see) How strong was this effect? Could I accurately predict participants’ responses based on early VEP? Even if VEP were strongly2 correlated1 with behavioral response, there are multiple possible conclusions. Differences in VEP might be due to variations in a neural signal that encodes the perceived lighting direction (as the authors suggest), or priors on lighting direction. Or, the VEP may merely reflect a signal that is largely unrelated to SFS unless the stim is completely bistable, so that the perceived shape is effectively a toss-up. Could VEP have been correlated with response even before the stim was presented?

Hemispheric Difference Relationship between VEP and behavioral response depends on hemisphere. Extent of that deviation is correlated with subject’s preferred lighting direction (R = 0.83).

A Bottom-Up Mechanism for Shape from Shading? Authors argue that these results are evidence that “shape from shading is mostly a bottom-up mechanism”. However, static priors like p(L) are not the only source of contextual information. It makes sense for static priors to be encoded in low-level visual areas. But dynamic, contextual priors change frequently, and may need to be represented in higher cortical areas. Examples: I can see where the light-source is. I was told where it is, I’ve been here before, etc. P(L|context)

A Bottom-Up Mechanism for Shape from Shading?

A Bottom-Up Mechanism for Shape from Shading?

A Bottom-Up Mechanism for Shape from Shading? Also, ambiguity in lighting direction is not the only potential source of ambiguity. Even under known illumination conditions, solving shape from shading in natural images is a difficult, unsolved problem. Harder problems may require more top-down cues. Ambiguous, but computationally simple once a light source direction is chosen. A harder problem, may benefit from contextual information (recognizing the material as fabric, etc)

A Bottom-Up Mechanism for Shape from Shading? Also, ambiguity in lighting direction is not the only potential source of ambiguity. Even under known illumination conditions, solving shape from shading in natural images is a difficult, unsolved problem. Harder problems may require more top-down cues. Disambiguating surface markings and shadow from shading variations is even more difficult, and can benefit strongly from contextual cues. In this image, our perception is aided by mid-level context, like the recognition that the object is dirty, tarnished metal and also high-level contextual cues, like the recognition of the object as a coin, and the figure as a face.

Neural Correlates of Shape from Shading Hou, Pettet, Vildavski, Norcia Journal of Vision, 2006

Switching from 3D!2D stim results in a more flat response than switching from 2D!3D

Controlling for low-level cues

Controlling for low-level cues Grey line: 3D on/off percept Black line: 3D lateral motion