Multisensory integration: perceptual grouping by eye and ear

Slides:



Advertisements
Similar presentations
Visual speech speeds up the neural processing of auditory speech van Wassenhove, V., Grant, K. W., & Poeppel, D. (2005) Proceedings of the National Academy.
Advertisements

Chapter 12 Speech Perception. Animals use sound to communicate in many ways Bird calls Bird calls Whale calls Whale calls Baboons shrieks Baboons shrieks.
SPEECH PERCEPTION 2 DAY 17 – OCT 4, 2013 Brain & Language LING NSCI Harry Howard Tulane University.
Speech Perception Overview of Questions Can computers perceive speech as well as humans? Does each word that we hear have a unique pattern associated.
Mirror Neurons.
Mind, Brain & Behavior Friday February 7, From Nerve Cells to Cognition (Cont.) Chapter 18.
The role of auditory-visual integration in object recognition Clara Suied 1, Nicolas Bonneel 2 and Isabelle Viaud-Delmon 1 1 CNRS – UPMC UMR 7593 Hôpital.
Multimodal Perception
The Process of Forming Perceptions SHMD219. Perception The ability to see, hear, or become aware of something through the senses. Perception is a series.
Perception and Attention Advanced Cognitive Psychology PSY 421, Fall 2004.
Cortical evoked potentials to an auditory illusion: Binaural beats
The Brain And It’s Organization.
Copyright © American Speech-Language-Hearing Association
Visual Perception: Shaping What We See
Multisensory perception: Beyond modularity and convergence
Evoked Response Potential (ERP) and Face Stimuli N170: negative-going potential at 170 ms Largest over the right parietal lobe,
Cognitive Brain Dynamics Lab
Multisensory Integration: What You See Is Where You Hear
Who is That? Brain Networks and Mechanisms for Identifying Individuals
Multisensory Interplay Reveals Crossmodal Influences on ‘Sensory-Specific’ Brain Regions, Neural Responses, and Judgments  Jon Driver, Toemme Noesselt 
Neuroscience: What You See and Hear Is What You Get
Stereopsis: How the brain sees depth
Jean-Paul Noel, Mark Wallace, Randolph Blake  Current Biology 
Visual Maps: To Merge or Not To Merge
The Nose Smells What the Eye Sees
Face Cells: Separate Processing of Expression and Gaze in the Amygdala
Michael S Beauchamp, Kathryn E Lee, Brenna D Argall, Alex Martin 
Viewing Lip Forms Neuron
Sequential effects of propofol on functional brain activation induced by auditory language processing: an event-related functional magnetic resonance.
Vision: In the Brain of the Beholder
Multisensory Integration: What You See Is Where You Hear
Mental Imagery Changes Multisensory Perception
Multisensory Processes: A Balancing Act across the Lifespan
Multisensory flavour perception
Multisensory Perception and Biological Motion
Face Perception: Broken into Parts
Posterior parietal cortex
Binocular Rivalry and Visual Awareness in Human Extrastriate Cortex
Volume 12, Issue 18, Pages (September 2002)
Integration of Touch and Sound in Auditory Cortex
Michael S Beauchamp, Kathryn E Lee, Brenna D Argall, Alex Martin 
The Functional Neuroanatomy of Object Agnosia: A Case Study
Audiovisual Integration of Letters in the Human Brain
Learning Letters in Adulthood
Patrick Haggard, Gian Domenico Iannetti, Matthew R. Longo 
Katherine M. Armstrong, Jamie K. Fitzgerald, Tirin Moore  Neuron 
Integration of Local Features into Global Shapes
Visual Maps: To Merge or Not To Merge
Volume 27, Issue 13, Pages R631-R636 (July 2017)
Stephen G. Lomber, Blake E. Butler  Current Biology 
Neuroimaging: Perception at the Brain's Core
Vahe Poghosyan, Andreas A. Ioannides  Neuron 
Neural Basis of the Ventriloquist Illusion
Neural and Computational Mechanisms of Action Processing: Interaction between Visual and Motor Representations  Martin A. Giese, Giacomo Rizzolatti  Neuron 
Auditory perception: The near and far of sound localization
Vision: Modular analysis – or not?
Volume 7, Issue 6, Pages R147-R151 (June 2000)
Multisensory Integration: Maintaining the Perception of Synchrony
Sensory processing: Signal selection by cortical feedback
Viewing Lip Forms Neuron
The lateral geniculate nucleus
Qualia Current Biology Volume 22, Issue 10, Pages R392-R396 (May 2012)
Laurie S. Glezer, Xiong Jiang, Maximilian Riesenhuber  Neuron 
Multisensory Integration: Space, Time and Superadditivity
Social Attention and the Brain
Perceptual Learning: Is V1 up to the Task?
Christoph Kayser, Nikos K. Logothetis, Stefano Panzeri  Current Biology 
The superior colliculus
Attention and Scene Perception
Before Speech: Cerebral Voice Processing in Infants
Presentation transcript:

Multisensory integration: perceptual grouping by eye and ear Andrew J King, Gemma A Calvert  Current Biology  Volume 11, Issue 8, Pages R322-R325 (April 2001) DOI: 10.1016/S0960-9822(01)00175-0

Fig. 1 The influence of visual processing on the perception of speech sounds, commonly known as the McGurk effect. When the speaker mouths the syllable /ga/, but the auditory stimulus is actually /ba/, subjects tend to hear /da/. Current Biology 2001 11, R322-R325DOI: (10.1016/S0960-9822(01)00175-0)

Fig. 2 Multisensory interactions at the single cell level. Many neurons in the deeper layers of the superior colliculus receive converging inputs from two or more sensory modalities and their responses depend on the spatiotemporal relationship between the different stimuli. (a) Example of a neuron that responds weakly to both visual (V) and auditory (A) stimuli presented separately, but much more vigorously when they are combined (VA). (b) Example of a neuron in which the response to a visual stimulus is depressed by combining this with an auditory stimulus, even though the latter alone is apparently ineffective in activating the neuron. Current Biology 2001 11, R322-R325DOI: (10.1016/S0960-9822(01)00175-0)

Fig. 3 Evidence from fMRI for binding of the visual and auditory components of speech in the human superior temporal sulcus. (a) Listening to congruent auditory and visual speech signals — lip movements synchronized to the same heard words — produces larger responses than the sum of the unimodal responses, whereas incongruent stimuli — lip movements corresponding to different words from those heard — evoke smaller responses. (b) Location of the region in the posterior bank of the left superior temporal sulcus where these response interactions are observed. This follows the usual radiological convention of showing the left hemisphere on the right side of the image. (Modified from Calvert et al.[16].) Current Biology 2001 11, R322-R325DOI: (10.1016/S0960-9822(01)00175-0)