Neuroscience: What You See and Hear Is What You Get

Slides:



Advertisements
Similar presentations
From: Rat performance on visual detection task modeled with divisive normalization and adaptive decision thresholds Journal of Vision. 2011;11(9):1. doi: /
Advertisements

Emotional Memory: Selective Enhancement by Sleep
The “Flash-Lag” Effect Occurs in Audition and Cross-Modally
Visual Perception: Shaping What We See
Calibrating color vision
Multisensory Integration: What You See Is Where You Hear
Positive emotional contagion in a New Zealand parrot
Jean-Paul Noel, Mark Wallace, Randolph Blake  Current Biology 
Plant Physiology: Out in the Midday Sun, Plants Keep Their Cool
Neurobiology: The Eye within the Brain
Neuronal Homeostasis: Does Form Follow Function or Vice Versa?
Aaron R. Seitz, Praveen K. Pilly, Christopher C. Pack  Current Biology 
Generalizable Learning: Practice Makes Perfect — But at What?
Sensory-Motor Integration: More Variability Reduces Individuality
Visual Categorization: When Categories Fall to Pieces
Perceptual Echoes at 10 Hz in the Human Brain
Visual Development: Learning Not to See
Integrative Cell Biology: Katanin at the Crossroads
Meiosis: Organizing Microtubule Organizers
Multisensory Integration: What You See Is Where You Hear
Kinesthetic information disambiguates visual motion signals
Neuroscience: Intelligence in the Honeybee Mushroom Body
Sergei Gepshtein, Martin S. Banks  Current Biology 
Volume 20, Issue 6, Pages R282-R284 (March 2010)
Size constancy following long-term visual deprivation
Robert J. Snowden, Alan B. Milne  Current Biology 
Volume 24, Issue 5, Pages R204-R206 (March 2014)
Palaeontology: Scrapes of Dinosaur Courtship
Visual Search and Attention
Sex Determination: Balancing Selection in the Honey Bee
Jason Samaha, Bradley R. Postle  Current Biology 
Visual Cortex: The Eccentric Area Prostriata in the Human Brain
Visual Attention: Size Matters
Yukiyasu Kamitani, Frank Tong  Current Biology 
Asymmetric tail-wagging responses by dogs to different emotive stimuli
Volume 26, Issue 3, Pages (February 2016)
Road crossing in chimpanzees: A risky business
Young Children Do Not Integrate Visual and Haptic Form Information
Neuroscience: The Rhythms of Speech Understanding
Publication metrics and success on the academic job market
Silent Reading: Does the Brain ‘Hear’ Both Speech and Voices?
Motor learning Current Biology
The Ventriloquist Effect Results from Near-Optimal Bimodal Integration
Volume 24, Issue 7, Pages R262-R263 (March 2014)
Volume 28, Issue 6, Pages R266-R269 (March 2018)
Why Seeing Is Believing: Merging Auditory and Visual Worlds
Developmental Patterning: Putting the Squeeze on Mis-specified Cells
Antigen-Presenting Cells: Professionals and amateurs
Daniel Hanus, Josep Call  Current Biology 
Auditory Neuroscience: How to Stop Tinnitus by Buzzing the Vagus
Visual Development: Learning Not to See
Stephen G. Lomber, Blake E. Butler  Current Biology 
Visual aftereffects Current Biology
Neural Basis of the Ventriloquist Illusion
Visual Adaptation of the Perception of Causality
FOXO transcription factors
Feeding the future world
Volume 7, Issue 6, Pages R147-R151 (June 2000)
Visual selection: Neurons that make up their minds
Psychophysics results: psychometric functions, visual weights and audiovisual variances. Psychophysics results: psychometric functions, visual weights.
Brain Stimulation: Feeling the Buzz
Sound Facilitates Visual Learning
The superior colliculus
A Visual Sense of Number
Neuronal Homeostasis: Does Form Follow Function or Vice Versa?
Visual Perception: One World from Two Eyes
Vision: Attending the Invisible
Circadian Biology: The Early Bird Catches the Morning Shift
Volume 18, Issue 5, Pages R198-R202 (March 2008)
The real ‘kingdoms’ of eukaryotes
Presentation transcript:

Neuroscience: What You See and Hear Is What You Get Martin S. Banks  Current Biology  Volume 14, Issue 6, Pages R236-R238 (March 2004) DOI: 10.1016/j.cub.2004.02.055

Figure 1 Optimal combination of signals from different senses. The green and yellow curves represent the probability distributions associated with an auditory stimulus presented at an azimuth of −6 degrees (green vertical arrow) and a visual stimulus presented at −2 degrees (yellow arrow). The white curve represents the distribution that would result by using the optimal combination rule (equation (1) in the text). The peak is closer to the visual than to the auditory distribution and the variance of the combined distribution is less than the variances of the other two distributions. Alais and Burr [1] presented two types of auditory–visual stimulus: a non-conflict stimulus, in which the auditory and visual directions were identical; and a conflict stimulus, in which the two differed slightly. They varied the direction of the non-conflict stimulus to find the point of subjective equality (PSE): the value that on average had the same perceived direction as the conflict stimulus. In the figure, the perceived direction of the conflict stimulus should be roughly −3 degrees, so if the brain uses the weighted-average rule, the PSE would be at −3 degrees. Current Biology 2004 14, R236-R238DOI: (10.1016/j.cub.2004.02.055)