Sound Facilitates Visual Learning

Slides:



Advertisements
Similar presentations
From: Perceptual entrainment of individually unambiguous motions
Advertisements

Rapid and Persistent Adaptability of Human Oculomotor Control in Response to Simulated Central Vision Loss  MiYoung Kwon, Anirvan S. Nandy, Bosco S. Tjan 
Attention Narrows Position Tuning of Population Responses in V1
Visual Influences on Echo Suppression
Backward Masking and Unmasking Across Saccadic Eye Movements
Volume 27, Issue 7, Pages (April 2017)
Rei Akaishi, Kazumasa Umeda, Asako Nagase, Katsuyuki Sakai  Neuron 
Araceli Ramirez-Cardenas, Maria Moskaleva, Andreas Nieder 
Elise A. Piazza, Marius Cătălin Iordan, Casey Lew-Williams 
Decision Making during the Psychological Refractory Period
Aaron R. Seitz, Praveen K. Pilly, Christopher C. Pack  Current Biology 
Perceptual Echoes at 10 Hz in the Human Brain
Volume 26, Issue 14, Pages (July 2016)
Volume 23, Issue 18, Pages (September 2013)
Huan Luo, Xing Tian, Kun Song, Ke Zhou, David Poeppel  Current Biology 
Depth Affects Where We Look
Braden A. Purcell, Roozbeh Kiani  Neuron 
Volume 17, Issue 21, Pages (November 2007)
Neural Correlates of Knowledge: Stable Representation of Stimulus Associations across Variations in Behavioral Performance  Adam Messinger, Larry R. Squire,
The Privileged Brain Representation of First Olfactory Associations
Differential Impact of Behavioral Relevance on Quantity Coding in Primate Frontal and Parietal Neurons  Pooja Viswanathan, Andreas Nieder  Current Biology 
Volume 27, Issue 6, Pages (March 2017)
Syed A. Chowdhury, Gregory C. DeAngelis  Neuron 
Volume 24, Issue 13, Pages (July 2014)
Volume 25, Issue 14, Pages (July 2015)
Learning to Link Visual Contours
Nicolas Catz, Peter W. Dicke, Peter Thier  Current Biology 
Adaptation Disrupts Motion Integration in the Primate Dorsal Stream
Serial Dependence in the Perception of Faces
Contour Saliency in Primary Visual Cortex
Volume 27, Issue 23, Pages e3 (December 2017)
Franco Pestilli, Marisa Carrasco, David J. Heeger, Justin L. Gardner 
Integration Trumps Selection in Object Recognition
Volume 74, Issue 4, Pages (May 2012)
BOLD fMRI Correlation Reflects Frequency-Specific Neuronal Correlation
Volume 19, Issue 6, Pages (March 2009)
Perception Matches Selectivity in the Human Anterior Color Center
Neuronal Response Gain Enhancement prior to Microsaccades
Rei Akaishi, Kazumasa Umeda, Asako Nagase, Katsuyuki Sakai  Neuron 
Volume 25, Issue 5, Pages (March 2015)
Visual Sensitivity Can Scale with Illusory Size Changes
Robert A. Harris, Paul Graham, Thomas S. Collett  Current Biology 
Franco Pestilli, Marisa Carrasco, David J. Heeger, Justin L. Gardner 
Variance After-Effects Distort Risk Perception in Humans
Social Signals in Primate Orbitofrontal Cortex
The sound of change: visually-induced auditory synesthesia
The Normalization Model of Attention
Volume 72, Issue 6, Pages (December 2011)
Noa Raz, Ella Striem, Golan Pundak, Tanya Orlov, Ehud Zohary 
Andrew Clouter, Kimron L. Shapiro, Simon Hanslmayr  Current Biology 
Attention Reorients Periodically
Volume 26, Issue 14, Pages (July 2016)
Visual Adaptation of the Perception of Causality
Attention Samples Stimuli Rhythmically
Category Selectivity in the Ventral Visual Pathway Confers Robustness to Clutter and Diverted Attention  Leila Reddy, Nancy Kanwisher  Current Biology 
When Correlation Implies Causation in Multisensory Integration
Daniela Vallentin, Andreas Nieder  Current Biology 
Age-Related Declines of Stability in Visual Perceptual Learning
Cross-Modal Associative Mnemonic Signals in Crow Endbrain Neurons
Volume 21, Issue 23, Pages (December 2011)
The Postsaccadic Unreliability of Gain Fields Renders It Unlikely that the Motor System Can Use Them to Calculate Target Position in Space  Benjamin Y.
Hippocampal-Prefrontal Theta Oscillations Support Memory Integration
Christoph Kayser, Nikos K. Logothetis, Stefano Panzeri  Current Biology 
Kazumichi Matsumiya, Satoshi Shioiri  Current Biology 
Søren K. Andersen, Steven A. Hillyard, Matthias M. Müller 
Gaby Maimon, Andrew D. Straw, Michael H. Dickinson  Current Biology 
Figure 1. Probabilistic learning task
Volume 28, Issue 19, Pages e8 (October 2018)
Volume 27, Issue 6, Pages (March 2017)
Motion-Induced Blindness and Motion Streak Suppression
Presentation transcript:

Sound Facilitates Visual Learning Aaron R. Seitz, Robyn Kim, Ladan Shams  Current Biology  Volume 16, Issue 14, Pages 1422-1427 (July 2006) DOI: 10.1016/j.cub.2006.05.048 Copyright © 2006 Elsevier Ltd Terms and Conditions

Figure 1 Task Schematic All sessions for all groups involved the same basic task. A sequence of two stimulus displays was presented (one containing a directional signal and the other containing only noise). After the sequence, subjects reported, with a key press, first the interval that contained the signal (1 or 2) and second the direction of that signal (left or right). In the unisensory condition (top), stimuli consisted of randomly moving dots, a fraction of which moved coherently (direction signal depicted here by white arrows). In the multisensory condition (bottom), visual stimuli were the same and auditory motion stimuli (see Experimental Procedures) were added, which moved in a direction consistent with that of the visual stimuli. Subjects were required to fixate on a central cross while performing the task. In training sessions, feedback was given on interval response; in tests, no feedback was given. Current Biology 2006 16, 1422-1427DOI: (10.1016/j.cub.2006.05.048) Copyright © 2006 Elsevier Ltd Terms and Conditions

Figure 2 Fast-Learning Results (Top) Data from first training session for multisensory group (black) and unisensory group (gray) for visual-only trials. Only the detection data are shown here. The discrimination data also show improved learning for the multisensory group (see Figure S2). Ordinate is percent correct averaged across three signal levels of visual-only trials, and abscissa reflects proportion of the first session. Curves were constructed by averaging sequential trials of each signal level for each subject and then smoothing this with a 30-trial-wide boxcar filter (other filter widths provide qualitatively similar results) and averaging smoothed data across subjects; data at the very beginning and end of the session were excluded because of filtering boundary effects. Error bars reflect within-group standard error [46]. (Bottom) Running paired t test for the multisensory (black) and unisensory (gray) groups comparing performance at the beginning of the session (first valid smoothed bin) with each subsequent point in the session on visual-only trials. Current Biology 2006 16, 1422-1427DOI: (10.1016/j.cub.2006.05.048) Copyright © 2006 Elsevier Ltd Terms and Conditions

Figure 3 Slow-Learning Results (Top) Data from each training session for multisensory group (black) and unisensory group (gray) on visual-only trials. Only the detection data are shown here. The discrimination data are qualitatively similar (see Figure S3), as is the effect when broken down by individual stimulus level (see Figure S5). Ordinate is percent correct averaged across three signal levels of visual-only trials during the first third of the session; abscissa reflects training session number. We chose the first third of the session because during this phase neither group showed fast learning and performance was similar across groups in the first session, but results are qualitatively similar for other early phases of the sessions. Error bars reflect within-group standard error [46]. (Bottom) Running paired t test for the multisensory (black) and unisensory (gray) groups comparing performance of the first training session with that of each subsequent session in visual-only trials. Current Biology 2006 16, 1422-1427DOI: (10.1016/j.cub.2006.05.048) Copyright © 2006 Elsevier Ltd Terms and Conditions