Cognitive Brain Dynamics Lab

Slides:



Advertisements
Similar presentations
FMRI Methods Lecture 10 – Using natural stimuli. Reductionism Reducing complex things into simpler components Explaining the whole as a sum of its parts.
Advertisements

fMRI Methods Lecture 9 – The brain at rest
Steven L. Bressler Cognitive Neurodynamics Laboratory Center for Complex Systems & Brain Sciences Department of Psychology Florida Atlantic University.
CNTRICS April 2010 Center-surround: Adaptation to context in perception Robert Shapley Center for Neural Science New York University.
Visual speech speeds up the neural processing of auditory speech van Wassenhove, V., Grant, K. W., & Poeppel, D. (2005) Proceedings of the National Academy.
Fast Readout of Object Identity from Macaque Inferior Tempora Cortex Chou P. Hung, Gabriel Kreiman, Tomaso Poggio, James J.DiCarlo McGovern Institute for.
functional magnetic resonance imaging study in a nonverbal task.
Hunger and satiety in anorexia nervosa: fMRI during cognitive processing of food pictures Santel et al., 2006 Maria Zuluaga Bsc Psychology 14’
Music increases frontal EEG coherence during verbal learning David A. Peterson a,b,c, ∗, Michael H. Thaut b,c a Department of Computer Science, Colorado.
Auditory-acoustic relations and effects on language inventory Carrie Niziolek [carrien] may 2004.
Innateness of colour categories is a red herring: insights from computational modelling Tony Belpaeme Artificial Intelligence Lab Vrije Universiteit Brussel.
Max Planck Institute for Human Cognitive and Brain Sciences Leipzig, Germany A hierarchy of time-scales and the brain Stefan Kiebel.
Active Vision Key points: Acting to obtain information Eye movements Depth from motion parallax Extracting motion information from a spatio-temporal pattern.
Human perception and recognition of metric changes of part-based dynamic novel objects Quoc C. Vuong, Johannes Schultz, & Lewis Chuang Max Planck Institute.
Optimism vs. skepticism in cognitive neuroscience (Bechtel + Gallant’s lab vs. Uttal + Hardcastle)
Graph Evolution: A Computational Approach Olaf Sporns, Department of Psychological and Brain Sciences Indiana University, Bloomington, IN 47405
Dynamic Causal Modelling of Evoked Responses in EEG/MEG Wellcome Dept. of Imaging Neuroscience University College London Stefan Kiebel.
Human Brain and Behavior Laboratory Center for Complex Systems & Brain Sciences Neural mechanisms of social coordination: a continuous EEG analysis using.
Neural Basis of the Ventriloquist Illusion Bonath, Noesselt, Martinez, Mishra, Schwiecker, Heinze, and Hillyard.
Pattern Classification of Attentional Control States S. G. Robison, D. N. Osherson, K. A. Norman, & J. D. Cohen Dept. of Psychology, Princeton University,
Chapter 2. From Complex Networks to Intelligent Systems in Creating Brain-like Systems, Sendhoff et al. Course: Robots Learning from Humans Baek, Da Som.
The brain at rest. Spontaneous rhythms in a dish Connected neural populations tend to synchronize and oscillate together.
Introduction Ruth Adam & Uta Noppeney Max Planck Institute for Biological Cybernetics, Tübingen Scientific Aim Experimental.
D AVIDSSON ET AL L ONG - TERM MEDITATORS SELF - INDUCE HIGH - AMPLITUDE GAMMA SYNCHRONY DURING MENTAL PRACTICE Background: Practitioners understand.
Asta K. Håberg Trondheim fMRI gruppe Department of Neuroscience Norwegian University of Science and Technology (NTNU) Human navigation abilities elucidated.
Maxlab proprietary information – 5/4/09 – Maximilian Riesenhuber CT2WS at Georgetown: Letting your brain be all that.
How Do Brain Areas Work Together When We Think, Perceive, and Remember? J. McClelland Stanford University.
Ch. 13 A face in the crowd: which groups of neurons process face stimuli, and how do they interact? KARI L. HOFFMANN 2009/1/13 BI, Population Coding Seminar.
Connecting Sound with the Mind’s Eye: Multisensory Interactions in Music Conductors W. David Hairston, Ph.D Advanced Neuroscience Imaging Research Lab.
Mihály Bányai, Vaibhav Diwadkar and Péter Érdi
Mechanisms of Simple Perceptual Decision Making Processes
The superior temporal sulcus Oct 4, 2017 – DAY 16
Nicolas Alzetta CoNGA: Cognition and Neuroscience Group of Antwerp
Effective Connectivity
Cerebral responses to vocal attractiveness and auditory
Dynamic Causal Modelling (DCM): Theory
A visual sense of number
Who is That? Brain Networks and Mechanisms for Identifying Individuals
Cognitive Brain Dynamics Lab
Andy Dykstra HST.722 November 1, 2007
Experimental Design in Functional Neuroimaging
Michael S Beauchamp, Kathryn E Lee, Brenna D Argall, Alex Martin 
Joerg F. Hipp, Andreas K. Engel, Markus Siegel  Neuron 
Dynamic Causal Modelling
Visually-induced auditory spatial adaptation in monkeys and humans
Frequency-specific network connectivity increases underlie accurate spatiotemporal memory retrieval Andrew J Watrous, Nitin Tandon, Chris R Conner, Thomas.
Multimodal Human-Computer Interaction New Interaction Techniques 22. 1
Marcus Grueschow, Rafael Polania, Todd A. Hare, Christian C. Ruff 
John-Dylan Haynes, Jon Driver, Geraint Rees  Neuron 
Westfälische Wilhelms-Universität Münster
Cycle 10: Brain-state dependence
Dynamic Causal Modelling for M/EEG
Ciaran Cooney, Raffaella Folli, Damien Coyle  iScience 
Multisensory Perception and Biological Motion
Volume 79, Issue 4, Pages (August 2013)
Bayesian Methods in Brain Imaging
The Prefrontal Cortex—An Update
Volume 76, Issue 2, Pages (October 2012)
Progress Seminar 권순빈.
Multisensory integration: perceptual grouping by eye and ear
Volume 45, Issue 4, Pages (February 2005)
Michael S Beauchamp, Kathryn E Lee, Brenna D Argall, Alex Martin 
Effective Connectivity
The Molecular and Systems Biology of Memory
Types of Brain Connectivity By Amnah Mahroo
Dynamic Causal Modelling for evoked responses
Stefano Panzeri, Jakob H. Macke, Joachim Gross, Christoph Kayser 
Assessment of Neural Dynamics in Severe Traumatic Brain Injured Patients with Disorders of Consciousness Casey S. Gilmore, Ph.D. Defense and Veterans Brain.
Multiscale interactions and neuronal mass modeling Chair: Viktor Jirsa
César F. Lima, Saloni Krishnan, Sophie K. Scott 
Presentation transcript:

Cognitive Brain Dynamics Lab Teasing out the multi-scale representational space of cross-modal speech perception:  mechanisms Arpan Banerjee Cognitive Brain Dynamics Lab

Cognitive Brain Dynamics Lab

Overview Cross-modal perception in speech Spatio-temporal localization of cross-modal perception Computational model of underlying network dynamics

I. Speech perception Most fundamental mode of human communication (before Facebook!) which is affected in brain injuries (stroke), spectrum disorders Animal models are hard to construct Key question: What is the representational space that processes the acts of listening and speaking?

Cross-modal perception in speech McGurk and McDonald, 1976

Behavior as a function of time lags Illusory perception is mostly reported 0 & 150ms lagged videos Munhall, 1996 Thakur et al (2016) Mukherjee et al (in prep)

Points to conceptualize Multiple senses interact to give rise to perception, sometimes illusory. Ecological (environmental) contexts give rise to stability and instability of perception, e.g. variation with AV lags What are the minimum requirements of a multisensory experience

II. Spatiotemporal localization of speech perception Abhishek Mukherjee G. Vinodh Kumar

Spatial boundaries of perceptual networks fMRI of 55 human subjects 3T scanner at NBRC, Manesar Right handed healthy normals, bilingual population Abhishek Mukherjee

Representation of cross-modal perception Behavior Illusory perception is mostly reported 0 & 150ms lagged videos Brain Activation > Rest (p<0.01) FDR corrected (fMRI data, 34 human volunteers) IFG pSTS Lag(-300) Lag(-150) Lag(0) Lag(150) Lag(300) Lag(450) Result: More activity in pSTS only when lags are 0 to 150ms, along with increase in connectivity between IPL, pSTS and IFG Mukherjee, Raghunathan & Banerjee (in prep)

Representation of cross-modal perception

Key observations Activation in posterior superior temporal sulcus (pSTS), inferior frontal gyrus (IFG), auditory cortex and V5 increases with temporal congruency of audio-visual stimulus. When blocks were sorted according to perceptual categorization as maximal illusory, minimal illusory, pSTS, SPL activations were observed in Max /ta We hypothesize cross-modal perception is exclusively facilitated by a network pSTS, SPL and higher order visual and auditory areas that may not be necessarily activated by cross-modal stimuli.

Large scale network dynamics of cross-modal perception: EEG 25 healthy normal humans , right handed and bilingual No history of imapaired auditory processing/ neurological disorders Kumar et. al Frontiers in Psychology (2016) Kumar et. al Multsensory Res (2017) G. Vinodh Kumar

Results: Behavior , 15 frequent and 10 rare perceivers

Coherence: Spectral representation of neuronal coordination Coherence is an estimator of the brain network

15 frequent perceivers 10 rare perceivers Alpha coherence peak absent in rare perceivers

Within frequent perceivers (Inter-trial variability) 0 lag has broadband gamma coherence getting increased along with a decrease in alpha coherence In 450 and -450 ms lag broadband increase in gamma coherence was absoer

Source localization in frequent and rare perceivers Cortical locations where source power ta/pa significantly higher. Surprisingly distribution were identical for frequent and rare perceivers

Interim summary Inter-trial and inter group variability may originate from two different cortical oscillations Several areas observed in fMRI could also be found out by source localization to create a etwork model.

III. Detailed model of cross-modal perception Auditory cortex pSTS Shrey Dutta

Large-scale model of multisensory of perception Kumar, Dutta et al (in progress)

Predicting the neural dynamics of rare perceivers

Summary Dynamic framework of multisensory experience that captures biophysically realistic functional connectivity and environmental constraints as key mediators of perceptual experience Multimodal (EEG/MRI) and multi-level representation (segregation/ integration/ coherence of EEG signals) of perceptual experience in the brain Whole-brain analysis techniques give insights to the representational space of multisensory perception Interactions between fast and slow time-scale systems are crucial in multisensory integration

Summary Cross-modal perception in speech Spatio-temporal localization of cross-modal perception Computational model of underlying network dynamics

Funding Department of Biotechnology (Ramalingaswami & IYBA) Department of Science and Technology (CSRI) Science Education and Research Board (SERB) NBRC Core

THANK YOU