A new theoretical framework for multisensory integration Michael W. Hadley and Elaine R. Reynolds Neuroscience Program, Lafayette College, Easton PA 18042.

Slides:



Advertisements
Similar presentations
© Negnevitsky, Pearson Education, Introduction Introduction Hebbian learning Hebbian learning Generalised Hebbian learning algorithm Generalised.
Advertisements

Introduction to Artificial Neural Networks
Un Supervised Learning & Self Organizing Maps. Un Supervised Competitive Learning In Hebbian networks, all neurons can fire at the same time Competitive.
5/16/2015Intelligent Systems and Soft Computing1 Introduction Introduction Hebbian learning Hebbian learning Generalised Hebbian learning algorithm Generalised.
Artificial neural networks:
Machine Learning: Connectionist McCulloch-Pitts Neuron Perceptrons Multilayer Networks Support Vector Machines Feedback Networks Hopfield Networks.
Decision Support Systems
Slides are based on Negnevitsky, Pearson Education, Lecture 8 Artificial neural networks: Unsupervised learning n Introduction n Hebbian learning.
Self-Organizing Hierarchical Neural Network
Carla P. Gomes CS4700 CS 4700: Foundations of Artificial Intelligence Prof. Carla P. Gomes Module: Neural Networks: Concepts (Reading:
Un Supervised Learning & Self Organizing Maps Learning From Examples
Chapter Seven The Network Approach: Mind as a Web.
Introduction to Neural Networks John Paxton Montana State University Summer 2003.
Instar Learning Law Adapted from lecture notes of the course CN510: Cognitive and Neural Modeling offered in the Department of Cognitive and Neural Systems.
Overview 1.The Structure of the Visual Cortex 2.Using Selective Tuning to Model Visual Attention 3.The Motion Hierarchy Model 4.Simulation Results 5.Conclusions.
Artificial Neural Networks Ch15. 2 Objectives Grossberg network is a self-organizing continuous-time competitive network.  Continuous-time recurrent.
A neural mechanism for robust junction representation in the visual cortex University of Ulm Dept. of Neural Information Processing Thorsten Hansen and.
November 24, 2009Introduction to Cognitive Science Lecture 21: Self-Organizing Maps 1 Self-Organizing Maps (Kohonen Maps) In the BPN, we used supervised.
Un Supervised Learning & Self Organizing Maps Learning From Examples
Lecture 09 Clustering-based Learning
NEURAL NETWORKS Introduction
SOMTIME: AN ARTIFICIAL NEURAL NETWORK FOR TOPOLOGICAL AND TEMPORAL CORRELATION FOR SPATIOTEMPORAL PATTERN LEARNING.
Face Recognition Using Neural Networks Presented By: Hadis Mohseni Leila Taghavi Atefeh Mirsafian.
CHAPTER 12 ADVANCED INTELLIGENT SYSTEMS © 2005 Prentice Hall, Decision Support Systems and Intelligent Systems, 7th Edition, Turban, Aronson, and Liang.
Gwangju Institute of Science and Technology Intelligent Design and Graphics Laboratory Multi-scale tensor voting for feature extraction from unstructured.
Adaptive, behaviorally gated, persistent encoding of task-relevant auditory information in ferret frontal cortex.
Artificial Neural Networks
Artificial Neural Nets and AI Connectionism Sub symbolic reasoning.
Introduction to Neural Networks. Neural Networks in the Brain Human brain “computes” in an entirely different way from conventional digital computers.
Copyright ©2009 by Pearson Education, Inc. Upper Saddle River, New Jersey All rights reserved. Neural Networks and Learning Machines, Third Edition.
1 6. Feed-forward mapping networks Lecture Notes on Brain and Computation Byoung-Tak Zhang Biointelligence Laboratory School of Computer Science and Engineering.
Neural Network Hopfield model Kim, Il Joong. Contents  Neural network: Introduction  Definition & Application  Network architectures  Learning processes.
Artificial Neural Network Unsupervised Learning
Self organizing maps 1 iCSC2014, Juan López González, University of Oviedo Self organizing maps A visualization technique with data dimension reduction.
Chapter 3 Neural Network Xiu-jun GONG (Ph. D) School of Computer Science and Technology, Tianjin University
Artificial Neural Network Yalong Li Some slides are from _24_2011_ann.pdf.
Modular Neural Networks: SOM, ART, and CALM Jaap Murre University of Amsterdam University of Maastricht
Advances in Modeling Neocortex and its impact on machine intelligence Jeff Hawkins Numenta Inc. VS265 Neural Computation December 2, 2010 Documentation.
Intelligent Database Systems Lab 國立雲林科技大學 National Yunlin University of Science and Technology Adaptive nonlinear manifolds and their applications to pattern.
NEURAL NETWORKS FOR DATA MINING
Cognition, Brain and Consciousness: An Introduction to Cognitive Neuroscience Edited by Bernard J. Baars and Nicole M. Gage 2007 Academic Press Chapter.
2 2  Background  Vision in Human Brain  Efficient Coding Theory  Motivation  Natural Pictures  Methodology  Statistical Characteristics  Models.
1 Chapter 11 Neural Networks. 2 Chapter 11 Contents (1) l Biological Neurons l Artificial Neurons l Perceptrons l Multilayer Neural Networks l Backpropagation.
National Taiwan A Road Sign Recognition System Based on a Dynamic Visual Model C. Y. Fang Department of Information and.
UNSUPERVISED LEARNING NETWORKS
© 2005 Prentice Hall, Decision Support Systems and Intelligent Systems, 7th Edition, Turban, Aronson, and Liang 12-1 Chapter 12 Advanced Intelligent Systems.
Neural Network Basics Anns are analytical systems that address problems whose solutions have not been explicitly formulated Structure in which multiple.
Face Image-Based Gender Recognition Using Complex-Valued Neural Network Instructor :Dr. Dong-Chul Kim Indrani Gorripati.
Neural Networks Presented by M. Abbasi Course lecturer: Dr.Tohidkhah.
Model-based learning: Theory and an application to sequence learning P.O. Box 49, 1525, Budapest, Hungary Zoltán Somogyvári.
Chapter 8: Adaptive Networks
COSC 4426 AJ Boulay Julia Johnson Artificial Neural Networks: Introduction to Soft Computing (Textbook)
Artificial Neural Networks (ANN). Artificial Neural Networks First proposed in 1940s as an attempt to simulate the human brain’s cognitive learning processes.
A Self-organizing Semantic Map for Information Retrieval Xia Lin, Dagobert Soergel, Gary Marchionini presented by Yi-Ting.
March 31, 2016Introduction to Artificial Intelligence Lecture 16: Neural Network Paradigms I 1 … let us move on to… Artificial Neural Networks.
Artificial Neural Networks By: Steve Kidos. Outline Artificial Neural Networks: An Introduction Frank Rosenblatt’s Perceptron Multi-layer Perceptron Dot.
Computational Intelligence: Methods and Applications Lecture 9 Self-Organized Mappings Włodzisław Duch Dept. of Informatics, UMK Google: W Duch.
Big data classification using neural network
BIOPHYSICS 6702 – ENCODING NEURAL INFORMATION
Self-Organizing Network Model (SOM) Session 11
6. Feed-forward mapping networks
Chapter 12 Advanced Intelligent Systems
Intelligent Leaning -- A Brief Introduction to Artificial Neural Networks Chiung-Yao Fang.
Computational Intelligence: Methods and Applications
Volume 36, Issue 5, Pages (December 2002)
Sparseness and Expansion in Sensory Representations
Toward a Great Class Project: Discussion of Stoianov & Zorzi’s Numerosity Model Psych 209 – 2019 Feb 14, 2019.
The Network Approach: Mind as a Web
Introduction to Neural Network
Presentation transcript:

A new theoretical framework for multisensory integration Michael W. Hadley and Elaine R. Reynolds Neuroscience Program, Lafayette College, Easton PA Multisensory integration (MSI) literature has focused on the Superior Colliculus (SC), the subcotrical area responsible for gaze orientation, resulting in understanding the development, classes and computational of SC MSI. Introduction Analysis of the use of SOMs to model MSI I took the important facets of computational models of SC MSI ([1],[5],[7] and [8]) and applied them to a cortical setting: Superior Colliculus Visual Auditory Tactile 2.1 MMA’s Architecture 2.2 MMA’s Results MMA’s model [5] consists of m senses projecting to a 10x10 grid of neurons-the SC. The projections to the SC were trained with a SOM by presenting many examples of the different firing combinations. MMA found the SC formed unisensory areas in the corners of the grid with multisensory areas in between the unisensory areas. The response of the network to multisensory stimuli showed a nonlinear increase as compared to the component unisensory stimuli (MSE). Martin, Meredith, Ahmad (MMA) SOM model Each example that a SOM is trained on maps to a location in the grid (simplified into 3.1). Similar examples are mapped to similar locations forming unisensory and multisensory areas (denoted by the colors in 3.1). SOMs are a solid foundation for cortical MSI: [1],[5] and [7] used excitatory self-organizing maps (SOMs) to explain MSI [7] and [8] used layered, topographic architectures to model multisensory information processing [7] showed that SOMs can be used in a multilayer system [8] shows the importance of uses inhibition and feedback I built extensions onto [5] to test the applicability of the SOM-based models to the cortex and discovered that a SOM alone cannot explain cortical MSI. I propose an additional training rule and a hierarchy based on context to allow inhibition and feedback in a multilayer SOM. SOMs form a weight distribution to allow MSI using the sigmoidal firing Noise is essential to smooth map formation. The random variations are micro-examples that fill in the gaps in the map (contrast in 2.2 with 3.3) Evidence suggests that our sensory areas have a topographic organization and [4] suggests this is the result of SOMs 1.1 Traditional view of multisensory areas1.2 Revised view of multisensory areas Adapted from [2] Cortical MSI lacks such an understanding, and hence, the recent shift in views on cortical MSI (1.1 to 1.2) has yet to be computationally modeled. 3.1 How SOMs form 3.3 No noise3.2 Sigmoidal curve yields MSE The key to the integration is the sigmoidal firing curve. The weights in the mulstisensory areas pay attention to each modality equivalently, so the unsisensory responsiveness is subthreshold while the multisensory response is above threshold (see 3.2). Multisensory world Multi-neuron modalities Inhibition Multiple RFs Moving SOMs from MMA to a Cortical Hierarchy Virtual multisensory world Larger “multi-neuron” modalities Inhibition Multiple receptive fields (RFs) Sense 1Sense 2 BimodalMSE/MSD Sensory Size 3 x 3 4 x 4 2 x 2 Single RFs Multiple RFs Sense 1Sense 2BimodalMSE 4.1 World with unisensory and multisensory space 4.2 Extensions to larger senses decreased the signal-to-noise ratio resulting in MSE of noise activity. Adjusting parameters was not enough to fix the ratio. 4.3 Inhibition increases signal-to-noise allowing for larger grids and potentially both MSE (Red) and MSD (Green) 4.4 2D SOMs can only map one relationship. They can either map the overlapping RFs within a sense or the overlapping RFs between senses, but not both. Visual Area Auditory Area Cortex Multisensory World Sense 1Sense 2BimodalMSE 3 x 3 4 x 4 2 x 2 Sensory Size 4 x 4 (Adjusted) 1:8 1:15 1:3 Signal:Noise Hierarchy The flow of information in the hierarchy and the additional rule set address the problems of signal-to-noise and inhibition while conforming to the literature on cortical MSI. The literature has yet to suggest reasons for the existence of interconnections between low level sensory areas and feedback from cortical areas. This model works by setting up a hierarchy of contexts. The visual, auditory and cortical areas all have their view of the world. The cross talk and feedback allows these contexts to be enhanced and suppressed as needed to create a coherent view. This view of information flow through a hierarchy has been expressed in [3] and [5]. [3] has successfully implemented contextual hierarchy to simulate advanced computer vision. Acknowledgements I would like to thank Dr. Elaine Reynolds for her continued advice and mentorship through the course of this research. References Training rule sets Feed-forward and excitatory connections trained with traditional SOM Modified Hebb with inhibition to deal with the issue of multiple RFs and signal-to-noise 1) Unisensory Extraction 2) Cross-modal Interaction 3) Multisensory Integration 4) Cortical Feedback The visual and auditory areas are trained with a SOM to store unisensory patterns. Interactions between the two sensory areas allow the alignment of sensory information: The primarily “unisensory” areas projections to the cortical area are trained with a SOM to extract a multisensory view of the world. Cortical feedback is trained with the same scheme as 2 to allow for top-down integration. If two neurons fire in response to the same input, increase connection weight If one neuron fires but another neuron does not, decrease connection weight The weights are capped to allow subthreshold influences that generate MSE [1] Anastasio, T. J., & Patton, P. E. (2003). A two-stage unsupervised learning algorithm reproduces multisensory enhancement in a neural network model of the corticotectal system. Journal of Neuroscience, 23, [2] Ghazanfar, A. A., & Schroeder, C. E. (2006). Is neocortex essentially multisensory? Trends in Cognitive Sciences, 10, [3] Hawkins, J., & Blakeslee, S. (2004). On Intelligence. New York: Holt. [4] Kohonen, T. & Hari, R. (1999). Where the abstract feature maps of the brain might come from. Trends in Neuroscience. 22, [5] Martin, J. G., Meredith, M.A. & Ahmad, K. (2009). Modeling multisensory enhancement with self-organizing maps. Frontiers in Computation Neuroscience, 3. [6] Meyer K., & Damasio A. (2009). Convergence and divergence in a neural architecture for recognition and memory, Trends in Neurosciences, 32, [7] Pavlou, A. and Casey, M. (2010). Simulating the effects of cortical feedback in the superior colliculus with topographic maps, Proceedings of the International Joint Conference on Neural Networks 2010, Barcelona, July. [8] Ursino, M., Cuppini, C., Magosso, E., Serino, A. & Pellegrino, G. (2009). Multisensory integration in the superior colliculus: a neural network model. Journal of Computational Neuroscience, 26,