Time Organized Maps – Learning cortical topography from spatiotemporal stimuli “ Learning cortical topography from spatiotemporal stimuli ”, J. Wiemer,

Slides:



Advertisements
Similar presentations
Learning School of Computing, University of Leeds, UK AI23 – 2004/05 – demo 2.
Advertisements

5/16/2015Intelligent Systems and Soft Computing1 Introduction Introduction Hebbian learning Hebbian learning Generalised Hebbian learning algorithm Generalised.
Kohonen Self Organising Maps Michael J. Watts
Artificial neural networks:
Competitive Networks. Outline Hamming Network.
A Batch-Language, Vector-Based Neural Network Simulator Motivation: - general computer languages (e.g. C) lead to complex code - neural network simulators.
Michigan State University1 Visual Attention and Recognition Through Neuromorphic Modeling of “Where” and “What” Pathways Zhengping Ji Embodied Intelligence.
Slides are based on Negnevitsky, Pearson Education, Lecture 8 Artificial neural networks: Unsupervised learning n Introduction n Hebbian learning.
November 19, 2009Introduction to Cognitive Science Lecture 20: Artificial Neural Networks I 1 Artificial Neural Network (ANN) Paradigms Overview: The Backpropagation.
1 Artificial Neural Networks: An Introduction S. Bapi Raju Dept. of Computer and Information Sciences, University of Hyderabad.
Symbolic Encoding of Neural Networks using Communicating Automata with Applications to Verification of Neural Network Based Controllers* Li Su, Howard.
To Understand, Survey and Implement Neurodynamic Models By Farhan Tauheed Asif Tasleem.
Introduction to Neural Networks John Paxton Montana State University Summer 2003.
Michigan State University 1 “Saliency-Based Visual Attention” “Computational Modeling of Visual Attention”, Itti, Koch, (Nature Reviews – Neuroscience.
Laurent Itti: CS599 – Computational Architectures in Biological Vision, USC Lecture 7: Coding and Representation 1 Computational Architectures in.
November 24, 2009Introduction to Cognitive Science Lecture 21: Self-Organizing Maps 1 Self-Organizing Maps (Kohonen Maps) In the BPN, we used supervised.
Biologically Inspired Robotics Group,EPFL Associative memory using coupled non-linear oscillators Semester project Final Presentation Vlad TRIFA.
Neural Networks Lecture 17: Self-Organizing Maps
A Hybrid Self-Organizing Neural Gas Network James Graham and Janusz Starzyk School of EECS, Ohio University Stocker Center, Athens, OH USA IEEE World.
SOMTIME: AN ARTIFICIAL NEURAL NETWORK FOR TOPOLOGICAL AND TEMPORAL CORRELATION FOR SPATIOTEMPORAL PATTERN LEARNING.
Hazırlayan NEURAL NETWORKS Radial Basis Function Networks II PROF. DR. YUSUF OYSAL.
Face Recognition Using Neural Networks Presented By: Hadis Mohseni Leila Taghavi Atefeh Mirsafian.
Traffic Sign Recognition Using Artificial Neural Network Radi Bekker
1 Introduction to Artificial Neural Networks Andrew L. Nelson Visiting Research Faculty University of South Florida.
Soft Computing Colloquium 2 Selection of neural network, Hybrid neural networks.
Self-Organized Recurrent Neural Learning for Language Processing April 1, March 31, 2012 State from June 2009.
KOHONEN SELF ORGANISING MAP SEMINAR BY M.V.MAHENDRAN., Reg no: III SEM, M.E., Control And Instrumentation Engg.
Pattern Similarity and Storage Capacity of Hopfield Network Suman K Manandhar Prof. Ramakoti Sadananda Computer Science and Information Management AIT.
Intrusion Detection Using Hybrid Neural Networks Vishal Sevani ( )
Artificial Neural Networks Dr. Abdul Basit Siddiqui Assistant Professor FURC.
Self organizing maps 1 iCSC2014, Juan López González, University of Oviedo Self organizing maps A visualization technique with data dimension reduction.
Chapter 3 Neural Network Xiu-jun GONG (Ph. D) School of Computer Science and Technology, Tianjin University
Modular Neural Networks: SOM, ART, and CALM Jaap Murre University of Amsterdam University of Maastricht
An informal description of artificial neural networks John MacCormick.
Artificial Neural Networks An Introduction. What is a Neural Network? A human Brain A porpoise brain The brain in a living creature A computer program.
Intelligent Database Systems Lab 國立雲林科技大學 National Yunlin University of Science and Technology 2008.NN.10 Modeling propagation delays in the development.
Features of Biological Neural Networks 1)Robustness and Fault Tolerance. 2)Flexibility. 3)Ability to deal with variety of Data situations. 4)Collective.
Features of Biological Neural Networks 1)Robustness and Fault Tolerance. 2)Flexibility. 3)Ability to deal with variety of Data situations. 4)Collective.
The Function of Synchrony Marieke Rohde Reading Group DyStURB (Dynamical Structures to Understand Real Brains)
An Unsupervised Connectionist Model of Rule Emergence in Category Learning Rosemary Cowell & Robert French LEAD-CNRS, Dijon, France EC FP6 NEST Grant.
CS 478 – Tools for Machine Learning and Data Mining Perceptron.
1 Adaptive Resonance Theory. 2 INTRODUCTION Adaptive resonance theory (ART) was developed by Carpenter and Grossberg[1987a] ART refers to the class of.
Introduction to Neural Networks Introduction to Neural Networks Applied to OCR and Speech Recognition An actual neuron A crude model of a neuron Computational.
Neural Networks Sarah Ezzell Engr What Are They? Information processing system (non- algorithmic, non-digital) Inspired by the human brain Made.
Analysis of spectro-temporal receptive fields in an auditory neural network Madhav Nandipati.
Spiking Neural Networks Banafsheh Rekabdar. Biological Neuron: The Elementary Processing Unit of the Brain.
COMPUTATIONAL INTELLIGENCE
How do you get here?
Ghent University Backpropagation for Population-Temporal Coded Spiking Neural Networks July WCCI/IJCNN 2006 Benjamin Schrauwen and Jan Van Campenhout.
National Taiwan Normal A System to Detect Complex Motion of Nearby Vehicles on Freeways C. Y. Fang Department of Information.
1 Neural Networks MUMT 611 Philippe Zaborowski April 2005.
Big data classification using neural network
Fall 2004 Perceptron CS478 - Machine Learning.
Effective Connectivity
CSE P573 Applications of Artificial Intelligence Neural Networks
CSE 473 Introduction to Artificial Intelligence Neural Networks
Intelligent Leaning -- A Brief Introduction to Artificial Neural Networks Chiung-Yao Fang.
Approximate Fully Connected Neural Network Generation
Intelligent Leaning -- A Brief Introduction to Artificial Neural Networks Chiung-Yao Fang.
CSE 573 Introduction to Artificial Intelligence Neural Networks
network of simple neuron-like computing elements
Pattern Recognition & Machine Learning
Fundamentals of Neural Networks Dr. Satinder Bal Gupta
Effective Connectivity
TexPoint fonts used in EMF.
Artificial Neural Networks
Artificial Neural Network learning
Artificial Neural Networks / Spring 2002
Akram Bitar and Larry Manevitz Department of Computer Science
Presentation transcript:

Time Organized Maps – Learning cortical topography from spatiotemporal stimuli “ Learning cortical topography from spatiotemporal stimuli ”, J. Wiemer, F. Spengler, F. Joublin, P. Stagge, S. Wacquant, Biological Cybernetics, 2000 “The Time-Organized Map Algorithm: Extending the Self-Organizing Map to Spatiotemporal Signals”, Jan C.Wiemer, Neural Computation, 2003 Presented by: Mojtaba Solgi TexPoint fonts used in EMF. Read the TexPoint manual before you delete this box.: AAAA A

Outline 1. The purpose and biological motivation 2. The Model: TOM Algorithm Wave propagation Learning 3. Experiments and Results Gaussian stimuli Generic artificial stimuli Semi-natural stimuli 4. Discussion 5. z

Neurobiological experiments, Spengler et al., 1996, 1999

Terminology Integration Fusion of different stimuli into one representation Segregation: Process of Increasing representational distance of different stimuli z

2D Network Architecture Activation positional shift

One-dimensional model

Wave propagation

Integration and Segregation

Algorithm 1. Compute neurons activations and the position of the top winner neuron 2. Compute the neural position of the propagated wave from the last time step activation

Algorithm – Cont. 3. Shift the position of the top winner neuron due to interaction with propagated wave

Algorithm – Cont. 4. Again shift the position of the winner neuron this time due to noise 5. Update the winner neurons weights SOM Hebbian

Experiments with Gaussian stimuli & 2D neural layer 1. Simulation of ‘ontogenesis’ (Development)

Experiments with Gaussian stimuli & 2D neural layer 2. Simulation of post-ontogenetic plasticity

One-dimensional model

Experiments with generic artificial stimuli & 1D neural layer The input

Experiments with semi-natural stimuli & 1D neural layer

Discussion Importance of temporal stimulus for development of cortical topography Continuous mapping of related stimuli Inter-Stimulus-Interval-Dependant representations Hardly scalable No recognition performance on real-world problems Tested only on artificial input

Summary Utilizing temporal information in developing cortical topography Wave-like spread of cortical activity Experiments and results show compatibility of the model with neurobiological observations Biologically inspired and plausible, but no engineering performance

Thank you! Any thoughts/question?