Biologically Inspired Robotics Group,EPFL Associative memory using coupled non-linear oscillators Semester project Final Presentation Vlad TRIFA.

Slides:



Advertisements
Similar presentations
Bioinspired Computing Lecture 16
Advertisements

Learning in Neural and Belief Networks - Feed Forward Neural Network 2001 년 3 월 28 일 안순길.
Signals and Systems March 25, Summary thus far: software engineering Focused on abstraction and modularity in software engineering. Topics: procedures,
Sparse Coding in Sparse Winner networks Janusz A. Starzyk 1, Yinyin Liu 1, David Vogel 2 1 School of Electrical Engineering & Computer Science Ohio University,
Template design only ©copyright 2008 Ohio UniversityMedia Production Spring Quarter  A hierarchical neural network structure for text learning.
Red Circle & Green Square or Green Circle and Red Square? The Binding Problem.
Artificial neural networks:
Kostas Kontogiannis E&CE
Transient Cortical Excitation at the onset of visual fixation Visual recognition is brain state dependent.
Synchrony in Neural Systems: a very brief, biased, basic view Tim Lewis UC Davis NIMBIOS Workshop on Synchrony April 11, 2011.
Functional Link Network. Support Vector Machines.
Advanced Mechanical Design December 2008
Neuromodulation - signal-to-noise - switching - burst/single spike - oscillations.
Pattern Classification All materials in these slides were taken from Pattern Classification (2nd ed) by R. O. Duda, P. E. Hart and D. G. Stork, John Wiley.
Radial Basis Functions
Correlation Matrix Memory CS/CMPE 333 – Neural Networks.
Advancing Wireless Link Signatures for Location Distinction J. Zhang, M. H. Firooz, N. Patwari, S. K. Kasera MobiCom’ 08 Presenter: Yuan Song.
Un Supervised Learning & Self Organizing Maps Learning From Examples
Basic Models in Neuroscience Oren Shriki 2010 Associative Memory 1.
Introduction to Neural Network Justin Jansen December 9 th 2002.
Chapter Seven The Network Approach: Mind as a Web.
Chapter 6: Multilayer Neural Networks
Reward processing (1) There exists plenty of evidence that midbrain dopamine systems encode errors in reward predictions (Schultz, Neuron, 2002) Changes.
Nawaf M Albadia Introduction. Components. Behavior & Characteristics. Classes & Rules. Grid Dimensions. Evolving Cellular Automata using Genetic.
1 Prediction of Software Reliability Using Neural Network and Fuzzy Logic Professor David Rine Seminar Notes.
COMPLEXITY SCIENCE WORKSHOP 18, 19 June 2015 Systems & Control Research Centre School of Mathematics, Computer Science and Engineering CITY UNIVERSITY.
Signals and Systems March 25, Summary thus far: software engineering Focused on abstraction and modularity in software engineering. Topics: procedures,
Jochen Triesch, UC San Diego, 1 Short-term and Long-term Memory Motivation: very simple circuits can store patterns of.
Image recognition using analysis of the frequency domain features 1.
Introduction to Neural Networks. Neural Networks in the Brain Human brain “computes” in an entirely different way from conventional digital computers.
IE 585 Introduction to Neural Networks. 2 Modeling Continuum Unarticulated Wisdom Articulated Qualitative Models Theoretic (First Principles) Models Empirical.
The search for organizing principles of brain function Needed at multiple levels: synapse => cell => brain area (cortical maps) => hierarchy of areas.
Neural Networks Architecture Baktash Babadi IPM, SCS Fall 2004.
2101INT – Principles of Intelligent Systems Lecture 10.
NIMIA October 2001, Crema, Italy - Vincenzo Piuri, University of Milan, Italy NEURAL NETWORKS FOR SENSORS AND MEASUREMENT SYSTEMS Part II Vincenzo.
Artificial Neural Network Supervised Learning دكترمحسن كاهاني
NEURAL NETWORKS FOR DATA MINING
The Boltzmann Machine Psych 419/719 March 1, 2001.
CSC321: Introduction to Neural Networks and machine Learning Lecture 16: Hopfield nets and simulated annealing Geoffrey Hinton.
Artificial Neural Networks An Introduction. What is a Neural Network? A human Brain A porpoise brain The brain in a living creature A computer program.
Conceptual Modelling and Hypothesis Formation Research Methods CPE 401 / 6002 / 6003 Professor Will Zimmerman.
John Wordsworth, Peter Ashwin, Gabor Orosz, Stuart Townley Mathematics Research Institute University of Exeter.
Methodology of Simulations n CS/PY 399 Lecture Presentation # 19 n February 21, 2001 n Mount Union College.
Basic Concepts of Audio Watermarking. Selection of Different Approaches Embedding Domain  time domain  frequency domain DFT, DCT, etc. Modulation Method.
Activations, attractors, and associators Jaap Murre Universiteit van Amsterdam
Oscillatory Models of Hippocampal Activity and Memory Roman Borisyuk University of Plymouth, UK In collaboration with.
Lecture 5 Neural Control
Chapter 2. From Complex Networks to Intelligent Systems in Creating Brain-like Systems, Sendhoff et al. Course: Robots Learning from Humans Baek, Da Som.
Neural Networks Presented by M. Abbasi Course lecturer: Dr.Tohidkhah.
Neural Networks Teacher: Elena Marchiori R4.47 Assistant: Kees Jong S2.22
Model-based learning: Theory and an application to sequence learning P.O. Box 49, 1525, Budapest, Hungary Zoltán Somogyvári.
CSC321: Neural Networks Lecture 18: Distributed Representations
Artificial Intelligence: Research and Collaborative Possibilities a presentation by: Dr. Ernest L. McDuffie, Assistant Professor Department of Computer.
NETWORK SONGS !! created by Carina Curto & Katherine Morrison January 2016 Input: a simple directed graph G satisfying two rules: 1. G is an oriented.
CS623: Introduction to Computing with Neural Nets (lecture-12) Pushpak Bhattacharyya Computer Science and Engineering Department IIT Bombay.
Chapter 4. Analysis of Brain-Like Structures and Dynamics (2/2) Creating Brain-Like Intelligence, Sendhoff et al. Course: Robots Learning from Humans 09/25.
“Principles of Soft Computing, 2 nd Edition” by S.N. Sivanandam & SN Deepa Copyright  2011 Wiley India Pvt. Ltd. All rights reserved. CHAPTER 2 ARTIFICIAL.
Robot Intelligence Technology Lab. 10. Complex Hardware Morphologies: Walking Machines Presented by In-Won Park
Robot Intelligence Technology Lab. Evolutionary Robotics Chapter 3. How to Evolve Robots Chi-Ho Lee.
Lecture 12. Outline of Rule-Based Classification 1. Overview of ANN 2. Basic Feedforward ANN 3. Linear Perceptron Algorithm 4. Nonlinear and Multilayer.
A Self-organizing Semantic Map for Information Retrieval Xia Lin, Dagobert Soergel, Gary Marchionini presented by Yi-Ting.
Software Design Process. What is software? mid-1970s executable binary code ‘source code’ and the resulting binary code 1990s development of the Internet.
The Neural Code Baktash Babadi SCS, IPM Fall 2004.
Week 5 NETWORKS of NEURONS and ASSOCIATIVE MEMORY Wulfram Gerstner EPFL, Lausanne, Switzerland 5.1 Introduction - networks of neuron - systems for computing.
9.012 Presentation by Alex Rakhlin March 16, 2001
Neural Networks.
Supporting Fault-Tolerance in Streaming Grid Applications
CSC2535: Computation in Neural Networks Lecture 13: Representing things with neurons Geoffrey Hinton.
Fundamentals of Neural Networks Dr. Satinder Bal Gupta
The Network Approach: Mind as a Web
Presentation transcript:

Biologically Inspired Robotics Group,EPFL Associative memory using coupled non-linear oscillators Semester project Final Presentation Vlad TRIFA

Project summary Litterature review Implementation of an associative memory using coupled oscillators and analysis of performance/drawbacks. Mixture with the BIRG model Generalization to complex signals Better control on the capacity Final discussion about relevant issues concerning the performances of both models Conclusion

Associative Memory Animal and human memory works by association. Able to retrieve a stored pattern upon presentation of a partial and noisy representation of an input signal. Many models developed since early 80’s Concepts taken from statisic mechanics and hebbian learning rule turned neural networks into dynamic systems. Useful into understanding dynamics of networks (emergence) but… Lack of biologically plausible mechanisms (coupling, binary,…) Low capacity and performance (Global coupling: N 2 parameters)

Oscillators Oscillating systems are very common in nature and possess very intersting properties. Synchronization Energy efficient mechanism for temporal correlation Many brain processes rely on interaction of oscillators CPG Olfactory and visual cortex Temporal correlation hypothesis and binding problem Information can be stored as phase relationships patterns, where coupled oscillators converge.

Analyzed model Can be found in [Borisyuk, 2001]. Oscillators described by phase, amplitude, and frequency.

The model

Dynamics

Model performance The capacity of this model is not easy to derive, due to the random phase shifts, and to the dynamics of the nonlinear term: We do not know what percentage of overlapping is possible, as memorized patterns can be “overwritten“. Implies that error increases as memory is filled. Robustness due to distributed memorization. But, loss of groups influence strongly the retrieval error.

Discussion The model is interesting as it is based on oscillating systems, thus can be easily implementable on many oscillating systems (PLL, etc…). A very nice methodology that is embedded in the system is proposed in order to decide where to store an input signal is proposed. Random phase shifts ensure some robustness to the system, but too big influence on the performance.

Drawbacks The all-to-all coupling into groups is not efficient computationally and it uses too many oscillators. Due to the explicit input signal embedded in the equations, we can only learn sine functions. The input dimension is annoying. Complexity is increased with no performance increase. The time is reset after each stimulus. We need to present the input in-phase with the oscillators. We cannot learn sequences.

Improvements We want to be able to learn complex signals. Starting from the model in [Righetti et al, 2005], we want to extend the model to form a network.

Discussion Simpler model, more computationally efficient. We gained a much better control on the amount of oscillators to dedicate for a frequency component. We are able to memorize complex signals in a robust and fault tolerant manner, under some constraints. but… Unfortunately, the capacity depends on the complexity of the signals to store. We lost the selection of the storage sites based on phase relationships we had with the previous model.

Future work We need to find a mechanism (embedded in the dynamics) that can select where each component should be stored depending on the signal. It would be very intersting to create links between different clusters activated by the same signal, similar to associative connections forming according to the correlation of neural activity between assemblies, enhancing robustness if attenuated components. Reduce parameters, so we need only to select the amount of oscillators allocated per component.

Conclusion This work should be considered as an attempt to provide insights on how it is possible to store information encoded as a complex signals in a reliable manner, simply by using oscillating systems with local interactions. Our approach is interesting as it uses some concepts that are common in biological neuronal networks such as Oscillating components with local interactions No global external process to supervise the learning procedure

Thank you! References : [Borisyuk, 2001] [Righetti et al., 2005] [Singer, 1995]