Designing High-Capacity Neural Networks for Storing, Retrieving and Forgetting Patterns in Real-Time Dmitry O. Gorodnichy IMMS, Cybernetics Center of Ukrainian.

Slides:



Advertisements
Similar presentations
High Performance Associative Neural Networks: Overview and Library High Performance Associative Neural Networks: Overview and Library Presented at AI06,
Advertisements

Bioinspired Computing Lecture 16
Feedback Networks and Associative Memories
1 Neural networks 3. 2 Hopfield network (HN) model A Hopfield network is a form of recurrent artificial neural network invented by John Hopfield in 1982.
Channel Assignment using Chaotic Simulated Annealing Enhanced Neural Network Channel Assignment using Chaotic Simulated Annealing Enhanced Hopfield Neural.
Pattern Association A pattern association learns associations between input patterns and output patterns. One of the most appealing characteristics of.
Connectionist models. Connectionist Models Motivated by Brain rather than Mind –A large number of very simple processing elements –A large number of weighted.
1 Chapter 11 Neural Networks. 2 Chapter 11 Contents (1) l Biological Neurons l Artificial Neurons l Perceptrons l Multilayer Neural Networks l Backpropagation.
1 An Adaptive Nearest Neighbor Classification Algorithm for Data Streams Yan-Nei Law & Carlo Zaniolo University of California, Los Angeles PKDD, Porto,
Un Supervised Learning & Self Organizing Maps Learning From Examples
Basic Models in Neuroscience Oren Shriki 2010 Associative Memory 1.
November 30, 2010Neural Networks Lecture 20: Interpolative Associative Memory 1 Associative Networks Associative networks are able to store a set of patterns.
Neural Networks Chapter 2 Joost N. Kok Universiteit Leiden.
December 7, 2010Neural Networks Lecture 21: Hopfield Network Convergence 1 The Hopfield Network The nodes of a Hopfield network can be updated synchronously.
Adaptive Signal Processing Class Project Adaptive Interacting Multiple Model Technique for Tracking Maneuvering Targets Viji Paul, Sahay Shishir Brijendra,
CS623: Introduction to Computing with Neural Nets (lecture-10) Pushpak Bhattacharyya Computer Science and Engineering Department IIT Bombay.
Naixue GSU Slide 1 ICVCI’09 Oct. 22, 2009 A Multi-Cloud Computing Scheme for Sharing Computing Resources to Satisfy Local Cloud User Requirements.
© Negnevitsky, Pearson Education, Lecture 7 Artificial neural networks: Supervised learning Introduction, or how the brain works Introduction, or.
Jochen Triesch, UC San Diego, 1 Short-term and Long-term Memory Motivation: very simple circuits can store patterns of.
Artificial Neurons: Hopfield Networks Seminar: Introduction to the Theory of Neural Computation Introduction Neurophysiological Background Modeling Simplified.
Neural Networks. Plan Perceptron  Linear discriminant Associative memories  Hopfield networks  Chaotic networks Multilayer perceptron  Backpropagation.
10/6/20151 III. Recurrent Neural Networks. 10/6/20152 A. The Hopfield Network.
Neural Networks Architecture Baktash Babadi IPM, SCS Fall 2004.
Neural Network Hopfield model Kim, Il Joong. Contents  Neural network: Introduction  Definition & Application  Network architectures  Learning processes.
1 Pattern Recognition: Statistical and Neural Lonnie C. Ludeman Lecture 23 Nov 2, 2005 Nanjing University of Science & Technology.
1 Chapter 6: Artificial Neural Networks Part 2 of 3 (Sections 6.4 – 6.6) Asst. Prof. Dr. Sukanya Pongsuparb Dr. Srisupa Palakvangsa Na Ayudhya Dr. Benjarath.
Artificial Neural Network Supervised Learning دكترمحسن كاهاني
Using associative memory principles to enhance perceptual ability of vision systems (Giving a meaning to what you see) CVPR Workshop on Face Processing.
Hebbian Coincidence Learning
Boltzmann Machine (BM) (§6.4) Hopfield model + hidden nodes + simulated annealing BM Architecture –a set of visible nodes: nodes can be accessed from outside.
CS 478 – Tools for Machine Learning and Data Mining Backpropagation.
Deriving connectivity patterns in the primary visual cortex from spontaneous neuronal activity and feature maps Barak Blumenfeld, Dmitri Bibitchkov, Shmuel.
Neural Networks and Fuzzy Systems Hopfield Network A feedback neural network has feedback loops from its outputs to its inputs. The presence of such loops.
IE 585 Associative Network. 2 Associative Memory NN Single-layer net in which the weights are determined in such a way that the net can store a set of.
1 Chapter 11 Neural Networks. 2 Chapter 11 Contents (1) l Biological Neurons l Artificial Neurons l Perceptrons l Multilayer Neural Networks l Backpropagation.
CSC321: Introduction to Neural Networks and machine Learning Lecture 16: Hopfield nets and simulated annealing Geoffrey Hinton.
Evolving Virtual Creatures & Evolving 3D Morphology and Behavior by Competition Papers by Karl Sims Presented by Sarah Waziruddin.
Modelling Language Evolution Lecture 1: Introduction to Learning Simon Kirby University of Edinburgh Language Evolution & Computation Research Unit.
Activations, attractors, and associators Jaap Murre Universiteit van Amsterdam
September Bound Computation for Adaptive Systems V&V Giampiero Campa September 2008 West Virginia University.
Cellular Automata Machine For Pattern Recognition Pradipta Maji 1 Niloy Ganguly 2 Sourav Saha 1 Anup K Roy 1 P Pal Chaudhuri 1 1 Department of Computer.
Satellite Orbit Visualization Vladimir Ivanov Oceanit Project Supervisor: Frank Dachille Project Advisor: Dale Nahoolewa, Curt Leonard Home Institution:
B. Stochastic Neural Networks
Adaptive Hopfield Network Gürsel Serpen Dr. Gürsel Serpen Associate Professor Electrical Engineering and Computer Science Department University of Toledo.
Neural Networks Presented by M. Abbasi Course lecturer: Dr.Tohidkhah.
DMA 2 KAN Data-flexible multipurpose automated adaptive (k)omplex ANN (or autonomous ANN: A2N2) Neuronal Diversity Why important and prevalent? E.g. aspects:
Activations, attractors, and associators Jaap Murre Universiteit van Amsterdam en Universiteit Utrecht
Banaras Hindu University. A Course on Software Reuse by Design Patterns and Frameworks.
Chapter 6 Neural Network.
Lecture 9 Model of Hopfield
IEEE AI - BASED POWER SYSTEM TRANSIENT SECURITY ASSESSMENT Dr. Hossam Talaat Dept. of Electrical Power & Machines Faculty of Engineering - Ain Shams.
Lecture 2 Introduction to Neural Networks and Fuzzy Logic President UniversityErwin SitompulNNFL 2/1 Dr.-Ing. Erwin Sitompul President University
A Presentation on Adaptive Neuro-Fuzzy Inference System using Particle Swarm Optimization and it’s Application By Sumanta Kundu (En.R.No.
Lecture 39 Hopfield Network
J. Kubalík, Gerstner Laboratory for Intelligent Decision Making and Control Artificial Neural Networks II - Outline Cascade Nets and Cascade-Correlation.
CSC321 Lecture 18: Hopfield nets and simulated annealing
Ch7: Hopfield Neural Model
Yinsheng Liu, Beijing Jiaotong University, China
Biological and Artificial Neuron
Biological and Artificial Neuron
Covariation Learning and Auto-Associative Memory
Corso su Sistemi complessi:
Biological and Artificial Neuron
OCNC Statistical Approach to Neural Learning and Population Coding ---- Introduction to Mathematical.
Recurrent Networks A recurrent network is characterized by
Boltzmann Machine (BM) (§6.4)
An Adaptive Nearest Neighbor Classification Algorithm for Data Streams
Lecture 39 Hopfield Network
CS623: Introduction to Computing with Neural Nets (lecture-11)
Presentation transcript:

Designing High-Capacity Neural Networks for Storing, Retrieving and Forgetting Patterns in Real-Time Dmitry O. Gorodnichy IMMS, Cybernetics Center of Ukrainian Academy Sciences, Kiev,Ukraine Dept. of Computing Science, University of Alberta, Edmonton, Canada As presented at IJCNN’99, Washington DC, July 12-17, 1999

Outline What is memory? What is good memory? Pseudo-Inverse Neural Network is a tool to build it! Setting the paradigm for designing non-iterative high- capacity real-time adaptive systems: Increasing the Attraction Radius of the network Desaturating the network Solution for cycles and for fast retrieval Just some results Processing a stream of data - Dynamic Desaturation Conclusions. Some food for thought

What is Memory? The one that stores data: and retrieves it:

What do we want? To store as much as possible (for given amount of space) as fast as possible To retrieve from as much noise as possible as fast as possible To continuously update the contents of the memory, as new data are coming We are interested in theoretically grounded solutions

Neural Network - a tool to do that A fully connected network of N neurons Y i, which evolves in time according to the update rule: until it reaches a stable state (attractor). Patterns can be stored as attractors -> Non-iterative learning - fast learning -> Synchronous dynamics - fast retrieval How to find the best weight matrix C ?

Pseudo-Inverse Learning Rule Obtained from stability condition [Personnaz’85, Reznik’93]: CV=eV C = VV + or (Widrow-Hoff’s rule is its approximation) Optical and hardware implementations exist Its dynamics can be studied theoretically: It can retrieve up to 0.3 N patterns -> It can retrieve up to 0.3 N patterns (Hebbian rule retrieves only 0.13N patterns)

Attraction Radius Attraction radius (AR) tells us how good is the retrieval Direct AR can be calculated theoretically [Gorodnichy’95] as -> Weights C determine AR... -> … and weights satisfy: Indirect AR can be estimated by Monte-Carlo simulations

Desaturation of the network When there are too many patterns in memory, the network gets saturated When there are too many patterns in memory, the network gets saturated: There are too many spurious local attractors [Reznik 93]. Global attractors are never reached. Solution Solution [Gorodnichy 95&96]: desaturate the network by partially reducing self-connections: C ii = C ii *D, 0 < D < 1 Desaturation Desaturation: -> preserves main attractors -> decreases the number of static spurious attractors -> decreases the number of static spurious attractors -> makes the network more flexible (increases the number of iterations) -> makes the network more flexible (increases the number of iterations) -> drastically increases the attraction radius -> drastically increases the attraction radius [Gorodnichy&Reznik’97] But what about cycles (dynamic spurious attractors)?

Increase of AR with Desaturation Direct AR Direct AR Indirect AR

Dynamics of the network The behaviour of the netrwork is governed by the energy functions However : They are few -> They are few, when D>0.1 [Gorodnichy&Reznik’97] They are detected automatically -> They are detected automatically Cycles are possible, when D<1 :

Update flow neuro-processing ->is very fast -> is very fast (as only few neurons are actually changing in one iteration) -> detects cycle automatically -> detects cycle automatically -> suitable for parallel implementation -> suitable for parallel implementation [Gorodnichy&Reznik’94]: “Process only those neurons which change during the evolution”, i.e. instead of N multiplications: do only few of them :

Dealing with a stream of data Dynamic desaturation Dynamic desaturation : -> maintains the capacity of 0.2N (with complete retrieval) -> allows to store data in real-time -> allows to store data in real-time (no need for iterative learning methods!) -> provides means for forgetting obsolete data -> is the basis for the design of adaptive filters -> gives new insights on how the brain work -> is a ground for the revision of the traditional learning theory That’s what is the Neurocomputer designed in IMMS of Cybernetics Center of the Ukrainian NAS

Conclusions The best performance of Hopfield like networks is achieved with the Desaturated Pseudo-Inverse Learning Rule: C=VV+, Cii=D*Cii, D=0.15 E.g. complete retrieval from 8% noise of M=0.5N patterns from 2% noise of M=0.7N patterns The basis for non-iterative learning (to replace traditional iterative learning methods) is set. This basis is Dynamic Desaturation, which allows one to build real-time Adaptive Systems. Update Flow neuro-processing technique makes retrieval very fast. It also resolves the issue of spurious dynamic attractors. Free code of Pseudo-Inverse Memory is available! (see our web-site).