Feedback Networks and Associative Memories

Slides:



Advertisements
Similar presentations
Artificial Neural Networks
Advertisements

Multi-Layer Perceptron (MLP)
Bioinspired Computing Lecture 16
NN – cont. Alexandra I. Cristea USI intensive course Adaptive Systems April-May 2003.
Chapter3 Pattern Association & Associative Memory
Pattern Association.
Computational Intelligence Winter Term 2011/12 Prof. Dr. Günter Rudolph Lehrstuhl für Algorithm Engineering (LS 11) Fakultät für Informatik TU Dortmund.
Presentation By Utkarsh Trivedi Y8544
Beyond Linear Separability
Backpropagation Learning Algorithm
There are two basic categories: There are two basic categories: 1. Feed-forward Neural Networks These are the nets in which the signals.
NEURAL NETWORKS Biological analogy
Memristor in Learning Neural Networks
Computational Intelligence
Introduction to Artificial Neural Networks
2806 Neural Computation Self-Organizing Maps Lecture Ari Visa.
Kostas Kontogiannis E&CE
CS 678 –Relaxation and Hopfield Networks1 Relaxation and Hopfield Networks Totally connected recurrent relaxation networks Bidirectional weights (symmetric)
Machine Learning Neural Networks
Neural NetworksNN 11 Neural Networks Teacher: Elena Marchiori R4.47 Assistant: Kees Jong S2.22
Pattern Association A pattern association learns associations between input patterns and output patterns. One of the most appealing characteristics of.
Connectionist models. Connectionist Models Motivated by Brain rather than Mind –A large number of very simple processing elements –A large number of weighted.
Introduction to Neural Networks John Paxton Montana State University Summer 2003.
1 Chapter 11 Neural Networks. 2 Chapter 11 Contents (1) l Biological Neurons l Artificial Neurons l Perceptrons l Multilayer Neural Networks l Backpropagation.
Rutgers CS440, Fall 2003 Neural networks Reading: Ch. 20, Sec. 5, AIMA 2 nd Ed.
Module 1: Machine Learning
Neural Networks Chapter 2 Joost N. Kok Universiteit Leiden.
Artificial Neural Network
CS623: Introduction to Computing with Neural Nets (lecture-10) Pushpak Bhattacharyya Computer Science and Engineering Department IIT Bombay.
Chapter 6 Associative Models. Introduction Associating patterns which are –similar, –contrary, –in close proximity (spatial), –in close succession (temporal)
© Negnevitsky, Pearson Education, Lecture 7 Artificial neural networks: Supervised learning Introduction, or how the brain works Introduction, or.
Neural Networks. Plan Perceptron  Linear discriminant Associative memories  Hopfield networks  Chaotic networks Multilayer perceptron  Backpropagation.
Presentation on Neural Networks.. Basics Of Neural Networks Neural networks refers to a connectionist model that simulates the biophysical information.
Associative-Memory Networks Input: Pattern (often noisy/corrupted) Output: Corresponding pattern (complete / relatively noise-free) Process 1.Load input.
Neural NetworksNN 11 Neural netwoks thanks to: Basics of neural network theory and practice for supervised and unsupervised.
Artificial Neural Networks An Overview and Analysis.
Neural Networks Architecture Baktash Babadi IPM, SCS Fall 2004.
Artificial Neural Network Supervised Learning دكترمحسن كاهاني
NEURAL NETWORKS FOR DATA MINING
Hebbian Coincidence Learning
Recurrent Network InputsOutputs. Motivation Associative Memory Concept Time Series Processing – Forecasting of Time series – Classification Time series.
Neural Networks and Fuzzy Systems Hopfield Network A feedback neural network has feedback loops from its outputs to its inputs. The presence of such loops.
IE 585 Associative Network. 2 Associative Memory NN Single-layer net in which the weights are determined in such a way that the net can store a set of.
1 Chapter 11 Neural Networks. 2 Chapter 11 Contents (1) l Biological Neurons l Artificial Neurons l Perceptrons l Multilayer Neural Networks l Backpropagation.
Activations, attractors, and associators Jaap Murre Universiteit van Amsterdam
Neural Network Basics Anns are analytical systems that address problems whose solutions have not been explicitly formulated Structure in which multiple.
1 Lecture 6 Neural Network Training. 2 Neural Network Training Network training is basic to establishing the functional relationship between the inputs.
Neural Networks Teacher: Elena Marchiori R4.47 Assistant: Kees Jong S2.22
Chapter 18 Connectionist Models
Activations, attractors, and associators Jaap Murre Universiteit van Amsterdam en Universiteit Utrecht
COSC 4426 AJ Boulay Julia Johnson Artificial Neural Networks: Introduction to Soft Computing (Textbook)
Chapter 6 Neural Network.
ECE 471/571 - Lecture 16 Hopfield Network 11/03/15.
Business Intelligence and Decision Support Systems (9 th Ed., Prentice Hall) Chapter 6: Artificial Neural Networks for Data Mining.
Supervised Learning – Network is presented with the input and the desired output. – Uses a set of inputs for which the desired outputs results / classes.
Lecture 39 Hopfield Network
J. Kubalík, Gerstner Laboratory for Intelligent Decision Making and Control Artificial Neural Networks II - Outline Cascade Nets and Cascade-Correlation.
1 Neural Networks MUMT 611 Philippe Zaborowski April 2005.
Business Intelligence and Decision Support Systems (9 th Ed., Prentice Hall) Chapter 6: Artificial Neural Networks for Data Mining.
Chapter 6 Associative Models
Ch7: Hopfield Neural Model
NEURONAL DYNAMICS 2: ACTIVATION MODELS
Intelligent Leaning -- A Brief Introduction to Artificial Neural Networks Chiung-Yao Fang.
Intelligent Leaning -- A Brief Introduction to Artificial Neural Networks Chiung-Yao Fang.
Chap 8: Adaptive Networks
Recurrent Networks A recurrent network is characterized by
A Dynamic System Analysis of Simultaneous Recurrent Neural Network
Ch6: AM and BAM 6.1 Introduction AM: Associative Memory
CSC 578 Neural Networks and Deep Learning
Presentation transcript:

Feedback Networks and Associative Memories 虞台文 大同大學資工所 智慧型多媒體研究室

Content Introduction Discrete Hopfield NNs Continuous Hopfield NNs Associative Memories Hopfield Memory Bidirection Memory

Feedback Networks and Associative Memories Introduction 大同大學資工所 智慧型多媒體研究室

Feedforward/Feedback NNs Feedforward NNs The connections between units do not form cycles. Usually produce a response to an input quickly. Most feedforward NNs can be trained using a wide variety of efficient algorithms. Feedback or recurrent NNs There are cycles in the connections. In some feedback NNs, each time an input is presented, the NN must iterate for a potentially long time before it produces a response. Usually more difficult to train than feedforward NNs.

Supervised-Learning NNs Feedforward NNs Perceptron Adaline, Madaline Backpropagation (BP) Artmap Learning Vector Quantization (LVQ) Probabilistic Neural Network (PNN) General Regression Neural Network (GRNN) Feedback or recurrent NNs Brain-State-in-a-Box (BSB) Fuzzy Congitive Map (FCM) Boltzmann Machine (BM) Backpropagation through time (BPTT)

Unsupervised-Learning NNs Feedforward NNs Learning Matrix (LM) Sparse Distributed Associative Memory (SDM) Fuzzy Associative Memory (FAM) Counterprogation (CPN) Feedback or Recurrent NNs Binary Adaptive Resonance Theory (ART1) Analog Adaptive Resonance Theory (ART2, ART2a) Discrete Hopfield (DH) Continuous Hopfield (CH) Discrete Bidirectional Associative Memory (BAM) Kohonen Self-organizing Map/Topology-preserving map (SOM/TPM)

The Hopfield NNs In 1982, Hopfield, a Caltech physicist, mathematically tied together many of the ideas from previous research. A fully connected, symmetrically weighted network where each node functions both as input and output node. Used for Associated memories Combinatorial optimization

Associative Memories An associative memory is a content-addressable structure that maps a set of input patterns to a set of output patterns. Two types of associative memory: autoassociative and heteroassociative. Auto-association retrieves a previously stored pattern that most closely resembles the current pattern. Hetero-association the retrieved pattern is, in general, different from the input pattern not only in content but possibly also in type and format.

Associative Memories A A Auto-association Hetero-association memory Niagara Waterfall

Optimization Problems Associate costs with energy functions in Hopfield Networks Need to be in quadratic form Hopfield Network finds local, satisfactory soluions, doesn’t choose solutions from a set. Local optimums, not global.

Feedback Networks and Associative Memories Discrete Hopfield NNs 大同大學資工所 智慧型多媒體研究室

The Discrete Hopfield NNs w1n w2n w3n w13 w23 wn3 w12 w32 wn2 w21 w31 wn1 1 2 3 n . . . I1 I2 I3 In v1 v2 v3 vn

The Discrete Hopfield NNs wij = wij wii = 0 The Discrete Hopfield NNs w1n w2n w3n w13 w23 wn3 w12 w32 wn2 w21 w31 wn1 1 2 3 n . . . I1 I2 I3 In v1 v2 v3 vn

The Discrete Hopfield NNs wij = wij wii = 0 The Discrete Hopfield NNs w1n w2n w3n w13 w23 wn3 w12 w32 wn2 w21 w31 wn1 1 2 3 n . . . I1 I2 I3 In v1 v2 v3 vn

State Update Rule Asynchronous mode Update rule Stable?

Energy Function Fact:E is lower bounded (upper bounded). If E is monotonically decreasing (increasing), the system is stable.

The Proof Suppose that at time t + 1, the kth neuron is selected for update.

Their values are not changed at time t + 1. The Proof Their values are not changed at time t + 1. Suppose that at time t + 1, the kth neuron is selected for update.

The Proof

E The Proof Stable vk(t) vk(t+1) Hk(t) E 1  0 1 1 < 0 1 < 0 1 < 0 1 < 0 1  0 1 < 0 1 < 0 1

Feedback Networks and Associative Memories Continuous Hopfield NNs 大同大學資工所 智慧型多媒體研究室

The Neuron of Continuous Hopfield NNs . gi Ci wi1 wi2 win Ii v1 v2 vn ui vi ui vi =a(ui) 1 1 vi ui =a1(vi) 1 1

The Dynamics . gi Ci wi1 wi2 win Ii v1 v2 vn ui vi Gi

The Continuous Hopfield NNs g1 C1 I1 u1 g2 C2 I2 u2 g3 C3 I3 u3 gn Cn In un v1 v2 v3 vn w21 wn1 w31 w12 w13 w1n w23 w2n w32 w3n wn2 wn3 . . .

The Continuous Hopfield NNs g1 C1 I1 u1 g2 C2 I2 u2 g3 C3 I3 u3 gn Cn In un v1 v2 v3 vn w21 wn1 w31 w12 w13 w1n w23 w2n w32 w3n wn2 wn3 . . . Stable?

Equilibrium Points Consider the autonomous system: Equilibrium Points Satisfy

Lyapunov Theorem Call E(y) as energy function. The system is asymptotically stable if the following holds: There exists a positive-definite function E(y) s.t.

Lyapunov Energy Function gn Cn In un v1 v2 v3 vn w21 wn1 w31 w12 w13 w1n w23 w2n w32 w3n wn2 wn3 . . .

Lyapunov Energy Function u=a1(v) 1 1 ui vi =a(ui) 1 1 v 1 1 I1 I2 I3 In w1n w3n w2n wn3 w13 w23 w12 w32 wn2 w31 w21 wn1 u1 u2 u3 un C1 g1 C2 g2 C3 g3 Cn gn . . . . . . v1 v2 v3 vn v1 v2 v3 vn

Stability of Continuous Hopfield NNs Dynamics Stability of Continuous Hopfield NNs

Stability of Continuous Hopfield NNs Dynamics Stability of Continuous Hopfield NNs > 0 v u=a1(v) 1 1

Stability of Continuous Hopfield NNs g1 C1 I1 u1 g2 C2 I2 u2 g3 C3 I3 u3 gn Cn In un v1 v2 v3 vn w21 wn1 w31 w12 w13 w1n w23 w2n w32 w3n wn2 wn3 . . . Stable

Feedback Networks and Associative Memories 大同大學資工所 智慧型多媒體研究室

Associative Memories Also named content-addressable memory. Autoassociative Memory Hopfield Memory Heteroassociative Memory Bidirection Associative Memory (BAM)

Associative Memories Stored Patterns Autoassociative Heteroassociative

Feedback Networks and Associative Memories Hopfield Memory Bidirection Memory 大同大學資工所 智慧型多媒體研究室

Hopfield Memory Fully connected 14,400 weights 1210 Neurons

Example Stored Patterns Memory Association

Example Stored Patterns Memory Association

Example How to Store Patterns? Stored Patterns Memory Association

=? The Storage Algorithm Suppose the set of stored pattern is of dimension n. =?

The Storage Algorithm

Analysis Suppose that x  xi.

Example

Example 1 2 3 4 2

Example 1 2 3 4 2 1 1 E=4 1 1 Stable E=0 1 1 E=4

Example 1 2 3 4 2 1 1 E=4 1 1 Stable E=0 1 1 E=4

Problems of Hopfield Memory Complement Memories Spurious stable states Capacity

Capacity of Hopfield Memory The number of storable patterns w.r.t. the size of the network. Study methods: Experiments Probability Information theory Radius of attraction ()

Capacity of Hopfield Memory The number of storable patterns w.r.t. the size of the network. Hopfield (1982) demonstrated that the maximum number of patterns that can be stored in the Hopfield model of n nodes before the error in the retrieved pattern becomes severe is around 0.15n.  The memory capacity of the Hopfield model can be increased as shown by Andrecut (1972).

Radius of attraction (0    1/2) False memories

Feedback Networks and Associative Memories Hopfield Memory Bidirection Memory 大同大學資工所 智慧型多媒體研究室

. . . . . . Bidirection Memory W=[wij]nm y1 y2 yn x1 x2 xm Y Layer Forward Pass Backward Pass W=[wij]nm . . . x1 x2 xm X Layer

? . . . . . . Bidirection Memory W=[wij]nm Stored Patterns y1 y2 yn Y Layer ? Forward Pass Backward Pass W=[wij]nm . . . x1 x2 xm X Layer

The Storage Algorithm Stored Patterns

Analysis Suppose xk’ is one of the stored vector. 0

Energy Function

Bidirection Memory Energy Function: