Download presentation
Presentation is loading. Please wait.
Published byMyles Wright Modified over 9 years ago
1
Neural Network Hopfield model Kim, Il Joong
2
Contents Neural network: Introduction Definition & Application Network architectures Learning processes (Training) Hopfield model Summary of model Example Limitations Hopfield pattern recognition on a scale-free neural network
3
Definition of Neural Network A massively parallel system made up of simple processing units and dense interconnections, which has a natural propensity for storing experien-tial knowledge and making it available for use. Interconnection strengths, known as synaptic weights, are used to store the acquired knowledge. => Learning process.
4
Application of Neural Network Patterns-pattern mapping, pattern completion, pattern classification Image Analysis Speech Analysis & Generation Financial Analysis Diagnosis Automated Control
5
Network architectures Single-layer feedforward network
6
Network architectures Multilayer feedforward network
7
Network architectures Recurrent network
8
Learning processes (training) EError-correction learning MMemory-based learning HHebbian learning CCompetitive learning BBoltzmann learning
9
Hebbian learning process If two neurons on either side of a synapse connection are activated simultaneously, then the strength of that synapse is increased. If two neurons on either side of a synapse are activated asynchronously, then the strength of that synapse is weakened or eliminated.
10
Hopfield model N processing units (binary) Fully(Infinitely) connected : N(N-1) connections Single-layer(no hidden layer) Recurrent(feedback) network : No self-feedback loof Network architecture
11
Hopfield model Learning process Let denote a known set of N-dim. memories.
12
Hopfield model Inputting and updating Let denote an unknown N-dimensional input vector. Update asynchronously (i.e., randomly and one at a time) according to the rule
13
Hopfield model Convergence and Outputting Repeat updating until the state vector remains unchanged. Let denote the fixed point (stable state). Associated memories Memory vectors are states that corresponds to minimum E. Any input vector converges to the stored memory vector that is most similar or most accessible to the input.
14
Hopfield model N=3 example Let (1,-1,1), (-1,1,-1) denote the stored memories. (M=2)
15
Limitations of Hopfield model The stored memories are not always stable. There may be stable states that were not the stored memories. (Spurious states) The signal-to-noise ratio: for large M. The quality of memory recall breaks down at M=0.14N
16
Limitations of Hopfield model Stable state may not be the state that is most similar to the input state.
17
On a scale-free neural network Network architecture: the BA scale-free network A small core of m nodes. (fully connected) N ( ≫ m) nodes are added. Total N + m processing units. Total Nm connections. (for 1 ≪ m ≪ N)
18
On a scale-free neural network Hopfield pattern recognition Stored P different patterns: Input pattern: 10% reversal of ( =0.8) Output pattern: The quality of recognition: overlap
19
On a scale-free neural network Small m : N=10000, m=2,3,5
20
On a scale-free neural network Large m : N+m=10000, P=10,100,1000
21
On a scale-free neural network Comparison with a fully connected network (m=N) For small m, low quality of recognition. For 1 ≪ m ≪ N, good quality of recognition. Gain a factor N/m>>1 in the computer memory and time. A gradual decrease of quality of recognition.
22
References D. Stauffer et al., http://xxx.lanl.gov/abs/cond-mat/0212601 (2002)http://xxx.lanl.gov/abs/cond-mat/0212601 (2002) A. S. Mikhailov, Foundations of Synergetics 1, Springer-Verlag Berlin Heidelberg (1990) John Hertz et al., Introduction to the theory of neural computation, Addison-Wesley (1991) Judith E. Dayhoff, Neural Network Architectures, Van Nostrand Reinhold (1990) S. Haykin, Neural Networks, Prentice-Hall (1999)
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.