Download presentation
Presentation is loading. Please wait.
Published byJoakim Johannes Knutsen Modified over 6 years ago
1
Synaptic Dynamics: Unsupervised Learning
Part Ⅱ Wang Xiumei 2018/9/20
2
1.Stochastic unsupervised learning and stochastic equilibrium;
2.Signal Hebbian Learning; 3.Competitive Learning. 2018/9/20
3
1.Stochastic unsupervised learning and stochastic equilibrium
⑴ The noisy random unsupervised learning law; ⑵ Stochastic equilibrium; ⑶ The random competitive learning law; ⑷ The learning vector quantization system. 2018/9/20
4
The noisy random unsupervised learning law
The random-signal Hebbian learning law: (4-92) denotes a Browian-motion diffusion process, each term in (4-92)demotes a separate random process. 2018/9/20
5
The noisy random unsupervised learning law
Using noise relationship: we can rewrite (4-92): (4-93) We assume the zero-mean, Gaussian white-noise process ,and use equation : 2018/9/20
6
The noisy random unsupervised learning law
We can get a noisy random unsupervised learning law (4-94) Lemma: (4-95) is finite variance. proof: P132 2018/9/20
7
The noisy random unsupervised learning law
The lemma implies two points: 1, stochastic synapses vibrate in equilibrium, and they vibrate at least as much as the driving noise process vibrates; 2,the synaptic vector changes or vibrate at every instant t, and equals a constant value. wanders in a brownian motion about the constant value E[ ]. 2018/9/20
8
Stochastic equilibrium
When synaptic vector stops moving, synaptic equilibrium occurs in “steady state”, (4-101) synaptic vector reaches synaptic equilibrium when only the random noise vector change : (4-103) 2018/9/20
9
The random competitive learning law
The random linear competitive learning law 2018/9/20
10
The learning vector quantization system.
2018/9/20
11
The self-organizing map system
The self-organizing map system equations: 2018/9/20
12
The self-organizing map system
The self-organizing map is a unsupervised clustering algorithm. Compared with traditional clustering algorithms, its centroid can be mapped a curve or plain, and it remains topological structure. 2018/9/20
13
2.Signal Hebbian Learning
⑴ Recency effects and forgetting; ⑵ Asymptotic correlation encoding; ⑶ Hebbian correlation decoding. 2018/9/20
14
Signal Hebbian Learning
The deterministic first-order signal Hebbian learning law: (4-132) (4-133) 2018/9/20
15
Recency effects and forgetting
Hebbian synapses learn an exponentially weighted average of sampled patterns. the forgetting term is . The simplest local unsupervised learning law: 2018/9/20
16
Asymptotic correlation encoding
The synaptic matrix of long-term memory traces asymptotically approaches the bipolar correlation matrix : X and Y denotes the bipolar signal vectors and 2018/9/20
17
Asymptotic correlation encoding
In practice we use a diagonal fading-memory exponential matrix W compensates for the inherent exponential decay of learned information: (4-142) 2018/9/20
18
Hebbian correlation decoding
First we consider the bipolar correlation encoding of the M bipolar associations ,and turn bipolar associations into binary vector associations . replace -1s with 0s 2018/9/20
19
Hebbian correlation decoding
The Hebbian encoding of the bipolar associations corresponds to the weighted Hebbian encoding scheme if the weight matrix W equals the (4-143) 2018/9/20
20
Hebbian correlation decoding
We use the Hebbian synaptic M for bidirectional processing of and neuronal signals, and pass neural signal through M in the forward direction, in the backward direction. 2018/9/20
21
Hebbian correlation decoding
Signal-noise decomposition: 2018/9/20
22
Hebbian correlation decoding
Correction coefficients : (4-149) They can make each vector resemble in sign as much as possible. The same correction property holds in the backward direction . 2018/9/20
23
Hebbian correlation decoding
We define the Hamming distance between binary vectors and 2018/9/20
24
Hebbian correlation decoding
[number of common bits] - [number of different bits ] 2018/9/20
25
Hebbian correlation decoding
Suppose binary vector is close to , Then ,geometrically, the two patterns are less than half their space away from each other, So In the extreme case ;so The rare case that result in , and the correction coefficients should be discarded. 2018/9/20
26
Hebbian correlation decoding
3) Suppose is far away from , . In the extreme case: , 2018/9/20
27
Hebbian encoding method
binary vector bipolar vector sum contiguous correlation -encoded associations: Hebbian encoding method T 2018/9/20
28
Hebbian encoding method
Example(P144): consider the three-step limit cycle: convert bit vectors to bipolar vectors: 2018/9/20
29
Hebbian encoding method
Produce the asymmetric TAM matrix T: 2018/9/20
30
Hebbian encoding method
Passing the bit vectors through T in the forward direction produces: Produce the forward limit cycle: 2018/9/20
31
Competitive Learning The deterministic competitive learning law:
(4-165) (4-166) We see that the competitive learning law uses the nonlinear forgetting term: 2018/9/20
32
Competitive Learning term . So the two laws differ in how they
Heb learning law uses the linear forgetting term . So the two laws differ in how they forget, not in how they learn. In both cases when -when the jth competing neuron wins-the synaptic value encodes the forcing signal and encodes it exponentially quickly. 2018/9/20
33
3.Competitive Learning. ⑴ Competition as Indication;
⑵ Competition as correlation detection; ⑶ Asymptotic centroid estimation; ⑷ Competitive covariance estimation. 2018/9/20
34
Competition as indication
Centroid estimation requires that the competitive signal approximate the indicator function of the locally sampled pattern class : (4-168) 2018/9/20
35
Competition as indication
If sample pattern X comes from region , the jth competing neuron in should win, and all other competing neurons should Lose. In practice we usually use the random linear competitive learning law and a simple additive model. (4-169) 2018/9/20
36
Competition as indication
the inhibitive-feedback term equals the additive sum of synapse-weighted signal: (4-170) if the jth neuron wins, and to if instead the kth neuron wins. 2018/9/20
37
Competition as correlation detection
The metrical indicator function: (4-171) If the input vector X is closer to synaptic vector than to all other stored synaptic vectors, the jth competing neuron will win. 2018/9/20
38
Competition as correlation detection
Using equinorm property, we can get the equivalent equalities(P147): (4-174) (4-178) (4-179) 2018/9/20
39
Competition as correlation detection
From the above equality, we can get: The jth Competing neuron wins iff the input signal or pattern correlates maximally with . The cosine law: (4-180) 2018/9/20
40
Asymptotic centroid estimation
The simpler competitive law: (4-181) If we use the equilibrium condition: (4-182) 2018/9/20
41
Asymptotic centroid estimation
Solving for the equilibrium synaptic vector: (4-186) It show that equals the centroid of . 2018/9/20
42
Competitive covariance estimation
Centroids provides a first-order Estimate of how the unknown probability Density function behaves in the regions , and local covariances provide a second-order description. 2018/9/20
43
Competitive covariance estimation
Extend the competitive learning laws to asymptotically estimate the local conditional covariance matrices : (4-187) (4-189) denotes the centriod. 2018/9/20
44
Competitive covariance estimation
The fundamental theorem of estimation theory [Mendel 1987]: (4-190) is Borel-measurable random vector function 2018/9/20
45
Competitive covariance estimation
At each iteration we estimate the unknown centroid as the current synaptic vector ,In this sense becomes an error conditional covariance matrix . the stochastic difference-equation algorithm: ( ) 2018/9/20
46
Competitive covariance estimation
denotes an appropriately decreasing sequence of learning coefficients in(4-192). If the ith neuron loses the metrical competition 2018/9/20
47
Competitive covariance estimation
The algorithm(4-192) corresponds to the stochastic differential equation: (4-195) (4-199) 2018/9/20
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.