Download presentation
Presentation is loading. Please wait.
Published byJacob Wilcox Modified over 9 years ago
1
Unsupervised Learning Networks 主講人 : 虞台文
2
Content Introduction Important Unsupervised Learning NNs – Hamming Networks – Kohonen’s Self-Organizing Feature Maps – Grossberg’s ART Networks – Counterpropagation Networks – Adaptive BAN – Neocognitron Conclusion
3
Unsupervised Learning Networks Introduction
4
What is Unsupervised Learning? Learning without a teacher. No feedback to indicate the desired outputs. The network must by itself discover the relationship of interest from the input data. – E.g., patterns, features, regularities, correlations, or categories. Translate the discovered relationship into output.
5
A Strange World
6
Supervised Learning IQ Height A B C
7
Supervised Learning IQ Height A B C Try Classification
8
The Probabilities of Populations IQ Height A B C
9
The Centroids of Clusters IQ Height A B C
10
The Centroids of Clusters IQ Height A B C Try Classification
11
Unsupervised Learning IQ Height
12
Unsupervised Learning IQ Height
13
Clustering Analysis IQ Height Categorize the input patterns into several classes based on the similarity among patterns.
14
Clustering Analysis IQ Height Categorize the input patterns into several classes based on the similarity among patterns. How many classes we may have?
15
Clustering Analysis IQ Height Categorize the input patterns into several classes based on the similarity among patterns. 2 clusters
16
Clustering Analysis IQ Height Categorize the input patterns into several classes based on the similarity among patterns. 3 clusters
17
Clustering Analysis IQ Height Categorize the input patterns into several classes based on the similarity among patterns. 4 clusters
18
Unsupervised Learning Networks The Hamming Networks
19
The Nearest Neighbor Classifier Suppose that we have p prototypes centered at x (1), x (2), …, x (p). Given pattern x, it is assigned to the class label of the i th prototype if Examples of distance measures include the Hamming distance and Euclidean distance.
20
The Nearest Neighbor Classifier 1 1 2 2 3 3 4 4 x (1) x (2) x (3) x (4) The Stored Prototypes
21
The Nearest Neighbor Classifier 1 1 2 2 3 3 4 4 x (1) x (2) x (3) x (4) ?Class
22
The Hamming Networks Stored a set of classes represented by a set of binary prototypes. Given an incomplete binary input, find the class to which it belongs. Use Hamming distance as the distance measurement. Distance vs. Similarity.
23
The Hamming Net Similarity Measurement MAXNET Winner-Take-All x1x1 x2x2 xnxn
24
The Hamming Distance y = 1 1 1 1 1 1 1 x = 1 1 1 1 1 1 1 Hamming Distance = ?
25
y = 1 1 1 1 1 1 1 x = 1 1 1 1 1 1 1 The Hamming Distance Hamming Distance = 3
26
y = 1 1 1 1 1 1 1 The Hamming Distance 1 1 1 1 1 1 1 Sum=1 x = 1 1 1 1 1 1 1
27
The Hamming Distance
29
The Hamming Net Similarity Measurement MAXNET Winner-Take-All 1 1 2 2 n1n1 n1n1 n n x1x1 x2x2 xm1xm1 xmxm 1 1 2 2 n1n1 n1n1 n n y1y1 y2y2 yn1yn1 ynyn
30
The Hamming Net Similarity Measurement MAXNET Winner-Take-All 1 1 2 2 n1n1 n1n1 n n x1x1 x2x2 xm1xm1 xmxm 1 1 2 2 n1n1 n1n1 n n y1y1 y2y2 yn1yn1 ynyn W S =? W M =?
31
The Stored Patterns Similarity Measurement MAXNET Winner-Take-All 1 1 2 2 n1n1 n1n1 n n x1x1 x2x2 xm1xm1 xmxm 1 1 2 2 n1n1 n1n1 n n y1y1 y2y2 yn1yn1 ynyn W S =? W M =?
32
The Stored Patterns Similarity Measurement k x1x1 x2x2 xmxm... m/2
33
Weights for Stored Patterns Similarity Measurement 1 1 2 2 n1n1 n1n1 n n x1x1 x2x2 xm1xm1 xmxm W S =?
34
Weights for Stored Patterns Similarity Measurement 1 1 2 2 n1n1 n1n1 n n x1x1 x2x2 xm1xm1 xmxm W S =? m/2
35
The MAXNET Similarity Measurement MAXNET Winner-Take-All 1 1 2 2 n1n1 n1n1 n n x1x1 x2x2 xm1xm1 xmxm 1 1 2 2 n1n1 n1n1 n n y1y1 y2y2 yn1yn1 ynyn
36
Weights of MAXNET MAXNET Winner-Take-All 1 1 2 2 n1n1 n1n1 n n y1y1 y2y2 yn1yn1 ynyn 1 1
37
Weights of MAXNET MAXNET Winner-Take-All 1 1 2 2 n1n1 n1n1 n n y1y1 y2y2 yn1yn1 ynyn 0< < 1/n 1 1
38
Updating Rule MAXNET Winner-Take-All 1 1 2 2 n1n1 n1n1 n n 0< < 1/n 1 1 s1s1 s2s2 s3s3 snsn
39
Updating Rule MAXNET Winner-Take-All 1 1 2 2 n1n1 n1n1 n n 0< < 1/n 1 1 s1s1 s2s2 s3s3 snsn
40
Analysis Updating Rule Let If now
41
Analysis Updating Rule Let If now
42
Example
43
Unsupervised Learning Networks The Self-organizing Feature Map
44
Feature Mapping Map high-dimensional input signals onto a lower- dimensional (usually 1 or 2D) structure. Similarity relations present in the original data are still present after the mapping. Dimensionality Reduction Topology-Preserving Map
45
Somatotopic Map Illustration: The “Homunculus” The relationship between body surfaces and the regions of the brain that control them.
46
Another Depiction of the Homunculus
47
Phonotopic maps
48
humppila
49
Self-Organizing Feature Map Developed by professor Kohonen. One of the most popular neural network models. Unsupervised learning. Competitive learning networks.
50
The Structure of SOM
51
Example
52
Local Excitation, Distal Inhibition
53
Topological Neighborhood SquareHex
54
Size Shrinkage
56
Learning Rule Similarity Matching Updating
57
Example
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.