Download presentation
Presentation is loading. Please wait.
1
Unsupervised Learning Networks
主講人: 虞台文
2
Content Introduction Important Unsupervised Learning NNs Conclusion
Hamming Networks Kohonen’s Self-Organizing Feature Maps Grossberg’s ART Networks Counterpropagation Networks Adaptive BAN Neocognitron Conclusion
3
Unsupervised Learning Networks
Introduction
4
What is Unsupervised Learning?
Learning without a teacher. No feedback to indicate the desired outputs. The network must by itself discover the relationship of interest from the input data. E.g., patterns, features, regularities, correlations, or categories. Translate the discovered relationship into output.
5
A Strange World
6
Supervised Learning A Height B C IQ
7
Try Classification Supervised Learning A Height B C IQ
8
The Probabilities of Populations
Height B C IQ
9
The Centroids of Clusters
Height A B C IQ
10
The Centroids of Clusters
Try Classification The Centroids of Clusters Height A B C IQ
11
Unsupervised Learning
Height IQ
12
Unsupervised Learning
Height IQ
13
Clustering Analysis Height IQ
Categorize the input patterns into several classes based on the similarity among patterns. Clustering Analysis Height IQ
14
Clustering Analysis How many classes we may have? Height IQ
Categorize the input patterns into several classes based on the similarity among patterns. Clustering Analysis How many classes we may have? Height IQ
15
Clustering Analysis 2 clusters Height IQ
Categorize the input patterns into several classes based on the similarity among patterns. Clustering Analysis 2 clusters Height IQ
16
Clustering Analysis 3 clusters Height IQ
Categorize the input patterns into several classes based on the similarity among patterns. Clustering Analysis 3 clusters Height IQ
17
Clustering Analysis 4 clusters Height IQ
Categorize the input patterns into several classes based on the similarity among patterns. Clustering Analysis 4 clusters Height IQ
18
Unsupervised Learning Networks
The Hamming Networks
19
The Nearest Neighbor Classifier
Suppose that we have p prototypes centered at x(1), x(2), …, x(p). Given pattern x, it is assigned to the class label of the ith prototype if Examples of distance measures include the Hamming distance and Euclidean distance.
20
The Nearest Neighbor Classifier
The Stored Prototypes The Nearest Neighbor Classifier 1 2 3 4 x(1) x(2) x(3) x(4)
21
The Nearest Neighbor Classifier
1 2 3 4 x(1) x(2) ?Class x(3) x(4)
22
The Hamming Networks Stored a set of classes represented by a set of binary prototypes. Given an incomplete binary input, find the class to which it belongs. Use Hamming distance as the distance measurement. Distance vs. Similarity.
23
The Hamming Net MAXNET Winner-Take-All x1 x2 xn Similarity Measurement
24
Hamming Distance = ? The Hamming Distance y = 1 1 1 1 1 1 1
x = 1 Hamming Distance = ?
25
Hamming Distance = 3 The Hamming Distance y = 1 1 1 1 1 1 1
x = 1 Hamming Distance = 3
26
The Hamming Distance y = 1 1 1 1 1 1 1 x = 1 1 1 1 1 1 1
Sum=1 x = 1 1
27
The Hamming Distance
28
The Hamming Distance
29
The Hamming Net y1 y2 yn1 yn x1 x2 xm1 xm MAXNET Similarity
Winner-Take-All 1 2 n1 n x1 x2 xm1 xm Similarity Measurement
30
The Hamming Net WM=? WS=? y1 y2 yn1 yn x1 x2 xm1 xm MAXNET
Winner-Take-All WM=? 1 2 n1 n x1 x2 xm1 xm Similarity Measurement WS=?
31
The Stored Patterns WM=? WS=? y1 y2 yn1 yn x1 x2 xm1 xm MAXNET
Winner-Take-All WM=? 1 2 n1 n x1 x2 xm1 xm Similarity Measurement WS=?
32
The Stored Patterns k x1 x2 xm . . . m/2 Similarity Measurement
33
Weights for Stored Patterns
Similarity Measurement 1 2 n1 n x1 x2 xm1 xm WS=?
34
Weights for Stored Patterns
m/2 1 2 n1 n x1 x2 xm1 xm Similarity Measurement WS=?
35
The MAXNET y1 y2 yn1 yn x1 x2 xm1 xm MAXNET Similarity Measurement 1
Winner-Take-All 1 2 n1 n x1 x2 xm1 xm Similarity Measurement
36
Weights of MAXNET y1 y2 yn1 yn MAXNET Winner-Take-All 1 1 2 n1 n
37
Weights of MAXNET 0< < 1/n y1 y2 yn1 yn MAXNET 1 2 n1 n
Winner-Take-All 1 1 2 n1 n
38
Updating Rule 0< < 1/n s1 s2 s3 sn MAXNET 1 2 n1 n
Winner-Take-All 1 1 2 n1 n s1 s2 s3 sn
39
Updating Rule 0< < 1/n s1 s2 s3 sn MAXNET 1 2 n1 n
Winner-Take-All 1 1 2 n1 n s1 s2 s3 sn
40
Analysis Updating Rule
Let If now
41
Analysis Updating Rule
Let If now
42
Example
43
Unsupervised Learning Networks
The Self-Organizing Feature Map
44
Feature Mapping Dimensionality Reduction Topology-Preserving Map
Map high-dimensional input signals onto a lower-dimensional (usually 1 or 2D) structure. Similarity relations present in the original data are still present after the mapping. Dimensionality Reduction Topology-Preserving Map
45
Somatotopic Map Illustration: The “Homunculus”
The relationship between body surfaces and the regions of the brain that control them.
46
Another Depiction of the Homunculus
47
Phonotopic maps
48
Phonotopic maps humppila
49
Self-Organizing Feature Map
Developed by professor Kohonen. One of the most popular neural network models. Unsupervised learning. Competitive learning networks.
50
The Structure of SOM
51
The Structure of SOM
52
Example
53
Local Excitation, Distal Inhibition
54
Topological Neighborhood
Square Hex
55
Size Shrinkage
56
Size Shrinkage
57
Learning Rule Similarity Matching Updating
58
Example
59
Example
60
Example
61
Example
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.