Unsupervised Learning Networks 主講人: 虞台文
Content Introduction Important Unsupervised Learning NNs Conclusion Hamming Networks Kohonen’s Self-Organizing Feature Maps Grossberg’s ART Networks Counterpropagation Networks Adaptive BAN Neocognitron Conclusion
Unsupervised Learning Networks Introduction
What is Unsupervised Learning? Learning without a teacher. No feedback to indicate the desired outputs. The network must by itself discover the relationship of interest from the input data. E.g., patterns, features, regularities, correlations, or categories. Translate the discovered relationship into output.
A Strange World
Supervised Learning A Height B C IQ
Try Classification Supervised Learning A Height B C IQ
The Probabilities of Populations Height B C IQ
The Centroids of Clusters Height A B C IQ
The Centroids of Clusters Try Classification The Centroids of Clusters Height A B C IQ
Unsupervised Learning Height IQ
Unsupervised Learning Height IQ
Clustering Analysis Height IQ Categorize the input patterns into several classes based on the similarity among patterns. Clustering Analysis Height IQ
Clustering Analysis How many classes we may have? Height IQ Categorize the input patterns into several classes based on the similarity among patterns. Clustering Analysis How many classes we may have? Height IQ
Clustering Analysis 2 clusters Height IQ Categorize the input patterns into several classes based on the similarity among patterns. Clustering Analysis 2 clusters Height IQ
Clustering Analysis 3 clusters Height IQ Categorize the input patterns into several classes based on the similarity among patterns. Clustering Analysis 3 clusters Height IQ
Clustering Analysis 4 clusters Height IQ Categorize the input patterns into several classes based on the similarity among patterns. Clustering Analysis 4 clusters Height IQ
Unsupervised Learning Networks The Hamming Networks
The Nearest Neighbor Classifier Suppose that we have p prototypes centered at x(1), x(2), …, x(p). Given pattern x, it is assigned to the class label of the ith prototype if Examples of distance measures include the Hamming distance and Euclidean distance.
The Nearest Neighbor Classifier The Stored Prototypes The Nearest Neighbor Classifier 1 2 3 4 x(1) x(2) x(3) x(4)
The Nearest Neighbor Classifier 1 2 3 4 x(1) x(2) ?Class x(3) x(4)
The Hamming Networks Stored a set of classes represented by a set of binary prototypes. Given an incomplete binary input, find the class to which it belongs. Use Hamming distance as the distance measurement. Distance vs. Similarity.
The Hamming Net MAXNET Winner-Take-All x1 x2 xn Similarity Measurement
Hamming Distance = ? The Hamming Distance y = 1 1 1 1 1 1 1 x = 1 1 1 1 1 1 1 Hamming Distance = ?
Hamming Distance = 3 The Hamming Distance y = 1 1 1 1 1 1 1 x = 1 1 1 1 1 1 1 Hamming Distance = 3
The Hamming Distance y = 1 1 1 1 1 1 1 x = 1 1 1 1 1 1 1 Sum=1 x = 1 1 1 1 1 1 1 1 1 1 1 1 1 1
The Hamming Distance
The Hamming Distance
The Hamming Net y1 y2 yn1 yn x1 x2 xm1 xm MAXNET Similarity Winner-Take-All 1 2 n1 n x1 x2 xm1 xm Similarity Measurement
The Hamming Net WM=? WS=? y1 y2 yn1 yn x1 x2 xm1 xm MAXNET Winner-Take-All WM=? 1 2 n1 n x1 x2 xm1 xm Similarity Measurement WS=?
The Stored Patterns WM=? WS=? y1 y2 yn1 yn x1 x2 xm1 xm MAXNET Winner-Take-All WM=? 1 2 n1 n x1 x2 xm1 xm Similarity Measurement WS=?
The Stored Patterns k x1 x2 xm . . . m/2 Similarity Measurement
Weights for Stored Patterns Similarity Measurement 1 2 n1 n x1 x2 xm1 xm WS=?
Weights for Stored Patterns m/2 1 2 n1 n x1 x2 xm1 xm Similarity Measurement WS=?
The MAXNET y1 y2 yn1 yn x1 x2 xm1 xm MAXNET Similarity Measurement 1 Winner-Take-All 1 2 n1 n x1 x2 xm1 xm Similarity Measurement
Weights of MAXNET y1 y2 yn1 yn MAXNET Winner-Take-All 1 1 2 n1 n
Weights of MAXNET 0< < 1/n y1 y2 yn1 yn MAXNET 1 2 n1 n Winner-Take-All 1 1 2 n1 n
Updating Rule 0< < 1/n s1 s2 s3 sn MAXNET 1 2 n1 n Winner-Take-All 1 1 2 n1 n s1 s2 s3 sn
Updating Rule 0< < 1/n s1 s2 s3 sn MAXNET 1 2 n1 n Winner-Take-All 1 1 2 n1 n s1 s2 s3 sn
Analysis Updating Rule Let If now
Analysis Updating Rule Let If now
Example
Unsupervised Learning Networks The Self-Organizing Feature Map
Feature Mapping Dimensionality Reduction Topology-Preserving Map Map high-dimensional input signals onto a lower-dimensional (usually 1 or 2D) structure. Similarity relations present in the original data are still present after the mapping. Dimensionality Reduction Topology-Preserving Map
Somatotopic Map Illustration: The “Homunculus” The relationship between body surfaces and the regions of the brain that control them.
Another Depiction of the Homunculus
Phonotopic maps
Phonotopic maps humppila
Self-Organizing Feature Map Developed by professor Kohonen. One of the most popular neural network models. Unsupervised learning. Competitive learning networks.
The Structure of SOM
The Structure of SOM
Example
Local Excitation, Distal Inhibition
Topological Neighborhood Square Hex
Size Shrinkage
Size Shrinkage
Learning Rule Similarity Matching Updating
Example
Example
Example
Example