Unsupervised Learning Networks

Slides:



Advertisements
Similar presentations
© Negnevitsky, Pearson Education, Introduction Introduction Hebbian learning Hebbian learning Generalised Hebbian learning algorithm Generalised.
Advertisements

Memristor in Learning Neural Networks
Introduction to Artificial Neural Networks
2806 Neural Computation Self-Organizing Maps Lecture Ari Visa.
Un Supervised Learning & Self Organizing Maps. Un Supervised Competitive Learning In Hebbian networks, all neurons can fire at the same time Competitive.
Neural Networks Chapter 9 Joost N. Kok Universiteit Leiden.
Unsupervised learning. Summary from last week We explained what local minima are, and described ways of escaping them. We investigated how the backpropagation.
5/16/2015Intelligent Systems and Soft Computing1 Introduction Introduction Hebbian learning Hebbian learning Generalised Hebbian learning algorithm Generalised.
Kohonen Self Organising Maps Michael J. Watts
Unsupervised Learning with Artificial Neural Networks The ANN is given a set of patterns, P, from space, S, but little/no information about their classification,
Artificial neural networks:
Competitive learning College voor cursus Connectionistische modellen M Meeter.
Unsupervised Networks Closely related to clustering Do not require target outputs for each input vector in the training data Inputs are connected to a.
Self-Organizing Map (SOM). Unsupervised neural networks, equivalent to clustering. Two layers – input and output – The input layer represents the input.
X0 xn w0 wn o Threshold units SOM.
1 Chapter 10 Introduction to Machine Learning. 2 Chapter 10 Contents (1) l Training l Rote Learning l Concept Learning l Hypotheses l General to Specific.
Slides are based on Negnevitsky, Pearson Education, Lecture 8 Artificial neural networks: Unsupervised learning n Introduction n Hebbian learning.
Chapter 2: Pattern Recognition
KNN, LVQ, SOM. Instance Based Learning K-Nearest Neighbor Algorithm (LVQ) Learning Vector Quantization (SOM) Self Organizing Maps.
Neural Networks based on Competition
Introduction to Neural Networks John Paxton Montana State University Summer 2003.
November 24, 2009Introduction to Cognitive Science Lecture 21: Self-Organizing Maps 1 Self-Organizing Maps (Kohonen Maps) In the BPN, we used supervised.
Neural Networks Lecture 17: Self-Organizing Maps
Lecture 09 Clustering-based Learning
Project reminder Deadline: Monday :00 Prepare 10 minutes long pesentation (in Czech/Slovak), which you’ll present on Wednesday during.
Lecture 12 Self-organizing maps of Kohonen RBF-networks
KOHONEN SELF ORGANISING MAP SEMINAR BY M.V.MAHENDRAN., Reg no: III SEM, M.E., Control And Instrumentation Engg.
Presentation on Neural Networks.. Basics Of Neural Networks Neural networks refers to a connectionist model that simulates the biophysical information.
Self Organized Map (SOM)
CZ5225: Modeling and Simulation in Biology Lecture 5: Clustering Analysis for Microarray Data III Prof. Chen Yu Zong Tel:
Chapter 4. Neural Networks Based on Competition Competition is important for NN –Competition between neurons has been observed in biological nerve systems.
Artificial Neural Networks Dr. Abdul Basit Siddiqui Assistant Professor FURC.
-Artificial Neural Network- Chapter 9 Self Organization Map(SOM) 朝陽科技大學 資訊管理系 李麗華 教授.
Artificial Neural Network Unsupervised Learning
A Scalable Self-organizing Map Algorithm for Textual Classification: A Neural Network Approach to Thesaurus Generation Dmitri G. Roussinov Department of.
Stephen Marsland Ch. 9 Unsupervised Learning Stephen Marsland, Machine Learning: An Algorithmic Perspective. CRC 2009 based on slides from Stephen.
Machine Learning Neural Networks (3). Understanding Supervised and Unsupervised Learning.
Institute for Advanced Studies in Basic Sciences – Zanjan Kohonen Artificial Neural Networks in Analytical Chemistry Mahdi Vasighi.
Self Organizing Feature Map CS570 인공지능 이대성 Computer Science KAIST.
Neural Networks - Lecture 81 Unsupervised competitive learning Particularities of unsupervised learning Data clustering Neural networks for clustering.
UNSUPERVISED LEARNING NETWORKS
1 Chapter 10 Introduction to Machine Learning. 2 Chapter 10 Contents (1) l Training l Rote Learning l Concept Learning l Hypotheses l General to Specific.
Digital Image Processing & Pattern Analysis (CSCE 563) Introduction to Pattern Analysis Prof. Amr Goneid Department of Computer Science & Engineering The.
381 Self Organization Map Learning without Examples.
1 Lecture 6 Neural Network Training. 2 Neural Network Training Network training is basic to establishing the functional relationship between the inputs.
What is Unsupervised Learning? Learning without a teacher. No feedback to indicate the desired outputs. The network must by itself discover the relationship.
Unsupervised Learning Networks 主講人 : 虞台文. Content Introduction Important Unsupervised Learning NNs – Hamming Networks – Kohonen’s Self-Organizing Feature.
Semiconductors, BP&A Planning, DREAM PLAN IDEA IMPLEMENTATION.
CHAPTER 14 Competitive Networks Ming-Feng Yeh.
COSC 4426 AJ Boulay Julia Johnson Artificial Neural Networks: Introduction to Soft Computing (Textbook)
Example Apply hierarchical clustering with d min to below data where c=3. Nearest neighbor clustering d min d max will form elongated clusters!
Supervised Learning – Network is presented with the input and the desired output. – Uses a set of inputs for which the desired outputs results / classes.
Machine Learning 12. Local Models.
Chapter 5 Unsupervised learning
Self-Organizing Network Model (SOM) Session 11
Data Mining, Neural Network and Genetic Programming
Unsupervised Learning and Neural Networks
Dr. Unnikrishnan P.C. Professor, EEE
Instance Based Learning (Adapted from various sources)
Lecture 22 Clustering (3).
Data Mining 資料探勘 分群分析 (Cluster Analysis) Min-Yuh Day 戴敏育
Competitive Networks.
Competitive Networks.
Introduction to Cluster Analysis
Feature mapping: Self-organizing Maps
Machine Learning – a Probabilistic Perspective
Artificial Neural Networks
Unsupervised Networks Closely related to clustering
Presentation transcript:

Unsupervised Learning Networks 主講人: 虞台文

Content Introduction Important Unsupervised Learning NNs Conclusion Hamming Networks Kohonen’s Self-Organizing Feature Maps Grossberg’s ART Networks Counterpropagation Networks Adaptive BAN Neocognitron Conclusion

Unsupervised Learning Networks Introduction

What is Unsupervised Learning? Learning without a teacher. No feedback to indicate the desired outputs. The network must by itself discover the relationship of interest from the input data. E.g., patterns, features, regularities, correlations, or categories. Translate the discovered relationship into output.

A Strange World

Supervised Learning A Height B C IQ

Try Classification Supervised Learning A Height B C IQ

The Probabilities of Populations Height B C IQ

The Centroids of Clusters Height A B C IQ

The Centroids of Clusters Try Classification The Centroids of Clusters Height A B C IQ

Unsupervised Learning Height IQ

Unsupervised Learning Height IQ

Clustering Analysis Height IQ Categorize the input patterns into several classes based on the similarity among patterns. Clustering Analysis Height IQ

Clustering Analysis How many classes we may have? Height IQ Categorize the input patterns into several classes based on the similarity among patterns. Clustering Analysis How many classes we may have? Height IQ

Clustering Analysis 2 clusters Height IQ Categorize the input patterns into several classes based on the similarity among patterns. Clustering Analysis 2 clusters Height IQ

Clustering Analysis 3 clusters Height IQ Categorize the input patterns into several classes based on the similarity among patterns. Clustering Analysis 3 clusters Height IQ

Clustering Analysis 4 clusters Height IQ Categorize the input patterns into several classes based on the similarity among patterns. Clustering Analysis 4 clusters Height IQ

Unsupervised Learning Networks The Hamming Networks

The Nearest Neighbor Classifier Suppose that we have p prototypes centered at x(1), x(2), …, x(p). Given pattern x, it is assigned to the class label of the ith prototype if Examples of distance measures include the Hamming distance and Euclidean distance.

The Nearest Neighbor Classifier The Stored Prototypes The Nearest Neighbor Classifier 1 2 3 4 x(1) x(2) x(3) x(4)

The Nearest Neighbor Classifier 1 2 3 4 x(1) x(2)  ?Class x(3) x(4)

The Hamming Networks Stored a set of classes represented by a set of binary prototypes. Given an incomplete binary input, find the class to which it belongs. Use Hamming distance as the distance measurement. Distance vs. Similarity.

The Hamming Net MAXNET Winner-Take-All x1 x2 xn Similarity Measurement

Hamming Distance = ? The Hamming Distance y = 1 1 1 1 1 1 1 x = 1 1 1 1 1 1 1 Hamming Distance = ?

Hamming Distance = 3 The Hamming Distance y = 1 1 1 1 1 1 1 x = 1 1 1 1 1 1 1 Hamming Distance = 3

The Hamming Distance y = 1 1 1 1 1 1 1 x = 1 1 1 1 1 1 1 Sum=1 x = 1 1 1 1 1 1 1 1 1 1 1 1 1 1

The Hamming Distance

The Hamming Distance

The Hamming Net y1 y2 yn1 yn x1 x2 xm1 xm MAXNET Similarity Winner-Take-All 1 2 n1 n x1 x2 xm1 xm Similarity Measurement

The Hamming Net WM=? WS=? y1 y2 yn1 yn x1 x2 xm1 xm MAXNET Winner-Take-All WM=? 1 2 n1 n x1 x2 xm1 xm Similarity Measurement WS=?

The Stored Patterns WM=? WS=? y1 y2 yn1 yn x1 x2 xm1 xm MAXNET Winner-Take-All WM=? 1 2 n1 n x1 x2 xm1 xm Similarity Measurement WS=?

The Stored Patterns k x1 x2 xm . . . m/2 Similarity Measurement

Weights for Stored Patterns Similarity Measurement 1 2 n1 n x1 x2 xm1 xm WS=?

Weights for Stored Patterns m/2 1 2 n1 n x1 x2 xm1 xm Similarity Measurement WS=?

The MAXNET y1 y2 yn1 yn x1 x2 xm1 xm MAXNET Similarity Measurement 1 Winner-Take-All 1 2 n1 n x1 x2 xm1 xm Similarity Measurement

Weights of MAXNET y1 y2 yn1 yn MAXNET Winner-Take-All 1 1 2 n1 n

Weights of MAXNET 0<  < 1/n  y1 y2 yn1 yn MAXNET 1 2 n1 n Winner-Take-All 1 1 2 n1 n

Updating Rule 0<  < 1/n  s1 s2 s3 sn MAXNET 1 2 n1 n Winner-Take-All 1 1 2 n1 n s1 s2 s3 sn

Updating Rule 0<  < 1/n  s1 s2 s3 sn MAXNET 1 2 n1 n Winner-Take-All 1 1 2 n1 n s1 s2 s3 sn

Analysis  Updating Rule Let If now

Analysis  Updating Rule Let If now

Example

Unsupervised Learning Networks The Self-Organizing Feature Map

Feature Mapping Dimensionality Reduction Topology-Preserving Map Map high-dimensional input signals onto a lower-dimensional (usually 1 or 2D) structure. Similarity relations present in the original data are still present after the mapping. Dimensionality Reduction Topology-Preserving Map

Somatotopic Map Illustration: The “Homunculus” The relationship between body surfaces and the regions of the brain that control them.

Another Depiction of the Homunculus

Phonotopic maps

Phonotopic maps humppila

Self-Organizing Feature Map Developed by professor Kohonen. One of the most popular neural network models. Unsupervised learning. Competitive learning networks.

The Structure of SOM

The Structure of SOM

Example

Local Excitation, Distal Inhibition

Topological Neighborhood Square Hex

Size Shrinkage

Size Shrinkage

Learning Rule Similarity Matching Updating

Example

Example

Example

Example