Kohonen Self Organising Maps Michael J. Watts

Slides:



Advertisements
Similar presentations
© Negnevitsky, Pearson Education, Introduction Introduction Hebbian learning Hebbian learning Generalised Hebbian learning algorithm Generalised.
Advertisements

Memristor in Learning Neural Networks
Neural Networks Dr. Peter Phillips. The Human Brain (Recap of week 1)
Un Supervised Learning & Self Organizing Maps. Un Supervised Competitive Learning In Hebbian networks, all neurons can fire at the same time Competitive.
Neural Networks Chapter 9 Joost N. Kok Universiteit Leiden.
Unsupervised learning. Summary from last week We explained what local minima are, and described ways of escaping them. We investigated how the backpropagation.
Self Organization: Competitive Learning
5/16/2015Intelligent Systems and Soft Computing1 Introduction Introduction Hebbian learning Hebbian learning Generalised Hebbian learning algorithm Generalised.
Artificial neural networks:
Competitive learning College voor cursus Connectionistische modellen M Meeter.
Unsupervised Networks Closely related to clustering Do not require target outputs for each input vector in the training data Inputs are connected to a.
Self-Organizing Map (SOM). Unsupervised neural networks, equivalent to clustering. Two layers – input and output – The input layer represents the input.
Biological and Artificial Neurons Michael J. Watts
Machine Learning: Connectionist McCulloch-Pitts Neuron Perceptrons Multilayer Networks Support Vector Machines Feedback Networks Hopfield Networks.
X0 xn w0 wn o Threshold units SOM.
Self Organizing Maps. This presentation is based on: SOM’s are invented by Teuvo Kohonen. They represent multidimensional.
Slides are based on Negnevitsky, Pearson Education, Lecture 8 Artificial neural networks: Unsupervised learning n Introduction n Hebbian learning.
1 Chapter 11 Neural Networks. 2 Chapter 11 Contents (1) l Biological Neurons l Artificial Neurons l Perceptrons l Multilayer Neural Networks l Backpropagation.
November 9, 2010Neural Networks Lecture 16: Counterpropagation 1 Unsupervised Learning So far, we have only looked at supervised learning, in which an.
Introduction to Neural Networks John Paxton Montana State University Summer 2003.
November 24, 2009Introduction to Cognitive Science Lecture 21: Self-Organizing Maps 1 Self-Organizing Maps (Kohonen Maps) In the BPN, we used supervised.
JYC: CSM17 Bioinformatics Week 9: Simulations #3: Neural Networks biological neurons natural neural networks artificial neural networks applications.
Lecture 09 Clustering-based Learning
Radial Basis Function (RBF) Networks
Project reminder Deadline: Monday :00 Prepare 10 minutes long pesentation (in Czech/Slovak), which you’ll present on Wednesday during.
KOHONEN SELF ORGANISING MAP SEMINAR BY M.V.MAHENDRAN., Reg no: III SEM, M.E., Control And Instrumentation Engg.
Presentation on Neural Networks.. Basics Of Neural Networks Neural networks refers to a connectionist model that simulates the biophysical information.
Self Organized Map (SOM)
CZ5225: Modeling and Simulation in Biology Lecture 5: Clustering Analysis for Microarray Data III Prof. Chen Yu Zong Tel:
Chapter 4. Neural Networks Based on Competition Competition is important for NN –Competition between neurons has been observed in biological nerve systems.
Artificial Neural Networks Dr. Abdul Basit Siddiqui Assistant Professor FURC.
Artificial Neural Network Unsupervised Learning
Self-Organising Networks This is DWC-lecture 8 of Biologically Inspired Computing; about Kohonen’s SOM, what it’s useful for, and some applications.
A Scalable Self-organizing Map Algorithm for Textual Classification: A Neural Network Approach to Thesaurus Generation Dmitri G. Roussinov Department of.
Stephen Marsland Ch. 9 Unsupervised Learning Stephen Marsland, Machine Learning: An Algorithmic Perspective. CRC 2009 based on slides from Stephen.
1 Chapter 11 Neural Networks. 2 Chapter 11 Contents (1) l Biological Neurons l Artificial Neurons l Perceptrons l Multilayer Neural Networks l Backpropagation.
Machine Learning Neural Networks (3). Understanding Supervised and Unsupervised Learning.
Institute for Advanced Studies in Basic Sciences – Zanjan Kohonen Artificial Neural Networks in Analytical Chemistry Mahdi Vasighi.
Neural Networks - Lecture 81 Unsupervised competitive learning Particularities of unsupervised learning Data clustering Neural networks for clustering.
UNSUPERVISED LEARNING NETWORKS
381 Self Organization Map Learning without Examples.
CUNY Graduate Center December 15 Erdal Kose. Outlines Define SOMs Application Areas Structure Of SOMs (Basic Algorithm) Learning Algorithm Simulation.
1 Lecture 6 Neural Network Training. 2 Neural Network Training Network training is basic to establishing the functional relationship between the inputs.
Unsupervised Learning Networks 主講人 : 虞台文. Content Introduction Important Unsupervised Learning NNs – Hamming Networks – Kohonen’s Self-Organizing Feature.
Semiconductors, BP&A Planning, DREAM PLAN IDEA IMPLEMENTATION.
Self-Organizing Maps (SOM) (§ 5.5)
Self Organizing Maps: Clustering With unsupervised learning there is no instruction and the network is left to cluster patterns. All of the patterns within.
CHAPTER 14 Competitive Networks Ming-Feng Yeh.
Unsupervised learning: simple competitive learning Biological background: Neurons are wired topographically, nearby neurons connect to nearby neurons.
Unsupervised Learning G.Anuradha. Contents Introduction Competitive Learning networks Kohenen self-organizing networks Learning vector quantization Hebbian.
Perceptrons Michael J. Watts
Example Apply hierarchical clustering with d min to below data where c=3. Nearest neighbor clustering d min d max will form elongated clusters!
Supervised Learning – Network is presented with the input and the desired output. – Uses a set of inputs for which the desired outputs results / classes.
Chapter 5 Unsupervised learning
Self-Organizing Network Model (SOM) Session 11
Data Mining, Neural Network and Genetic Programming
Unsupervised Learning Networks
Other Applications of Energy Minimzation
Unsupervised Learning and Neural Networks
FUNDAMENTAL CONCEPT OF ARTIFICIAL NETWORKS
Dr. Unnikrishnan P.C. Professor, EEE
Lecture 22 Clustering (3).
Competitive Networks.
Competitive Networks.
Self-Organizing Maps (SOM) (§ 5.5)
CS 621 Artificial Intelligence Lecture 27 – 21/10/05
Artificial Neural Networks
Unsupervised Networks Closely related to clustering
Presentation transcript:

Kohonen Self Organising Maps Michael J. Watts

Lecture Outline Vector Quantisation Unsupervised learning Kohonen Self Organising Topological Maps

Vector Quantisation represents a n dimensional space as a m dimensional one m < n Preserves similarity between examples o examples that are close in n dimensional space will be close in m dimensional space

Supervised vs Unsupervised Learning Supervised Learning o Network is taught by a teacher o Desired outputs play a strong role during training o Network cannot be trained without known output values

Supervised vs Unsupervised Learning Unsupervised Learning o Learning without a teacher o No desired outputs present o Network learns patterns in the data

Kohonen Self Organising Topological Maps Referred to as Kohonen SOMs or just SOMs Invented by Teuvo Kohonen around 1982 Motivated by neurobiology o regions of the cortex specialise o similar items are stored nearby in biological brains

SOM Architecture Two layers of neurons Input layer Output map layer Each output neuron is connected to each input neuron o Fully connected network

SOM Architecture Output map usually has two dimensions o one and three dimensions also used Neurons in output map can be laid out in different patterns o rectangular o Hexagonal

SOM Architecture

SOMs are competitive networks Neurons in the network compete with each other Other kinds of competitive network exist o e.g. ART

SOM Algorithm We are interested in finding the winning neuron in the output map layer The winning neuron is that neuron which is closest to the current input vector

SOM Algorithm Each output neuron is connected to each neuron in the input layer Therefore, each output neuron has an incoming connection weight vector Dimensionality of this vector is the same as the dimensionality of the input vector

SOM Algorithm Since the dimensionality of these vectors is the same, we can measure the Euclidean distance between them where: o is the distance between the vectors o is element i of the input vector o is element i of the weight vector of neuron j

SOM Algorithm Winning node is that with the least distance o i.e. the lowest value of D Outputs from a SOM are binary A node is either the winner, or it is not Only one node can win

SOM Training Based on rewarding the winning node This is a form of competitive learning Winners weights are adjusted to be closer to the input vector Why not equal? o We want the output map to learn regions, not examples

SOM Training SOM weight update rule where: o is the weight change o is the learning rate

SOM Training An important concept in SOM training is that of the “Neighbourhood” The output map neurons that adjoin the winner Neighbourhood size describes how far out from the winner the neighbours can be Neighbours weights are also modified

Neighbourhood

SOM Training Number of neighbours is effected by the shape of the map o rectangular grids  4 neighbours o hexagonal grids  6 neighbours Neighbourhood size and learning rate is reduced gradually during training

SOM Training Overall effect of training o groups, or “clusters” form in output map o clusters represent spatially nearby regions in input space o since dimensionality of the output map is less than the dimensionality of the input space  vector quantisation

SOM Training It has been suggested that the total number of training cycles should be greater than 500 times the number of output neurons A training cycle is one presentation of one training example

SOM Mapping Labelled training data set fed through the trained SOM Finds winner for each training example Assigns label(s) for that example to that neuron Creates set of co-ordinates and labels o co-ordinates identify output neurons

SOM Mapping

SOM Recall Finds winner for each recall vector Co-ordinates of winner taken

Labelling Looks up map label using recall co-ordinates Results in a list of labels for each recall vector Allows you to identify the class each recall example belongs to

Conclusion Kohonen SOMs are competitive networks SOMs learn via an unsupervised algorithm SOM training is based on forming clusters SOMs perform vector quantisation