Self Organizing Maps. This presentation is based on: SOM’s are invented by Teuvo Kohonen. They represent multidimensional.

Slides:



Advertisements
Similar presentations
DIMENSIONALITY REDUCTION Computer Graphics CourseJune 2013.
Advertisements

2806 Neural Computation Self-Organizing Maps Lecture Ari Visa.
Neural Networks Dr. Peter Phillips. The Human Brain (Recap of week 1)
Organizing a spectral image database by using Self-Organizing Maps Research Seminar Oili Kohonen.
Self Organization of a Massive Document Collection
Unsupervised learning. Summary from last week We explained what local minima are, and described ways of escaping them. We investigated how the backpropagation.
Self Organization: Competitive Learning
5/16/2015Intelligent Systems and Soft Computing1 Introduction Introduction Hebbian learning Hebbian learning Generalised Hebbian learning algorithm Generalised.
Kohonen Self Organising Maps Michael J. Watts
Artificial neural networks:
Unsupervised Networks Closely related to clustering Do not require target outputs for each input vector in the training data Inputs are connected to a.
Self-Organizing Map (SOM). Unsupervised neural networks, equivalent to clustering. Two layers – input and output – The input layer represents the input.
X0 xn w0 wn o Threshold units SOM.
1 Chapter 10 Introduction to Machine Learning. 2 Chapter 10 Contents (1) l Training l Rote Learning l Concept Learning l Hypotheses l General to Specific.
UNIVERSITY OF JYVÄSKYLÄ Yevgeniy Ivanchenko Yevgeniy Ivanchenko University of Jyväskylä
Slides are based on Negnevitsky, Pearson Education, Lecture 8 Artificial neural networks: Unsupervised learning n Introduction n Hebbian learning.
1 Chapter 11 Neural Networks. 2 Chapter 11 Contents (1) l Biological Neurons l Artificial Neurons l Perceptrons l Multilayer Neural Networks l Backpropagation.
KNN, LVQ, SOM. Instance Based Learning K-Nearest Neighbor Algorithm (LVQ) Learning Vector Quantization (SOM) Self Organizing Maps.
1 Study of Topographic and Equiprobable Mapping with Clustering for Fault Classification Ashish Babbar EE645 Final Project.
Lecture 09 Clustering-based Learning
Content Based Image Organization CS491 Spring 2006 Prof. Chengyu Sun Kelly Breed.
Radial Basis Function (RBF) Networks
 C. C. Hung, H. Ijaz, E. Jung, and B.-C. Kuo # School of Computing and Software Engineering Southern Polytechnic State University, Marietta, Georgia USA.
Project reminder Deadline: Monday :00 Prepare 10 minutes long pesentation (in Czech/Slovak), which you’ll present on Wednesday during.
Lecture 12 Self-organizing maps of Kohonen RBF-networks
KOHONEN SELF ORGANISING MAP SEMINAR BY M.V.MAHENDRAN., Reg no: III SEM, M.E., Control And Instrumentation Engg.
Intelligent Database Systems Lab 國立雲林科技大學 National Yunlin University of Science and Technology Advisor : Dr. Hsu Student : Sheng-Hsuan Wang Department.
Presentation on Neural Networks.. Basics Of Neural Networks Neural networks refers to a connectionist model that simulates the biophysical information.
Self Organized Map (SOM)
CZ5225: Modeling and Simulation in Biology Lecture 5: Clustering Analysis for Microarray Data III Prof. Chen Yu Zong Tel:
Self-organizing Maps Kevin Pang. Goal Research SOMs Research SOMs Create an introductory tutorial on the algorithm Create an introductory tutorial on.
Artificial Neural Networks Dr. Abdul Basit Siddiqui Assistant Professor FURC.
Artificial Neural Network Unsupervised Learning
Self organizing maps 1 iCSC2014, Juan López González, University of Oviedo Self organizing maps A visualization technique with data dimension reduction.
Community Architectures for Network Information Systems
A two-stage approach for multi- objective decision making with applications to system reliability optimization Zhaojun Li, Haitao Liao, David W. Coit Reliability.
A Scalable Self-organizing Map Algorithm for Textual Classification: A Neural Network Approach to Thesaurus Generation Dmitri G. Roussinov Department of.
Self-Organizing Maps Corby Ziesman March 21, 2007.
1 Chapter 11 Neural Networks. 2 Chapter 11 Contents (1) l Biological Neurons l Artificial Neurons l Perceptrons l Multilayer Neural Networks l Backpropagation.
Self-organizing maps (SOMs) and k-means clustering: Part 1 Steven Feldstein The Pennsylvania State University Trieste, Italy, October 21, 2013 Collaborators:
Machine Learning Neural Networks (3). Understanding Supervised and Unsupervised Learning.
Self Organizing Feature Map CS570 인공지능 이대성 Computer Science KAIST.
Neural Networks - Lecture 81 Unsupervised competitive learning Particularities of unsupervised learning Data clustering Neural networks for clustering.
UNSUPERVISED LEARNING NETWORKS
TreeSOM :Cluster analysis in the self- organizing map Neural Networks 19 (2006) Special Issue Reporter 張欽隆 D
381 Self Organization Map Learning without Examples.
CUNY Graduate Center December 15 Erdal Kose. Outlines Define SOMs Application Areas Structure Of SOMs (Basic Algorithm) Learning Algorithm Simulation.
Semiconductors, BP&A Planning, DREAM PLAN IDEA IMPLEMENTATION.
Self-Organizing Maps (SOM) (§ 5.5)
Example Apply hierarchical clustering with d min to below data where c=3. Nearest neighbor clustering d min d max will form elongated clusters!
Supervised Learning – Network is presented with the input and the desired output. – Uses a set of inputs for which the desired outputs results / classes.
A Self-organizing Semantic Map for Information Retrieval Xia Lin, Dagobert Soergel, Gary Marchionini presented by Yi-Ting.
Computational Intelligence: Methods and Applications Lecture 9 Self-Organized Mappings Włodzisław Duch Dept. of Informatics, UMK Google: W Duch.
Self-Organizing Network Model (SOM) Session 11
Data Mining, Neural Network and Genetic Programming
Self Organizing Maps: Parametrization of Parton Distribution Functions
Concept Map: Clustering Visualizations of Categorical Domains
Other Applications of Energy Minimzation
Molecular Classification of Cancer
Lecture 22 Clustering (3).
An Artificial Intelligence Based Fisheries Research On The Evaluation Of Gnathiid Parasitism In Goldblotch Grouper of ISKENDERUN BAY ORAL, M. GENÇ, M.A. ELMAS,
Self-Organizing Maps Corby Ziesman March 21, 2007.
Computational Intelligence: Methods and Applications
Self-Organizing Maps (SOM) (§ 5.5)
Introduction to Cluster Analysis
Feature mapping: Self-organizing Maps
Artificial Neural Networks
Unsupervised Networks Closely related to clustering
Presentation transcript:

Self Organizing Maps

This presentation is based on: SOM’s are invented by Teuvo Kohonen. They represent multidimensional data in much lower dimensional spaces - usually two dimensions. Common example is the mapping of colors from their three dimensional components - red, green and blue, into two dimensions. 8 colors on the right have been presented as 3D vectors and the system has learnt to represent them in the 2D space. In addition to clustering the colors into distinct regions, regions of similar properties are usually found adjacent to each other.

Network Architecture Data consists of vectors, V, of n dimensions: V1, V2, V3…Vn Each node will contain a corresponding weight vector W, of n dimensions: W1, W2, W3...Wn.

Network Example Each node in the 40-by-40 lattice has three weights, one for each element of the input vector: red, green and blue. Each node is represented by a rectangular cell when drawn to display.

Overview of the Algorithm Idea: Any new, previously unseen input vector presented to the network will stimulate nodes in the zone with similar weight vectors. 1.Each node's weights are initialized. 2.A vector is chosen at random from the set of training data and presented to the lattice. 3.Every node is examined to calculate which one's weights are most like the input vector. The winning node is commonly known as the Best Matching Unit (BMU). 4.The radius of the neighborhood of the BMU is now calculated. This is a value that starts large, typically set to the 'radius' of the lattice, but diminishes each time-step. Any nodes found within this radius are deemed to be inside the BMU's neighborhood. 5.Each neighboring node's (the nodes found in step 4) weights are adjusted to make them more like the input vector. The closer a node is to the BMU, the more its weights get altered. 6.Repeat step 2 for N iterations.

Details Initializing the Weights Set to small standardized random values 0 < w < 1 Calculating the Best Matching Unit Use some distance Determining the Best Matching Unit's Local Neighborhood Area of the neighborhood shrinks over time by shrinking the radius of the neighborhood over time For this use the exponential decay function:  0 is the width of the lattice at time t 0 is a constant t is the iteration number

Details Over time the neighborhood will shrink to the size of just one node... the BMU

Details Adjusting the Weights Every node within the BMU's neighborhood (including the BMU) has its weight vector adjusted according to the following equation: where t represents the time-step and L is a small variable called the learning rate, which decreases with time. The decay of the learning rate is calculated each iteration using the following equation:

Details Also, the effect of learning should be proportional to the distance a node is from the BMU.

Applications SOM’s are commonly used as visualization aids. –They can make it easy to see relationships between vast amounts of data.