Self-Organizing Maps Corby Ziesman March 21, 2007.

Slides:



Advertisements
Similar presentations
Memristor in Learning Neural Networks
Advertisements

2806 Neural Computation Self-Organizing Maps Lecture Ari Visa.
Neural Networks Dr. Peter Phillips. The Human Brain (Recap of week 1)
Instar and Outstar Learning Laws Adapted from lecture notes of the course CN510: Cognitive and Neural Modeling offered in the Department of Cognitive and.
Self Organization of a Massive Document Collection
Unsupervised learning. Summary from last week We explained what local minima are, and described ways of escaping them. We investigated how the backpropagation.
Self Organization: Competitive Learning
5/16/2015Intelligent Systems and Soft Computing1 Introduction Introduction Hebbian learning Hebbian learning Generalised Hebbian learning algorithm Generalised.
Kohonen Self Organising Maps Michael J. Watts
Unsupervised Learning with Artificial Neural Networks The ANN is given a set of patterns, P, from space, S, but little/no information about their classification,
Artificial neural networks:
Adaptive Resonance Theory (ART) networks perform completely unsupervised learning. Their competitive learning algorithm is similar to the first (unsupervised)
Unsupervised Networks Closely related to clustering Do not require target outputs for each input vector in the training data Inputs are connected to a.
Self-Organizing Map (SOM). Unsupervised neural networks, equivalent to clustering. Two layers – input and output – The input layer represents the input.
X0 xn w0 wn o Threshold units SOM.
Self Organizing Maps. This presentation is based on: SOM’s are invented by Teuvo Kohonen. They represent multidimensional.
Slides are based on Negnevitsky, Pearson Education, Lecture 8 Artificial neural networks: Unsupervised learning n Introduction n Hebbian learning.
Self-Organizing Hierarchical Neural Network
1 Chapter 11 Neural Networks. 2 Chapter 11 Contents (1) l Biological Neurons l Artificial Neurons l Perceptrons l Multilayer Neural Networks l Backpropagation.
November 9, 2010Neural Networks Lecture 16: Counterpropagation 1 Unsupervised Learning So far, we have only looked at supervised learning, in which an.
KNN, LVQ, SOM. Instance Based Learning K-Nearest Neighbor Algorithm (LVQ) Learning Vector Quantization (SOM) Self Organizing Maps.
Instar Learning Law Adapted from lecture notes of the course CN510: Cognitive and Neural Modeling offered in the Department of Cognitive and Neural Systems.
Introduction to Neural Networks John Paxton Montana State University Summer 2003.
November 24, 2009Introduction to Cognitive Science Lecture 21: Self-Organizing Maps 1 Self-Organizing Maps (Kohonen Maps) In the BPN, we used supervised.
Lecture 09 Clustering-based Learning
Radial Basis Function (RBF) Networks
Project reminder Deadline: Monday :00 Prepare 10 minutes long pesentation (in Czech/Slovak), which you’ll present on Wednesday during.
Lecture 12 Self-organizing maps of Kohonen RBF-networks
Presentation on Neural Networks.. Basics Of Neural Networks Neural networks refers to a connectionist model that simulates the biophysical information.
Self Organized Map (SOM)
Self-organizing Maps Kevin Pang. Goal Research SOMs Research SOMs Create an introductory tutorial on the algorithm Create an introductory tutorial on.
Self-organizing map Speech and Image Processing Unit Department of Computer Science University of Joensuu, FINLAND Pasi Fränti Clustering Methods: Part.
Artificial Neural Networks Dr. Abdul Basit Siddiqui Assistant Professor FURC.
-Artificial Neural Network- Chapter 9 Self Organization Map(SOM) 朝陽科技大學 資訊管理系 李麗華 教授.
Artificial Neural Network Unsupervised Learning
Stephen Marsland Ch. 9 Unsupervised Learning Stephen Marsland, Machine Learning: An Algorithmic Perspective. CRC 2009 based on slides from Stephen.
Hebbian Coincidence Learning
Self-Organizing Maps Corby Ziesman March 21, 2007.
1 Chapter 11 Neural Networks. 2 Chapter 11 Contents (1) l Biological Neurons l Artificial Neurons l Perceptrons l Multilayer Neural Networks l Backpropagation.
Machine Learning Neural Networks (3). Understanding Supervised and Unsupervised Learning.
Institute for Advanced Studies in Basic Sciences – Zanjan Kohonen Artificial Neural Networks in Analytical Chemistry Mahdi Vasighi.
Self Organizing Feature Map CS570 인공지능 이대성 Computer Science KAIST.
Multi-Layer Perceptron
Neural Networks - Lecture 81 Unsupervised competitive learning Particularities of unsupervised learning Data clustering Neural networks for clustering.
UNSUPERVISED LEARNING NETWORKS
TreeSOM :Cluster analysis in the self- organizing map Neural Networks 19 (2006) Special Issue Reporter 張欽隆 D
Intelligent Database Systems Lab 國立雲林科技大學 National Yunlin University of Science and Technology Rival-Model Penalized Self-Organizing Map Yiu-ming Cheung.
381 Self Organization Map Learning without Examples.
CUNY Graduate Center December 15 Erdal Kose. Outlines Define SOMs Application Areas Structure Of SOMs (Basic Algorithm) Learning Algorithm Simulation.
Semiconductors, BP&A Planning, DREAM PLAN IDEA IMPLEMENTATION.
CHAPTER 14 Competitive Networks Ming-Feng Yeh.
Example Apply hierarchical clustering with d min to below data where c=3. Nearest neighbor clustering d min d max will form elongated clusters!
Supervised Learning – Network is presented with the input and the desired output. – Uses a set of inputs for which the desired outputs results / classes.
A Self-organizing Semantic Map for Information Retrieval Xia Lin, Dagobert Soergel, Gary Marchionini presented by Yi-Ting.
Computational Intelligence: Methods and Applications Lecture 9 Self-Organized Mappings Włodzisław Duch Dept. of Informatics, UMK Google: W Duch.
Chapter 5 Unsupervised learning
Self-Organizing Network Model (SOM) Session 11
Data Mining, Neural Network and Genetic Programming
Self Organizing Maps: Parametrization of Parton Distribution Functions
Other Applications of Energy Minimzation
Creating fuzzy rules from numerical data using a neural network
An Artificial Intelligence Based Fisheries Research On The Evaluation Of Gnathiid Parasitism In Goldblotch Grouper of ISKENDERUN BAY ORAL, M. GENÇ, M.A. ELMAS,
Computational Intelligence: Methods and Applications
Self-organizing map numeric vectors and sequence motifs
Feature mapping: Self-organizing Maps
Artificial Neural Networks
Unsupervised Networks Closely related to clustering
A Neural Net For Terrain Classification
AI Lectures by Engr.Q.Zia
Presentation transcript:

Self-Organizing Maps Corby Ziesman March 21, 2007

Overview A Self-Organizing Map (SOM) is a way to represent higher dimensional data in an usually 2-D or 3-D manner, such that similar data is grouped together. It runs unsupervised and performs the grouping on its own. Once the SOM converges, it can only classify new data. It is unlike traditional neural nets which are continuously learning and adapting. SOMs run in two phases: Training phase: map is built, network organizes using a competitive process, it is trained using large numbers of inputs (or the same input vectors can be administered multiple times). Mapping phase: new vectors are quickly given a location on the converged map, easily classifying or categorizing the new data.

Uses → Example: Data sets for poverty levels in different countries. Data sets have many different statistics for each country. SOM does not show poverty levels, rather it shows how similar the poverty sets for different countries are to each other. (Similar color = similar data sets).

Example from ASU’s IMPACT Lab Wireless Localization Using Self-Organizing Maps by Gianni Giorgetti, Sandeep K. S. Gupta, and Gianfranco Manes. IPSN ’07. Uses a modified SOM to determine localization for a wireless network.

Forming the Map

SOM Structure Every node is connected to the input the same way, and no nodes are connected to each other. In a SOM that judges RGB color values and tries to group similar colors together, each square “pixel” in the display is a node in the SOM. Notice in the converged SOM above that dark blue is near light blue and dark green is near light green.

The Basic Process Initialize each node’s weights. Choose a random vector from training data and present it to the SOM. Every node is examined to find the Best Matching Unit (BMU). The radius of the neighborhood around the BMU is calculated. The size of the neighborhood decreases with each iteration. Each node in the BMU’s neighborhood has its weights adjusted to become more like the BMU. Nodes closest to the BMU are altered more than the nodes furthest away in the neighborhood. Repeat from step 2 for enough iterations for convergence.

Calculating the Best Matching Unit Calculating the BMU is done according to the Euclidean distance among the node’s weights (W1, W2, … , Wn) and the input vector’s values (V1, V2, … , Vn). This gives a good measurement of how similar the two sets of data are to each other.

Determining the BMU Neighborhood Size of the neighborhood: We use an exponential decay function that shrinks on each iteration until eventually the neighborhood is just the BMU itself. Effect of location within the neighborhood: The neighborhood is defined by a gaussian curve so that nodes that are closer are influenced more than farther nodes.

Modifying Nodes’ Weights The new weight for a node is the old weight, plus a fraction (L) of the difference between the old weight and the input vector… adjusted (theta) based on distance from the BMU. The learning rate, L, is also an exponential decay function. This ensures that the SOM will converge. The lambda represents a time constant, and t is the time step

See It in Action

Example 1 Input space Initial weights Final weights 2-D square grid of nodes. Inputs are colors. SOM converges so that similar colors are grouped together. Program with source code and pre-compiled Win32 binary: http://www.ai-junkie.com/files/SOMDemo.zip or mirror. Input space Initial weights Final weights

Example 2 Similar to the work done in Wireless Localization Using Self-Organizing Maps by Gianni Giorgetti, Sandeep K. S. Gupta, and Gianfranco Manes. IPSN ’07. Program with source code and pre-compiled Win32 binary: http://www.ai-junkie.com/files/KirkD_SOM.zip or mirror. Occasionally the SOM will appear to have a twist in it. This results from unfortunate initial weights, but only happens rarely.

Questions/Discussion

References Wireless Localization Using Self-Organizing Maps by Gianni Giorgetti, Sandeep K. S. Gupta, and Gianfranco Manes. IPSN ’07. Wikipedia: “Self organizing map”. http://en.wikipedia.org/wiki/Self-organizing_map. (Retrieved March 21, 2007). AI-Junkie: “SOM tutorial”. http://www.ai-junkie.com/ann/som/som1.html. (Retrieved March 20, 2007).