KOHONEN SELF ORGANISING MAP SEMINAR BY M.V.MAHENDRAN., Reg no:200831328 III SEM, M.E., Control And Instrumentation Engg.

Slides:



Advertisements
Similar presentations
© Negnevitsky, Pearson Education, Introduction Introduction Hebbian learning Hebbian learning Generalised Hebbian learning algorithm Generalised.
Advertisements

Memristor in Learning Neural Networks
2806 Neural Computation Self-Organizing Maps Lecture Ari Visa.
Neural Networks Dr. Peter Phillips. The Human Brain (Recap of week 1)
Un Supervised Learning & Self Organizing Maps. Un Supervised Competitive Learning In Hebbian networks, all neurons can fire at the same time Competitive.
Neural Networks Chapter 9 Joost N. Kok Universiteit Leiden.
Unsupervised learning. Summary from last week We explained what local minima are, and described ways of escaping them. We investigated how the backpropagation.
Self Organization: Competitive Learning
5/16/2015Intelligent Systems and Soft Computing1 Introduction Introduction Hebbian learning Hebbian learning Generalised Hebbian learning algorithm Generalised.
Kohonen Self Organising Maps Michael J. Watts
Artificial neural networks:
Competitive learning College voor cursus Connectionistische modellen M Meeter.
Unsupervised Networks Closely related to clustering Do not require target outputs for each input vector in the training data Inputs are connected to a.
X0 xn w0 wn o Threshold units SOM.
Slides are based on Negnevitsky, Pearson Education, Lecture 8 Artificial neural networks: Unsupervised learning n Introduction n Hebbian learning.
Un Supervised Learning & Self Organizing Maps Learning From Examples
November 9, 2010Neural Networks Lecture 16: Counterpropagation 1 Unsupervised Learning So far, we have only looked at supervised learning, in which an.
Introduction to Neural Networks John Paxton Montana State University Summer 2003.
Un Supervised Learning & Self Organizing Maps Learning From Examples
WK6 – Self-Organising Networks:
Lecture 09 Clustering-based Learning
S. Mandayam/ ANN/ECE Dept./Rowan University Artificial Neural Networks ECE /ECE Fall 2010 Shreekanth Mandayam ECE Department Rowan University.
Machine Learning. Learning agent Any other agent.
Lecture 12 Self-organizing maps of Kohonen RBF-networks
Unsupervised Learning and Clustering k-means clustering Sum-of-Squared Errors Competitive Learning SOM Pre-processing and Post-processing techniques.
Presentation on Neural Networks.. Basics Of Neural Networks Neural networks refers to a connectionist model that simulates the biophysical information.
Self Organized Map (SOM)
CZ5225: Modeling and Simulation in Biology Lecture 5: Clustering Analysis for Microarray Data III Prof. Chen Yu Zong Tel:
Intrusion Detection Using Hybrid Neural Networks Vishal Sevani ( )
Artificial Neural Networks Dr. Abdul Basit Siddiqui Assistant Professor FURC.
-Artificial Neural Network- Chapter 9 Self Organization Map(SOM) 朝陽科技大學 資訊管理系 李麗華 教授.
Artificial Neural Network Unsupervised Learning
Chapter 3 Neural Network Xiu-jun GONG (Ph. D) School of Computer Science and Technology, Tianjin University
Artificial Neural Networks An Introduction. What is a Neural Network? A human Brain A porpoise brain The brain in a living creature A computer program.
Machine Learning Neural Networks (3). Understanding Supervised and Unsupervised Learning.
So Far……  Clustering basics, necessity for clustering, Usage in various fields : engineering and industrial fields  Properties : hierarchical, flat,
Self Organizing Feature Map CS570 인공지능 이대성 Computer Science KAIST.
Neural Networks - Lecture 81 Unsupervised competitive learning Particularities of unsupervised learning Data clustering Neural networks for clustering.
Features of Biological Neural Networks 1)Robustness and Fault Tolerance. 2)Flexibility. 3)Ability to deal with variety of Data situations. 4)Collective.
UNSUPERVISED LEARNING NETWORKS
1 Lecture 6 Neural Network Training. 2 Neural Network Training Network training is basic to establishing the functional relationship between the inputs.
Semiconductors, BP&A Planning, DREAM PLAN IDEA IMPLEMENTATION.
Self-Organizing Maps (SOM) (§ 5.5)
COSC 4426 AJ Boulay Julia Johnson Artificial Neural Networks: Introduction to Soft Computing (Textbook)
Unsupervised Learning G.Anuradha. Contents Introduction Competitive Learning networks Kohenen self-organizing networks Learning vector quantization Hebbian.
Example Apply hierarchical clustering with d min to below data where c=3. Nearest neighbor clustering d min d max will form elongated clusters!
Soft Computing Lecture 15 Constructive learning algorithms. Network of Hamming.
Intro. ANN & Fuzzy Systems Lecture 20 Clustering (1)
Computational Intelligence: Methods and Applications Lecture 9 Self-Organized Mappings Włodzisław Duch Dept. of Informatics, UMK Google: W Duch.
J. Kubalík, Gerstner Laboratory for Intelligent Decision Making and Control Artificial Neural Networks II - Outline Cascade Nets and Cascade-Correlation.
Chapter 5 Unsupervised learning
Self-Organizing Network Model (SOM) Session 11
Data Mining, Neural Network and Genetic Programming
Other Applications of Energy Minimzation
DEPARTMENT: COMPUTER SC. & ENGG. SEMESTER : VII
Unsupervised Learning and Neural Networks
Self organizing networks
CSE P573 Applications of Artificial Intelligence Neural Networks
Unsupervised learning
Lecture 22 Clustering (3).
Neural Networks and Their Application in the Fields of Coporate Finance By Eric Séverin Hanna Viinikainen.
CSE 573 Introduction to Artificial Intelligence Neural Networks
Computational Intelligence: Methods and Applications
Self-Organizing Maps (SOM) (§ 5.5)
Introduction to Cluster Analysis
Feature mapping: Self-organizing Maps
Artificial Neural Networks
Unsupervised Networks Closely related to clustering
Presentation transcript:

KOHONEN SELF ORGANISING MAP SEMINAR BY M.V.MAHENDRAN., Reg no: III SEM, M.E., Control And Instrumentation Engg

Introduction to Neural networks Artificial Intelligence: - Automation of Intelligence of human behaviour Turing test: -output indistinguishable Hard computing: -computations using ones and zeros Soft computing: -computation as like as human brain does

Introduction to Neural networks.. cont Soft computing: -Neural network -Fuzzy logic -Genetic algorithm Feature of Neural networks -Highly massive parallel and distributed computation. -Fault tolerant -Mapping, Generalization, Robustness

Neural networks ANN learning: -Training with known examples of problem -Supervised-as if teacher present -Unsupervised –as if no teacher present -Weight are updated… how? Using rules. Whether continuous manner or discrete step Delta rule..

Weight update rule Delta rule: Input /output pair is known. -Apply input to ANN. -Compute output. -Compute error -Forming error function -Minimization of error -Reach local minimum / global minimum *** this delta Rule can be generalized for any number of layers

Kohonen self organising map SOM motivated by the feature of human brain. The neurons organized in one or multi dimensional lattices. The Neurons compete among themselves to be activated in according to a competition learning scheme. The weight vector accosicated with winning neuron is only updated in the scheme ’WINNER TAKES ALL’. The ‘soft max’ rule however not only winning neurons but also other neighbourhood neurons take path in the self organising process.

Kohonen self organising map (cont) Kohonen introduced a novel neighbourhood concept where the topology of input data space can be learnt through SOM. In this scheme, a neural lattice can be one or multi dimensional neighbourhood concept among individual neurons in a lattice is a priory embedded. As neurons updates their weights upon competition, a meaningful co-ordinated system for different input feature over the lattice is developed.

2 D Kohonen Lattices

SOM LEARNING ALGORITHM Random weight initialization. There are three essential processes involved in the formation of SOM. 1.Competition. ( To find Winner) 2.Co-operation ( Neighbourhood) 3.Weight update

1. COMPETITION computing distance measure ƒ(lateral distance) for all neuron. The neuron for which ƒ(lateral distance) is minimum declared as Winner.

2. CO-OPERATION The winning neuron selects the neighbourhood according to a pre-defined neighbourhood function. Let D(i) denotes the topological neighbourhood centered on winning neuron i. di,j denotes the lateral distance between winning neuron I and excited neuron j The amplitude of topological neighbourhood decreases monotonically with increasing lateral distance and decaying to zero. This is necessary condition for convergences.

3. WEIGHT UPDATE In beginning winning neuron, all other neurons are considered as neighbours. As learning progress neighbourhood shrinks Weight associated with winning neuron and its neighbours are updated as per neighbourhood index. Winning neuron allowed o be maximally benefited from the weight update while neurons farthest from the winner is minimally benefitted.

CLUSTERING How can we represent voluminous data using finite number of samples. Topology of input space preserved.

SOM Graphics Video demo how SOM formed

Architecture of KSOM

Kohonen SOM

problem

Problem soln:

Matlab SOM video

THAK YOU