Artificial Neural Networks Dr. Abdul Basit Siddiqui Assistant Professor FURC.

Slides:



Advertisements
Similar presentations
© Negnevitsky, Pearson Education, Introduction Introduction Hebbian learning Hebbian learning Generalised Hebbian learning algorithm Generalised.
Advertisements

Neural Networks Dr. Peter Phillips. The Human Brain (Recap of week 1)
Organizing a spectral image database by using Self-Organizing Maps Research Seminar Oili Kohonen.
Un Supervised Learning & Self Organizing Maps. Un Supervised Competitive Learning In Hebbian networks, all neurons can fire at the same time Competitive.
Neural Networks Chapter 9 Joost N. Kok Universiteit Leiden.
Unsupervised learning. Summary from last week We explained what local minima are, and described ways of escaping them. We investigated how the backpropagation.
Self Organization: Competitive Learning
5/16/2015Intelligent Systems and Soft Computing1 Introduction Introduction Hebbian learning Hebbian learning Generalised Hebbian learning algorithm Generalised.
Kohonen Self Organising Maps Michael J. Watts
Unsupervised Learning with Artificial Neural Networks The ANN is given a set of patterns, P, from space, S, but little/no information about their classification,
Artificial neural networks:
Unsupervised Networks Closely related to clustering Do not require target outputs for each input vector in the training data Inputs are connected to a.
Self-Organizing Map (SOM). Unsupervised neural networks, equivalent to clustering. Two layers – input and output – The input layer represents the input.
Machine Learning: Connectionist McCulloch-Pitts Neuron Perceptrons Multilayer Networks Support Vector Machines Feedback Networks Hopfield Networks.
X0 xn w0 wn o Threshold units SOM.
Self Organizing Maps. This presentation is based on: SOM’s are invented by Teuvo Kohonen. They represent multidimensional.
Simple Neural Nets For Pattern Classification
Slides are based on Negnevitsky, Pearson Education, Lecture 8 Artificial neural networks: Unsupervised learning n Introduction n Hebbian learning.
November 9, 2010Neural Networks Lecture 16: Counterpropagation 1 Unsupervised Learning So far, we have only looked at supervised learning, in which an.
Neural Networks based on Competition
Introduction to Neural Networks John Paxton Montana State University Summer 2003.
JYC: CSM17 Bioinformatics Week 9: Simulations #3: Neural Networks biological neurons natural neural networks artificial neural networks applications.
Neural Networks Lecture 17: Self-Organizing Maps
Lecture 09 Clustering-based Learning
S. Mandayam/ ANN/ECE Dept./Rowan University Artificial Neural Networks ECE /ECE Fall 2010 Shreekanth Mandayam ECE Department Rowan University.
 C. C. Hung, H. Ijaz, E. Jung, and B.-C. Kuo # School of Computing and Software Engineering Southern Polytechnic State University, Marietta, Georgia USA.
Lecture 12 Self-organizing maps of Kohonen RBF-networks
KOHONEN SELF ORGANISING MAP SEMINAR BY M.V.MAHENDRAN., Reg no: III SEM, M.E., Control And Instrumentation Engg.
Unsupervised Learning and Clustering k-means clustering Sum-of-Squared Errors Competitive Learning SOM Pre-processing and Post-processing techniques.
Presentation on Neural Networks.. Basics Of Neural Networks Neural networks refers to a connectionist model that simulates the biophysical information.
Self Organizing Maps (SOM) Unsupervised Learning.
Self Organized Map (SOM)
CZ5225: Modeling and Simulation in Biology Lecture 5: Clustering Analysis for Microarray Data III Prof. Chen Yu Zong Tel:
Self-organizing Maps Kevin Pang. Goal Research SOMs Research SOMs Create an introductory tutorial on the algorithm Create an introductory tutorial on.
-Artificial Neural Network- Chapter 9 Self Organization Map(SOM) 朝陽科技大學 資訊管理系 李麗華 教授.
Artificial Neural Network Unsupervised Learning
Machine Learning Neural Networks (3). Understanding Supervised and Unsupervised Learning.
Institute for Advanced Studies in Basic Sciences – Zanjan Kohonen Artificial Neural Networks in Analytical Chemistry Mahdi Vasighi.
Self Organizing Feature Map CS570 인공지능 이대성 Computer Science KAIST.
Neural Networks - Lecture 81 Unsupervised competitive learning Particularities of unsupervised learning Data clustering Neural networks for clustering.
UNSUPERVISED LEARNING NETWORKS
Unsupervised Learning
381 Self Organization Map Learning without Examples.
CUNY Graduate Center December 15 Erdal Kose. Outlines Define SOMs Application Areas Structure Of SOMs (Basic Algorithm) Learning Algorithm Simulation.
Unsupervised Learning Networks 主講人 : 虞台文. Content Introduction Important Unsupervised Learning NNs – Hamming Networks – Kohonen’s Self-Organizing Feature.
Semiconductors, BP&A Planning, DREAM PLAN IDEA IMPLEMENTATION.
Self-Organizing Maps (SOM) (§ 5.5)
Self Organizing Maps: Clustering With unsupervised learning there is no instruction and the network is left to cluster patterns. All of the patterns within.
CHAPTER 14 Competitive Networks Ming-Feng Yeh.
COSC 4426 AJ Boulay Julia Johnson Artificial Neural Networks: Introduction to Soft Computing (Textbook)
ViSOM - A Novel Method for Multivariate Data Projection and Structure Visualization Advisor : Dr. Hsu Graduate : Sheng-Hsuan Wang Author : Hujun Yin.
Example Apply hierarchical clustering with d min to below data where c=3. Nearest neighbor clustering d min d max will form elongated clusters!
Self-Organizing Network Model (SOM) Session 11
Data Mining, Neural Network and Genetic Programming
Ch10 : Self-Organizing Feature Maps
Unsupervised Learning Networks
Other Applications of Energy Minimzation
Unsupervised Learning and Neural Networks
Self organizing networks
CSE P573 Applications of Artificial Intelligence Neural Networks
Dr. Unnikrishnan P.C. Professor, EEE
Lecture 22 Clustering (3).
Kohonen Self-organizing Feature Maps
CSE 573 Introduction to Artificial Intelligence Neural Networks
Self-Organizing Maps (SOM) (§ 5.5)
Feature mapping: Self-organizing Maps
Artificial Neural Networks
Unsupervised Networks Closely related to clustering
Presentation transcript:

Artificial Neural Networks Dr. Abdul Basit Siddiqui Assistant Professor FURC

NEURAL NETWORKS BASED ON COMPETITION Kohonen SOM (Learning Unsupervised Environment)

Unsupervised Learning We can include additional structure in the network so that the net is forced to make a decision as to which one unit will respond. The mechanism by which it is achieved is called competition. It can be used in unsupervised learning. A common use for unsupervised learning is clustering based neural networks.

Unsupervised Learning In a clustering net, there are as many units as the input vector has components. Every output unit represents a cluster and the number of output units limit the number of clusters. During the training, the network finds the best matching output unit to the input vector. The weight vector of the winner is then updated according to learning algorithm.

Kohonen Learning A variety of nets use Kohonen Learning –New weight vector is the linear combination of old weight vector and the current input vector. –The weight update for cluster unit (output unit) j can be calculated as: –the learning rate alpha decreases as the learning process proceeds.

Kohonen SOM (Self Organizing Maps) Since it is unsupervised environment, so the name is Self Organizing Maps. Self Organizing NNs are also called Topology Preserving Maps which leads to the idea of neighborhood of the clustering unit. During the self-organizing process, the weight vectors of winning unit and its neighbors are updated.

Kohonen SOM (Self Organizing Maps) Normally, Euclidean distance measure is used to find the cluster unit whose weight vector matches most closely to the input vector. For a linear array of cluster units, the neighborhood of radius R around cluster unit J consists of all units j such that:

Kohonen SOM (Self Organizing Maps) Architecture of SOM

Kohonen SOM (Self Organizing Maps) Structure of Neighborhoods

Kohonen SOM (Self Organizing Maps) Structure of Neighborhoods

Kohonen SOM (Self Organizing Maps) Structure of Neighborhoods

Kohonen SOM (Self Organizing Maps) –Neighborhoods do not wrap around from one side of the grid to other side which means missing units are simply ignored. Algorithm:

Kohonen SOM (Self Organizing Maps) Algorithm: –Radius and learning rates may be decreased after each epoch. –Learning rate decrease may be either linear or geometric.

Winning neuron wiwi neuron i Input vector X X=[x 1,x 2,…x n ]  R n w i =[w i1,w i2,…,w in ]  R n Kohonen layer KOHONEN SELF ORGANIZING MAPS Architecture

Kohonen SOM (Self Organizing Maps) Example

Kohonen SOM (Self Organizing Maps)