CS623: Introduction to Computing with Neural Nets (lecture-20) Pushpak Bhattacharyya Computer Science and Engineering Department IIT Bombay.

Slides:



Advertisements
Similar presentations
Sheeraza Bandi Sheeraza Bandi CS-635 Advanced Machine Learning Zahid Irfan February 2004 Picture © Greg Martin,
Advertisements

CS344: Principles of Artificial Intelligence Pushpak Bhattacharyya CSE Dept., IIT Bombay Lecture 11, 12: Perceptron Training 30 th and 31 st Jan, 2012.
Un Supervised Learning & Self Organizing Maps. Un Supervised Competitive Learning In Hebbian networks, all neurons can fire at the same time Competitive.
5/16/2015Intelligent Systems and Soft Computing1 Introduction Introduction Hebbian learning Hebbian learning Generalised Hebbian learning algorithm Generalised.
Kohonen Self Organising Maps Michael J. Watts
Artificial neural networks:
Competitive Networks. Outline Hamming Network.
Slides are based on Negnevitsky, Pearson Education, Lecture 8 Artificial neural networks: Unsupervised learning n Introduction n Hebbian learning.
November 19, 2009Introduction to Cognitive Science Lecture 20: Artificial Neural Networks I 1 Artificial Neural Network (ANN) Paradigms Overview: The Backpropagation.
Un Supervised Learning & Self Organizing Maps Learning From Examples
1 Pendahuluan Pertemuan 1 Matakuliah: T0293/Neuro Computing Tahun: 2005.
November 24, 2009Introduction to Cognitive Science Lecture 21: Self-Organizing Maps 1 Self-Organizing Maps (Kohonen Maps) In the BPN, we used supervised.
Un Supervised Learning & Self Organizing Maps Learning From Examples
Neural Networks Lecture 17: Self-Organizing Maps
Lecture 09 Clustering-based Learning
Traffic Sign Recognition Using Artificial Neural Network Radi Bekker
CS623: Introduction to Computing with Neural Nets (lecture-6) Pushpak Bhattacharyya Computer Science and Engineering Department IIT Bombay.
CS623: Introduction to Computing with Neural Nets (lecture-10) Pushpak Bhattacharyya Computer Science and Engineering Department IIT Bombay.
CHAPTER 12 ADVANCED INTELLIGENT SYSTEMS © 2005 Prentice Hall, Decision Support Systems and Intelligent Systems, 7th Edition, Turban, Aronson, and Liang.
JingTao Yao Growing Hierarchical Self-Organizing Maps for Web Mining Joseph P. Herbert and JingTao Yao Department of Computer Science, University or Regina.
Pushpak Bhattacharyya CSE Dept., IIT Bombay Lecture 38: PAC Learning, VC dimension; Self Organization.
MSE 2400 EaLiCaRA Spring 2015 Dr. Tom Way
KOHONEN SELF ORGANISING MAP SEMINAR BY M.V.MAHENDRAN., Reg no: III SEM, M.E., Control And Instrumentation Engg.
Presentation on Neural Networks.. Basics Of Neural Networks Neural networks refers to a connectionist model that simulates the biophysical information.
CZ5225: Modeling and Simulation in Biology Lecture 5: Clustering Analysis for Microarray Data III Prof. Chen Yu Zong Tel:
Self-organizing map Speech and Image Processing Unit Department of Computer Science University of Joensuu, FINLAND Pasi Fränti Clustering Methods: Part.
Artificial Neural Networks Dr. Abdul Basit Siddiqui Assistant Professor FURC.
 The most intelligent device - “Human Brain”.  The machine that revolutionized the whole world – “computer”.  Inefficiencies of the computer has lead.
Artificial Neural Network Unsupervised Learning
Self organizing maps 1 iCSC2014, Juan López González, University of Oviedo Self organizing maps A visualization technique with data dimension reduction.
A Scalable Self-organizing Map Algorithm for Textual Classification: A Neural Network Approach to Thesaurus Generation Dmitri G. Roussinov Department of.
LINEAR CLASSIFICATION. Biological inspirations  Some numbers…  The human brain contains about 10 billion nerve cells ( neurons )  Each neuron is connected.
Machine Learning Neural Networks (3). Understanding Supervised and Unsupervised Learning.
CS344: Introduction to Artificial Intelligence (associated lab: CS386) Pushpak Bhattacharyya CSE Dept., IIT Bombay Lecture 31: Feedforward N/W; sigmoid.
CS344: Introduction to Artificial Intelligence (associated lab: CS386) Pushpak Bhattacharyya CSE Dept., IIT Bombay Lecture 30: Perceptron training convergence;
Neural Networks - Lecture 81 Unsupervised competitive learning Particularities of unsupervised learning Data clustering Neural networks for clustering.
UNSUPERVISED LEARNING NETWORKS
© 2005 Prentice Hall, Decision Support Systems and Intelligent Systems, 7th Edition, Turban, Aronson, and Liang 12-1 Chapter 12 Advanced Intelligent Systems.
CSE & CSE6002E - Soft Computing Winter Semester, 2011 Neural Networks Videos Brief Review The Next Generation Neural Networks - Geoff Hinton.
Instructor: Prof. Pushpak Bhattacharyya 13/08/2004 CS-621/CS-449 Lecture Notes CS621/CS449 Artificial Intelligence Lecture Notes Set 4: 24/08/2004, 25/08/2004,
Neural Network Basics Anns are analytical systems that address problems whose solutions have not been explicitly formulated Structure in which multiple.
Prof. Pushpak Bhattacharyya, IIT Bombay 1 CS 621 Artificial Intelligence Lecture /10/05 Prof. Pushpak Bhattacharyya Artificial Neural Networks:
CS621 : Artificial Intelligence
1 Lecture 6 Neural Network Training. 2 Neural Network Training Network training is basic to establishing the functional relationship between the inputs.
Prof. Pushpak Bhattacharyya, IIT Bombay CS 621 Artificial Intelligence Lecture 28 – 22/10/05 Prof. Pushpak Bhattacharyya Generating Function,
COSC 4426 AJ Boulay Julia Johnson Artificial Neural Networks: Introduction to Soft Computing (Textbook)
Example Apply hierarchical clustering with d min to below data where c=3. Nearest neighbor clustering d min d max will form elongated clusters!
Deep Learning Overview Sources: workshop-tutorial-final.pdf
CS623: Introduction to Computing with Neural Nets (lecture-9) Pushpak Bhattacharyya Computer Science and Engineering Department IIT Bombay.
Machine Learning 12. Local Models.
Big data classification using neural network
Data Mining, Neural Network and Genetic Programming
CS623: Introduction to Computing with Neural Nets (lecture-5)
CSE P573 Applications of Artificial Intelligence Neural Networks
CS621: Artificial Intelligence
Prof. Carolina Ruiz Department of Computer Science
Chapter 12 Advanced Intelligent Systems
Lecture 22 Clustering (3).
CS623: Introduction to Computing with Neural Nets (lecture-2)
CSE 573 Introduction to Artificial Intelligence Neural Networks
CS623: Introduction to Computing with Neural Nets (lecture-9)
CS 621 Artificial Intelligence Lecture 27 – 21/10/05
CS623: Introduction to Computing with Neural Nets (lecture-5)
CS623: Introduction to Computing with Neural Nets (lecture-3)
CS 621 Artificial Intelligence Lecture 29 – 22/10/05
Artificial Neural Networks
CS621: Artificial Intelligence Lecture 22-23: Sigmoid neuron, Backpropagation (Lecture 20 and 21 taken by Anup on Graphical Models) Pushpak Bhattacharyya.
CS621: Artificial Intelligence Lecture 18: Feedforward network contd
Prof. Carolina Ruiz Department of Computer Science
Presentation transcript:

CS623: Introduction to Computing with Neural Nets (lecture-20) Pushpak Bhattacharyya Computer Science and Engineering Department IIT Bombay

Self Organization Biological Motivation Brain

Higher brain Brain Cerebellum Cerebrum 3- Layers: Cerebrum Cerebellum Higher brain

Maslow’s hierarchy Search for Meaning Contributing to humanity Achievement,recognition Food,rest survival

Higher brain ( responsible for higher needs) Cerebrum (crucial for survival) 3- Layers: Cerebrum Cerebellum Higher brain

Mapping of Brain Side areas For auditory information processing Back of brain( vision) Lot of resilience: Visual and auditory areas can do each other’s job

Left Brain and Right Brain Dichotomy Left Brain Right Brain

Left Brain – Logic, Reasoning, Verbal ability Right Brain – Emotion, Creativity Music Words – left Brain Tune – Right Brain Maps in the brain. Limbs are mapped to brain

Character Recognition: A, A, A,,, O/p grid.. I/p neuron

Kohonen Net Self Organization or Kohonen network fires a group of neurons instead of a single one. The group “some how” produces a “picture” of the cluster. Fundamentally SOM is competitive learning. But weight changes are incorporated on a neighborhood. Find the winner neuron, apply weight change for the winner and its “neighbors”.

Neurons on the contour are the “neighborhood” neurons. Winner

Weight change rule for SOM W (n+1) = W (n) + η (n) ( I (n) – W (n) ) P+δ(n) P+δ(n) P+δ(n) Neighborhood: function of n Learning rate: function of n δ(n) is a decreasing function of n η(n) learning rate is also a decreasing function of n 0 < η(n) < η(n –1 ) <=1

Pictorially Winner δ(n).. Convergence for kohonen not proved except for uni- dimension

… … …. P neurons o/p layer n neurons Clusters: A A : A WpWp B : C : :

Clustering Algos 1. Competitive learning 2. K – means clustering 3. Counter Propagation

K – means clustering K o/p neurons are required from the knowledge of K clusters being present. 26 neurons Full connection n neurons ……

Steps 1. Initialize the weights randomly. 2. I k is the vector presented at k th iteration. 3. Find W* such that |w* - I k | < |w j - I k | for all j 4. make W* (new) = W* (old) + η(I k - w* ). 5 K  K +1 ; if go to Go to 2 until the error is below a threshold.

Two part assignment Supervised Hidden layer

A1A1 A2A2 A3A3 4 I/p neurons A4A4 Cluster Discovery By SOM/Kohenen Net

NeoCognitron (Fukusima et. al.)

Hierarchical feature extraction based

Corresponding Netowork

S-Layer Each S-layer in the neocognitron is intended for extraction of features from corresponding stage of hierarchy. Particular S-layers are formed by distinct number of S-planes. Number of these S- planes depends on the number of extracted features.

V-Layer Each V-layer in the neocognitron is intended for obtaining of informations about average activity in previous C- layer (or input layer). Particular V-layers are always formed by only one V-plane.

C-Layer Each C-layer in the neocognitron is intended for ensuring of tolerance of shifts of features extracted in previous S- layer. Particular C-layers are formed by distinct number of C-planes. Their number depends on the number of features extracted in the previous S-layer.