Dr. Unnikrishnan P.C. Professor, EEE

Slides:



Advertisements
Similar presentations
Neural Networks Dr. Peter Phillips. The Human Brain (Recap of week 1)
Advertisements

Marković Miljan 3139/2011
Instar and Outstar Learning Laws Adapted from lecture notes of the course CN510: Cognitive and Neural Modeling offered in the Department of Cognitive and.
Data mining in wireless sensor networks based on artificial neural-networks algorithms Authors: Andrea Kulakov and Danco Davcev Presentation by: Niyati.
Neural Networks Chapter 9 Joost N. Kok Universiteit Leiden.
Kohonen Self Organising Maps Michael J. Watts
Adaptive Resonance Theory (ART) networks perform completely unsupervised learning. Their competitive learning algorithm is similar to the first (unsupervised)
Competitive learning College voor cursus Connectionistische modellen M Meeter.
Unsupervised Networks Closely related to clustering Do not require target outputs for each input vector in the training data Inputs are connected to a.
Decision Support Systems
Slides are based on Negnevitsky, Pearson Education, Lecture 8 Artificial neural networks: Unsupervised learning n Introduction n Hebbian learning.
Introduction to Neural Networks John Paxton Montana State University Summer 2003.
1 Pertemuan 15 ADAPTIVE RESONANCE THEORY Matakuliah: H0434/Jaringan Syaraf Tiruan Tahun: 2005 Versi: 1.
1 Chapter 11 Neural Networks. 2 Chapter 11 Contents (1) l Biological Neurons l Artificial Neurons l Perceptrons l Multilayer Neural Networks l Backpropagation.
An Illustrative Example
Instar Learning Law Adapted from lecture notes of the course CN510: Cognitive and Neural Modeling offered in the Department of Cognitive and Neural Systems.
Artificial Neural Networks Ch15. 2 Objectives Grossberg network is a self-organizing continuous-time competitive network.  Continuous-time recurrent.
ART (Adaptive Resonance Theory)
Hazırlayan NEURAL NETWORKS Radial Basis Function Networks II PROF. DR. YUSUF OYSAL.
Unsupervised Learning and Clustering k-means clustering Sum-of-Squared Errors Competitive Learning SOM Pre-processing and Post-processing techniques.
Presentation on Neural Networks.. Basics Of Neural Networks Neural networks refers to a connectionist model that simulates the biophysical information.
1 Neural Network-Based Clustering A. Selçuk MERCANLI Supervisor: Assist. Prof.Dr. Turgay İBRİKÇİ.
5.5 Learning algorithms. Neural Network inherits their flexibility and computational power from their natural ability to adjust the changing environments.
Artificial Neural Networks Dr. Abdul Basit Siddiqui Assistant Professor FURC.
NEURAL NETWORKS FOR DATA MINING
Adaptive Resonance Theory
15 1 Grossberg Network Biological Motivation: Vision Eyeball and Retina.
1 Chapter 11 Neural Networks. 2 Chapter 11 Contents (1) l Biological Neurons l Artificial Neurons l Perceptrons l Multilayer Neural Networks l Backpropagation.
Chapter 5. Adaptive Resonance Theory (ART) ART1: for binary patterns; ART2: for continuous patterns Motivations: Previous methods have the following problem:
Soft Computing Lecture 14 Clustering and model ART.
Ming-Feng Yeh1 CHAPTER 16 AdaptiveResonanceTheory.
Activations, attractors, and associators Jaap Murre Universiteit van Amsterdam
Neural Networks - Lecture 81 Unsupervised competitive learning Particularities of unsupervised learning Data clustering Neural networks for clustering.
IE 585 Competitive Network – Learning Vector Quantization & Counterpropagation.
UNSUPERVISED LEARNING NETWORKS
1 An Anti-Spam filter based on Adaptive Neural Networks Alexandru Catalin Cosoi Researcher / BitDefender AntiSpam Laboratory
1 Adaptive Resonance Theory. 2 INTRODUCTION Adaptive resonance theory (ART) was developed by Carpenter and Grossberg[1987a] ART refers to the class of.
Neural Network Basics Anns are analytical systems that address problems whose solutions have not been explicitly formulated Structure in which multiple.
Unsupervised Learning Networks 主講人 : 虞台文. Content Introduction Important Unsupervised Learning NNs – Hamming Networks – Kohonen’s Self-Organizing Feature.
CHAPTER 14 Competitive Networks Ming-Feng Yeh.
COSC 4426 AJ Boulay Julia Johnson Artificial Neural Networks: Introduction to Soft Computing (Textbook)
Intelligent Database Systems Lab Advisor : Dr. Hsu Graduate : Yu Cheng Chen Author : Yongqiang Cao Jianhong Wu 國立雲林科技大學 National Yunlin University of Science.
Example Apply hierarchical clustering with d min to below data where c=3. Nearest neighbor clustering d min d max will form elongated clusters!
Artificial Neural Networks for Data Mining. Copyright © 2011 Pearson Education, Inc. Publishing as Prentice Hall 6-2 Learning Objectives Understand the.
Soft Computing Lecture 15 Constructive learning algorithms. Network of Hamming.
Business Intelligence and Decision Support Systems (9 th Ed., Prentice Hall) Chapter 6: Artificial Neural Networks for Data Mining.
Business Intelligence and Decision Support Systems (9 th Ed., Prentice Hall) Chapter 6: Artificial Neural Networks for Data Mining.
Supervised Learning – Network is presented with the input and the desired output. – Uses a set of inputs for which the desired outputs results / classes.
Machine Learning 12. Local Models.
Chapter 5 Unsupervised learning
Data Mining, Neural Network and Genetic Programming
Adaptive Resonance Theory (ART)
Unsupervised Learning Networks
Other Applications of Energy Minimzation
Dr. Unnikrishnan P.C. Professor, EEE
Unsupervised Learning and Neural Networks
Counter propagation network (CPN) (§ 5.3)
Intelligent Leaning -- A Brief Introduction to Artificial Neural Networks Chiung-Yao Fang.
Intelligent Leaning -- A Brief Introduction to Artificial Neural Networks Chiung-Yao Fang.
Grossberg Network.
Competitive Networks.
Adaptive Resonance Theory
Artificial Neural Networks
Clustering Techniques
Grossberg Network.
Competitive Networks.
Adaptive Resonance Theory
Dr. Unnikrishnan P.C. Professor, EEE
The Network Approach: Mind as a Web
Artificial Neural Networks
Presentation transcript:

Dr. Unnikrishnan P.C. Professor, EEE EE368 Soft Computing Dr. Unnikrishnan P.C. Professor, EEE

Module II Adaptive Resonance Theory

Adaptive Resonance Theory Unsupervised ANN Stability-Plasticity Dilemma Adaptive Resonance Theory basics ART Architecture Algorithm Types of ART NN Applications References

Unsupervised ANN Usually 2-layer ANN Only input data are given ANN must self-organise output Two main models: Kohonen’s Self Organizing Map (SOM) and Grossberg’s ART Clustering applications

Stability Plasticity Dilemma (SPD)

SPD …… Every learning system faces the plasticity-stability dilemma. The plasticity-stability dilemma poses few questions :

What is ART ? ART stands for "Adaptive Resonance Theory", invented by Stephen Grossberg in 1976. ART represents a family of neural networks. The basic ART System is an unsupervised learning model. The term "resonance" refers to resonant state of a neural network in which a category prototype vector matches close enough to the current input vector. ART matching leads to this resonant state, which permits learning. The network learns only in its resonant state. Stephen Grossberg

Key Innovation The key innovation of ART is the use of “expectations.” › As each input is presented to the network, it is compared with the prototype vector that is most closely matches (the expectation). › If the match between the prototype and the input vector is NOT adequate, a new prototype is selected. In this way, previous learned memories (prototypes) are not eroded by new learning.

Grossberg Network The L1-L2 connections are instars, which performs a clustering (or categorization) operation. When an input pattern is presented, it is multiplied (after normalization) by the L1-L2 weight matrix. A competition is performed at Layer 2 to determine which row of the weight matrix is closest to the input vector. That row is then moved toward the input vector. After learning is complete, each row of the L1-L2 weight matrix is a prototype pattern, which represents a cluster (or a category) of input vectors.

ART Network … Learning of ART networks also occurs in a set of feedback connections from Layer 2 to Layer 1.These connections are outstars which perform pattern recall. When a node in Layer 2 is activated this reproduces a prototype pattern (The expectation) at Layer 1. Layer 1 then performs a comparison between the expectation and the input pattern When the expectation and the input pattern are NOT closely matched, the orienting subsystem causes a reset in Layer 2.

ART Network … The reset disables the current winning neuron, and the current expectation is removed. A new competition is then performed in Layer 2, while the previous winning neuron is disabled The new winning neuron in layer 2 projects a new expectation to Layer 1, through the L2-L1 connections. This process continues until the L2-L1 expectation provides a close enough match to the input pattern.

ART Architecture › Forward matching Bottom-up weights bij Top-down weights tij Store class template Input nodes Vigilance test Input normalisation Output nodes › Forward matching Long-term memory ANN weights Short-term memory ANN activation pattern

Basic ART Structure

Overview STM –Short Term Memory LTM- Long Term Memory

A simple ART-1 Structure