Neural Networks Dr. Peter Phillips. The Human Brain (Recap of week 1)

Slides:



Advertisements
Similar presentations
© Negnevitsky, Pearson Education, Introduction Introduction Hebbian learning Hebbian learning Generalised Hebbian learning algorithm Generalised.
Advertisements

Bioinspired Computing Lecture 14
Un Supervised Learning & Self Organizing Maps. Un Supervised Competitive Learning In Hebbian networks, all neurons can fire at the same time Competitive.
Neural Networks Chapter 9 Joost N. Kok Universiteit Leiden.
Unsupervised learning. Summary from last week We explained what local minima are, and described ways of escaping them. We investigated how the backpropagation.
Self Organization: Competitive Learning
5/16/2015Intelligent Systems and Soft Computing1 Introduction Introduction Hebbian learning Hebbian learning Generalised Hebbian learning algorithm Generalised.
Outcomes  Look at the theory of self-organisation.  Other self-organising networks  Look at examples of neural network applications.
Kohonen Self Organising Maps Michael J. Watts
Artificial neural networks:
Competitive learning College voor cursus Connectionistische modellen M Meeter.
Unsupervised Networks Closely related to clustering Do not require target outputs for each input vector in the training data Inputs are connected to a.
Machine Learning: Connectionist McCulloch-Pitts Neuron Perceptrons Multilayer Networks Support Vector Machines Feedback Networks Hopfield Networks.
X0 xn w0 wn o Threshold units SOM.
Self Organizing Maps. This presentation is based on: SOM’s are invented by Teuvo Kohonen. They represent multidimensional.
1 Chapter 10 Introduction to Machine Learning. 2 Chapter 10 Contents (1) l Training l Rote Learning l Concept Learning l Hypotheses l General to Specific.
Slides are based on Negnevitsky, Pearson Education, Lecture 8 Artificial neural networks: Unsupervised learning n Introduction n Hebbian learning.
November 19, 2009Introduction to Cognitive Science Lecture 20: Artificial Neural Networks I 1 Artificial Neural Network (ANN) Paradigms Overview: The Backpropagation.
1 Chapter 11 Neural Networks. 2 Chapter 11 Contents (1) l Biological Neurons l Artificial Neurons l Perceptrons l Multilayer Neural Networks l Backpropagation.
Neural Networks Dr. Peter Phillips. Neural Networks What are Neural Networks Where can neural networks be used Examples Recognition systems (Voice, Signature,
November 9, 2010Neural Networks Lecture 16: Counterpropagation 1 Unsupervised Learning So far, we have only looked at supervised learning, in which an.
Un Supervised Learning & Self Organizing Maps Learning From Examples
Aula 4 Radial Basis Function Networks
JYC: CSM17 Bioinformatics Week 9: Simulations #3: Neural Networks biological neurons natural neural networks artificial neural networks applications.
Lecture 09 Clustering-based Learning
 C. C. Hung, H. Ijaz, E. Jung, and B.-C. Kuo # School of Computing and Software Engineering Southern Polytechnic State University, Marietta, Georgia USA.
Machine Learning. Learning agent Any other agent.
Lecture 12 Self-organizing maps of Kohonen RBF-networks
KOHONEN SELF ORGANISING MAP SEMINAR BY M.V.MAHENDRAN., Reg no: III SEM, M.E., Control And Instrumentation Engg.
Unsupervised Learning and Clustering k-means clustering Sum-of-Squared Errors Competitive Learning SOM Pre-processing and Post-processing techniques.
1 st Neural Network: AND function Threshold(Y) = 2 X1 Y X Y.
Presentation on Neural Networks.. Basics Of Neural Networks Neural networks refers to a connectionist model that simulates the biophysical information.
Self Organized Map (SOM)
CZ5225: Modeling and Simulation in Biology Lecture 5: Clustering Analysis for Microarray Data III Prof. Chen Yu Zong Tel:
Intrusion Detection Using Hybrid Neural Networks Vishal Sevani ( )
5.5 Learning algorithms. Neural Network inherits their flexibility and computational power from their natural ability to adjust the changing environments.
Artificial Neural Networks Dr. Abdul Basit Siddiqui Assistant Professor FURC.
Artificial Neural Network Unsupervised Learning
Self organizing maps 1 iCSC2014, Juan López González, University of Oviedo Self organizing maps A visualization technique with data dimension reduction.
Self-Organising Networks This is DWC-lecture 8 of Biologically Inspired Computing; about Kohonen’s SOM, what it’s useful for, and some applications.
NEURAL NETWORKS FOR DATA MINING
1 Chapter 11 Neural Networks. 2 Chapter 11 Contents (1) l Biological Neurons l Artificial Neurons l Perceptrons l Multilayer Neural Networks l Backpropagation.
Machine Learning Neural Networks (3). Understanding Supervised and Unsupervised Learning.
Self Organizing Feature Map CS570 인공지능 이대성 Computer Science KAIST.
IE 585 Competitive Network – Learning Vector Quantization & Counterpropagation.
UNSUPERVISED LEARNING NETWORKS
1 Chapter 10 Introduction to Machine Learning. 2 Chapter 10 Contents (1) l Training l Rote Learning l Concept Learning l Hypotheses l General to Specific.
CUNY Graduate Center December 15 Erdal Kose. Outlines Define SOMs Application Areas Structure Of SOMs (Basic Algorithm) Learning Algorithm Simulation.
Semiconductors, BP&A Planning, DREAM PLAN IDEA IMPLEMENTATION.
Hybrid Intelligent Systems for Network Security Lane Thames Georgia Institute of Technology Savannah, GA
Soft Computing Lecture 15 Constructive learning algorithms. Network of Hamming.
Supervised Learning – Network is presented with the input and the desired output. – Uses a set of inputs for which the desired outputs results / classes.
Computational Intelligence: Methods and Applications Lecture 9 Self-Organized Mappings Włodzisław Duch Dept. of Informatics, UMK Google: W Duch.
Self-Organizing Network Model (SOM) Session 11
Data Mining, Neural Network and Genetic Programming
Other Applications of Energy Minimzation
Creating fuzzy rules from numerical data using a neural network
Neural Networks Dr. Peter Phillips.
Self organizing networks
Dr. Unnikrishnan P.C. Professor, EEE
Lecture 22 Clustering (3).
OVERVIEW OF BIOLOGICAL NEURONS
Computational Intelligence: Methods and Applications
Artificial Neural Networks
Self Organizing Maps A major principle of organization is the topographic map, i.e. groups of adjacent neurons process information from neighboring parts.
Artificial Neural Networks
Unsupervised Networks Closely related to clustering
A Neural Net For Terrain Classification
Presentation transcript:

Neural Networks Dr. Peter Phillips

The Human Brain (Recap of week 1)

A Classic Artificial Neuron (Recap cont.)  X1X1 X2X2 X3X3 SjSj Output W1W1 W2W2 W3W3 f(S j )

Unsupervised Learning Today’s lecture will consider the use of Self Organising Map (SOM) and Unsupervised Learning Recall that Supervised Learning matches inputs to outputs. Unsupervised Learning Classifies the data into classes

The Biological Basis for Unsupervised Neural Networks Major sensory and motor systems are ‘topographically mapped’ in the brain –Vision: retinotopic map –Hearing: tonotopic map –Touch: somatotopic map

Kohonen Self-Organising Maps The most famous unsupervised learning network is the Kohonen Network. Neural network algorithm using unsupervised competitive learning Primarily used for organization and visualization of complex data Teuvo Kohonen

Understanding the Data Set A good understanding of the data set is essential to use a SOM – or any network for that matter A ‘distance measure’ and/or suitable rescaling must be defined to allow meaningful comparison The data must be of good quality and must be representative of the application area

SOM - Architecture 2d array of neurons Set of input signals (connected to all neurons in lattice) Weighted synapses x1x1 x2x2 x3x3 xnxn... w j1 w j2 w j3 w jn j

Finding a Winner (2) Euclidean distance between two vectors a and b, a = (a 1,a 2,…,a n ), b = (b 1,b 2,…b n ), is calculated as: i.e. Pythagoras’ Theorem Other distance measures could be used, e.g. Manhattan distance Euclidean distance

SOM Parameters The learning rate and neighbourhood function define to what extent the weights of each node are adjusted

Neighbourhood function Degree of neighbourhood Distance from winner Degree of neighbourhood Distance from winner Time

Data For Tutorial Work Data collected from a UHT plant at Leatherhead Consists of 300 cases use 150 for Training and 150 for Testing Data collected with plant running in normal state, during cleaning of exchangers and with fault

Tutorial 2 (UHT Plant Data)

My settings First run 70 epochs – learning rate 0.6 to 0.1 Second run 50 epochs – learning rate constant 0.1 First run Neighbourhood kept to 1 Second run Neighbourhood start 1 end 0

Trajan Classification

SOM – New Data A trained SOM can be used to classify new input data The input data is classified to the node with the ‘best’ or ‘closest’ weights Previous knowledge of other data samples assigned to the same class enable inferences to be made about the ‘new’ input sample