1 Lecture 6 Neural Network Training. 2 Neural Network Training Network training is basic to establishing the functional relationship between the inputs.

Slides:



Advertisements
Similar presentations
Slides from: Doug Gray, David Poole
Advertisements

5/16/2015Intelligent Systems and Soft Computing1 Introduction Introduction Hebbian learning Hebbian learning Generalised Hebbian learning algorithm Generalised.
Artificial neural networks:
Kostas Kontogiannis E&CE
Machine Learning: Connectionist McCulloch-Pitts Neuron Perceptrons Multilayer Networks Support Vector Machines Feedback Networks Hopfield Networks.
Financial Informatics –XVI: Supervised Backpropagation Learning
Machine Learning Neural Networks
Simple Neural Nets For Pattern Classification
Supervised learning 1.Early learning algorithms 2.First order gradient methods 3.Second order gradient methods.
Prénom Nom Document Analysis: Artificial Neural Networks Prof. Rolf Ingold, University of Fribourg Master course, spring semester 2008.
Radial Basis Functions
1 Chapter 11 Neural Networks. 2 Chapter 11 Contents (1) l Biological Neurons l Artificial Neurons l Perceptrons l Multilayer Neural Networks l Backpropagation.
November 9, 2010Neural Networks Lecture 16: Counterpropagation 1 Unsupervised Learning So far, we have only looked at supervised learning, in which an.
Neural Networks. R & G Chapter Feed-Forward Neural Networks otherwise known as The Multi-layer Perceptron or The Back-Propagation Neural Network.
Prénom Nom Document Analysis: Artificial Neural Networks Prof. Rolf Ingold, University of Fribourg Master course, spring semester 2008.
Neural Networks Chapter Feed-Forward Neural Networks.
Before we start ADALINE
Image Compression Using Neural Networks Vishal Agrawal (Y6541) Nandan Dubey (Y6279)
Neural Networks. Background - Neural Networks can be : Biological - Biological models Artificial - Artificial models - Desire to produce artificial systems.
Dr. Hala Moushir Ebied Faculty of Computers & Information Sciences
Where We’re At Three learning rules  Hebbian learning regression  LMS (delta rule) regression  Perceptron classification.
Radial Basis Function Networks
Presentation on Neural Networks.. Basics Of Neural Networks Neural networks refers to a connectionist model that simulates the biophysical information.
Artificial Neural Networks (ANN). Output Y is 1 if at least two of the three inputs are equal to 1.
Multiple-Layer Networks and Backpropagation Algorithms
Artificial Neural Networks
Neural NetworksNN 11 Neural netwoks thanks to: Basics of neural network theory and practice for supervised and unsupervised.
Chapter 9 Neural Network.
Artificial Neural Network Unsupervised Learning
Waqas Haider Khan Bangyal. Multi-Layer Perceptron (MLP)
Chapter 3 Neural Network Xiu-jun GONG (Ph. D) School of Computer Science and Technology, Tianjin University
Artificial Neural Network Supervised Learning دكترمحسن كاهاني
NEURAL NETWORKS FOR DATA MINING
 Diagram of a Neuron  The Simple Perceptron  Multilayer Neural Network  What is Hidden Layer?  Why do we Need a Hidden Layer?  How do Multilayer.
Classification / Regression Neural Networks 2
Artificial Neural Networks. The Brain How do brains work? How do human brains differ from that of other animals? Can we base models of artificial intelligence.
CS 478 – Tools for Machine Learning and Data Mining Backpropagation.
1 Chapter 11 Neural Networks. 2 Chapter 11 Contents (1) l Biological Neurons l Artificial Neurons l Perceptrons l Multilayer Neural Networks l Backpropagation.
METU Informatics Institute Min720 Pattern Classification with Bio-Medical Applications Part 8: Neural Networks.
1 Back-propagation Algorithm: Supervised Learning  ackpropagation (BP) is amongst the ‘most popular algorithms for ANNs’: it has been estimated by Paul.
Multi-Layer Perceptron
ADALINE (ADAptive LInear NEuron) Network and
Neural Network Basics Anns are analytical systems that address problems whose solutions have not been explicitly formulated Structure in which multiple.
SUPERVISED LEARNING NETWORK
Lecture 5 Neural Control
Neural Networks Presented by M. Abbasi Course lecturer: Dr.Tohidkhah.
Neural Networks Teacher: Elena Marchiori R4.47 Assistant: Kees Jong S2.22
EEE502 Pattern Recognition
Hazırlayan NEURAL NETWORKS Backpropagation Network PROF. DR. YUSUF OYSAL.
IE 585 History of Neural Networks & Introduction to Simple Learning Rules.
COSC 4426 AJ Boulay Julia Johnson Artificial Neural Networks: Introduction to Soft Computing (Textbook)
Neural Networks 2nd Edition Simon Haykin
Artificial Intelligence Methods Neural Networks Lecture 3 Rakesh K. Bissoondeeal Rakesh K. Bissoondeeal.
1 Technological Educational Institute Of Crete Department Of Applied Informatics and Multimedia Intelligent Systems Laboratory.
Bab 5 Classification: Alternative Techniques Part 4 Artificial Neural Networks Based Classifer.
Kim HS Introduction considering that the amount of MRI data to analyze in present-day clinical trials is often on the order of hundreds or.
Supervised Learning – Network is presented with the input and the desired output. – Uses a set of inputs for which the desired outputs results / classes.
Lecture 2 Introduction to Neural Networks and Fuzzy Logic President UniversityErwin SitompulNNFL 2/1 Dr.-Ing. Erwin Sitompul President University
Data Mining: Concepts and Techniques1 Prediction Prediction vs. classification Classification predicts categorical class label Prediction predicts continuous-valued.
A Presentation on Adaptive Neuro-Fuzzy Inference System using Particle Swarm Optimization and it’s Application By Sumanta Kundu (En.R.No.
Multiple-Layer Networks and Backpropagation Algorithms
Self-Organizing Network Model (SOM) Session 11
One-layer neural networks Approximation problems
Real Neurons Cell structures Cell body Dendrites Axon
DEPARTMENT: COMPUTER SC. & ENGG. SEMESTER : VII
FUNDAMENTAL CONCEPT OF ARTIFICIAL NETWORKS
CSE P573 Applications of Artificial Intelligence Neural Networks
Artificial Neural Network & Backpropagation Algorithm
CSE 573 Introduction to Artificial Intelligence Neural Networks
Presentation transcript:

1 Lecture 6 Neural Network Training

2 Neural Network Training Network training is basic to establishing the functional relationship between the inputs and the outputs of any neural network and considerable effort has been spent on finding faster and more efficient training algorithms which will reduce the time required to train a network.

3 Neural Network Training There are two basic classes of network training: supervised learning that involves an external source of knowledge about the system unsupervised learning that involves no external source of knowledge and learning relies on local information and internal data.

4 Neural Network Training supervised learning the desired outputs of the network for every given input condition are specified and the network learns the appropriate functional relationship between them following repeated application of training sets of input-output pairs. The popular back-propagation algorithm, which is used in many applications, belongs to this class. This algorithm gets its name from the fact that the synaptic weights of a multi-layer network are adapted iteratively by propagating some measure of the error between the desired and actual output of the network from its output back to its input.

5 Supervised Learning Teacher presents ANN input-output pairs ANN weights adjusted according to error Iterative algorithms (e.g. Delta rule, BP rule) Quality of training examples is critical

6 Neural Network Training unsupervised learning, no information on the desired output of the network that corresponds to a particular input is available. Here, the network is auto-associative, learning to respond to the different inputs in different ways. Typical applications of this class are feature detection and data clustering. Hebb´s algorithm and competitive learning are two examples of unsupervised learning algorithms. A wide range of network topologies such as those due to Hopfield, Hamming and Boltzmann, also use the same method of learning. In general, these networks, with their ability to generate arbitrary mappings between their inputs and outputs, are used as associative memories and classifiers.

7 Unsupervised Learning ANN adapts weights to cluster input data Hebbian learning  Connection stimulus-response strengthened (hebbian) Competitive learning algorithms  Kohonen & ART  Input weights adjusted to resemble stimulus

8 6.1 The Widrow-Hoff Training Algorithm introduced by Widrow and Hoff in the 1960s. A single neuron with inputs: an array of synaptic weights: weighted sum:

9 6.1 The Widrow-Hoff Training Algorithm Training is based on some measure of the discrepancy or error e=d-y between the desired and actual output of the element. LMS algorithm (for Least Mean Squares), using the squared error for each training set as the objective function, i.e. the optimum synaptic weight vector: R and p are not known and the Wiener solution unfortunately cannot be used.

The Widrow-Hoff Training Algorithm Widrow and Hoff observed that using the partial derivative of the squared instantaneous error The Widrow-Hoff algorithm can be expressed as the iteration: λ is some positive constant that defines the rate of convergence of the algorithm. ﹢

TheWidrow-Hoff Training Algorithm At each iteration the change in the error is

The Delta Training Algorithm The original Widrow-Hoff algorithm must be modified when the neural element contains a nonlinear element. The Delta training algorithm is given by the iteration differentiable or smooth.

Multi-layer ANN Training Algorithms For control applications number of layers and the number of neurons per layer minimal neural networks involving an input layer, a single hidden layer and an output layer, with a total of less than 10 nodes have proved quite successful in practical neural controllers. Training is performed first for a given number of neurons, the number is then reduced and the ANN is re-trained. The number of neurons is further reduced at each successive training session until the measure of the error starts to increase. the choice is arbitrary and a matter of experimentation.

The Back-propagation (BP) Algorithm consider a simple ANN

15 first layer output layer 6.4 The Back-propagation (BP) Algorithm

16 distorting or compression function of the node: 6.4 The Back-propagation (BP) Algorithm

The Back-propagation (BP) Algorithm

18 Steps in Back propagation Algorithm STEP ONE: initialize the weights and biases. The weights in the network are initialized to random numbers from the interval [-1,1]. Each unit has a BIAS associated with it The biases are similarly initialized to random numbers from the interval [-1,1]. STEP TWO: feed the training sample.

19 Steps in Back propagation Algorithm ( cont..) STEP THREE: Propagate the inputs forward; we compute the net input and output of each unit in the hidden and output layers. STEP FOUR: back propagate the error. STEP FIVE: update weights and biases to reflect the propagated errors. STEP SIX: terminating conditions.