1 Pattern Recognition: Statistical and Neural Lonnie C. Ludeman Lecture 20 Oct 26, 2005 Nanjing University of Science & Technology.

Slides:



Advertisements
Similar presentations
Artificial Neural Networks
Advertisements

A Brief Overview of Neural Networks By Rohit Dua, Samuel A. Mulder, Steve E. Watkins, and Donald C. Wunsch.
Slides from: Doug Gray, David Poole
G53MLE | Machine Learning | Dr Guoping Qiu
NEURAL NETWORKS Perceptron
Multilayer Perceptrons 1. Overview  Recap of neural network theory  The multi-layered perceptron  Back-propagation  Introduction to training  Uses.
Machine Learning Neural Networks
Lecture 14 – Neural Networks
Simple Neural Nets For Pattern Classification
Pattern Classification All materials in these slides were taken from Pattern Classification (2nd ed) by R. O. Duda, P. E. Hart and D. G. Stork, John Wiley.
Prénom Nom Document Analysis: Artificial Neural Networks Prof. Rolf Ingold, University of Fribourg Master course, spring semester 2008.
Least-Mean-Square Algorithm CS/CMPE 537 – Neural Networks.
The Perceptron CS/CMPE 333 – Neural Networks. CS/CMPE Neural Networks (Sp 2002/2003) - Asim LUMS2 The Perceptron – Basics Simplest and one.
Prénom Nom Document Analysis: Artificial Neural Networks Prof. Rolf Ingold, University of Fribourg Master course, spring semester 2008.
An Illustrative Example
Chapter 6: Multilayer Neural Networks
Before we start ADALINE
Artificial Neural Networks
November 21, 2012Introduction to Artificial Intelligence Lecture 16: Neural Network Paradigms III 1 Learning in the BPN Gradients of two-dimensional functions:
Neural Networks. Background - Neural Networks can be : Biological - Biological models Artificial - Artificial models - Desire to produce artificial systems.
Artificial neural networks:
1 Pattern Recognition: Statistical and Neural Lonnie C. Ludeman Lecture 13 Oct 14, 2005 Nanjing University of Science & Technology.
Multiple-Layer Networks and Backpropagation Algorithms
Artificial Neural Networks
Explorations in Neural Networks Tianhui Cai Period 3.
1 Pattern Recognition: Statistical and Neural Lonnie C. Ludeman Lecture 23 Nov 2, 2005 Nanjing University of Science & Technology.
1 Machine Learning The Perceptron. 2 Heuristic Search Knowledge Based Systems (KBS) Genetic Algorithms (GAs)
Pattern Classification All materials in these slides were taken from Pattern Classification (2nd ed) by R. O. Duda, P. E. Hart and D. G. Stork, John Wiley.
ECE 8443 – Pattern Recognition ECE 8527 – Introduction to Machine Learning and Pattern Recognition LECTURE 16: NEURAL NETWORKS Objectives: Feedforward.
Classification / Regression Neural Networks 2
LINEAR CLASSIFICATION. Biological inspirations  Some numbers…  The human brain contains about 10 billion nerve cells ( neurons )  Each neuron is connected.
Artificial Intelligence Techniques Multilayer Perceptrons.
1 Pattern Recognition: Statistical and Neural Lonnie C. Ludeman Lecture 21 Oct 28, 2005 Nanjing University of Science & Technology.
Artificial Neural Networks. The Brain How do brains work? How do human brains differ from that of other animals? Can we base models of artificial intelligence.
From Machine Learning to Deep Learning. Topics that I will Cover (subject to some minor adjustment) Week 2: Introduction to Deep Learning Week 3: Logistic.
1 Pattern Recognition: Statistical and Neural Lonnie C. Ludeman Lecture 14 Oct 14, 2005 Nanjing University of Science & Technology.
Neural Networks and Machine Learning Applications CSC 563 Prof. Mohamed Batouche Computer Science Department CCIS – King Saud University Riyadh, Saudi.
1 Pattern Recognition: Statistical and Neural Lonnie C. Ludeman Lecture 24 Nov 2, 2005 Nanjing University of Science & Technology.
Multi-Layer Perceptron
Non-Bayes classifiers. Linear discriminants, neural networks.
Akram Bitar and Larry Manevitz Department of Computer Science
1 Pattern Recognition: Statistical and Neural Lonnie C. Ludeman Lecture 25 Nov 4, 2005 Nanjing University of Science & Technology.
Back-Propagation Algorithm AN INTRODUCTION TO LEARNING INTERNAL REPRESENTATIONS BY ERROR PROPAGATION Presented by: Kunal Parmar UHID:
1 Lecture 6 Neural Network Training. 2 Neural Network Training Network training is basic to establishing the functional relationship between the inputs.
Chapter 2 Single Layer Feedforward Networks
Neural Networks Presented by M. Abbasi Course lecturer: Dr.Tohidkhah.
Hazırlayan NEURAL NETWORKS Backpropagation Network PROF. DR. YUSUF OYSAL.
Modelleerimine ja Juhtimine Tehisnärvivõrgudega Identification and Control with artificial neural networks.
Previous Lecture Perceptron W  t+1  W  t  t  d(t) - sign (w(t)  x)] x Adaline W  t+1  W  t  t  d(t) - f(w(t)  x)] f’ x Gradient.
Chapter 6 Neural Network.
Artificial Intelligence Methods Neural Networks Lecture 3 Rakesh K. Bissoondeeal Rakesh K. Bissoondeeal.
Neural NetworksNN 21 Architecture We consider the architecture: feed- forward NN with one layer It is sufficient to study single layer perceptrons with.
Data Mining: Concepts and Techniques1 Prediction Prediction vs. classification Classification predicts categorical class label Prediction predicts continuous-valued.
Pattern Recognition Lecture 20: Neural Networks 3 Dr. Richard Spillman Pacific Lutheran University.
1 Neural Networks MUMT 611 Philippe Zaborowski April 2005.
Modelleerimine ja Juhtimine Tehisnärvivõrgudega
with Daniel L. Silver, Ph.D. Christian Frey, BBA April 11-12, 2017
Prof. Carolina Ruiz Department of Computer Science
CSC 578 Neural Networks and Deep Learning
Pattern Recognition: Statistical and Neural
Chapter 3. Artificial Neural Networks - Introduction -
network of simple neuron-like computing elements
Neural Networks Chapter 5
Chapter - 3 Single Layer Percetron
CS621: Artificial Intelligence Lecture 22-23: Sigmoid neuron, Backpropagation (Lecture 20 and 21 taken by Anup on Graphical Models) Pushpak Bhattacharyya.
Pattern Recognition: Statistical and Neural
Akram Bitar and Larry Manevitz Department of Computer Science
Prof. Carolina Ruiz Department of Computer Science
Outline Announcement Neural networks Perceptrons - continued
Presentation transcript:

1 Pattern Recognition: Statistical and Neural Lonnie C. Ludeman Lecture 20 Oct 26, 2005 Nanjing University of Science & Technology

2 Lecture 20 Topics 1. Perceptron Algorithm Revisited 2. Local Delta Training Algorithm for ANE 3. General Definition of Neural Networks 4. Basic Neural Network Structures-Examples 5. Analysis and Synthesis of Neural Networks

3 Signum Function Activation Training Algorithm(Perceptron) Weight Update Algorithm y = +1 if input vector x is from C 1 y = -1 if input vector x is from C 2 Review

4 How do we train an Artificial Neural Element(ANE) to do classification ??? Question Answer Use the Delta Training Algorithm !!!

5 Given an Artificial Neural Element as follows Wish to find weight vector such that training patterns are correctly classified

6 x(p) ε { x 1, x 2, …, x K } d( x(p) ) = { d(x 1 ), d(x 2 ), …, d(x K ) } Define a performance measure E p for sample x(p) and decision d[ x(p) ] as Given:

7 Use the gradient method to minimize E p New Weight w k+1 in terms of previous weight w k where the Gradient is Derivation of Delta weight update Equation

8 Substituting the gradient vector into the weight update gives the General Local Delta Algorithm or rewriting gives w(p+1) = w(p) + {d[x(p)] – f(net)} f / (net)) x(p) where net = w T (p)x(p) General Local Delta Algorithm Weight Update Equation

9 Continuous Perceptron Training Algorithm Sometimes called the

10 Case 1: Local Delta Algorithm for Training an ANE with Logistic Activation Function Given: Solution:

11 Substituting the derivative gives the Local algorithm for the Logistic Activation function as Local Weight Update Equation for Logistic Activation Function

12 Case 2: Local Delta Algorithm for for Training an ANE - Hyperbolic Tangent Activation Function Given: Solution; Taking derivative of the nonlinearity and substituting into the general update equation yields the following Local Weight Update Equation for Hyperbolic Activation Function

13 Scale Factors for Case 2: Tanh Activation Function SF = ( d[x(p) ] –f(net) )(1 – f 2 (net) ) d[x(p)]= 1 SF 1 = ( 1 – f(net) )(1 – f 2 (net) ) d[x(p)] = -1 SF -1 = ( -1 – f(net) )(1 – f 2 (net) ) d[x(p)]= 1d[x(p)] = -1

14 Scale Factors for Case 2: Tanh Activation Function (desired values = +0.9 and -0.9 )

15 Case 3: Local Delta Algorithm for Training an ANE - Linear Activation Function Given: Solution:Taking derivative and substituting in general update equation gives Local Weight Update Equation for Linear Activation Function ( Widrow-Hoff Training Rule )

16 General Global Delta Algorithm Define a performance measure E TOT for all samples x k and decisions d[ x k ) ] as Using Gradient technique gives the Global Delta Algorithm as Global Weight Update Equation

17 Definitions A Neural Network is defined as any connection of Neural Elements. An Artificial Neural Network is defined as any connection of Artificial Neural Elements.

18 Examples of Artificial Neural Networks (a) Two Layer neural Network (b) Special Three Layer Form: Hyperplane-AND-OR structure (c) General 3-Layer Feedforward structure and nomenclature Feedback Artificial Neural Networks (d) One Layer Hopfield Net (e) Two Layer Feedback Feed Forward Artificial Neural Networks

19 (a) Example - Two Layer Neural Network Using Signum Nonlinearity

20 (b) Special Hyperplane-AND-OR structure x HyperplanesLogical AND Logical OR y input output Layer 1 Layer 2 Layer 3

21 Building Block- Hyperplane μ

22 Building Block- AND μ -(n-½)

23 Building Block- OR ½ μ

24 AND Layer OR Layer Hyperplanes Layer all f(·) = u(·) unit step (b) Example- Hyperplanes-AND-OR Structure

25 (c) General Feedforward Structure

26 (d) Example: Feedback Structure one Layer

27 (e) Example: Feedback Structure Two Layer /

28 Definitions: Analysis of Neural Networks- Synthesis of Neural Networks- Given a Neural Network describe the output for all inputs ( Mathematical or computer generated) Given a list of properties and requirements build a Neural Network to satisfy the requirements ( Mathematical or computer generated)

29 Example: Analyze the following Neural Network Determine the output y 1 (2) for all (x 1,x 2 ). Solution: (Next Lecture)

30 Example: Synthesize a Neural Network Given the following decision regions build a neural network to perform the classification process Solution: Use Hyperplane-AND-OR Structure (Next Lecture)

31 Summary Lecture Perceptron Algorithm Revisited 2. Local Delta Training Algorithms for ANE 3. General Definition of Neural Networks 4. Basic Neural Network Structures-Examples 5. Analysis and Synthesis of Neural Networks

32 Question How do we train an Artificial Neural Network to perform the classification problem??? Answer Not a simple answer but we will look at one way that uses the backpropagation algorithm to do the Training. Not Today, we have to wait until Friday. ☺☻☺☻☺☻☺☻☺

33 End of Lecture 20