1 Artificial Neural Networks Sanun Srisuk 42973003 EECP0720 Expert Systems – Artificial Neural Networks.

Slides:



Advertisements
Similar presentations
Artificial Neural Networks
Advertisements

Slides from: Doug Gray, David Poole
1 Machine Learning: Lecture 4 Artificial Neural Networks (Based on Chapter 4 of Mitchell T.., Machine Learning, 1997)
G53MLE | Machine Learning | Dr Guoping Qiu
CSCI 347 / CS 4206: Data Mining Module 07: Implementations Topic 03: Linear Models.
Machine Learning: Connectionist McCulloch-Pitts Neuron Perceptrons Multilayer Networks Support Vector Machines Feedback Networks Hopfield Networks.
Classification Neural Networks 1
Perceptron.
Machine Learning Neural Networks
Overview over different methods – Supervised Learning
Simple Neural Nets For Pattern Classification
Supervised learning 1.Early learning algorithms 2.First order gradient methods 3.Second order gradient methods.
Artificial Neural Networks #1 Machine Learning CH4 : 4.1 – 4.5
Linear Learning Machines  Simplest case: the decision function is a hyperplane in input space.  The Perceptron Algorithm: Rosenblatt, 1956  An on-line.
Back-Propagation Algorithm
Machine Learning Motivation for machine learning How to set up a problem How to design a learner Introduce one class of learners (ANN) –Perceptrons –Feed-forward.
Artificial Neural Networks
September 23, 2010Neural Networks Lecture 6: Perceptron Learning 1 Refresher: Perceptron Training Algorithm Algorithm Perceptron; Start with a randomly.
Data Mining with Neural Networks (HK: Chapter 7.5)
Artificial Neural Networks
LOGO Classification III Lecturer: Dr. Bo Yuan
CS 4700: Foundations of Artificial Intelligence
CS 484 – Artificial Intelligence
Neural Networks. Background - Neural Networks can be : Biological - Biological models Artificial - Artificial models - Desire to produce artificial systems.
Dr. Hala Moushir Ebied Faculty of Computers & Information Sciences
Artificial Neural Networks
Computer Science and Engineering
Artificial Neural Networks
CS464 Introduction to Machine Learning1 Artificial N eural N etworks Artificial neural networks (ANNs) provide a general, practical method for learning.
Machine Learning Chapter 4. Artificial Neural Networks
Artificial Intelligence Techniques Multilayer Perceptrons.
CS 478 – Tools for Machine Learning and Data Mining Backpropagation.
Neural Networks and Machine Learning Applications CSC 563 Prof. Mohamed Batouche Computer Science Department CCIS – King Saud University Riyadh, Saudi.
Linear Discrimination Reading: Chapter 2 of textbook.
Non-Bayes classifiers. Linear discriminants, neural networks.
11 1 Backpropagation Multilayer Perceptron R – S 1 – S 2 – S 3 Network.
Linear Classification with Perceptrons
EE459 Neural Networks Backpropagation
Neural Networks and Backpropagation Sebastian Thrun , Fall 2000.
Insight: Steal from Existing Supervised Learning Methods! Training = {X,Y} Error = target output – actual output.
Chapter 2 Single Layer Feedforward Networks
Artificial Neural Networks (Cont.) Chapter 4 Perceptron Gradient Descent Multilayer Networks Backpropagation Algorithm 1.
Artificial Neural Network
EEE502 Pattern Recognition
Chapter 8: Adaptive Networks
Hazırlayan NEURAL NETWORKS Backpropagation Network PROF. DR. YUSUF OYSAL.
1 Perceptron as one Type of Linear Discriminants IntroductionIntroduction Design of Primitive UnitsDesign of Primitive Units PerceptronsPerceptrons.
Artificial Intelligence Methods Neural Networks Lecture 3 Rakesh K. Bissoondeeal Rakesh K. Bissoondeeal.
Bab 5 Classification: Alternative Techniques Part 4 Artificial Neural Networks Based Classifer.
Artificial Neural Network. Introduction Robust approach to approximating real-valued, discrete-valued, and vector-valued target functions Backpropagation.
Artificial Neural Network. Introduction Robust approach to approximating real-valued, discrete-valued, and vector-valued target functions Backpropagation.
Learning: Neural Networks Artificial Intelligence CMSC February 3, 2005.
Pattern Recognition Lecture 20: Neural Networks 3 Dr. Richard Spillman Pacific Lutheran University.
Learning with Neural Networks Artificial Intelligence CMSC February 19, 2002.
CSE343/543 Machine Learning Mayank Vatsa Lecture slides are prepared using several teaching resources and no authorship is claimed for any slides.
Fall 2004 Backpropagation CS478 - Machine Learning.
with Daniel L. Silver, Ph.D. Christian Frey, BBA April 11-12, 2017
Artificial Neural Networks
Machine Learning Today: Reading: Maria Florina Balcan
CSC 578 Neural Networks and Deep Learning
Classification Neural Networks 1
Perceptron as one Type of Linear Discriminants
Lecture Notes for Chapter 4 Artificial Neural Networks
Backpropagation.
Machine Learning: Lecture 4
Machine Learning: UNIT-2 CHAPTER-1
Chapter - 3 Single Layer Percetron
COSC 4335: Part2: Other Classification Techniques
Backpropagation.
Seminar on Machine Learning Rada Mihalcea
Presentation transcript:

1 Artificial Neural Networks Sanun Srisuk EECP0720 Expert Systems – Artificial Neural Networks

2 Introduction Artificial neural networks (ANNs) provide a general, practical method for learning real-valued, discrete- valued, and vector-valued functions from examples. Algorithms such as BACKPROPAGATION use gradient descent to tune network parameters to best fit a training set of input-output pairs. ANN learning is robust to errors in the training data and has been successfully applied to problems such as face recognition/detection, speech recognition, and learning robot control strategies. EECP0720 Expert Systems – Artificial Neural Networks

3 Autonomous Vehicle Steering EECP0720 Expert Systems – Artificial Neural Networks

4 Characteristics of ANNs Instances are represented by many attribute-value pairs. The target function output may be discrete-valued, real- valued, or a vector of several real- or discrete-valued attributes. The training examples may contain errors. Long training times are acceptable. Fast evaluation of the learned target function may be required. The ability of humans to understand the learned target function is not important. EECP0720 Expert Systems – Artificial Neural Networks

5 One type of ANN system is based on a unit called a perceptron. The perceptron function can sometimes be written as The space H of candidate hypotheses considered in perceptron learning is the set of all possible real-valued weight vectors. Perceptrons EECP0720 Expert Systems – Artificial Neural Networks

6 Representational Power of Perceptrons EECP0720 Expert Systems – Artificial Neural Networks

7 Decision surface EECP0720 Expert Systems – Artificial Neural Networks linear decision surfacenonlinear decision surface Programming Example of Decision Surface

8 The Perceptron Training Rule One way to learn an acceptable weight vector is to begin with random weights, then iteratively apply the perceptron to each training example, modifying the perceptron weights whenever it misclassifies an example. This process is repeated, iterating through the training examples as many times as needed until the perceptron classifies all training examples correctly. Weights are modified at each step according to the perceptron training rule, which revises the weight associated with input according to the rule EECP0720 Expert Systems – Artificial Neural Networks

9 Gradient Descent and Delta Rule The delta training rule is best understood by considering the task of training an unthresholded perceptron; that is, a linear unit for which the output o is given by In order to derive a weight learning rule for linear units, let us begin by specifying a measure for the training error of a hypothesis (weight vector), relative to the training examples. EECP0720 Expert Systems – Artificial Neural Networks

10 Visualizing the Hypothesis Space EECP0720 Expert Systems – Artificial Neural Networks minimum error initial weight vector by random

11 Derivation of the Gradient Descent Rule The vector derivative is called the gradient of E with respect to, written The gradient specifies the direction that produces the steepest increase in E. The negative of this vector therefore gives the direction of steepest decrease. The training rule for gradient descent is EECP0720 Expert Systems – Artificial Neural Networks

12 Derivation of the Gradient Descent Rule (cont.) The negative sign is presented because we want to move the weight vector in the direction that decreases E. This training rule can also written in its component form which makes it clear that steepest descent is achieved by altering each component of in proportion to. EECP0720 Expert Systems – Artificial Neural Networks

13 The vector of derivatives that form the gradient can be obtained by differentiating E EECP0720 Expert Systems – Artificial Neural Networks Derivation of the Gradient Descent Rule (cont.) The weight update rule for standard gradient descent can be summarized as

14 Stochastic Approximation to Gradient Descent EECP0720 Expert Systems – Artificial Neural Networks

15 Summary of Perceptron Perceptron training rule guaranteed to succeed if training examples are linearly separable sufficiently small learning rate Linear unit training rule uses gradient descent guaranteed to converge to hypothesis with minimum squared error given sufficiently small learning rate even when training data contains noise EECP0720 Expert Systems – Artificial Neural Networks

16 BACKPROPAGATION Algorithm EECP0720 Expert Systems – Artificial Neural Networks

17 Error Function The Backpropagation algorithm learns the weights for a multilayer network, given a network with a fixed set of units and interconnections. It employs gradient descent to attempt to minimize the squared error between the network output values and the target values for those outputs. We begin by redefining E to sum the errors over all of the network output units where outputs is the set of output units in the network, and t kd and o kd are the target and output values associated with the k th output unit and training example d. EECP0720 Expert Systems – Artificial Neural Networks

18 Architecture of Backpropagation EECP0720 Expert Systems – Artificial Neural Networks

19 Backpropagation Learning Algorithm EECP0720 Expert Systems – Artificial Neural Networks

20 Backpropagation Learning Algorithm (cont.) EECP0720 Expert Systems – Artificial Neural Networks

21 Backpropagation Learning Algorithm (cont.) EECP0720 Expert Systems – Artificial Neural Networks

22 Backpropagation Learning Algorithm (cont.) EECP0720 Expert Systems – Artificial Neural Networks

23 Backpropagation Learning Algorithm (cont.) EECP0720 Expert Systems – Artificial Neural Networks

24 Face Detection using Neural Networks EECP0720 Expert Systems – Artificial Neural Networks Neural Network Face Database Non-Face Database Training Process Output=1, for face database Output=0, for non-face database Face or Non- Face? Testing Process

25 End of Presentation EECP0720 Expert Systems – Artificial Neural Networks

26 Derivation of Backpropagation EECP0720 Expert Systems – Artificial Neural Networks

27 Derivation of Backpropagation (cont.) EECP0720 Expert Systems – Artificial Neural Networks