Introduction to Neural Networks Introduction to Neural Networks Applied to OCR and Speech Recognition An actual neuron A crude model of a neuron Computational.

Slides:



Advertisements
Similar presentations
Multi-Layer Perceptron (MLP)
Advertisements

Slides from: Doug Gray, David Poole
NEURAL NETWORKS Backpropagation Algorithm
Perceptron Learning Rule
1 Neural networks. Neural networks are made up of many artificial neurons. Each input into the neuron has its own weight associated with it illustrated.
Multilayer Perceptrons 1. Overview  Recap of neural network theory  The multi-layered perceptron  Back-propagation  Introduction to training  Uses.
Artificial Neural Networks
Machine Learning: Connectionist McCulloch-Pitts Neuron Perceptrons Multilayer Networks Support Vector Machines Feedback Networks Hopfield Networks.
Classification Neural Networks 1
Machine Learning Neural Networks
Supervised learning 1.Early learning algorithms 2.First order gradient methods 3.Second order gradient methods.
1 Part I Artificial Neural Networks Sofia Nikitaki.
Pattern Classification All materials in these slides were taken from Pattern Classification (2nd ed) by R. O. Duda, P. E. Hart and D. G. Stork, John Wiley.
Radial Basis-Function Networks. Back-Propagation Stochastic Back-Propagation Algorithm Step by Step Example Radial Basis-Function Networks Gaussian response.
Prénom Nom Document Analysis: Artificial Neural Networks Prof. Rolf Ingold, University of Fribourg Master course, spring semester 2008.
Prénom Nom Document Analysis: Artificial Neural Networks Prof. Rolf Ingold, University of Fribourg Master course, spring semester 2008.
Multi Layer Perceptrons (MLP) Course website: The back-propagation algorithm Following Hertz chapter 6.
Data Mining with Neural Networks (HK: Chapter 7.5)
Neural Networks Lab 5. What Is Neural Networks? Neural networks are composed of simple elements( Neurons) operating in parallel. Neural networks are composed.
CS 4700: Foundations of Artificial Intelligence
September 28, 2010Neural Networks Lecture 7: Perceptron Modifications 1 Adaline Schematic Adjust weights i1i1i1i1 i2i2i2i2 inininin …  w 0 + w 1 i 1 +
Neural Networks. Background - Neural Networks can be : Biological - Biological models Artificial - Artificial models - Desire to produce artificial systems.
Radial Basis Function (RBF) Networks
Traffic Sign Recognition Using Artificial Neural Network Radi Bekker
1 Introduction to Artificial Neural Networks Andrew L. Nelson Visiting Research Faculty University of South Florida.
1 st Neural Network: AND function Threshold(Y) = 2 X1 Y X Y.
Artificial Neural Networks
Chapter 3 Neural Network Xiu-jun GONG (Ph. D) School of Computer Science and Technology, Tianjin University
Machine Learning Dr. Shazzad Hosain Department of EECS North South Universtiy
 Diagram of a Neuron  The Simple Perceptron  Multilayer Neural Network  What is Hidden Layer?  Why do we Need a Hidden Layer?  How do Multilayer.
LINEAR CLASSIFICATION. Biological inspirations  Some numbers…  The human brain contains about 10 billion nerve cells ( neurons )  Each neuron is connected.
Artificial Neural Networks. The Brain How do brains work? How do human brains differ from that of other animals? Can we base models of artificial intelligence.
Back Propagation and Representation in PDP Networks Psychology 209 February 6, 2013.
1 Chapter 11 Neural Networks. 2 Chapter 11 Contents (1) l Biological Neurons l Artificial Neurons l Perceptrons l Multilayer Neural Networks l Backpropagation.
A note about gradient descent: Consider the function f(x)=(x-x 0 ) 2 Its derivative is: By gradient descent. x0x0 + -
CS344: Introduction to Artificial Intelligence (associated lab: CS386) Pushpak Bhattacharyya CSE Dept., IIT Bombay Lecture 31: Feedforward N/W; sigmoid.
Multi-Layer Perceptron
Non-Bayes classifiers. Linear discriminants, neural networks.
Neural Network Basics Anns are analytical systems that address problems whose solutions have not been explicitly formulated Structure in which multiple.
CS621 : Artificial Intelligence
Neural Networks Teacher: Elena Marchiori R4.47 Assistant: Kees Jong S2.22
EEE502 Pattern Recognition
NEURAL NETWORKS LECTURE 1 dr Zoran Ševarac FON, 2015.
Perceptrons Michael J. Watts
Previous Lecture Perceptron W  t+1  W  t  t  d(t) - sign (w(t)  x)] x Adaline W  t+1  W  t  t  d(t) - f(w(t)  x)] f’ x Gradient.
Artificial Intelligence CIS 342 The College of Saint Rose David Goldschmidt, Ph.D.
Bab 5 Classification: Alternative Techniques Part 4 Artificial Neural Networks Based Classifer.
Back Propagation and Representation in PDP Networks
Today’s Lecture Neural networks Training
Neural networks.
Back Propagation and Representation in PDP Networks
Learning in Neural Networks
Ranga Rodrigo February 8, 2014
CSE 473 Introduction to Artificial Intelligence Neural Networks
Dr. Unnikrishnan P.C. Professor, EEE
CSE P573 Applications of Artificial Intelligence Neural Networks
CSE 473 Introduction to Artificial Intelligence Neural Networks
Classification Neural Networks 1
XOR problem Input 2 Input 1
Face Recognition with Neural Networks
CSE 573 Introduction to Artificial Intelligence Neural Networks
network of simple neuron-like computing elements
Neural Networks Chapter 5
Neural Network - 2 Mayank Vatsa
Pulsed Neural Networks
Back Propagation and Representation in PDP Networks
CS621: Artificial Intelligence Lecture 22-23: Sigmoid neuron, Backpropagation (Lecture 20 and 21 taken by Anup on Graphical Models) Pushpak Bhattacharyya.
CS621: Artificial Intelligence Lecture 18: Feedforward network contd
Artificial Neural Networks / Spring 2002
Outline Announcement Neural networks Perceptrons - continued
Presentation transcript:

Introduction to Neural Networks Introduction to Neural Networks Applied to OCR and Speech Recognition An actual neuron A crude model of a neuron Computational Neural Networks: A computational approach inspired by the architecture of the biological nervous system

Introduction to Neural Networks Cat Neural Probe to Study Response

Introduction to Neural Networks The Perceptron Model

Introduction to Neural Networks Example Weights: And & Or Problems

Introduction to Neural Networks Weight Adjustments

Introduction to Neural Networks 3-D and 2-D Plot of AND Table

Introduction to Neural Networks Training Procedure 1.First assign any values to w 1, w 2 and w 3 2.Using the current weight values w 1, w 2 and w 3 and the next training item inputs I 1 and I 2 compute the value: 3.If V 0 set computed output C to 1 else set to 0. 4.If the computed output C is not the same as the current training item output O, Adjust Weights. 5.Repeat steps 2-4. If you run out of training items, start with the first training item. Stop repeating if no weight changes through 1 complete training cycle).

Introduction to Neural Networks Gradient Descent Algorithm

Introduction to Neural Networks Linearly vs. Non-Linearly Separable

Introduction to Neural Networks The XOR Problem is Linearly Non- separable

Introduction to Neural Networks The Back Propagation Model

Introduction to Neural Networks Advantage of Backprop over Perceptron

Introduction to Neural Networks Backprop Learning Algorithm 1. Assign random values to all the weights 2. Choose a pattern from the training set (similar to perceptron). 3. Propagate the signal through to get final output (similar to perceptron). 4. Compute the error for the output layer (similar to the perceptron). 5. Compute the errors in the preceding layers by propagating the error backwards. 6. Change the weight between neuron A and each neuron B in another layer by an amount proportional to the observed output of B and the error of A. 7. Repeat step 2 for next training sample.

Introduction to Neural Networks Application: Needs Enough Training

Introduction to Neural Networks Application to Speech Processing

Introduction to Neural Networks Speech

Introduction to Neural Networks Appendix

Introduction to Neural Networks Error Propagation Algorithm If neuron A in one layer is connected to B,C, and D in the output layer, it is responsible for the errors observed in B,C, and D. Thus, the error in A is computable by summing up the errors in B, C, and D weighted by the connection strengths between A and each B, C and D