Download presentation
Presentation is loading. Please wait.
Published byTyler Norton Modified over 9 years ago
1
Introduction to Neural Networks Introduction to Neural Networks Applied to OCR and Speech Recognition An actual neuron A crude model of a neuron Computational Neural Networks: A computational approach inspired by the architecture of the biological nervous system
2
Introduction to Neural Networks Cat Neural Probe to Study Response
3
Introduction to Neural Networks The Perceptron Model
4
Introduction to Neural Networks Example Weights: And & Or Problems
5
Introduction to Neural Networks Weight Adjustments
6
Introduction to Neural Networks 3-D and 2-D Plot of AND Table
7
Introduction to Neural Networks Training Procedure 1.First assign any values to w 1, w 2 and w 3 2.Using the current weight values w 1, w 2 and w 3 and the next training item inputs I 1 and I 2 compute the value: 3.If V 0 set computed output C to 1 else set to 0. 4.If the computed output C is not the same as the current training item output O, Adjust Weights. 5.Repeat steps 2-4. If you run out of training items, start with the first training item. Stop repeating if no weight changes through 1 complete training cycle).
8
Introduction to Neural Networks Gradient Descent Algorithm
9
Introduction to Neural Networks Linearly vs. Non-Linearly Separable
10
Introduction to Neural Networks The XOR Problem is Linearly Non- separable
11
Introduction to Neural Networks The Back Propagation Model
12
Introduction to Neural Networks Advantage of Backprop over Perceptron
13
Introduction to Neural Networks Backprop Learning Algorithm 1. Assign random values to all the weights 2. Choose a pattern from the training set (similar to perceptron). 3. Propagate the signal through to get final output (similar to perceptron). 4. Compute the error for the output layer (similar to the perceptron). 5. Compute the errors in the preceding layers by propagating the error backwards. 6. Change the weight between neuron A and each neuron B in another layer by an amount proportional to the observed output of B and the error of A. 7. Repeat step 2 for next training sample.
14
Introduction to Neural Networks Application: Needs Enough Training
15
Introduction to Neural Networks Application to Speech Processing
16
Introduction to Neural Networks Speech
17
Introduction to Neural Networks Appendix
18
Introduction to Neural Networks Error Propagation Algorithm If neuron A in one layer is connected to B,C, and D in the output layer, it is responsible for the errors observed in B,C, and D. Thus, the error in A is computable by summing up the errors in B, C, and D weighted by the connection strengths between A and each B, C and D
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.