Download presentation
Presentation is loading. Please wait.
Published byBudi Kurniawan Modified over 6 years ago
1
network of simple neuron-like computing elements
Artificial Neural Networks network of simple neuron-like computing elements
2
Learning to Recognize Input Patterns
feedforward processing network weights can be learned from training examples (mapping from inputs to correct outputs) back-propagation: iterative algorithm that progressively reduces error between computed and desired output until performance is satisfactory on each iteration: compute output of current network and assess performance compute weight adjustments from hidden to output layer that can reduce output errors compute weight adjustments from input to hidden units that can enhance hidden layer change network weights, using rate parameter back-propagation algorithm to learn network weights
3
Example: Learning Handwritten Digits
MNIST database: x28 images of handwritten numbers start with random initial weights and use back-propagation to learn weights to recognize numbers 28 one output unit for each number select output unit with maximum response e.g. 9 28 sample image from MNIST database 25 hidden units one input unit for each pixel
4
Results: Learning Handwritten Digits
correct examples wrong examples confusion matrix overall classification accuracy: 87.1%
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.