Download presentation
Presentation is loading. Please wait.
Published byKerry Hines Modified over 9 years ago
1
Back-propagation Chih-yun Lin 5/16/2015
2
Agenda Perceptron vs. back-propagation network Network structure Learning rule Why a hidden layer? An example: Jets or Sharks Conclusions
3
Network Structure – Perceptron O Output Unit W j I j Input Units
4
Network Structure – Back-propagation Network O i Output Unit W j,i a j Hidden Units W k,j I k Input Units
5
Learning Rule Measure error Reduce that error By appropriately adjusting each of the weights in the network
6
Learning Rule – Perceptron Err = T – O O is the predicted output T is the correct output W j W j + α * I j * Err I j is the activation of a unit j in the input layer α is a constant called the learning rate
7
Learning Rule – Back-propagation Network Err i = T i – O i W j,i W j,i + α * a j * Δ i Δ i = Err i * g’(in i ) g’ is the derivative of the activation function g a j is the activation of the hidden unit W k,j W k,j + α * I k * Δ j Δ j = g’(in j ) * Σ i W j,i * Δ i
8
Learning Rule – Back-propagation Network E = 1/2Σ i (T i – O i ) 2 = - I k * Δ j
9
Why a hidden layer? (1 w 1 ) + (1 w 2 ) w 1 + w 2 < (1 w 1 ) + (0 w 2 ) > ==> w 1 > (0 w 1 ) + (1 w 2 ) > ==> w 2 > (0 w 1 ) + (0 w 2 ) 0 <
10
Why a hidden layer? (cont.) (1 w 1 ) + (1 w 2 ) + (1 w 3 ) w 1 + w 2 + w 3 < (1 w 1 ) + (0 w 2 ) + (0 w 3 ) > ==> w 1 > (0 w 1 ) + (1 w 2 ) + (0 w 3 ) > ==> w 2 > (0 w 1 ) + (0 w 2 ) + (0 w 3 ) 0 <
11
An example: Jets or Sharks
12
Conclusion Expressiveness: Well-suited for continuous inputs,unlike most decision tree systems Computational efficiency: Time to error convergence is highly variable Generalization: Have reasonable success in a number of real-world problems
13
Conclusions (cont.) Sensitivity to noise: Very tolerant of noise in the input data Transparency: Neural networks are essentially black boxes Prior knowledge: Hard to used one’s knowledge to “prime” a network to learn better
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.