Download presentation
Presentation is loading. Please wait.
1
BACKPROPOGATION OF NETWORKS
4
Terms used in Algorithm:-
5
Works in 3 PHASES
7
(1) Feed Forward
8
(2) Back Propogation Of Error
9
(3) Weight and bias Updation
10
Difficulty:- Training of network take substantial time as soon the change is done the network perform very fast. Calculating weight of hidden layer in efficient way that would result in very small or zero output error.(when no of hidden layer increase then network training become complex)
11
Note:- Back Propagation are different from other network how the weight are calculated during the period of the network. Back Propagation is Supervised Learning( where actual output is matched with the desired output) Output of back propagation can be Binary (1,0) or Bipolar (-1,1) So activation function used are (1)Binary Sigmoidal (2)Bipolar Sigmoidal
12
Momentum is used to control possible oscillations in the weight ,which could be alternatively signed error signals ( The Momentum factor also help in faster Convergences) The Best Network for generalization is BPN
13
APPLICATIONS OF BACKPROPAGATION NETWORK
Load forecasting problems in power systems. Image processing. Fault diagnosis and fault detection. Gesture recognition, speech recognition. Signature verification. Bioinformatics. Structural engineering design (civil).
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.