BACKPROPOGATION OF NETWORKS
Terms used in Algorithm:-
Works in 3 PHASES
(1) Feed Forward
(2) Back Propogation Of Error
(3) Weight and bias Updation
Difficulty:- Training of network take substantial time as soon the change is done the network perform very fast. Calculating weight of hidden layer in efficient way that would result in very small or zero output error.(when no of hidden layer increase then network training become complex)
Note:- Back Propagation are different from other network how the weight are calculated during the period of the network. Back Propagation is Supervised Learning( where actual output is matched with the desired output) Output of back propagation can be Binary (1,0) or Bipolar (-1,1) So activation function used are (1)Binary Sigmoidal (2)Bipolar Sigmoidal
Momentum is used to control possible oscillations in the weight ,which could be alternatively signed error signals ( The Momentum factor also help in faster Convergences) The Best Network for generalization is BPN
APPLICATIONS OF BACKPROPAGATION NETWORK Load forecasting problems in power systems. Image processing. Fault diagnosis and fault detection. Gesture recognition, speech recognition. Signature verification. Bioinformatics. Structural engineering design (civil).