Artificial neural networks Šarūnas Stanskis
Some history 1943 - Warren McCulloch and Walter Pitts introduced models of neurological networks, recreated threshold switches based on neurons and showed that even simple networks of this kind are able to calculate nearly any logic or arithmetic function. 1949: Donald O. Hebb formulated the classical Hebbian rule which represents in its more generalized form the basis of nearly all neural learning procedures. 1951: For his dissertation Marvin Minsky developed the neurocomputer Snark, which has already been capable to adjust its weights automatically …
Neuroniniai tinklai Biological neural network Artificial neural network
Biological and artificial neurons Biological neuron Artificial neuron
Neuron model Output Imputs with corresponding weights x1,w1 x2,wn ... xn,wn y1 f(∑xi *wi) i=1 n Output Imputs with corresponding weights Summation and transfer function Decision boundary
Percepton Percepton model
Activation Function examples Linear Treshold Logic sigmoid Tangent sigmoid
Feedforward ANN Feedforward ANN includes: Single layer perceptron Multilayer perceptron Radial basis function network Multilayer Feed Forward ANN example
Feedback ANN Recurrent ANN example
ANN learning The purpose of ANN learning is to minimize error between its response and actual wanted result. ANN learning algorithms can be divided into three categories: Supervised Learning with a Teacher; Supervised Learning with Reinforcement; Unsupervised Learning.
Supervised Learning with a Teacher The network is provided with a set of inputs and the appropriate outputs for those inputs.
Supervised Learning with Reinforcement The network is provided with a evaluation of its output given the input and alters the weights to try to increase the reinforcement it receives.
Unsupervised Learning The network receives no external feedback but has an internal criterion that it tries to fulfil given the inputs that it faces.
Backpropagation Backward propagation of errors
Backpropagation First apply the inputs to the network and work out the output. Next work out the errors for neurons in output layer. Change the weights. Calculate the Errors for the hidden layer neurons by using proceeding layer neutron errors. Change the hidden layer weights.
Problems related to ANN learning Finding local minimum instead of global minimum. Neural network overfitting to training data.
ANN application ANN applications can be divided into to main tasks: Classification Regression
ANN application More detailed examples: Image and signal analysis Character recognition Sales forecasting Industrial process control Customer research Data validation Risk management Target marketing
Questions What are 2 different ANN architecture category types? Name at least two activation functions. And why are they needed? What are ANN learning categories? What problems can arise during ANN learning? Name some ANN applications.
Thank you for your attention