Download presentation
Presentation is loading. Please wait.
Published byCoral Cox Modified over 9 years ago
1
Artificial Neural Nets and AI Connectionism Sub symbolic reasoning
2
Human Neural Nets Our brains contain about 10 11 neurons (brain cells) electrochemical signals reach the nucleus( cell body) via dendrites dendrites are nerve fibres which carry impulses towards the cell body The axon delivers the neuron's output to neighbouring neurons Neurons only fire when the collective influences of its inputs reach a certain threshold
3
Synapses One neuron is connected to other neurons at synapses There are about 10 16 synapses in our brains A single neuron may receive inputs from as many as 10 5 synapses Still not totally clear how we learn, remember and reason but it appears to be associated with the interconnections between neurons i.e. the synapses.
4
Artificial neural nets try to model the low level hardware and electrochemical activity of the brain A single artificial neuron has inputs, a processing element and an output The Perceptron
5
Perceptrons as Linear Discriminators A perceptron with two inputs and one output solves two class classification problems I 1 W 1 +I 2 W 2 >T for class 1 otherwise class 2 Learning involves a known training set and adjusting weights to get the correct result But NB the discriminating boundary I 1 W 1 +I 2 W 2 = T is a straight line A single perceptron can only classify linearly separable classes More inputs allow linear separation in larger dimensional space
6
The XOR Problem XOR AND OR I1 I2 Out I1 I2 Out I1 I2 Out 1 1 0 1 1 1 1 1 1 0 1 1 0 1 0 0 1 1 1 0 1 1 0 0 1 0 1 0 0 0 0 0 0 0 0 1
7
An artificial neural network a network of interconnected artificial neurons many topologies many algorithms and applications Multilayer perceptrons can solve non linear pattern classification problems
8
Multilayer Perceptron
9
Analogy between human and artificial neurons Human Artificial neuron processing element dendrites combining function cell body transfer function axon element output synapses weights Note, the analogy is weak: Real neuronal activity is much more complex (electrochemical in nature) Real neurons don’t just add up inputs: complex dendritic mechanisms occur
10
Is the use of ANNs a good idea? Is it a retrograde step compared with symbolic approaches? The hope is that high level reasoning is an emergent property of the activity in the network Advantages include: parallelism, learning capacity, self organise in some cases, distributed memory giving resistance to noise and graceful degradation capacity to generalise, learn rules from examples
11
Disadvantages Neural systems are inherently parallel but normally simulated on a sequential machines. –Processing time can rise quickly as the size of the problem grows - The Scaling Problem –In consequence, neural networks mainly used to address small problems. The performance of a network can be sensitive to the quality and type of preprocessing of the input data. Neural networks do not explain the results they obtain; their rules of operation are essentially hidden. Black box nature gives rise to distrust on the part of potential users. Some design decisions required in developing an application are not well understood.
12
Applications Pattern classification (supervised learning) eg in computer vision, speech and character recognition Clustering (unsupervised learning) Optimisation Market trading: Stock forecasting, Pension fund management Fraud Detection: Cheque approval, signature verification Protein analysis for drug development etc etc etc
13
Pattern Classification a classification task in computer vision is the object in the image a nut a bolt a washer or a screw? need a training set of images of known objects extract a signature (feature vector) from each image and use to train the neural net net is then used to classify a feature vector from an unclassified image
14
Learning Learning involves adjusting the weights and thresholds ( ΣI i w i >T ) to give the required output for the given input. These can be combined by adding an extra input to each neuron and making the signal for it -1 and the weight equal to the threshold, then ( ΣI i w i > 0 )
15
Transfer function need not be a step threshold better to use a sigmoid function Learning is like hill climbing. Hard thresholds are like cliffs. Its easier to work with smooth transfers.
16
Feedforward Nets processing proceeds from input layer, through hidden layers to output no loops or links between nodes in same layer Used extensively for pattern classification Error backpropagation is a popular learning algorithm
17
Idea of Error Back Propagation Initialise weights to small positive values For each feature vector in training set, apply to first layer and feed forward through the net to calculate the output. Adjust the weights to bring the output nearer to the desired output. This is done by propagating the errors back through the network a layer at a time to work out the required adjustments to the weights Iterate this process until some desired criterion is achieved
18
Using the net Obtain the feature vector for an unclassified image Apply to the first layer of the net and feed the values forward using the recorded weights to obtain the output giving an estimate of the class membership. In the workshop you will have the opportunity to experiment with various nets Perceptron Demo
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.