Download presentation
1
A Review: Architecture
Basic architecture : Input layer, output layer Input Unit Output Unit 1 b X1 w1 Y wi Xi wn Xn Single-layer net for pattern classification
2
Biases & Thresholds xiwi if net 0 -1 if net < 0
A bias acts exactly as a weight on a connection from a unit whose activation is always 1 Increasing the bias increases the net input to the unit If a bias is included, the activation function is typically taken to be f(net) = if net 0 -1 if net < 0 Where net = b + xiwi i
3
Biases & Thresholds xiwi
Some do not use bias weight, but use a fixed threshold for the activation function if net -1 if net < Where net = xiwi f(net) = i
4
Biases & Thresholds Input Unit Output Unit 1 b X1 w1 Y w2 X2
Single-layer net for pattern classification
5
Linear Separable For a particular output unit, the desired response is a ‘yes’(output signal 1) if the input pattern is the member of its class and ‘no’(output signal –1) We want one of two responses Activation function – step function From the net input formula we can draw a line, a plane or hyperplane y_in = b + xi wi b + xi wi = 0
6
Linear Separable b + xi wi = 0
The problem is said ‘linearly separable’ if there are weights (and a bias) so that all of the training input vectors for which the correct response is +1 lie on one side of the decision boundary and all of the training input vectors for which the correct response is –1 lie on the other side Draw the decision boundary using this equation b + xi wi = 0
7
Linear Separability + - x1 - -
8
Linear Separability + + x1 - +
9
Linear Separability - + x1 - +
10
Linear Separability In the case of an elementary perceptron, the n-dimensional space is divided by a hyperplane into two decision regions. The hyperplane is defined by the linearly separable function:
11
Linear Separability
12
Linear Separability A perceptron can learn the operations AND and OR, but not Exclusive-OR.
13
The HEBB NET The earliest and simplest learning rule
Proposed that learning occurs by modification of the synapse strengths (weights). If two interconnected neurons are both “on” or “off” at the same time, then the weight between those neurons should be increased. Hebb net also used for training other specific nets
14
The HEBB NET A single layer nets
Interconnected neurons will be between an input unit and one output unit. Suit for bipolar form of data (1, -1) Limitation for binary data (see examples 2.5 & 2.6)
15
The HEBB NET A single layer nets Input Unit Output Unit 1 b X1 w1 Y w2
Single-layer net for pattern classification Output Unit Input Unit
16
The HEBB NET bipolar form of data (1, -1) Weight update
wi(new) = wi(old) + xi y
17
The Algorithm Step 0: Initialize all weights: wi = 0 (i= 1 to n)
X1 X2 Y b w2 Single-layer net for pattern classification Output Unit Input Unit Step 0: Initialize all weights: wi = 0 (i= 1 to n) Step 1: For each input training vector and target output pair, s:t, do steps 2-4 Step 2. Set activations for input units: xi = si (i = 1 to n). Step 3. Set activation for output unit: y = t (t= target) Step 4. Adjust the weights for wi(new) = wi(old) + wi (i = 1 to n). wi =xi y Adjust the bias: b(new) = b(old) + y.
18
The Application Hebb net for And function: binary inputs and binary targets Hebb net for And function: binary inputs and bipolar targets Hebb net for And function: bipolar inputs and bipolar targets
19
Hebb net for And function: binary inputs and binary targets
w1 1 X1 X2 Y b w2 Output Unit Input Unit Input Target (x1 x2 1) ( ) ( ) ( ) ( ) 1 w1 = x1t w2 = x2 t b = t Weight Changes Weights ( ) ( ) ( ) ( ) ( w1 w2 b) ( ) ( ) (w1 w2 b) ( )
20
Hebb net for And function: binary inputs and binary targets
Weight Changes Weights (x1 x2 1) ( ) ( ) ( ) ( ) 1 ( w1 w2 b) ( ) ( ) (w1 w2 b) ( ) The response of the net correct for the first input pattern but not for the 2, 3, 4th pattern because the target values is 0, no learning occurs. Using binary target values prevents the net from learning any pattern for which the target is “off”
21
How a net learns By adjusting weights
If no weight adjusted :: no learning occurs w1 1 X1 X2 Y b w2 Output Unit Input Unit
22
Hebb net for And function: binary inputs and binary targets
Weight Changes Weights (x1 x2 1) ( ) ( ) ( ) ( ) 1 ( w1 w2 b) ( ) ( ) (w1 w2 b) ( ) No Learning occur
23
Hebb net for And function: binary inputs and binary targets
x2 w1 1 X1 X2 Y b w2 x1 -1 + b + xi wi = 1 + x1(1)+ x2(1)=0 Separating lines x2= -x1-1 Basic formula to draw separating line b + xi wi = 0
24
Hebb net for And function: binary inputs and bipolar targets
(x1 x2 1) ( ) ( ) ( ) ( ) 1 -1 w1 = x1t w2 = x2 t b = t Weight Changes Weights ( ) ( ) ( ) ( ) ( w1 w2 b) ( ) ( ) ( ) ( ) (w1 w2 b) ( ) ( ) ( ) ( ) w1 1 X1 X2 Y b w2
25
Hebb net for And function: binary inputs and bipolar targets
(x1 x2 1) ( ) ( ) ( ) ( ) 1 -1 ( w1 w2 b) ( ) ( ) ( ) ( ) (w1 w2 b) ( ) ( ) ( ) ( ) The response of the net correct for the first input pattern and for the 2, 3, 4th patterns shows that learning continues for each of these since the target value is now -1. However, these weights do not provide the correct response for the first input pattern
26
Hebb net for And function: binary inputs and bipolar targets
x2 w1 1 X1 X2 Y b w2 - + -1 - - x1 -1 b + xi wi = 1 + x1(1)+ x2(1)=0 Separating lines x2= -x1-1 Basic formula to draw separating line b + xi wi = 0
27
Hebb net for And function: bipolar inputs and bipolar targets
(x1 x2 1) ( ) ( ) ( ) ( ) 1 -1 w1 = x1t w2 = x2 t b = t Weight Changes Weights ( ) ( ) ( ) ( ) ( w1 w2 b) ( ) ( ) ( ) ( ) (w1 w2 b) ( ) ( ) ( ) ( ) ( ) w1 1 X1 X2 Y b w2
28
Hebb net for And function: bipolar inputs and bipolar targets
(x1 x2 1) ( ) ( ) ( ) ( ) 1 -1 ( w1 w2 b) ( ) ( ) ( ) ( ) (w1 w2 b) ( ) ( ) ( ) ( ) ( ) The response of the net correct for the first input pattern and the 2, 3, 4th patterns. w1 1 X1 X2 Y b w2
29
Hebb net for And function: bipolar inputs and bipolar targets
x2 - - 1 + -1 1 x1 - -1 - b + xi wi = -2 + x1(2)+ x2(2)=0 Separating lines x2= -x1+1 Basic formula to draw separating line b + xi wi = 0
30
Application: Character Recognition
# # . # . # . . . # . . . # . # . . # # # . # # Pattern 1 Pattern 2 1 –1 –1 –1 1, -1 1 –1 1 –1, -1 –1 1 –1 –1, -1 1 –1 1 –1, 1 –1 –1 –1 1 –1, 1 –1 –1 –1 1 1 –1 –1 –1 1, 1 –1 –1 –1 1, 1 –1 –1 –1 1,
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.