Download presentation
Presentation is loading. Please wait.
Published byGriselda Maxwell Modified over 9 years ago
1
Start with student evals
2
What function does perceptron #4 represent? 0.8 0.7 1.3
3
What function does perceptron #5 represent? 0.8 1.3 1.2 0.8
4
Each neuron has a threshold value Each neuron has weighted inputs from other neurons The input signals form a weighted sum If the activation level exceeds the threshold, the neuron “fires” The McCulloch-Pitts “Unit” The single neuron (Perceptron)
5
“Simple” Learning algorithm While epoch produces an error Present network with next inputs from epoch Error = T – O If Error <> 0 then W j = W j + LR * I j * Error End If End While
6
Training Perceptrons t = 0.0 y x W0 = ? W2 = ? W1 = ? For AND x y Output 0 0 0 0 1 0 1 0 0 1 1 1 What are the weight values?What are the weight values? Initialize with random weight valuesInitialize with random weight values
7
Training Perceptrons t = 0.0 y x W 0 = 0.3 W 2 =-0.4 W 1 = 0.5 For AND x y Output 0 0 0 0 1 0 1 0 0 1 1 1
8
“Simple” Learning algorithm While epoch produces an error Present network with next inputs from epoch Error = T – O If Error <> 0 then W j = W j + LR * I j * Error End If End While
9
Training Perceptrons t = 0.0 y x W0 = 0.3 W2 =-0.4 W1 = 0.5 For AND x y Output 0 0 0 0 1 0 1 0 0 1 1 1
10
“Simple” Learning algorithm While epoch produces an error Present network with next inputs from epoch Error = T – O If Error <> 0 then W j = W j + LR * I j * Error End If End While
11
Training Perceptrons t = 0.0 y x W 0 = 0.3 W 2 =-0.4 W 1 = 0.5 For AND x y Output 0 0 0 0 1 0 1 0 0 1 1 1
12
“Simple” Learning algorithm While epoch produces an error Present network with next inputs from epoch Error = T – O If Error <> 0 then W j = W j + LR * I j * Error End If End While
13
Training Perceptrons t = 0.0 y x W 0 = 0.3 W 2 =-0.4 W 1 = 0.5 For AND x y Output 0 0 0 0 1 0 1 0 0 1 1 1 If Error <> 0 then Wj = Wj + LR * Ij * Error
14
Learning in Neural Networks Will demo the online applet at http://neuron.eng.wayne.edu/java/Perceptron/ New38.html
15
Decision boundaries In simple cases, divide feature space by drawing a hyperplane across it. Known as a decision boundary. Discriminant function: returns different values on opposite sides. (straight line) Problems which can be thus classified are linearly separable.
16
Decision Surface of a Perceptron + + + + - - - - x1x1 x2x2 + + - - x1x1 x2x2 Perceptron is able to represent some useful functions AND(x 1,x 2 ) choose weights w 0 =-1.5, w 1 =1, w 2 =1 But functions that are not linearly separable (e.g. XOR) are not representable Linearly separableNon-Linearly separable
17
How do the weights define the boundary line? Consider the y-intercept X value is zero so I1 is zero and that term drops out. We are interested in I2 which in this case is the y coordinate on the boundary line So the boundary line is on I0*W0 + I2*W2 = 0 -1*W0 + y*W2 = 0 -W0 + y*W2 = 0 y*W2 = W0 y = W0/W2
18
How do the weights define the boundary line? Via similar math, the x-intercept is where y value is zero so I2 is zero and that term drops out. We are interested in I1 which in this case is the x coordinate on the boundary line So the boundary line is on I0*W0 + I1*W1 = 0 -1*W0 + x*W1 = 0 -W0 + x*W1 = 0 x*W1 = W0 x = W0/W1
19
How do the weights define the boundary line? Slope is defined as (y1 – y2) / (x1-x2) If we think of point one as the y intercept (0,w2/w0) and point two as the x intercept (w1/w0,0) Slope = (w2/w0 – 0) / (0 – w1/w0) = (w2/w0) / (-w1 / w0) = -(w2/w1)
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.