Download presentation
Presentation is loading. Please wait.
Published byEmerald Dean Modified over 9 years ago
1
Ch. 18 – Learning Supplemental slides for CSE 327 Prof. Jeff Heflin
2
Decision Tree Learning function D EC -T REE -L EARN (examples,attribs,parent_examples) returns a decision tree if examples is empty then return P LURALITY -V ALUE (parent_examples) else if all examples have the same classification then return the classification else if attribs is empty then return P LURALITY -V ALUE (examples) else A argmax a attribs I MPORTANCE ( A, examples) tree a new decision tree with root test A for each value v k of A do exs {e : e examples and e.A = v k } subtree D EC -T REE -L EARN (exs,attribs – A, examples) add a branch to tree with label (A = v k ) and subtree subtree return tree From Figure 18.5, p. 702
3
Decision Tree Data Set ExampleColorSizeShape Goal Predicate X1bluesmallsquareno X2greenlargetriangleno X3redlargecircleyes X4greensmallsquareno X5yellowsmallcircleno X6redsmallcircleyes X7bluelargetriangleno X8redsmallsquareno
4
Decision Tree Result Shape? Color? No Yes +: X3,X6 -: X1,X2,X4,X5,X7,X8 +: -: X1,X4,X8 +: -: X2,X7 +: X3,X6 -: X5 +: X3,X6 -: +: -: X5 +: -: square triangle circle red blue green yellow
5
A Neuron
6
Perceptron Learning function P ERCEPTRON -L EARNING (examples,network) returns a perceptron hypothesis inputs: examples, a set of examples with input x and output y network, a perceptron with weights W j and activation function g repeat for each example (x,y) in examples do Err y – g(in) for each j in 0..n W j W j + Err g’(in) x j until some stopping criteria is satisfied return N EURAL -N ET -H YPOTHESIS (network)
7
NETTalk OE_Y_AR … … 26 output units one layer of 80 hidden units 7x29 input units /r/
8
ALVINN Input is 30x32 pixels = 960 values 1 input pixel 5 hidden units 30 output units Sharp right Straight ahead Sharp left
9
SVM Kernels Non-linear separator in 2 dimensions: Mapped to 3 dimensions
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.