Presentation is loading. Please wait.

Presentation is loading. Please wait.

8/25/05 Cognitive Computations Software Tutorial Page 1 SNoW: Sparse Network of Winnows Presented by Nick Rizzolo.

Similar presentations


Presentation on theme: "8/25/05 Cognitive Computations Software Tutorial Page 1 SNoW: Sparse Network of Winnows Presented by Nick Rizzolo."— Presentation transcript:

1 8/25/05 Cognitive Computations Software Tutorial Page 1 SNoW: Sparse Network of Winnows Presented by Nick Rizzolo

2 Cognitive Computations Software Tutorial 28/25/05 Introduction Multi-class classification Infinite attribute domain User configurable linear threshold unit networks Variety of update rules, algorithmic extensions

3 Cognitive Computations Software Tutorial 38/25/05 Outline The Basic System Training Algorithmic Extensions Testing Tuning

4 Cognitive Computations Software Tutorial 48/25/05 The Basic System Targets (concepts) Features Weighted edges, instead of weight vectors  SNoW only represents the targets and weighted edges  Prediction is “one vs. all”

5 Cognitive Computations Software Tutorial 58/25/05 Update Rules Winnow – mistake driven  Promotion: if,  Demotion: if, Perceptron – mistake driven  Promotion: if,  Demotion: if, Naïve Bayes – statistical thresholdactivation

6 Cognitive Computations Software Tutorial 68/25/05 Command Line Interface snow –train –I -F -W,,, : -P,, : snow –test –I -F … snow –train –I learnMe.snow –F learnMe.net –W 2,0.5,3.5,1:1-3 –z + -s s snow –test –I testMe.snow -F learnMe.net –o winners

7 Cognitive Computations Software Tutorial 78/25/05 A Training Example 1, 1 1, 1001, 1006: 1, 1, 1 2, 1002, 1007, 1008: 1, 1 3, 1006, 1004: 100110021003100410051006100710081009 312 2, 22, 2, 2 1, 1004, 1007: 2, 1, 2, 1 2, 2, 2, 2 2, 22, 1, 1, 2 3, 1004, 1005, 1009: 2, 1, 2, 1 Update rule: Winnow α = 2, β = ½, θ = 3.5 1001, 1005, 1007: = 4= 2= 1

8 Cognitive Computations Software Tutorial 88/25/05 Bells & Whistles -d abs:0.1 Discarding -g - Feature conjunctions -f - “Fixed feature” -b 5 Smoothing -p 0.5 Prediction threshold -e count:1 Eligibility

9 Cognitive Computations Software Tutorial 98/25/05 Outline The Basic System Training Algorithmic Extensions Testing Tuning

10 Cognitive Computations Software Tutorial 108/25/05 Clouds (Voting) 312 Multiple target nodes per concept Learning still independent Cloud activation is weighted sum of its targets’ activations Decreasing function of mistakes gives those weights snow –train –I learnMe.snow –F learnMe.net –W :1-3 –P :1-3

11 Cognitive Computations Software Tutorial 118/25/05 The Sequential Model Larger confusion set → lower prediction accuracy Put prior knowledge in each example 1, 1001, 1005, 1007; 1, 3: 1001, 1005, 1007; 1, 3:

12 Cognitive Computations Software Tutorial 128/25/05 The Thick Separator Theory is similar to SVM Increases “margin” Improves generalization -S, Almost always improves performance

13 Cognitive Computations Software Tutorial 138/25/05 Constraint Classification Promotion / demotion depends on activation comparisons More expressive than independent update rules Sometimes difficult to realize performance improvements -O

14 Cognitive Computations Software Tutorial 148/25/05 Other Training Policies Regression  Function approximation  (Exponentiated) Gradient Descent Threshold relative updating  Each update moves hyperplane to example Single target training and testing -G -t

15 Cognitive Computations Software Tutorial 158/25/05 Outline The Basic System Training Algorithmic Extensions Testing Tuning

16 Cognitive Computations Software Tutorial 168/25/05 Sigmoid Functions How to compare activations from different algorithms? Sigmoids map activations to (0:1) Winnow / Perceptron σ θ - Ω Naïve Bayes already yields (0:1) Softmax:

17 Cognitive Computations Software Tutorial 178/25/05 Output Modes accuracy (the default)  Assumes labeled testing examples  Simply reports SNoW’s performance winners softmax allactivations -o Example 47Label: 0 0:0.700054.84750.77582* 1:0.309153.19590.14876 2:0.184932.51670.075420

18 Cognitive Computations Software Tutorial 188/25/05 Outline The Basic System Training Algorithmic Extensions Testing Tuning

19 Cognitive Computations Software Tutorial 198/25/05 Tuning Parameter settings make a big difference  Want to pick the right algorithm  Don’t want to over-fit Trial & error -W,,, : -P,, : -S The major players: -W,,, : -P,, : -S -r -u tune.pl –train -arch W: …


Download ppt "8/25/05 Cognitive Computations Software Tutorial Page 1 SNoW: Sparse Network of Winnows Presented by Nick Rizzolo."

Similar presentations


Ads by Google