Presentation is loading. Please wait.

Presentation is loading. Please wait.

Introduction of SNoW (Sparse Network of Winnows )

Similar presentations


Presentation on theme: "Introduction of SNoW (Sparse Network of Winnows )"— Presentation transcript:

1 Introduction of SNoW (Sparse Network of Winnows )
IRLab-LA Group hjLiu

2 Introdution The SNoW Architecture File Formats Using SNoW Applying SNoW to my work

3 Introdution Multi-class classifier Learning architecture framework
Including a true multi-class capability Standard one-vs-all training policy Predictions are done via a winner-take-all policy or a voted combination of several learners. Learning architecture framework A sparse network of sparse linear functions over a predefined or incrementally acquired feature space The user designs an architecture within that framework (defining many more parameters of the architecture )

4 The SNoW Architecture

5 The Basic System Iuput: features layer Output: Target nodes
A two layer network is maintained Iuput: features layer Output: Target nodes Target nodes are linked via weighted edges to (active feature) input features Prediction positive

6 The Basic System Initial feature weight:
The predicted target for example with a set of active features is:

7 Basic Learning Rules Winnow
Predict negative——label is positive-〉promoted Predict positive——label is negative -〉demoted Other-〉unchang Sigmoid activation:

8 Basic Learning Rules Perceptron
Predict negative——label is positive-〉promoted Predict positive——label is negative -〉demoted Other-〉unchang Sigmoid activation:

9 Basic Learning Rules Naïve Bayes

10 Extensions to the Basic Learning Rules
Modify the behavior of basic update rules: Inculde:eligibility of features, options for discarding features, conditional prediction based on a prediction threshold, and others Constraint Classification Regularization Function Approximation Sequential Model Voting: The Clouds Architecture Threshold-Relative Updating

11 File Formats

12 Example Files 样例结束符号 0, 3, 1234, , 12, 987, 234, 556: 1, 7(1.5), 5, 10(0.6), 13(-3.2): Label 样例结束符号 Features Features Label strength

13 Example Files(cont.) 1,10391,10149,10002,10003,10004,10460,10151,10044,10393,10143,10074,10046,10144,10145,10394,10015,10016,10146,10461,10462,10463,10458,10464,10399: 1,10391,10099,10002,10003,10004,10465,10157,10158,10393,10086,10074,10046,10159,10145,10394,10015,10016,10089,10432,10333,10433,10466,10467,10399: 1,10391,10001,10002,10003,10004,10418,10163,10044,10393,10073,10074,10046,10144,10145,10394,10015,10016,10078,10395,10019,10396,10458,10459,10399: 5,10391,10164,10002,10003,10004,10165,10166,10167,10393,10073,10074,10168,10169,10170,10394,10015,10016,10078,10468,10469,10470,10471,10472,10399: 1,10391,10001,10129,10070,10004,10369,10233,10044,10393,10073,10115,10046,10176,10177,10394,10015,10077,10234,10395,10079,10473,10474,10475,10476: 6,10391,10180,10129,10070,10004,10374,10238,10044,10393,10103,10115,10046,10176,10177,10394,10015,10077,10119,10477,10182,10478,10474,10479,10476: 2,10284,10001,10129,10003,10004,10480,10163,10044,10481,10073,10074,10046,10176,10177,10482,10483,10016,10078,10290,10019,10291,10383,10384,10294: 3,10284,10001,10002,10070,10004,10484,10227,10044,10481,10103,10115,10046,10485,10486,10482,10483,10077,10119,10290,10079,10370,10487,10488,10373:

14 Network Files target ID priorProbability cloudConfidence activeCount nonActiveCount algorithm learnerType parameters eg. target winnow ID : learnerType : featureID : activeCount updates weight eg. 1 : 2 : 34 :

15 Network Files(cont.) target 5 1 1 2 48 perceptron 0 0.05 4 0.48
5 : 0 : : 5 : 0 : : 5 : 0 : : target perceptron target perceptron target perceptron 8 : 0 : : 8 : 0 : : 8 : 0 : :

16 Result Files Example: -o softmax Note:Error Files See the file:resultP
User defined the parameters of output mode Example: -o softmax Note:Error Files See the file:resultP

17 Using SNoW

18 Execution Modes Trainnig Testing Interactive Evaluation Server mode

19 Traning Mode Architecture Definition Parameters Training Parameters
Command line usage snow -train -I inputfile -F networkfile [ -AaBbcdEefGgiLlMmOoPpRrSsTtuvWwz ] Architecture Definition Parameters Learning algorithms:-P,-W,-B Extention rules:-G,-O,-S,-t Training Parameters -e,-r,-s,-u and so on

20 Traning Mode(cont.) snow -train -I train_numeric -F snow_netP
-P 0.05:1-10 -S 2 -r 3 Trainning file Network file Learing rule Other para

21 Testing Mode Testing Parameters -i,-w,-p and so on Output Parameters
Command line usage snow -test -I inputfile -F networkfile [ -abEefGgiLlmOopRSstvwz ] Testing Parameters -i,-w,-p and so on Output Parameters -o <accuracy | winners | softmax | allpredictions | allactivations | allboth> and so on

22 Testing Mode(cont.) snow -test -I test_numeric -F snow_netP
-w S 2 -o softmax >resultP Testing file Network file Save in file Other para Output mode

23 Applying SNoW to my work

24 完成过程 特征数值化 程序:train_Pro_toNum.pl 和test_Pro_toNum.pl
作用:对训练和测试样例数值化,并且生成类别和特征文件 训练样例 约170多万个样例,50多个类别,170多万个特征 训练时间:约20多分钟 测试样例 约6万个样例, Label全标成NULL 测试时间:约5分钟 输出结果格式转化 程序:transFormat.pl 实现方式: 一步分类 两步分类: (1)先对NULL 和 NON-NULL进行二元分类 (2)再对NON-NULL进行多类分类

25 参考文献: Snow-Uerguide.pdf(使用手册)

26 Thanks!


Download ppt "Introduction of SNoW (Sparse Network of Winnows )"

Similar presentations


Ads by Google