Download presentation
Presentation is loading. Please wait.
Published byAlfred McKinney Modified over 9 years ago
1
Label Tree in Large-Scale Mun Jonghwan
2
From small to large scale 2
3
Learning model on Large-scale 3 … … One-vs-all model Tree structured model
4
Label Tree Each node has Label set Classifier of children(or edge) Each child label set is subset of its parent Two consideration How to split the label set(construction) How to learn classifier(optimization) 4 {dog, wolf, cat, tiger} {dog, wolf} {cat, tiger} {dog} {wolf} {cat} {tiger}
5
Talk about… Label Embedding Trees for Large Multi-Class Tasks(NIPS2010) Samy Bengio, Jason Weston, David Grangier Fast and Balanced: Efficient Label Tree Learning for Large Scale Object Recognition(NIPS2011) Jia Deng, Sanjeev Satheesh, Alexander C.Berg, Li Fei-Fei 5
6
Label Embedding Trees for Large Multi-Class Tasks NIPS 2010 Samy Bengio et al.
7
Flow of constructing label tree 7 Cat Tiger Pencil pen Label set {pencil, pen, cat, tiger} {pen, pencil} {cat, tiger} {pen} {pencil} {cat}{tiger} Learning tree structure Learning classifier
8
Tree loss 8
9
Learning label tree structure 9 CatTigerPenPencil Cat10.60.10.12 Tiger0.610.20.16 Pen0.10.210.9 pencil0.120.160.91 … {pencil, pen, cat, tiger} {pen, pencil} {cat, tiger} {pen} {pencil} {cat} {tiger} Recursively Spectral clustering Confusion matrix
10
Learning with fixed label tree Relaxation 1: Independent convex problems Relaxation 2: Tree loss optimization(Joint convex problem) 10 exmaple node
11
Label Embedding 11 Embedding W V
12
Label Embedding 12 Goal Non-Convex joint optimization
13
Label Embedding - Sequence of convex problem 13 Embedding · · · ⨯ ⨯ ⨯ V Learning VLearning W
14
Experiment - result 14
15
Fast and Balanced: Efficient Label Tree Learning for Large Scale Object Recognition NIPS 2011 Jia deng et al.
16
Limitation of first paper Learning one-vs-all classifier is costly for large-scale Disjoint partition of classes does not allow overlap Tree structure may be unbalanced 16 Goal jointly learns the splits and classifier weights Allowing overlapping of class labels among children Explicitly modeling the accuracy and efficiency trade-off
17
Construction of label tree 17 {cat, fox, tiger, book, pen, note} {cat, fox, tiger}{pen, note} {book, tiger} catfoxtigerbookpennote Child 1111000 Child 2000011 Child 3001100 Reculsively split the node and learning weight
18
Optimization 18
19
Children does not need to cover all the classes in parent Allow overlap of label sets between children 19 OP1 - Optimizing efficiency with accurary constraint
20
OP2 - Optimizing over w given P 20
21
OP3 - Optimizing over P 21
22
Summary of algorithm Algorithm 2 is applied recursively from the root 22
23
Experiment 23
24
Experiment 24
25
Conclusion 25 Fast multi-class classification by optimizing the overall tree loss Embedding with the label Jointly learns the splits and classifier weights Allowing overlapping splits Explicitly modeling the accuracy and efficiency trade-off Jia Deng et al.Samy Bengio et al.
26
Thank you 26
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.