Download presentation
Presentation is loading. Please wait.
Published bySpencer Hood Modified over 9 years ago
1
Proximal Methods for Sparse Hierarchical Dictionary Learning Rodolphe Jenatton, Julien Mairal, Guillaume Obozinski, Francis Bach Presented by Bo Chen, 2010, 6.11
2
Outline 1. Structured Sparsity 2. Dictionary Learning 3. Sparse Hierarchical Dictionary Learning 4. Experimental Results
3
Structured Sparsity Lasso (R. Tibshirani.,1996) Group Lasso (M. Yuan & Y. Lin, 2006) Tree-Guided Group Lasso (Kim & Xing, 2009)
4
Tree-Guided Structure Example Tree Regularization Definition: Kim & Xing, 2009 Multi-task:
5
Tree-Guided Structure Penalty Introduce two parameters: Rewrite the penalty term, if the number of tasks is 2. (K=2): Generally: Kim & Xing, 2009
6
In Detail Kim & Xing, 2009
7
Some Definitions about Hierarchical Groups
8
Hierarchical Sparsity-Inducing Norms
9
Dictionary Learning If the structure information is introduced, the difference between dictionary learning and group lasso: 1.Group Lasso is a regression problem. Each feature has its own physical meaning. The structure information should be meaningful and correct. Otherwise, the ‘structure’ will hurt the method. 2.In dictionary learning, the dictionary is unknown. So the structure information will be a guide to help learn the structured dictionary.
10
Optimization Proximal Operator for Structure Norm Fix the dictionary D, the objective function: = Transformed to a proximal problem: Proximal operator with the structure penalty:
11
Learning the Dictionary Updating D 5 times in each iteration, Updating A,
12
Experiments : Natural Image Patches Use the learned dictionary from training set to impute the missing values in testing samples. Each sample is a 8x8 patch. Training set: 50000; Testing set: 25000 Test 21 balanced tree structures of depth 3 and 4. Also set the number of the nodes in each layer.
13
Learned Hierarchical Dictionary
14
Experiments : Text Documents Key points:
15
Visualization of NIPS proceedings Documents: 1714 Words: 8274
16
Postings Classification Training set: 1000; Testing set: 425; Documents: 1425; Words:13312 Goal: classify the postings from the two newsgroups, alt.atheism and talk.religion.misc.
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.