Download presentation
Presentation is loading. Please wait.
Published byJulius Jenkins Modified over 9 years ago
1
Entropy-constrained overcomplete-based coding of natural images André F. de Araujo, Maryam Daneshi, Ryan Peng Stanford University
2
2 EE398A Project – Winter 2010/2011 Mar. 10, 2011 Outline Motivation Overcomplete-based coding: overview Entropy-constrained overcomplete-based coding Experimental results Conclusion Future work
3
3 EE398A Project – Winter 2010/2011 Mar. 10, 2011 Motivation (1) Study of new (and unusual) schemes for image compression Recently, new methods have been developed using the overcomplete approach Restricted scenarios for compression Did not fully exploit this approach’s characteristics for compression
4
4 EE398A Project – Winter 2010/2011 Mar. 10, 2011 Motivation (2) Why? Sparsity on coefficients better overall RD
5
5 EE398A Project – Winter 2010/2011 Mar. 10, 2011 Overcomplete coding: overview (1) K > N implies: Bases are not linearly independent Example: 8x8 blocks: N = 64 basis functions are needed to span the space of all possible signals Overcomplete basis could have K = 128 Two main tasks: 1. Sparse coding 2. Dictionary learning
6
6 EE398A Project – Winter 2010/2011 Mar. 10, 2011 Overcomplete coding: overview (2) 1. Sparse coding (“atom decomposition”) Compute the representation coefficients x based on the signal y (given) and dictionary D (given) overcomplete D Infinite solutions approxim. Commonly used algorithms: Matching Pursuits (MP), Orthogonal Matching Pursuits (OMP)
7
7 EE398A Project – Winter 2010/2011 Mar. 10, 2011 Overcomplete coding: overview (3) Sparse coding (OMP)
8
8 EE398A Project – Winter 2010/2011 Mar. 10, 2011 Overcomplete coding: overview (4) 2. Dictionary learning Two basic stages (analogy with K-means) i. Sparse coding stage: use a pursuit algorithm to compute x (OMP is usually employed) ii. Dictionary update stage: adopt a particular strategy for updating the dictionary Convergence issues: as first stage does not guarantee best match, cost can increase and convergence cannot be assured
9
9 EE398A Project – Winter 2010/2011 Mar. 10, 2011 Overcomplete coding: overview (5) 2. Dictionary learning Most relevant algorithms in the literature: K-SVD and MOD Sparse coding stage is done in the same way Codebook update stage is different: MOD Update entire dictionary using optimal adjustment for a given coefficients matrix K-SVD Update each basis one at a time using SVD formulation Introduces change in dictionary and coefficients
10
10 EE398A Project – Winter 2010/2011 Mar. 10, 2011 Entropy-const. OC-based coding (1)
11
11 EE398A Project – Winter 2010/2011 Mar. 10, 2011 Entropy-const. OC-based coding (2)
12
12 EE398A Project – Winter 2010/2011 Mar. 10, 2011 Entropy-const. OC-based coding (3) RD-OMP
13
13 EE398A Project – Winter 2010/2011 Mar. 10, 2011 Entropy-const. OC-based coding (4) EC Dictionary Learning – key ideas Dictionary update strategy K-SVD modifies dictionary and coefficients - reduction in Lagrangian cost is not assured. We use MOD, which provides the optimal adjustment assuming fixed coefficients Introduction of “Rate cost update” stage Analogous to ECVQ algorithm for training data Two pmfs must be updated: indexes and coefficients
14
14 EE398A Project – Winter 2010/2011 Mar. 10, 2011 Entropy-const. OC-based coding (5) EC-Dictionary Learning
15
15 EE398A Project – Winter 2010/2011 Mar. 10, 2011 Experiments (Setup) Rate calculation: optimal codebook (entropy) for each subband Test images: Lena, Boats, Harbour, Peppers Training dictionary experiments Training data: 18 Kodak downsampled (to 128x128) images (does not include images being coded) Use of downsampled images to 128x128, due to very high computational complexity (for other experiments, higher resolutions were employed: 512x512, 256x256)
16
16 EE398A Project – Winter 2010/2011 Mar. 10, 2011 Experiments (Sparse Coding) Comparison of Sparse coding methods
17
17 EE398A Project – Winter 2010/2011 Mar. 10, 2011 Experiments (Dict. learning) Comparison of dictionary learning methods
18
18 EE398A Project – Winter 2010/2011 Mar. 10, 2011 Experiments (Compression schemes) (1) 1: Training and coding for the same image (dictionary is sent) 2: Training with a set of natural images and applying to other images
19
19 EE398A Project – Winter 2010/2011 Mar. 10, 2011 Experiments (Compression schemes) (2)
20
20 EE398A Project – Winter 2010/2011 Mar. 10, 2011 Experiments (Compression schemes) (3)
21
21 EE398A Project – Winter 2010/2011 Mar. 10, 2011 Conclusion Improvement of sparse coding: RD-OMP Improvement of dictionary learning Entropy-constrained overcomplete dictionary learning Better overall performance compared to standard techniques
22
22 EE398A Project – Winter 2010/2011 Mar. 10, 2011 Future work Extension of implementation to higher resolution images Further investigation of trade-off between K and N Evaluation against directional transforms Low complexity implementation of the algorithms
23
23 EE398A Project – Winter 2010/2011 Mar. 10, 2011 Experiments (trained dictionary) K = 256
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.