DCC ‘99 - Adaptive Prediction Lossless Image Coding Adaptive Linear Prediction Lossless Image Coding Giovanni Motta, James A. Storer Brandeis University Volen Center for Complex Systems Computer Science Department Waltham MA-02454, US {gim, Bruno Carpentieri Universita' di Salerno Dip. di Informatica ed Applicazioni "R.M. Capocelli” I Baronissi (SA), Italy
Problem Graylevel lossless image compression addressed from the point of view of the achievable compression ratio
Outline Motivations Main Idea Algorithm Predictor Assessment Entropy Coding Final Experimental Results Conclusion Motivations Main Idea Algorithm Predictor Assessment Entropy Coding Final Experimental Results Conclusion
Past Results / Related Works Until TMW, the best existing lossless digital image compressors (CALIC, LOCO-I, etc..) seemed unable to improve compression by using image-by-image optimization techniques or more sophisticate and complex algorithms A year ago, B. Meyer and P. Tischer were able, with their TMW, to improve some current best results by using global optimization techniques and multiple blended linear predictors.
Past Results / Related Works In spite of the its high computational complexity, TMW’s results are in any case surprising because: Linear predictors are not effective in capturing image edginess; Global optimization seemed to be ineffective; CALIC was thought to achieve a data rate close to the entropy of the image.
Motivations Multiple Adaptive Linear Predictors Pixel-by-pixel optimization Local image statistics Investigation on an algorithm that uses:
Main Idea Explicit use of local statistics to: Classify the context of the current pixel; Select a Linear Predictor; Refine it.
Prediction Window Statistics are collected inside the window W x,y (R p ) Not all the samples in W x,y (R p ) are used to refine the predictor Window W x,y (R p ) R p +1 2R p +1 Current Pixel I(x,y) Encoded Pixels Current Context
Prediction Context 6 pixels fixed shape weights w 0,…,w 5 change to minimize error energy inside W x,y (R p ) w1w1 w2w2 w3w3 w5w5 w4w4 w0w0 Prediction: I’(x,y) = int(w 0 * I(x,y-2) + w 1 * I(x-1,y-1) + w 2 * I(x,y-1) + w 3 * I(x+1,y-1) + w 4 * I(x-2,y) + w 5 * I(x-1,y)) Error: Err(x,y) = I’(x,y) - I(x,y)
Predictor Refinement Gradient descent is used to refine the predictor Window W x,y (R p ) R p +1 2R p +1 Current Pixel I(x,y) Encoded Pixels Current Context
Algorithm for every pixel I(x,y) do begin /* Classification */ Collect samples W x,y (R p ) Classify the samples in n clusters (LBG on the contexts) Classify the context of the current pixel I(x,y) Let P i ={w 0,.., w 5 } be the predictor that achieves the smallest error on the current cluster C k /* Prediction */ Refine the predictor P i on the cluster C k Encode and send the prediction error ERR(x,y) end
Results Summary Compression is better when structures and textures are present Compression is worse on high contrast zones Local Adaptive LP seems to capture features not exploited by existing systems
Test Images downloaded from the ftp site of X. Wu: ”ftp:\\ftp.csd.uwo.ca/pub/from_wu/images” 9 “pgm” images,720x576 pixels, 256 greylevels (8 bits) BalloonBarbBarb2BoardBoats GirlGoldHotelZelda
Outline Motivations Main Idea Algorithm Predictor Assessment Entropy Coding Final Experimental Results Conclusion Motivations Main Idea Algorithm Predictor Assessment Entropy Coding Final Experimental Results Conclusion
File Size vs. Number of Predictors. ( R p =6 ) Using an adaptive AC # of predictors Balloon Barb Barb Board Boats Girl Gold Hotel Zelda Total (bytes)
File Size vs. window radius R P (# pred.=2) Using an adaptive AC Rp Balloon Barb Barb Board Boats Girl Gold Hotel Zelda Total (bytes)
Prediction Error baloonbarbbarb2boardboatsgirlgoldhotelzelda Image LOCO-I (Error Entropy after Context Modeling) LOCO-I (Entropy of the Prediction Error) 2 Predictors, Rp=10, Single Adaptive AC
Prediction Error
Prediction Error (histogram) Test image “Hotel”
Prediction Error (magnitude and sign) Test image “Hotel” Sign Magnitude
Prediction Error (magnitude and sign) Test image “Board” Magnitude Sign
Outline Motivations Main Idea Algorithm Predictor Assessment Entropy Coding Final Experimental Results Conclusion Motivations Main Idea Algorithm Predictor Assessment Entropy Coding Final Experimental Results Conclusion
Entropy Coding AC model determined in a window W x,y (R e ) Two different ACs for typical and non typical symbols (for practical reasons) Global determination of the cutting point
Compressed File Size vs. error window radius R e (# of predictors = 2 and R p =10) R e balloon barb barb board boats girl gold hotel zelda Total (bytes)
Outline Motivations Main Idea Algorithm Predictor Assessment Entropy Coding Final Experimental Results Conclusion Motivations Main Idea Algorithm Predictor Assessment Entropy Coding Final Experimental Results Conclusion
Comparisons Compression rate in bit per pixel. (# of predictors = 2, R p =10) balloon barb barb2 board boats girl gold hotel zelda Avg. SUNSET LOCO-I UCM Our CALIC TMW
Comparisons
Conclusion Compression is better when structures and textures are present Compression is worse on high contrast zones Local Adaptive LP seems to capture features not exploited by existing systems
Future Research Compression Better context classification (to improve on high contrast zones) Adaptive windows MAE minimization (instead of MSE min.) Complexity Gradient Descent More efficient entropy coding Additional experiments On different test sets