Context-based, Adaptive, Lossless Image Coding (CALIC) Authors: Xiaolin Wu and Nasir Memon Source: IEEE TRANSACTIONS ON COMMUNICATIONS, VOL. 45, NO. 4, APRIL 1997 Speaker: Guu-In Chen date:
Where to use lossless compression medical imaging remote sensing print spooling fax document & image archiving last step in lossy image compression system ……………………….
Some methods for lossless compression Run Length encoding statistical method : –Huffman coding –Arithmetic coding... dictionary-based model –LZW: UNIX compress, GIF,V.42 bis –PKZIP –ARJ... predictive coding –DCPM –LJPEG –CALIC –JPEG-LS(LOCO-I) –FELICS... wavelet transform –S+P …………………………………………………………
System Overview Raster scan original image, pixel value I Context-based prediction,error e grouping and prediction modification modified prediction,error grouping and prediction modification modified prediction,error Encode using arithmetic coding
Prediction d h ~ gradient in horizontal direction~ vertical edge d v ~ gradient in vertical direction~ horizontal edge d=d v - d h W (t+W)/2 (3t+W)/4 t (3t+N)/4 (t+N)/2 N Sharp horizontal edge horizontal edge week horizontal edge homogeneous week vertical edge vertical edge Sharp vertical edge d Ideal example
Prediction--continued more realistic example(inclined edge) Prediction error Example above, If I=100 then e=100-75=25
How to improve the error distribution e p(e) e Context 1. texture pattern => C={N,W,NW,NE,NN,WW,2N-NN,2W-WW} 2. Variability =>d h, d v Context 1. texture pattern => C={N,W,NW,NE,NN,WW,2N-NN,2W-WW} 2. Variability =>d h, d v Error distribution Influence Previous prediction error => Group pixels Each group has its new prediction why? Each group has its new prediction why?
Grouping Context 1. texture pattern => C={N,W,NW,NE,NN,WW,2N-NN,2W-WW} ={x 0,x 1,x 2,x 3,x 4, x 5, x 6, x 7 } b k = 0 if x k >= 1 if x k < α=b 7 b 6 …..b 0 Context 1. texture pattern => C={N,W,NW,NE,NN,WW,2N-NN,2W-WW} ={x 0,x 1,x 2,x 3,x 4, x 5, x 6, x 7 } b k = 0 if x k >= 1 if x k < α=b 7 b 6 …..b 0 =75 C={100, 100, 200,100,200,200,0,0} b 0~7 = α=
What means 2N-NN,2W-WW NN I N 2N-NN b 6 =1 C={N,W,NW,NE,NN,WW,2N-NN,2W-WW} How many cases in α NN(b 4 =0) I N (b 0 =1) 2N-NN(b 6 must be 1) There are not (b 0, b 4, b 6 )= (1,0,0 ) and(0,1,1) =6 cases. Same as (b 1, b 5, b 7 ). α has 6*6*4=144 cases not 2 8 NN(b 4 =1) I N (b 0 =0) 2N-NN(b 6 must be 0)
Grouping--continued Context 1. texture pattern => C={N,W,NW,NE,NN,WW,2N-NN,2W-WW} 2. Variability =>d h, d v Context 1. texture pattern => C={N,W,NW,NE,NN,WW,2N-NN,2W-WW} 2. Variability =>d h, d v Previous prediction error Previous prediction error △ = dh+dv +2 quantize △ to [0,3] △ = Quantization Q( △ ) = Q( △ ) expressed by binary number ( ) for example, △ =70, Q( △ ) =2, =10 2
Grouping--continued Compound and =>C( , ) for example, = =10 C( , )= cases in C( , ) = 144*4=576 According to different C( , ), we group the pixels.
Modify prediction For each C( , ) group mean of all e modified prediction modified error For each C( , ) group mean of all e modified prediction modified error Example: I=10, 11, 13, 15, 18 = 8, 10, 13, 16, 14 e= 2, 1, 0, -1, 4 =9, 11, 14, 17, 15 =1, 0, -1, -2, 3 more closer to I
Experimental result
comment 1. Balances bit rate and complexity. 2. Seems there are redundancies in C={N,W,NW,NE,NN,WW,2N-NN,2W-WW} & △ = dh+dv +2 or may be simplified. 3. Needs more understanding of Arithmetic coding. 4. Lossless or near-lossless compression can be the another fields for our laboratory.