Presentation is loading. Please wait.

Presentation is loading. Please wait.

Increasing Watermarking Robustness using Turbo Codes

Similar presentations


Presentation on theme: "Increasing Watermarking Robustness using Turbo Codes"— Presentation transcript:

1 Increasing Watermarking Robustness using Turbo Codes
Corina Nafornita, Alexandru Isar, Maria Kovaci Politehnica University of Timisoara {corina.nafornita, alexandru.isar, WISP 2009, Budapest, Hungary

2 Goal Duo-binary turbo codes Robustness Capacity Imperceptibility
Watermarking requirements Watermarking has been proposed as a means of identifying the owner, by secretly embedding an imperceptible image into the host image. Important properties of an image watermarking system include perceptual transparency, robustness, security, and data hiding capacity. The imperceptibility and the high capacity can be ensured by watermarking in the wavelets domain. The methods already proposed by the authors have also a good robustness. It can be improved by using some sort of encoding of the watermark, usually a repetition code or an error correcting code. Despite of their efficient use in telecommunications, turbo codes have been rarely used in watermarking. In this paper we present a watermarking system that uses the biorthogonal discrete wavelet transform, DWT and the message is encoded before embedding. The operations made are the following: 1. turbo coding, 2. embedding the turbo coded message into the host image using a perceptual mask proposed by the authors, 3. extraction of the turbo coded message from the watermarked, possibly corrupted image, and 4. turbo decoding. Perceptual mask - DWT WISP 2009, Budapest, Hungary

3 Barni et al., 2001 Daubechies-6 Watermark Marked Original Mask IDWT
WATERMARKING IN THE DWT DOMAIN The watermark is either embedded in coefficients of known robustness (which are usually large coefficients) or in perceptually significant regions, such as contours and textures of an image. This can be done empirically, selecting larger coefficients or using a thresholding scheme in the transform domain. Another approach is to insert the watermark in all coefficients of a transform, using a variable strength for each coefficient. We have developed the excellent idea of To embed the watermark, Barni et al. compute first the orthogonal Discrete Wavelet Transform (DWT) of the host image. Next the watermark is masked for each operational subband. The masking is realized according to the characteristics of the human visual system (HVS). These characteristics are taken into account considering the texture and the luminance content of all the image subbands, in all resolution levels of the DWT, except the low frequency one. For wavelet coefficients corresponding to contours of the image a higher strength is used, for textures a medium strength is used and for regions with high regularity a lower strength is used, in accordance with the analogy water-filling and watermarking. Next the masked watermarks are added to the corresponding subband coefficients. The marked image is obtained by the computation of the Inverse Discrete Wavelet Transform (IDWT). WISP 2009, Budapest, Hungary

4 Proposed solution Artifacts  low resolution mask
High resolution mask Watermark easily erased All resolution levels New approach: duo-binary turbo codes The watermarking system proposed by Barni et al. has some drawbacks. First, it produces some artifacts due to the low resolution of the mask used. These artifacts can be reduced with the aid of a high resolution mask. Secondly, the Barni’s watermark can be easily erased. This drawback can be diminished using all the resolution levels of the DWT. The proposed embedding method uses a high resolution mask and exploits all the resolution levels of the DWT. More, to increase the robustness, it uses duo-binary turbo codes. WISP 2009, Budapest, Hungary

5 New method Bior2.2 DBTC Encoded Watermark Marked Original New mask
message New mask Encoded Watermark Original Marked DWT IDWT Bior2.2 The proposed method adds some ingredients to the Barni’s method. First, the message is duo binary turbo coded, obtaining an encoded watermark. Second, a biorthogonal DWT is computed using the Bior2.2 mother wavelets. Third, the watermark is embedded in all the detail subbands. Forth, a higher resolution mask is generated. Fifth, the encoded watermark is embedded only in the higher detail coefficients. Select only higher coefficients!!! WISP 2009, Budapest, Hungary

6 Mask estimation Low resolution vs. High resolution
The Barni’s mask and the proposed mask are compared. It can be observed that the proposed mask has a higher resolution. Low resolution vs. High resolution (Barni et al.,2001) (Nafornita et al.,2006) WISP 2009, Budapest, Hungary

7 Duo-Binary Turbo Code 8-state duo binary RSC encoder , rate 2/3.
RSC r-ary ilv Input sequence r u 1 c1 c2 u2 1 c =u0 S1 u1 S3 S2 Aici zici tu ce trebuie. 8-state duo binary RSC encoder , rate 2/3. Encoder polynomials: 15 (feedback) and 13 (redundancy) WISP 2009, Budapest, Hungary

8 Experiments Test image Lena Message length: 768 bits Block size 7
Threshold 10 Watermark strength =9 PSNR obtained dB Lena Simulation results obtained using the image Lena (512×512) are reported. The image is decomposed into a four level decomposition with a biorthogonal mother wavelet (bior2.2). A pseudo-random binary message m with values {−1,1} is turbo coded using a DBTC, resulting in a coded watermark message. The block size is 768 bits and the number of blocks for the image Lena is 7. The coded watermark is embedded into each subband in coefficients with the magnitude greater than a threshold T, for levels 0, 1 and 2. In all simulations, this threshold was experimentally set to the value 10. The embedding strength is set to α = 9, resulting in a watermarked image with the peak signal-to-noise ratio PSNR=29.95 dB. Two experiments were performed: addition of white Gaussian noise (AWGN) and JPEG compression. Watermarked Lena WISP 2009, Budapest, Hungary

9 AWGN: σ=3.25…15 BER vs.  BER vs. PSNR WISP 2009, Budapest, Hungary
In the first experiment, we added noise with mean 0 and variance σ2 to the watermarked image. We repeated the experiment for σ ranging from 3.25 to 15 with step 0.25; we plotted BER, without coding the watermark, and with a DBTC. The left figure presents the values of the BER computed for different values of σ, while the right figure presents BER as a function of the PSNR between the attacked image and the watermarked image, for the uncoded sequence as well the coded sequence. For values of σ inferior to 8, the noise addition doesn’t degrade the image too much and the PSNR value is still high. For a PSNR superior to 29 dB, the watermark is reconstructed without error, using turbo decoding. For PSNR values inferior to 29 dB, the attacked images are visually impaired by the noise addition, making the attacked images useless. BER vs.  BER vs. PSNR WISP 2009, Budapest, Hungary

10 JPEG compression: Q=10…100 BER vs. quality factor BER vs. PSNR
Next we studied the robustness of the watermark using the JPEG compression attack. We have compressed the watermarked image using different quality factors, Q from 100 to 10, and we have plotted the BER with and without turbo coding the watermark. The left figure presents the values of the BER computed for different values of the quality factors, for the uncoded sequence as well the coded sequence. For a quality factor higher than 50, the reconstruction of the turbo coded watermark is almost perfect. The right figure presents BER as a function of the PSNR between the attacked image and the watermarked image. BER vs. quality factor BER vs. PSNR WISP 2009, Budapest, Hungary

11 Coded vs. Uncoded Attack vs. BER BER Uncoded DBTC
JPEG compression, Q=50, PSNR=32.94 dB AWGN, =8, PSNR=29.29 dB 0.0885 3.25·10-4 WISP 2009, Budapest, Hungary

12 Conclusions BER decreases in both cases when using turbo codes
Less severe attacks: perfect reconstruction, the attack modeled like an AWGN channel with high SNR. Coding gain of DBTC higher than 2 dB WISP 2009, Budapest, Hungary


Download ppt "Increasing Watermarking Robustness using Turbo Codes"

Similar presentations


Ads by Google