Download presentation
Presentation is loading. Please wait.
Published byElisha Pernell Modified over 10 years ago
1
A Standardized Workflow for Illumination-Invariant Image Extraction Mark S. Drew Muntaseer Salahuddin Alireza Fathi Simon Fraser University, Vancouver, Canada {mark,msalahud,alirezaf}@cs.sfu.cawww.cs.sfu.ca/~mark
2
2 Introduction Illumination-invariant image extraction is an interesting and open problem in vision. illustration shows the objective: (the “intrinsic image”)
3
3 Introduction (cont.) To obtain (b) from (a), we take the logarithm of band-ratio chromaticity colour coordinates, and then project in a special direction [Finlayson and Hordley, 2001]. The resultant grey-scale image is illumination invariant.
4
4 Introduction (cont.) Objective: we argue that sharpening sRGB allows us to find the invariant image as a generic workflow for images, from unknown cameras unknown cameras unknown actual special direction unknown actual special direction no complex algorithm using evidence in each image no complex algorithm using evidence in each image Works well (but not as well as knowing the camera or using internal evidence in image!)
5
5 Illumination invariant is crucial step! Shadow Removal
6
6 Finding direction The direction of projection is crucial.
7
7 Finding direction… Could calibrate the camera to find the invariant direction [Finlayson et al. (2002)]: HP912 Digital Still Camera: Log-chromaticities of 24 patches; 6 patches, imaged under 9 illuminants.
8
8 Finding direction… Without calibrating the camera, can use entropy of projection to find the invariant direction [Finlayson et al. (2004)] : Correct direction – smaller entropy Wrong direction – higher entropy Uses internal evidence in image.
9
9 This paper: Sharpening Helps Argument at AIC05 [Finlayson et al. 2005] : recommended that if we sharpen the values in XYZ space, get better invariant. HOWEVER…
10
10 Proposed Approach: Sharpen sRGB However, going from sRGB to XYZ is a broadening transform: a counter- intuitive approach. Therefore we propose to sharpen the sRGB space itself. Works better!
11
11 Old: Sharpen XYZ; new: Sharpen sRGB Old: Assume input is in nonlinear sRGB space; linearize; transform to XYZ; sharpen XYZ; chromaticity; project lighting-change direction. New: Assume input is in nonlinear sRGB space; linearize; sharpen sRGB linear using synthetic data, producing standardized transform for all images ; chromaticity; project lighting-change direction.
12
12 sRGB to XYZ is a Broadening Transform
13
13 Comparing XYZ to sRGB: no sharpening Log-chromaticity coordinates for Macbeth patches, as light changes. XYZsRGB
14
14 Comparing sharpening XYZ to sharpening RGB After Sharpening XYZ R = 0.764 (with Mean Subtraction) sRGB R = 0.920 √ Synthetic: Macbeth + Planckians
15
15 Standard colour transform Colour transform is “data-based” sharpening, optimizing lighting-invariance of output (with positivity enforced) [Drew et al. 2002] The transformation matrix T that does so is the following,
16
16 Standard colour transform… Sharpen, form chromaticities, then project in standard direction
17
17 The Standardized Workflow
18
Apply Standardized method to measured chart data best fit - - - - standardized sRGB sharpening ─── 105 illuminants, Nikon D70
19
Macbeth chart, under 14 different daylights. HP912 camera. Apply Standardized method: invariant
20
Invariant image, formed by calibrating camera. Av. of Std. Dev. across illuminants= 4.42% Best possible:
21
Invariant image, formed by sharpening sRGB. Av. of Std. Dev. across illuminants= 6.11% not as good as calibrated version, of course! but usable. Standardized method:
22
input chromaticity, segmented for display output, segmented shadow gone√
23
23 Conclusion The sharpening transform does a good enough job finding an invariant, given that it does not depend on any information specific to the camera or even the image. It can serve as a preprocessing step to many different vision problems.
24
24 Thanks! To Natural Sciences and Engineering Research Council of Canada
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.