Presentation is loading. Please wait.

Presentation is loading. Please wait.

Extracting Depth and Matte using a Color-Filtered Aperture Yosuke Bando TOSHIBA + The University of Tokyo Bing-Yu Chen National Taiwan University Tomoyuki.

Similar presentations


Presentation on theme: "Extracting Depth and Matte using a Color-Filtered Aperture Yosuke Bando TOSHIBA + The University of Tokyo Bing-Yu Chen National Taiwan University Tomoyuki."— Presentation transcript:

1 Extracting Depth and Matte using a Color-Filtered Aperture Yosuke Bando TOSHIBA + The University of Tokyo Bing-Yu Chen National Taiwan University Tomoyuki Nishita The University of Tokyo

2 2 Outline Background Related Work Our Method Results Conclusion

3 3 Computational Cameras Capture various scene properties –High dynamic range, high resolution, –Large field of view, reflectance, depth, –... and more With elaborate imaging devices –Camera arrays –Additional optical elements [Wilburn 2005] [Nayar 1997]

4 4 Compact Computational Cameras Small devices Simple optical elements [Ng 2005][Levin 2007]

5 5 Our Goal Capture scene properties With minimal modification to the camera

6 6 Our Goal Capture scene properties –Depth maps –Alpha mattes With minimal modification to the camera Captured imageDepthMatte

7 7 Our Goal Capture scene properties –Depth maps –Alpha mattes With minimal modification to the camera –Put color filters in a camera lens aperture Captured imageDepthMatteLens with color filters

8 8 Our Goal Capture scene properties –Depth maps –Alpha mattes With minimal modification to the camera –Put color filters in a camera lens aperture –This idea itself is not new Captured imageDepthMatteLens with color filters

9 9 Contents Background Related Work Our Method Results Conclusion

10 10 Previous Color-Filter Methods Extract (only) depth maps –With low precision –Or, a specialized flashbulb is used Spoils the visual quality of captured images [Amari 1992][Chang 2002]

11 11 Coded Aperture Patterned mask in the aperture –Changes frequency characteristics of defocus –Facilitates blur identification/removal [Levin 2007] [Veeraraghavan 2007] Lens with a mask Captured imageAmount of defocus blur (  depth)

12 12 Single-Lens Multi-View Capture Records light rays separately depending on their incident angle –Enables light field rendering [Ng 2005] [Adelson 1992] [Liang 2008][Georgeiv 2006] [Veeraraghavan 2007]

13 13 Matting Automatic matting by multiple cameras [Joshi 2006] [McGuire 2005] 3 cameras with half mirrors Array of 8 cameras Video matting

14 14 Our Method Features –Automatic depth and matte extraction –Single hand-held camera –Single shot Contributions 1. Improved depth estimation 2. Novel matting algorithm for images captured thru a color-filtered aperture

15 15 Outline Background Related Work Our Method Results Conclusion

16 16 Our Method Color-filtered aperture Depth estimation Matting

17 17 Our Method Color-filtered aperture Depth estimation Matting

18 18 Our Prototype Camera Lens Canon EF 50mm f/1.8 II lens

19 19 Our Prototype Camera Lens Canon EF 50mm f/1.8 II lens Aperture part of the disassembled lens

20 20 Our Prototype Camera Lens Canon EF 50mm f/1.8 II lens Fujifilter SC-58, BPB-53, and BPB-45

21 21 Our Prototype Camera Lens Canon EF 50mm f/1.8 II lens Fujifilter SC-58, BPB-53, and BPB-45 Our prototype lens with color-filters

22 22 Our Prototype Camera Lens Took me just a few hours to fabricate –Using a micro-screwdriver and a box cutter Canon EF 50mm f/1.8 II lens Fujifilter SC-58, BPB-53, and BPB-45 Our prototype lens with color-filters

23 23 Captured Image

24 24 Red Plane

25 25 Green Plane

26 26 Blue Plane

27 27 Captured Image Has depth-dependent color-misalignment –NOT due to chromatic aberration

28 28 Why Do Colors Get Misaligned? Image sensor Lens Foreground object Color filters Background

29 29 Why Do Colors Get Misaligned? Image sensor Lens Foreground object Color filters Background

30 30 Why Do Colors Get Misaligned? Image sensor Lens Foreground object Color filters Background Image sensor Lens Color filters Background

31 31 Why Do Colors Get Misaligned? Image sensor Lens Foreground object Color filters Background Image sensor Lens Color filters Background

32 32 Our Method Color-filtered aperture Depth estimation Matting

33 33 Depth Estimation Our camera captures 3 views in the RGB planes –  Stereo reconstruction problem Red planeGreen planeBlue plane

34 34 Depth Estimation Our camera captures 3 views in the RGB planes –  Stereo reconstruction problem However, their intensities don’t match –Contribution 1: improved correspondence measure between the RGB planes Red planeGreen planeBlue plane

35 35 Original Image

36 36 Disparity = 1

37 37 Disparity = 2

38 38 Disparity = 3

39 39 Disparity = 4

40 40 Disparity = 5

41 41 Disparity = 6

42 42 When Is The Color Aligned? Disparity

43 43 Color-Alignment Measure Local color distribution of natural images tends to form a line [Omer 2004, Levin 2006]

44 44 Color-Alignment Measure Local color distribution of natural images tends to form a line [Omer 2004, Levin 2006]

45 45 Color-Alignment Measure Local color distribution of natural images tends to form a line [Omer 2004, Levin 2006]

46 46 Color-Alignment Measure Local color distribution of natural images tends to form a line [Omer 2004, Levin 2006]

47 47 Color-Alignment Measure Local color distribution of natural images tends to form a line Misalign by 1 pixel Disparity = 0Disparity = 1 [Omer 2004, Levin 2006]

48 48 Color-Alignment Measure Local color distribution of natural images tends to form a line Misalign by 1 pixel Disparity = 0Disparity = 1 [Omer 2004, Levin 2006]

49 49 Color-Alignment Measure Local color distribution of natural images tends to form a line Misalign by 3 pixels Disparity = 0Disparity = 1Disparity = 3 [Omer 2004, Levin 2006]

50 50 Color-Alignment Measure Local color distribution of natural images tends to form a line Misalign by 3 pixels Disparity = 0Disparity = 1Disparity = 3 [Omer 2004, Levin 2006]

51 51 Color-Alignment Measure Disparity = 0Disparity = 1Disparity = 3 Variances along the principal axes (eigenvalues) Variances along the RGB axes

52 52 Color-Alignment Measure Disparity = 0Disparity = 1Disparity = 3 Variances along the principal axes (eigenvalues) Variances along the RGB axes L = 0.003L = 0.11L = 0.39

53 53 Depth Estimation Solve for the disparity that makes the color-alignment measure minimum Captured image Pixel-wise estimates (intensity  depth)

54 54 Depth Estimation Solve for the disparity that makes the color-alignment measure minimum With smoothness constraints –Graph-cut optimization After graph-cuts [Boykov 2001] Captured image Pixel-wise estimates (intensity  depth)

55 55 Our Method Color-filtered aperture Depth estimation Matting

56 56 Matting Problem of estimating foreground opacity Input image Matte Foreground color Background color

57 57 Trimap Assigns each pixel to one of 3 labels –Strictly foreground (  = 1) –Strictly background (  = 0) –Unknown (  to be computed) Captured imageTrimap strictly back- ground strictly fore- ground k u n n o w n

58 58 Trimap Assigns each pixel to one of 3 labels –Strictly foreground (  = 1) –Strictly background (  = 0) –Unknown (  to be computed) Generated from the depth map Captured imageTrimap strictly back- ground strictly fore- ground k u n n o w n

59 59 Trimap-Based Matting Captured image Trimap

60 60 Trimap-Based Matting Errors remain where the foreground and background colors are similar Captured image Trimap [Levin 2006]

61 61 Trimap-Based Matting Errors remain where the foreground and background colors are similar –Contribution 2: matte error correction using color misalignment cues Captured image Trimap [Levin 2006]

62 62 Basic Idea Estimate foreground and background colors based on the current matte Captured image Current matte

63 63 Basic Idea Estimate foreground and background colors based on the current matte Captured image Current matte Estimated foreground color Estimated background color

64 64 Basic Idea Estimate foreground and background colors based on the current matte Detect inconsistent color misalignments Captured image Current matte Estimated foreground color Estimated background color

65 65 Synthetic Toy Example Ground truth matte Synthesized input image

66 66 Synthetic Toy Example Background Ground truth matte Synthesized input image Camera

67 67 Synthetic Toy Example Foreground Background Camera Ground truth matte Synthesized input image

68 68 Synthetic Toy Example Hard example –Similar foreground and background colors ForegroundBackground Foreground Background

69 69 Synthetic Toy Example Hard example –Similar foreground and background colors But solvable –Color misalignment cues from ‘x’ textures ForegroundBackground Foreground Background

70 70 Trimap-Based Matting Trimap Input image Foreground Background

71 71 Trimap-Based Matting Trimap Estimated matteInput image Foreground Background

72 72 Trimap-Based Matting Trimap Estimated matteInput image Ground truth Foreground Background

73 73 Matting Algorithm Input image Current matte Foreground Background

74 74 Matting Algorithm Input image Current matte Estimated foreground color Estimated background color Foreground Background

75 75 Matting Algorithm Input image Current matte Estimated foreground color Estimated background color Detected color-misalignment

76 76 Matting Algorithm Input image Current matte Estimated foreground color Estimated background color Detected color-misalignment Detected color-alignment

77 77 Matting Algorithm Input image Current matte Estimated foreground color Estimated background color Detected color-misalignment Detected color-alignment

78 78 Matting Algorithm Input image Final matte Foreground Background

79 79 Matting Algorithm Input image Final matte Ground truth Foreground Background

80 80 Outline Background Related Work Our Method Results Conclusion

81 81 Results of Depth & Matte Extraction Captured image DepthMatte

82 82 Comparison: Depth Estimation With the previous color-filter methods –Local estimation to show raw performance [Amari 1992] [Chang 2002] Captured Our method

83 83 Comparison: Matting With the trimap-based matting methods –The trimaps were generated by our method [Levin 2006] [Wang 2007]Our method Trimap

84 84 Comparison with Ground Truth Mattes Ground truth Composite image (color-aligned) Composite image (color-misaligned) Our method [Levin 2006][Wang 2007]

85 85 Image Editing Image composition Color-alignment reconstruction Novel view synthesis Refocusing Video matting

86 86 Example 1: Composition Captured image Composite Matte Different Background Different Background

87 87 Example 2: Color-Alignment Reconst. Matte Foreground color Background color Cancel the estimated misalignment Re-compose Cancel the estimated misalignment

88 88 Reconstructed Image

89 89 Captured Image

90 90 Examples 3 & 4: View/Focus Synthesis

91 91 Example 5: Video Matting

92 92 Outline Background Related Work Our Method Results Conclusion

93 93 Conclusion Automatic depth and matte extraction using a color-filtered aperture –Improved depth estimation –Novel matting algorithm

94 94 Conclusion Automatic depth and matte extraction using a color-filtered aperture –Improved depth estimation –Novel matting algorithm Easy-to-use computational photography –Put color filters in a camera lens –Take a single photo with a hand-held camera

95 95 Conclusion Automatic depth and matte extraction using a color-filtered aperture –Improved depth estimation –Novel matting algorithm Easy-to-use computational photography –Put color filters in a camera lens –Take a single photo with a hand-held camera Limitation –Entirely red objects cannot be handled

96 96 Thank You!! Any questions? Acknowledgments –Takeshi Naemura –Yusuke Iguchi –Takuya Saito –Johanna Wolf –Zoltan Szego –Paulo Silva –Saori Horiuchi

97 97 Other Possible Filter Arrangements Our prototype Combination with coded aperture Multi-aperture More light- efficient Even more light- efficient but with lower depth resolution Higher depth resolution Simple horizontal arrangement Arbitrary shapes


Download ppt "Extracting Depth and Matte using a Color-Filtered Aperture Yosuke Bando TOSHIBA + The University of Tokyo Bing-Yu Chen National Taiwan University Tomoyuki."

Similar presentations


Ads by Google