Samuel W. Hasinoff Sing Bing Kang Richard Szeliski Interactive Visual Media Group Microsoft Research Dept. of Computer Science University of Toronto Boundary Matting for View Synthesis 2 nd Workshop on Image and Video Registration, July 2, 2004
Motivation Superior view synthesis & 3D editing from N -view stereo Key approach: occlusion boundaries as 3D curves More suitable for view synthesis Boundaries estimated to sub-pixel Two major limitations – even with perfect stereo! Resampling blur Boundary artifacts
B2B2 B3B3 Matting problem: Unmix the foreground & background Matting from Stereo Triangulation matting (Smith & Blinn, 1996) multiple backgrounds fixed viewpoint & object F B1B1 Extension to stereo Lambertian assumption F B3B3 B1B1 B2B2 underdetermined
Occlusion Boundaries in 3D Model boundaries as 3D splines (currently linear) Assumptions boundaries are relatively sharp relatively large-scale objects no internal transparency view 1view 3 view 2 (reference) 3D world
Geometric View of Alpha alpha partial pixel coverage on F side simulate blurring by convolving with 2D Gaussian alpha depends only on projected 3D curve, x integration over each pixel F B pixel j
Related Work Natural image matting [Chuang et al., 2001] based on color statistics Intelligent scissors [Mortenson, 2000] geometric view of alpha - single image - user-assisted
Related Work Bayesian Layer estimation [Wexler and Fitzgibbon, 2002] matting from multiple images using triangulation + priors - requires very high-quality stereo - alpha calculated at pixel level, only for reference - not suitable for view synthesis
Boundary Matting Algorithm 3D world view 1view 3view 2 (reference) find occlusion boundary in reference view backproject to 3D using stereo depth project to other views initial guess for B i and F optimize matting optimize
Initial Boundaries From Stereo Find depth discontinuities Greedily segment longest four-connected curves Spline control points evenly spaced along curve Tweak - snap to strongest nearby edge
Background Estimation F B1B1 B2B2 Use stereo to grab corresponding background-depth pixels from nearby views (if possible) Color consistency check to avoid mixed pixels B3B3 occluded
Foreground Estimation Invert matting equation, given 3D curve and B Aggregate F estimates over all views
Optimization Objective: Minimize inconsistency with matting over curve parameters, x, and foreground colors, F Pixels with unknown B not included Non-linear least squares, using forward differencing for Jacobian
Additional Penalty Terms Favor control points at strong edges define potential field around each edgel Discourage large motions (>2 pixels) helps avoid degenerate curves
Naïve object insertion (no matting)
Object insertion with Boundary Matting
Naïve object insertion (no matting) Object insertion with Boundary Matting
Naïve object insertion (no matting) Object insertion with Boundary Matting boundaries calculated with subpixel accuracy
Samsung commercial sequence
Naïve object insertion (no matting) Object insertion with Boundary Matting
Boundary MattingNaïve method
Boundary MattingNaïve method
boundary mattingboundary matting (sigma = 13)boundary matting (sigma = 26)compositebackgroundno matting Synthetic Noise
Concluding Remarks Boundary Matting better view synthesis refines stereo at occlusion boundaries subpixel boundary estimation Future work incorporate color statistics extend to dynamic setting
Pixel-level Matting for View Synthesis? - resampling for view synthesis can lead to blurring artifacts at boundaries. - this example can be represented exactly using a sub-pixel boundary model instead