Multi-Aperture Photography Paul Green – MIT CSAIL Wenyang Sun – MERL Wojciech Matusik – MERL Frédo Durand – MIT CSAIL.

Slides:



Advertisements
Similar presentations
MIT Media Lab Camera Culture Image Destabilization: Programmable Defocus using Lens and Sensor Motion Ankit Mohan, Douglas Lanman, Shinsaku Hiura, Ramesh.
Advertisements

Planar Orientation from Blur Gradients in a Single Image Scott McCloskey Honeywell Labs Golden Valley, MN, USA Michael Langer McGill University Montreal,
S INGLE -I MAGE R EFOCUSING AND D EFOCUSING Wei Zhang, Nember, IEEE, and Wai-Kuen Cham, Senior Member, IEEE.
Design of photographic lens Shinsaku Hiura Osaka University.
Digital Camera Essential Elements Part 1 Sept
Fourier Slice Photography
Light Fields PROPERTIES AND APPLICATIONS. Outline  What are light fields  Acquisition of light fields  from a 3D scene  from a real world scene 
© 2010 Adobe Systems Incorporated. All Rights Reserved. T. Georgiev, Adobe Systems A. Lumsdaine, Indiana University The Multi-Focus Plenoptic Camera.
Shaojie Zhuo, Dong Guo, Terence Sim School of Computing, National University of Singapore CVPR2010 Reporter: 周 澄 (A.J.) 01/16/2011 Key words: image deblur,
Depth of Field. What the what?? Is Depth of Field.
Light field photography and videography Marc Levoy Computer Science Department Stanford University.
Light Field Rendering Shijin Kong Lijie Heng.
Light Field Stitching with a Plenoptic Camera Zhou Xue LCAV - École Polytechnique Fédérale de Lausanne Dec
Personal Photo Enhancement using Example Images Neel Joshi Wojciech Matusik, Edward H. Adelson, and David J. Kriegman Microsoft Research, Disney Research,

Diffusion Coding Photography for Extended Depth of Field SIGGRAPH 2010 Ollie Cossairt, Changyin Zhou, Shree Nayar Columbia University.
What are Good Apertures for Defocus Deblurring? Columbia University ICCP 2009, San Francisco Changyin Zhou Shree K. Nayar.
Image or Object? Michael F. Cohen Microsoft Research.
Interactive Matting Christoph Rhemann Supervised by: Margrit Gelautz and Carsten Rother.
Linear View Synthesis Using a Dimensionality Gap Light Field Prior
1 Diffusion Coded Photography for Extended Depth of Field SIGGRAPH 2010 Oliver Cossairt, Changyin Zhou, Shree Nayar Columbia University Supported by ONR.
CIS 681 Distributed Ray Tracing. CIS 681 Anti-Aliasing Graphics as signal processing –Scene description: continuous signal –Sample –digital representation.
CSCE 641: Computer Graphics Image-based Rendering Jinxiang Chai.
Lecture 33: Computational photography CS4670: Computer Vision Noah Snavely.
6.098 Digital and Computational Photography Advanced Computational Photography Photography Survival Kit Bill Freeman Frédo Durand MIT - EECS.
Basic Principles of Imaging and Photometry Lecture #2 Thanks to Shree Nayar, Ravi Ramamoorthi, Pat Hanrahan.
Building a Real Camera.
Light Field. Modeling a desktop Image Based Rendering  Fast Realistic Rendering without 3D models.
Depth of Field Module 5. Aperture is the camera part that controls the amount of light that enters the camera. Aperture is measured in f/stops. The size.
Northeastern University, Fall 2005 CSG242: Computational Photography Ramesh Raskar Mitsubishi Electric Research Labs Northeastern University September.
Depth from Diffusion Supported by ONR Changyin ZhouShree NayarOliver Cossairt Columbia University.
MERL, MIT Media Lab Reinterpretable Imager Agrawal, Veeraraghavan & Raskar Amit Agrawal, Ashok Veeraraghavan and Ramesh Raskar Mitsubishi Electric Research.
Design of photographic lens Shinsaku Hiura Osaka University.
Digital Photography A tool for Graphic Design Graphic Design: Digital Photography.
Introduction to Computational Photography. Computational Photography Digital Camera What is Computational Photography? Second breakthrough by IT First.
776 Computer Vision Jan-Michael Frahm Fall Camera.
Dynamically Reparameterized Light Fields Aaron Isaksen, Leonard McMillan (MIT), Steven Gortler (Harvard) Siggraph 2000 Presented by Orion Sky Lawlor cs497yzy.
Computational photography CS4670: Computer Vision Noah Snavely.
Ray Tracing Sang Il Park SEjong University With lots of slides stolen from Jehee Lee, Doug James, Steve Seitz, Shree Nayar, Alexei Efros, Fredo Durand.
Macro and Close-up Photography Digital Photography DeCal 2010 Nathan Yan Kellen Freeman Some slides adapted from Zexi Eric Yan Photo by Daniel Schwen.
An Interactive Background Blurring Mechanism and Its Applications NTU CSIE 1 互動式背景模糊.
Yu-Wing Tai, Hao Du, Michael S. Brown, Stephen Lin CVPR’08 (Longer Version in Revision at IEEE Trans PAMI) Google Search: Video Deblurring Spatially Varying.
Reporter: Wade Chang Advisor: Jian-Jiun Ding 1 Depth Estimation and Focus Recovery.
Tutorial on Computational Optical Imaging University of Minnesota September David J. Brady Duke University
Have you ever desecrated beautiful flowers with a horrible picture like this?
03/24/03© 2003 University of Wisconsin Last Time Image Based Rendering from Sparse Data.
View-Dependent Precomputed Light Transport Using Nonlinear Gaussian Function Approximations Paul Green 1 Jan Kautz 1 Wojciech Matusik 2 Frédo Durand 1.
Fourier Depth of Field Cyril Soler, Kartic Subr, Frédo Durand, Nicolas Holzschuch, François Sillion INRIA, UC Irvine, MIT CSAIL.
I DOF. DOF works because of an optical property known as the “circle of confusion”.
Lecture 18 Optical Instruments
The Physics of Photography
EG 2011 | Computational Plenoptic Imaging STAR | VI. High Speed Imaging1 Computational Plenoptic Imaging Gordon Wetzstein 1 Ivo Ihrke 2 Douglas Lanman.
Extracting Depth and Matte using a Color-Filtered Aperture Yosuke Bando TOSHIBA + The University of Tokyo Bing-Yu Chen National Taiwan University Tomoyuki.
CSCE 641 Computer Graphics: Image-based Rendering (cont.) Jinxiang Chai.
An Interactive Background Blurring Mechanism and Its Applications NTU CSIE Yan Chih-Yu Advisor: Wu Ja-Ling, Ph.D. 1.
Removing motion blur from a single image
Chapter 2: The Lens. Focal Length is the distance between the center of a lens and the film plane when focused at infinity.
Camera surface reference images desired ray ‘closest’ ray focal surface ‘closest’ camera Light Field Parameterization We take a non-traditional approach.
A photograph is a two-dimensional representation of our three- dimensional world. A way to give an indication of three dimensions in photographs is by.
Multi-Aperture Photography
Building Omnicam and Multisensor Cameras
Imaging and Depth Estimation in an Optimization Framework
A tool for Graphic Design
Deconvolution , , Computational Photography
Sampling and Reconstruction of Visual Appearance
Media Production Richard Trombly Contact :
Aperture & Depth of Field
Depth Of Field (DOF).
Distributed Ray Tracing
A tool for Graphic Design
Presentation transcript:

Multi-Aperture Photography Paul Green – MIT CSAIL Wenyang Sun – MERL Wojciech Matusik – MERL Frédo Durand – MIT CSAIL

Motivation Portrait Landscape Small Aperture Large Aperture Depth of Field Control Shallow Depth of Field Large Depth of Field

plane of focus Depth and Defocus Blur sensor lens defocus blur depends on distance from plane of focus subject rays from point in focus converge to single pixel circle of confusion

Defocus Blur & Aperture lens plane of focus defocus blur depends on aperture size aperture sensor subject circle of confusion

Goals Aperture size is a critical parameter for photographers ■post-exposure depth of field control ■extrapolate shallow depth of field beyond physical aperture

Outline Multi-Aperture Camera ■ New camera design ■ Capture multiple aperture settings simultaneously Applications ■ Depth of field control ■ Depth of field extrapolation ■ (Limited) refocusing

Related Work Computational Cameras ■ Plenoptic Cameras ■ Adelson and Wang ‘92 ■ Ng et al ‘05 ■ Georgiev et al ‘06 ■ Split-Aperture Camera ■ Aggarwal and Ahuja ‘04 ■ Optical Splitting Trees ■ McGuire et al ‘07 ■ Coded Aperture ■ Levin et al ’07 ■ Veeraraghavan et al ’07 ■ Wavefront Coding ■ Dowski and Cathey ‘95 Depth from Defocus ■ Pentland ‘87 Georgiev et al‘06 Aggarwal and Ahuja ‘04 McGuire et al ‘07 Adelson and Wang ‘92 Levin et al ’07 Veeraraghavan et al ’07

Plenoptic Cameras Capture 4D LightField ■ 2D Spatial (x,y) ■ 2D Angular (u,v Aperture) Trade resolution for flexibility after capture ■ Refocusing ■ Depth of field control ■ Improved Noise Characteristics Lens Aperture u v Sensor (x,y) Lenslet Array Subject Lens (u,v)

1D vs 2D Aperture Sampling u v Aperture 2D Grid Sampling

4 Samples u v Aperture 2D Grid Sampling 1D vs. 2D Aperture Sampling Aperture 1D “Ring” Sampling 45 Samples

Optical Splitting Trees General framework for sampling imaging parameters ■ Beamsplitters ■ Multiple cameras Large Aperture Camera Small Aperture Camera McGuire et al ‘07 Beamsplitter Incoming light

Goals ■post-exposure depth of field control ■extrapolate shallow depth of field ■(limited) refocusing ■1d sampling ■no beamsplitters ■single sensor ■removable

Outline Multi-Aperture Camera ■ New camera design ■ Capture multiple aperture settings simultaneously Applications ■ Depth of field control ■ Depth of field extrapolation ■ Refocusing

Optical Design Principles Aperture 3D sampling ■ 2D spatial ■ 1D aperture size ■ 1 image for each “ring” Sensor

Goal: Split aperture into 4 separate optical paths ■ concentric tilted mirrors ■ at aperture plane Aperture Splitting Tilted Mirrors

Aperture Splitting Incoming light Sensor Mirrors Focusing lenses Tilted Mirrors

Aperture Splitting Photographic Lens Aperture Plane Relay system Aperture splitting optics New Aperture Plane X Ideally at aperture plane, but not physically possible! Solution: Relay Optics to create virtual aperture plane

Optical Prototype Mirror Close-up main lens relay optics mirrors tilted mirrors lenses SLR Camera

Sample Data Raw data from our camera

Ideally would be rings Gaps are from occlusion Point Spread Function Occlusion combined inner ring 1 ring 2 outer

Outline Multi-Aperture Camera ■ New camera design ■ Capture multiple aperture settings simultaneously Applications ■ Depth of field control ■ Depth of field extrapolation ■ Refocusing

DOF Navigation

Approximate defocus blur as convolution DOF Extrapolation? ? - Circular aperture blurring kernel Depends on depth and aperture size What is at each pixel in ?

Blur size Aperture Diameter Largest physical aperture DOF Extrapolation Roadmap capture estimate blur fit model extrapolate blur I I E E I 1 I 2 I 0 I 3

Blur size Aperture Diameter D I 1 I 2 I E I 0 σ I 3 Largest physical aperture Defocus Gradient Defocus blur G is slope of this line Defocus Gradient Map Defocus Gradient focal length aperture diameter sensor distance object distance Blur proportional to aperture diameter

Optimization solve for discrete defocus gradient values G at each pixel Data term Graph Cuts with spatial regularization term Defocus Gradient Map Smallest Aperture Image

Depth of Field Extrapolation

Synthetic Refocusing Modify gradient labels and re-synthesize image gradient map “refocused” map extrapolated f/1.8 “refocused” synthetic f/1.8

Synthetic Refocusing Video

Depth Guided Deconvolution Deconvolve (deblur) with kernel given by defocus gradient map Before After depth-guided deconvolution Defocus gradient map Smallest aperture image

Discussion ■Occlusion ■Could help depth discrimination (coded aperture) ■Difficult alignment process ■Mostly because prototype ■Refocusing limited by Depth of Field ■helped by depth-guided deconvolution ■Texture required for accurate defocus gradient map ■Not critical for depth of field and refocus

Summary ■ Multi-aperture camera ■ 1D sampling of aperture ■ Removable ■ Post-Exposure depth of field control ■ Depth of field extrapolation ■ Limited refocusing ■ Depth-guided deconvolution

Thanks People ■ John Barnwell ■ Jonathan Westhues ■ SeBaek Oh ■ Daniel Vlasic ■ Eugene Hsu ■ Tom Mertens ■ Britton Bradley ■ Jane Malcolm ■ MIT Graphics Group Funding ■ NSF CAREER award ■ Ford Foundation predoctoral Fellowship ■ Microsoft Research New Faculty Fellowship ■ Sloan Fellowship