1 Diffusion Coded Photography for Extended Depth of Field SIGGRAPH 2010 Oliver Cossairt, Changyin Zhou, Shree Nayar Columbia University Supported by ONR.

Slides:



Advertisements
Similar presentations
MIT Media Lab Camera Culture Image Destabilization: Programmable Defocus using Lens and Sensor Motion Ankit Mohan, Douglas Lanman, Shinsaku Hiura, Ramesh.
Advertisements

IMAGE QUALITY.
Chapter 5 Lithography Introduction and application.
LIGHT AND THE RETINAL IMAGE: KEY POINTS Light travels in (more or less) straight lines: the pinhole camera’s inverted image Enlarging the pinhole leads.
Fast Separation of Direct and Global Images Using High Frequency Illumination Shree K. Nayar Gurunandan G. Krishnan Columbia University SIGGRAPH Conference.
Procam and Campro Shree K. Nayar Computer Science Columbia University Support: NSF, ONR Procams 2006 PROCAMS Shree K. Nayar,
Micro Phase Shifting Mohit Gupta and Shree K. Nayar Computer Science Columbia University Supported by: NSF and ONR.
--- some recent progress Bo Fu University of Kentucky.
Measuring BRDFs. Why bother modeling BRDFs? Why not directly measure BRDFs? True knowledge of surface properties Accurate models for graphics.
Light Fields PROPERTIES AND APPLICATIONS. Outline  What are light fields  Acquisition of light fields  from a 3D scene  from a real world scene 
Hemispherical Confocal Imaging using Turtleback Reflector
Small f/number, “fast” system, little depth of focus, tight tolerances on placement of components Large f/number, “slow” system, easier tolerances,

Apertureless Scanning Near-field Optical Microscopy: a comparison between homodyne and heterodyne approaches Journal Club Presentation – March 26 th, 2007.
Diffusion Coding Photography for Extended Depth of Field SIGGRAPH 2010 Ollie Cossairt, Changyin Zhou, Shree Nayar Columbia University.
When Does a Camera See Rain? Department of Computer Science Columbia University Kshitiz Garg Shree K. Nayar ICCV Conference October 2005, Beijing, China.
What are Good Apertures for Defocus Deblurring? Columbia University ICCP 2009, San Francisco Changyin Zhou Shree K. Nayar.
Photorealistic Rendering of Rain Streaks Department of Computer Science Columbia University Kshitiz Garg Shree K. Nayar SIGGRAPH Conference July 2006,
Structured Light in Scattering Media Srinivasa Narasimhan Sanjeev Koppal Robotics Institute Carnegie Mellon University Sponsor: ONR Shree Nayar Bo Sun.
Projection Defocus Analysis for Scene Capture and Image Display Li Zhang Shree Nayar Columbia University IIS SIGGRAPH Conference July 2006, Boston,
Linear View Synthesis Using a Dimensionality Gap Light Field Prior
Lensless Imaging with A Controllable Aperture Assaf Zomet and Shree K. Nayar Columbia University IEEE CVPR Conference June 2006, New York, USA.
7. Optical instruments 1) Cameras
Removing Weather Effects from Monochrome Images Srinivasa Narasimhan and Shree Nayar Computer Science Department Columbia University IEEE CVPR Conference.
Integral Photography A Method for Implementing 3D Video Systems.
Lecture 33: Computational photography CS4670: Computer Vision Noah Snavely.
Multi-Aperture Photography Paul Green – MIT CSAIL Wenyang Sun – MERL Wojciech Matusik – MERL Frédo Durand – MIT CSAIL.
Northeastern University, Fall 2005 CSG242: Computational Photography Ramesh Raskar Mitsubishi Electric Research Labs Northeastern University September.
Joel Willis. Photography = Capturing Light Best Light Sources and Directions Basics: Aperture, Shutter Speed, ISO, Focal Length, White Balance Intro to.
Depth from Diffusion Supported by ONR Changyin ZhouShree NayarOliver Cossairt Columbia University.
MERL, MIT Media Lab Reinterpretable Imager Agrawal, Veeraraghavan & Raskar Amit Agrawal, Ashok Veeraraghavan and Ramesh Raskar Mitsubishi Electric Research.
Introduction to Computational Photography. Computational Photography Digital Camera What is Computational Photography? Second breakthrough by IT First.
Shedding Light on the Weather
Waveguide High-Speed Circuits and Systems Laboratory B.M.Yu High-Speed Circuits and Systems Laboratory 1.
W. Thomas Cathey and Edward R. Dowski W. T. Cathey is with the Department of Electrical and Computer Engineering, University of Colorado. E. R. Dowski.
Computational photography CS4670: Computer Vision Noah Snavely.
Macro and Close-up Photography Digital Photography DeCal 2010 Nathan Yan Kellen Freeman Some slides adapted from Zexi Eric Yan Photo by Daniel Schwen.
Nonphotorealistic rendering, and future cameras Computational Photography, Bill Freeman Fredo Durand May 11, 2006.
Optics Real-time Rendering of Physically Based Optical Effects in Theory and Practice Masanori KAKIMOTO Tokyo University of Technology.
Specialization module TTM5 Part: Collaboration Space Building Block 2 Item/NTNU October L A Rønningen.
Austin Roorda, Ph.D. University of Houston College of Optometry
What Does Motion Reveal About Transparency ? Moshe Ben-Ezra and Shree K. Nayar Columbia University ICCV Conference October 2003, Nice, France This work.
High Throughput Microscopy
Yu-Wing Tai, Hao Du, Michael S. Brown, Stephen Lin CVPR’08 (Longer Version in Revision at IEEE Trans PAMI) Google Search: Video Deblurring Spatially Varying.
Tutorial on Computational Optical Imaging University of Minnesota September David J. Brady Duke University
A New Definition of Refraction: Basics and Beyond Austin Roorda, Ph.D. Unversity of Houston College of Optometry.
Motion Deblurring Using Hybrid Imaging Moshe Ben-Ezra and Shree K. Nayar Columbia University IEEE CVPR Conference June 2003, Madison, USA.
Mitsubishi Electric Research Labs (MERL) Super-Res from Single Motion Blur PhotoAgrawal & Raskar Amit Agrawal and Ramesh Raskar Mitsubishi Electric Research.
CS348B Lecture 7Pat Hanrahan, 2005 Camera Simulation EffectCause Field of viewFilm size, stops and pupils Depth of field Aperture, focal length Motion.
October 13, IMAGE FORMATION. October 13, CAMERA LENS IMAGE PLANE OPTIC AXIS LENS.
EG 2011 | Computational Plenoptic Imaging STAR | VI. High Speed Imaging1 Computational Plenoptic Imaging Gordon Wetzstein 1 Ivo Ihrke 2 Douglas Lanman.
Film/Sensor Where the light is recorded Lens Bends the light Trajectory of light Subject Source of light Focusing A look at the overall camera system.
 Marc Levoy  Light Field = Array of (virtual) Cameras Sub-aperture Virtual Camera = Sub-aperture View.
Extracting Depth and Matte using a Color-Filtered Aperture Yosuke Bando TOSHIBA + The University of Tokyo Bing-Yu Chen National Taiwan University Tomoyuki.
On the Evaluation of Optical Performace of Observing Instruments Y. Suematsu (National Astronomical Observatory of Japan) ABSTRACT: It is useful to represent.
Removing motion blur from a single image
Optical Sciences CenterThe University of Arizona ERROR ANALYSIS FOR CGH OPTICAL TESTING Yu-Chun Chang and James Burge Optical Science Center University.
Date of download: 6/24/2016 Copyright © ASME. All rights reserved. From: Characterization of “Bulk Lithography” Process for Fabrication of Three-Dimensional.
Bo Sun Kalyan Sunkavalli Ravi Ramamoorthi Peter Belhumeur Shree Nayar Columbia University Time-Varying BRDFs.
IMAGE QUALITY. SPATIAL RESOLUTION CONTRAST RESOLUTION NOISE IMAGE ARTIFACTS RADIATION DOSE.
Introduction Computational Photography Seminar: EECS 395/495
Extended Depth of Field For Long Distance Biometrics
Radon Transform Imaging
Multiplexed Illumination
Deconvolution , , Computational Photography
Rob Fergus Computer Vision
Removing motion blur from a single image
Unit 57 – Photography Depth of field
Deblurring Shaken and Partially Saturated Images
Fig. 1 Experimental setup.
Presentation transcript:

1 Diffusion Coded Photography for Extended Depth of Field SIGGRAPH 2010 Oliver Cossairt, Changyin Zhou, Shree Nayar Columbia University Supported by ONR and NSF

2 Conventional Camera (F/1.8)

3 Conventional Camera (F/18)

4 Focused Image PSF Image Noise Captured Image Camera Blur Model Spatial domain Focused ImageMTFImage NoiseCaptured Image Frequency domain (Modulated Transfer Function)

5 Deblurring Problems MTFCaptured image Low SNR Low MTF values Problem 1: Focused image Lens Sensor P Object Q Problem 2: Variation with depth

6 Extending Depth of Field (EDOF): Previous Work Focal Sweep Cameras [Hausler ’72] [Nagahara et al. ’08] LensFocal PlaneSensor Other Related Work [Levin et al. ’07] [Veeraraghavan et al. ’07] [Levin et al. ’09] Wavefront Coding Cameras [Dowski and Cathey ’95] [Chi and George ’01] [Garcia-Guerrero et al. ‘07] Lens Sensor Cubic Phase Plate

7 Focal sweep vs. wavefront coding Focal Sweep Wavefront Coding Conventional Camera near far focus depth Note: only a single PSF will be used to deblur the whole image.

8 Deblurring Error vs. Depth Deblurring Error noise Depth Wavefront Coding Focal Sweep Focal sweep vs. wavefront coding

9 Achieve the performance of focal sweep without any moving parts?

10 Optical Diffusers Diffuser [ Diffuser sheets w SensorDiffuser Light ray x x Scatter function w SEM image

11 Diffuser Kernels x u A/2 -A/2 u x Light Field Without a diffuser: Lens A Sensor

12 Diffuser Kernels A/2 -A/2 u x Light Field Without a diffuser: With a diffuser: A/2 -A/2 u x x u Lens Sensor x u Lens Sensor A A

13 Diffuser Kernels A/2 -A/2 u x Light Field Without a diffuser: With a diffuser: u x A/2 -A/2 x u Lens Sensor x u Lens Sensor A A

14 Diffusion Kernels A/2 -A/2 u x Without diffuser u x With diffuser u x Diffuser kernel Light field

15 A/2 -A/2 u x u x u x Light field Diffusion Kernels x project x x PSF Diffused PSF Scatter function Camera PSF Without diffuser With diffuserDiffuser kernel

16 Radially Symmetric Diffuser Lens Sensor PSF Without a diffuser:

17 Without a diffuser: Lens Sensor With a diffuser: Lens Sensor PSF Radially Symmetric Diffuser

18 Lens Sensor With a diffuser: Lens Sensor PSF Without a diffuser: Radially Symmetric Diffuser

19 Lens Sensor Lens Sensor PSF With a diffuser: Without a diffuser: Radially Symmetric Diffuser

20 Lens Sensor With a diffuser: Radially Symmetric Diffuser Lens Sensor PSF Without a diffuser:

21 Lens Sensor Lens Sensor PSF With a diffuser: Without a diffuser: Radially Symmetric Diffuser

22 Lens Sensor Lens Sensor PSF With a diffuser: Without a diffuser: Radially Symmetric Diffuser

23 Radially Symmetric Diffuser PSF: Scatter function Camera PSF (1D slice) MTF (1D slice) Normalized frequency -50px50px-50px50px-50px50px-50px50px depth

24 Diffusion Coding Performance Deblurring Error vs. Depth Depth Wavefront Coding Focus Sweep noise Diffusion Coding (light field) Diffusion Coding (wave optics) Similar performance to focal sweep without moving parts Deblurring Error

25 Diffuser Implementation Diffuser scatter function r (mm) [ Diffuser height map Fabricated diffuser 110 Thickness (um) r (mm) Diffuser surface profile RPC Photonics

26 Garcia-Guerrero Comparison with Prior Work Deblurring Error vs. Depth Depth Diffusion Coding Diffusion coding significantly outperforms prior work. Deblurring Error x

27 Diffusion Coding Experiments Fabricated DiffuserCannon 50mm EF lensCannon 450D Sensor Experimental Setup Measured PSFs depth Without diffuser With diffuser

28 Examples

29 Stuffed Toys f-number = 1.8, exposure time = 16ms Conventional Camera

30 Stuffed Toys f-number = 18, exposure time = 16ms Conventional Camera

31 Stuffed Toys Diffusion Coding Camera: Captured f-number = 1.8, exposure time = 16ms

32 Stuffed Toys f-number = 1.8, exposure time = 16ms Diffusion Coding Camera: Deblurred

33 Statues f-number = 1.8; exposure time = 10ms Captured Deblurred

34 People and Flowers f-number = 1.8, exposure time = 16ms Conventional Camera

35 People and Flowers f-number = 1.8, exposure time = 16ms Diffusion Coding Camera: Captured

36 People and Flowers f-number = 1.8, exposure time = 16ms Diffusion Coding Camera: Deblurred

37 Limitations Conventional CameraDiffusion Coding Loss of image texture Loss of contrast Occlusion errors

38 Conclusions Diffusion Coding Theory Diffusion Coding ImplementationDiffusion Coding Examples Radially Symmetric Diffusers

39 Diffusion Coded Photography for Extended Depth of Field SIGGRAPH 2010 Oliver Cossairt, Changyin Zhou, Shree Nayar Columbia University Supported by ONR and NSF