Depth from Diffusion Supported by ONR Changyin ZhouShree NayarOliver Cossairt Columbia University.

Slides:



Advertisements
Similar presentations
Recovery of relative depth from a single observation using an uncalibrated (real-aperture) camera Vinay P. Namboodiri Subhasis Chaudhuri Department of.
Advertisements

Planar Orientation from Blur Gradients in a Single Image Scott McCloskey Honeywell Labs Golden Valley, MN, USA Michael Langer McGill University Montreal,
S INGLE -I MAGE R EFOCUSING AND D EFOCUSING Wei Zhang, Nember, IEEE, and Wai-Kuen Cham, Senior Member, IEEE.
Design of photographic lens Shinsaku Hiura Osaka University.
The Camera : Computational Photography Alexei Efros, CMU, Fall 2006.
Digital Camera Essential Elements Part 1 Sept
LIGHT AND THE RETINAL IMAGE: KEY POINTS Light travels in (more or less) straight lines: the pinhole camera’s inverted image Enlarging the pinhole leads.
Procam and Campro Shree K. Nayar Computer Science Columbia University Support: NSF, ONR Procams 2006 PROCAMS Shree K. Nayar,
IITB-Monash Research Academy An Indian-Australian Research Partnership IIT Bombay Projection Defocus Correction using Adaptive Kernel Sampling and Geometric.
Advanced Effects CMSC 435/634. General Approach Ray Tracing – Shoot more rays Rasterization – Render more images.
In all cameras, light enters through the lens and hits the recording medium. (In film cameras, the film plane, in digital cameras (for our purposes),
Generalized Mosaics Yoav Y. Schechner, Shree Nayar Department of Computer Science Columbia University.
Small f/number, “fast” system, little depth of focus, tight tolerances on placement of components Large f/number, “slow” system, easier tolerances,

Image Formation and Optics
Diffusion Coding Photography for Extended Depth of Field SIGGRAPH 2010 Ollie Cossairt, Changyin Zhou, Shree Nayar Columbia University.
What are Good Apertures for Defocus Deblurring? Columbia University ICCP 2009, San Francisco Changyin Zhou Shree K. Nayar.
Image Formation1 Projection Geometry Radiometry (Image Brightness) - to be discussed later in SFS.
Modeling the imaging system Why? If a customer gives you specification of what they wish to see, in what environment the system should perform, you as.
CAU Kiel DAGM 2001-Tutorial on Visual-Geometric 3-D Scene Reconstruction 1 The plan for today Leftovers and from last time Camera matrix Part A) Notation,
Structured Light in Scattering Media Srinivasa Narasimhan Sanjeev Koppal Robotics Institute Carnegie Mellon University Sponsor: ONR Shree Nayar Bo Sun.
Linear View Synthesis Using a Dimensionality Gap Light Field Prior
1 Diffusion Coded Photography for Extended Depth of Field SIGGRAPH 2010 Oliver Cossairt, Changyin Zhou, Shree Nayar Columbia University Supported by ONR.
Lensless Imaging with A Controllable Aperture Assaf Zomet and Shree K. Nayar Columbia University IEEE CVPR Conference June 2006, New York, USA.
Properties of Lenses Dr. Kenneth Hoffman. Focal Length The distance from the optical center of a lens to the film plane (imaging chip) when the lens is.
7. Optical instruments 1) Cameras
Technology Transitions Workshop Camera Selection and Testing.
Evaluate. Review: Light Volume Control: 1. f stops are light transmission standards: f 8 transmits the same amount of light for any lens. 2.Lenses can.
6.098 Digital and Computational Photography Advanced Computational Photography Photography Survival Kit Bill Freeman Frédo Durand MIT - EECS.
Basic Principles of Imaging and Photometry Lecture #2 Thanks to Shree Nayar, Ravi Ramamoorthi, Pat Hanrahan.
Multi-Aperture Photography Paul Green – MIT CSAIL Wenyang Sun – MERL Wojciech Matusik – MERL Frédo Durand – MIT CSAIL.
Photography Parts of a Camera. Aperture size (or width or diameter) of the opening of a lens diaphragm inside a photographic lens regulates the amount.
Northeastern University, Fall 2005 CSG242: Computational Photography Ramesh Raskar Mitsubishi Electric Research Labs Northeastern University September.
Depth Estimate and Focus Recovery Presenter : Wen-Chih Hong Adviser: Jian-Jiun Ding Digital Image and Signal Processing Laboratory Graduate Institute of.
Design of photographic lens Shinsaku Hiura Osaka University.
1 CS6825: Image Formation How are images created. How are images created.
Shedding Light on the Weather
Image Formation Fundamentals Basic Concepts (Continued…)
Computational photography CS4670: Computer Vision Noah Snavely.
Ray Tracing Sang Il Park SEjong University With lots of slides stolen from Jehee Lee, Doug James, Steve Seitz, Shree Nayar, Alexei Efros, Fredo Durand.
An Interactive Background Blurring Mechanism and Its Applications NTU CSIE 1 互動式背景模糊.
Optics Real-time Rendering of Physically Based Optical Effects in Theory and Practice Masanori KAKIMOTO Tokyo University of Technology.
1 Finding depth. 2 Overview Depth from stereo Depth from structured light Depth from focus / defocus Laser rangefinders.
Calibration of the LSST Camera Andy Scacco. LSST Basics Ground based 8.4m triple mirror design Mountaintop in N. Chile Wide 3.5 degree field survey telescope.
Miguel Tavares Coimbra
October 13, IMAGE FORMATION. October 13, CAMERA LENS IMAGE PLANE OPTIC AXIS LENS.
Cameras. Question: If you’re building a camera and want to make a larger image (a telephoto lens) you should: 1.increase the diameter of the lens 2.decrease.
The Physics of Photography
Film/Sensor Where the light is recorded Lens Bends the light Trajectory of light Subject Source of light Focusing A look at the overall camera system.
Extracting Depth and Matte using a Color-Filtered Aperture Yosuke Bando TOSHIBA + The University of Tokyo Bing-Yu Chen National Taiwan University Tomoyuki.
N ational T aiwan U niversity T aipei, T aiwan (R.O.C.) Feb Institute of Communication Engineering Digital Image and Signal.
Lenses Lenses define 2 important things: Angle of view (focal length) Aperture.
Camera LENSES, APERTURE AND DEPTH OF FIELD. Camera Lenses Wide angle lenses distort the image so that extreme wide angle can look like its convex such.
Chapter 2: The Lens. Focal Length is the distance between the center of a lens and the film plane when focused at infinity.
N A S A G O D D A R D S P A C E F L I G H T C E N T E R I n s t r u m e n t S y n t h e s i s a n d A n a l y s i s L a b o r a t o r y APS Formation Sensor.
There are four main functions of the lens…. The first and most obvious is…
Lenses Lenses define 2 important things: Angle of view (focal length) Aperture.
Distributed Ray Tracing. Can you get this with ray tracing?
Extended Depth of Field For Long Distance Biometrics
Imaging and Depth Estimation in an Optimization Framework
Depth of Field Objective: to photograph a subject with a foreground and background, giving a better understanding of aperture and its relation to depth.
Camera Selection and Testing
Aperture, Exposure and Depth of Field
Rob Fergus Computer Vision
Announcements Midterm out today Project 1 demos.
Depth Of Field (DOF).
Interactive Background Blurring
ART HIGH SCHOOL - PHOTOGRAPHY
Announcements Midterm out today Project 1 demos.
Distributed Ray Tracing
Presentation transcript:

Depth from Diffusion Supported by ONR Changyin ZhouShree NayarOliver Cossairt Columbia University

Optical Diffuser

Micrograph of a Holographic Diffuser (RPC Photonics) ~ 10 micron [Gray, 1978] [Chang et al., 2006] [Garcia-Guerrero et al. 2007]

Diffusers as Accessories Diffuser to preview the image (B&H) Diffusers to soften the image Diffusers for illumination (B&H)

Camera Diffuser Object Camera Diffuser Object The amount of diffusion varies with depth. Diffusion Encodes Depth

Geometry of Diffusion: A Pinhole Camera Sensor Pinhole Object P Q Miss

Object P Geometry of Diffusion: A Pinhole Camera Sensor θ Pinhole Diffuser

Geometry of Diffusion: A Pinhole Camera Pinhole θ θ Sensor Diffuser Object P A B

Geometry of Diffusion: A Pinhole Camera O V U Z θ θ Pinhole Diffusion Law: Sensor A B Object 2r P Object Diffuser

Geometry of Diffusion: A Pinhole Camera O V U Z Pinhole Sensor A B Object 2r P Diffusion Size and Depth: Object Diffuser θ θ

Geometry of Diffusion: A Pinhole Camera O V U Pinhole Sensor Object 2r Diffuser as a proxy object Diffusion Size and Depth: P Z Diffuser

Diffusion as Convolution: A Pinhole Camera Latent clear image Captured Image Diffusion Size Diffusion PSF Assume field angle and depth are constant for small image patches, we have:

Geometry of Diffusion: A Lens Camera

O V U Pinhole Sensor Object 2r Diffuser as a proxy object P Z Diffuser

Geometry of Diffusion: A Lens Camera V U Sensor Object 2r Diffuser as a proxy object P Z Diffuser Lens The captured image can be further blurred due to defocus.

Diffusion as Convolution: A Lens Camera The Final PSF Defocus PSF Diffusion PSF For a lens camera with a diffuser, we have: is the diffusion PSF if a pinhole were used. is the defocus PSF if the diffuser were removed.

Depth from Diffusion (DFDiff) Algorithm Same form as in DFD 2. Estimate Blur Size r 3. Compute Depth Z 1. Capture Two Images With a diffuser Without a diffuser

Depth from Diffusion vs. Depth from Defocus [Pentland, 1987] [Subbarao, 1988] [Watanabe & Nayar, 1996] [Chaudhuri & Rajagopalan, 1999] [Favaro & Soatto, 2005] [Schechner & Kiryati, 2000] Depth from Defocus Aperture pattern P Lens Focal Plane Sensor Z r Depth from Diffusion Diffusion pattern Sensor Pinhole θ θ Diffuser Z P r

Depth from Diffusion vs. Depth from Defocus Depth from Diffusion P Any lens is fine! Suppose 22.5x15mm Sensor, 10 um pixel, 100 mm EFL Object distance = 1000 mm Object A Diffuser of 21.8 o Depth precision is about 0.1 mm. Field of View

Depth from Diffusion vs. Depth from Defocus Depth from Defocus P Lens Aperture diameter ? Suppose 22.5x15mm Sensor, 10 um pixel, 100 mm EFL Depth precision is about 0.1 mm. Object distance = 1000 mm Object Field of View

Depth from Diffusion vs. Depth from Defocus Depth from Defocus Object distance = 1000 mm Suppose 22.5x15mm Sensor, 10 um pixel, 100 mm EFL Depth precision is about 0.1 mm. Aperture diameter 800 mm Object P

Depth from Diffusion vs. Depth from Defocus Depth from Diffusion Suppose 22.5x15mm Sensor, 10 um pixel, 100 mm EFL P Depth precision is about 1.0 mm. Object distance = 5000 mm Object Any lens is fine! A Diffuser of 11.2 o

Depth from Diffusion vs. Depth from Defocus Depth from Defocus P Lens Aperture diameter ? Suppose 22.5x15mm Sensor, 10 um pixel, 100 mm EFL Depth precision is about 1.0 mm. Object distance = 5000 mm Object

Depth from Diffusion vs. Depth from Defocus Depth from Defocus Suppose 22.5x15mm Sensor, 10 um pixel, 100 mm EFL Depth precision is about 1.0 mm. Aperture diameter 2000 mm P Object distance = 5000 mm Object

PSF Measurement: A Pinhole Camera F/22, Field Angle = 0 o Z = 2 mm Z = 5 mm CapturedModeled - Canon EOS T1i; EF 50mm F/1.8 Lens; - Luminit Holographic Diffuser (10 o Gaussian ); - Diffuser distance: U = 1m

Z = 2 mm Z = 5 mm PSF Measurement: A Pinhole Camera CapturedModeled F/22, Field Angle = 10 o - Canon EOS T1i; EF 50mm F/1.8 Lens; - Luminit Holographic Diffuser (10 o Gaussian ); - Diffuser distance: U = 1m

PSF Measurement: A Lens Camera F/1.8, Field Angle = 10 o CapturedModeled - Canon EOS T1i; EF 50mm F/1.8 Lens; - Luminit Holographic Diffuser (10 o Gaussian ); - Diffuser distance: U = 1m

Experiments Five playing cards, 0.29mm thick each Canon 20D + 50mm Lens Luminit Diffuser (20 o )

Experiments Captured WITHOUT a DiffuserCaptured WITH a Diffuser

Experiments Computed Depth Map (~ 0.1 mm precision) (mm) Five playing cards, 0.29mm thick each

Experiments A small sculpture of about 4mm thickness Canon G5 Compact Camera Luminit Diffuser (5 o )

Experiments Captured WITHOUT a DiffuserCaptured WITH a Diffuser

Experiments Computed Depth Map A 3D View of Depth Map A small sculpture of about 4mm thickness

Experiments Canon 20D; Gaussian Diffuser (10 o ) 450 mm 650 mm

Experiments Stitched Depth Map (precision) (mm)

Summary Formulated the image formation with optical diffusers Formulated the image formation with optical diffusers Proposed Depth from Diffusion Proposed Depth from Diffusion - Require a diffuser on the object side + High-precision depth estimation + Distant objects + Less sensitive to lens aberrations Demonstrated high-precision depth estimation Demonstrated high-precision depth estimation Camera Diffuser Object

Depth from Diffusion Supported by ONR Changyin ZhouShree NayarOliver Cossairt Columbia University