Fourier Depth of Field Cyril Soler, Kartic Subr, Frédo Durand, Nicolas Holzschuch, François Sillion INRIA, UC Irvine, MIT CSAIL.

Slides:



Advertisements
Similar presentations
Fast Depth-of-Field Rendering with Surface Splatting Jaroslav Křivánek CTU Prague IRISA – INRIA Rennes Jiří Žára CTU Prague Kadi Bouatouch IRISA – INRIA.
Advertisements

Computer graphics & visualization Global Illumination Effects.
Chunhui Yao 1 Bin Wang 1 Bin Chan 2 Junhai Yong 1 Jean-Claude Paul 3,1 1 Tsinghua University, China 2 The University of Hong Kong, China 3 INRIA, France.
Spherical Convolution in Computer Graphics and Vision Ravi Ramamoorthi Columbia Vision and Graphics Center Columbia University SIAM Imaging Science Conference:
Fourier Slice Photography
Light Fields PROPERTIES AND APPLICATIONS. Outline  What are light fields  Acquisition of light fields  from a 3D scene  from a real world scene 
Advanced Ray Tracing CMSC 435/634. Basic Ray Tracing For each pixel – Compute ray direction – Find closest surface – For each light Compute direct illumination.
5D COVARIA NCE TRACING FOR EFFICIENT DEFOCUS AND MOTION BLUR Laurent Belcour 1 Cyril Soler 2 Kartic Subr 3 Nicolas Holzschuch 2 Frédo Durand 4 1 Grenoble.
A Signal-Processing Framework for Forward and Inverse Rendering COMS , Lecture 8.
Monte Carlo Integration Robert Lin April 20, 2004.
Advanced Computer Graphics (Spring 2005) COMS 4162, Lectures 18, 19: Monte Carlo Integration Ravi Ramamoorthi Acknowledgements.
Advanced Computer Graphics (Fall 2010) CS 283, Lecture 10: Global Illumination Ravi Ramamoorthi Some images courtesy.
Advanced Computer Graphics (Fall 2010) CS 283, Lecture 17: Frequency Analysis and Signal Processing for Rendering Ravi Ramamoorthi
Global Illumination May 7, Global Effects translucent surface shadow multiple reflection.
Linear View Synthesis Using a Dimensionality Gap Light Field Prior
A Theory of Locally Low Dimensional Light Transport Dhruv Mahajan (Columbia University) Ira Kemelmacher-Shlizerman (Weizmann Institute) Ravi Ramamoorthi.
Frequency Analysis and Sheared Reconstruction for Rendering Motion Blur Kevin Egan Yu-Ting Tseng Nicolas Holzschuch Frédo Durand Ravi Ramamoorthi Columbia.
CIS 681 Distributed Ray Tracing. CIS 681 Anti-Aliasing Graphics as signal processing –Scene description: continuous signal –Sample –digital representation.
Basic Ray Tracing CMSC 435/634. Visibility Problem Rendering: converting a model to an image Visibility: deciding which objects (or parts) will appear.
Basic Principles of Imaging and Photometry Lecture #2 Thanks to Shree Nayar, Ravi Ramamoorthi, Pat Hanrahan.
Multi-Aperture Photography Paul Green – MIT CSAIL Wenyang Sun – MERL Wojciech Matusik – MERL Frédo Durand – MIT CSAIL.
Titre.
1 Fabricating BRDFs at High Spatial Resolution Using Wave Optics Anat Levin, Daniel Glasner, Ying Xiong, Fredo Durand, Bill Freeman, Wojciech Matusik,
01/28/05© 2005 University of Wisconsin Last Time Improving Monte Carlo Efficiency.
Ray Tracing Sang Il Park SEjong University With lots of slides stolen from Jehee Lee, Doug James, Steve Seitz, Shree Nayar, Alexei Efros, Fredo Durand.
02/10/03© 2003 University of Wisconsin Last Time Participating Media Assignment 2 –A solution program now exists, so you can preview what your solution.
Basic Ray Tracing CMSC 435/634. Visibility Problem Rendering: converting a model to an image Visibility: deciding which objects (or parts) will appear.
Fourier Analysis of Stochastic Sampling For Assessing Bias and Variance in Integration Kartic Subr, Jan Kautz University College London.
1 Plenoptic Imaging Chong Chen Dan Schonfeld Department of Electrical and Computer Engineering University of Illinois at Chicago May
A Frequency Analysis of Light Transport Fr é do Durand – MIT CSAIL With Nicolas Holzschuch, Cyril Soler, Eric Chan & Francois Sillion Artis Gravir/Imag-Inria.
Interreflections : The Inverse Problem Lecture #12 Thanks to Shree Nayar, Seitz et al, Levoy et al, David Kriegman.
Announcements Office hours today 2:30-3:30 Graded midterms will be returned at the end of the class.
A Theory of Monte Carlo Visibility Sampling
Ray Tracing Fall, Introduction Simple idea  Forward Mapping  Natural phenomenon infinite number of rays from light source to object to viewer.
Thank you for the introduction
Non-Linear Kernel-Based Precomputed Light Transport Paul Green MIT Jan Kautz MIT Wojciech Matusik MIT Frédo Durand MIT Henrik Wann Jensen UCSD.
01/26/05© 2005 University of Wisconsin Last Time Raytracing and PBRT Structure Radiometric quantities.
Global Illumination (3) Path Tracing. Overview Light Transport Notation Path Tracing Photon Mapping.
Distributed Ray Tracing. Can you get this with ray tracing?
CIS 681 Distributed Ray Tracing. CIS 681 Anti-Aliasing Graphics as signal processing –Scene description: continuous signal –Sample –digital representation.
Multi-Aperture Photography
Advanced Computer Graphics
Basic Ray Tracing CMSC 435/634.
3D Rendering 2016, Fall.
Working Group « Pre-Filtering »
Rendering Pipeline Fall, 2015.
CS262 – Computer Vision Lect 4 - Image Formation
Sampling and Reconstruction of Visual Appearance
Reconstruction For Rendering distribution Effect
Distributed Ray Tracing
Deconvolution , , Computational Photography
3D Graphics Rendering PPT By Ricardo Veguilla.
© University of Wisconsin, CS559 Fall 2004
(c) 2002 University of Wisconsin
Distribution Ray Tracing
Path Tracing (some material from University of Wisconsin)
CSCE 643 Computer Vision: Thinking in Frequency
Homework #3 Environment Lights
Chapter XVI Texturing toward Global Illumination
Distributed Ray Tracing
Monte Carlo I Previous lecture Analytical illumination formula
Progressive Photon Mapping Toshiya Hachisuka Henrik Wann Jensen
Monte Carlo Rendering Central theme is sampling:
CS5500 Computer Graphics May 29, 2006
Distributed Ray Tracing
Introduction to Ray Tracing
Distributed Ray Tracing
CSC418 Computer Graphics Raytracing Shadows Global Illumination.
Distributed Ray Tracing
Real-time Global Illumination with precomputed probe
Presentation transcript:

Fourier Depth of Field Cyril Soler, Kartic Subr, Frédo Durand, Nicolas Holzschuch, François Sillion INRIA, UC Irvine, MIT CSAIL

Defocus blur is important in photography

Defocus is due to integration over aperture Image Aperture Pixel p Lens

Defocus Image Aperture Pixel p Lens Scene

Monte Carlo estimate of aperture integral Image Aperture N A primary rays per pixel Integrate at p

Aperture integration is costly Image Aperture N P pixels N P x N A Primary rays N A Aperture samples

64 x #primary rays of the pinhole image Aperture integration is costly Paradox: More blurry image is costlier to compute!

Key observations

Observation 1: Image sampling Blurry regions should not require dense sampling of the image

Observation 2: Lens sampling Regions in focus should not require profuse sampling of the lens for diffuse objects

Observation 2: Aperture sampling Plane in focus At “sharp” pixels, rays are from same scene point Lens Image Regions in focus should not require profuse sampling of the lens for diffuse objects

Observation 2: Aperture sampling Plane in focus Variance depends on reflectance Lens Image Regions in focus should not require profuse sampling of the lens for diffuse objects

Goal: Adaptive sampling  Reduce number of primary rays  Adapt image and lens sampling rates based on Fourier bandwidth prediction

Sampling: 1) Image Sample blurry image regions sparsely Reference Our image samples

Sampling: 2) Aperture Sample aperture sparsely for objects in focus

Contributions  Fourier analysis of depth of field for image synthesis – Account for different transport phenomena  Mechanism for propagating local frequency content  Adaptive sampling of image and lens

Related work

Related work: Sampling approach [Cook et al. 84]  Trace multiple rays per pixel  Correctly account for phenomena  Costly [Cook et al. 87]

Related work: Image space approach [Kraus and Strengert 07] [Kass et al. 06]  Post process pinhole image using depth map  Fast although approximate  Correct handling of occlusion is a challenge [Potmesil and Chakravarty 81]

Related work: Frequency domain analysis [Chai et al. 2000] [Durand et al. 05] [Ramamoorthi and Hanrahan 04] [Ng. 05]

Algorithm

Typical Algorithm for estimating defocus for each pixel x in P for each sample y in L Sum ← Sum + EstimatedRadiance(x, y) P = {uniformly distributed image samples} N A // number of aperture samples Image (x) = Sum / N A L ← SampleLens(N A )

Our adaptive sampling for each pixel x in P P = {uniformly distributed image samples} N A (P, A ) ← BandwidthEstimation() P = {bandwidth dependent image samples} A = {aperture variance estimate} N A proportional to A(x) Reconstruct (Image, P) L ← SampleLens(N A ) for each sample y in L Image (x) = Sum / N A Sum ← Sum + EstimatedRadiance(x, y)

Algorithm Bandwidth Estimation Sampling rates over image and lens Estimate radiance rays through image and lens samples Reconstruct image from scattered radiance estimates

Theory: Propagation of light field spectra

Review: Local light field parametrization Space Angle [Durand05]

Local light field [Durand05] Review: Local light field as a density Angle Space 1D Lambertian emitter

Review: Local light field spectrum Local light field Power spectrum Fourier Transform [Durand05] Angular frequencies Spatial frequencies

Review: Transport & Local light field spectra [Durand05] Transport (free space) Processes Occlusion Reflectance Shear (angle) Operations Convolution Product

Our sampled representation  Samples in frequency space  Updated through light transport  Provides  bandwidth (max frequency)  Variance (sum of square frequencies) Spatial frequencies Angular frequencies Light field spectrum Sampled Light field spectrum

Our sampled representation  Samples in frequency space  Updated through light transport  Provides  bandwidth (max frequency)  Variance (sum of square frequencies) High spatial frequency High angular frequency Sampled Light field spectrum Spatial freq. Angular freq. Light field spectrum

Our sampled representation  Samples in frequency space  Updated through light transport  Provides  bandwidth (max frequency)  Variance (sum of square frequencies) Sampled Light field spectrum High spatial frequency Low angular frequency Angular freq. Light field spectrum

Our sampled representation  Samples in frequency space  Updated through light transport  Provides  bandwidth (max frequency)  Variance (sum of square frequencies) Sampled Light field spectrum Light field spectrum Spatial freq. Angular freq. Max angular freq.

Propagating light field spectra Aperture Sensor First intersection point P Primary ray through lens center

Propagating light field spectra Aperture Sensor Coarse depth image Scene Propagate spectra

Propagating light field spectra Aperture Sensor Coarse depth image Propagate spectra Scene Aperture variance Image –space bandwidth

Propagating light field spectra Aperture Sensor Coarse depth image Propagate spectra Aperture variance Image –space bandwidth Scene Sparse radiance Trace rays

Propagating light field spectra Aperture Sensor Primary ray through lens center Light field incident at P First intersection point P Local image bandwidth Variance over aperture

Propagating light field spectra Aperture Sensor Primary ray through lens center First intersection point P Reflection Light field incident at P

Propagating light field spectra Aperture Sensor Primary ray through lens center First intersection point P Reflection Transport through free space Light field incident at P

Propagating light field spectra Aperture Sensor Primary ray through lens center First intersection point P Reflection Transport through free space Aperture effect Light field incident at P

Propagating light field spectra Aperture Sensor Primary ray through lens center First intersection point P Reflection Transport through free space Aperture effect Light field incident at P

Propagating light field spectra

Incident light field  Assume full spectrum  Conservatively expect all frequencies  Simple, no illumination dependence Spatial frequency Angular frequency

Reflection: Last bounce to the eye  Convolution by BRDF  Fourier domain: Product of spectra Incident light field spectrum BRDF spectrum =x Light field spectrum after reflection

Reflection: Last bounce to the eye  Convolution by BRDF  Fourier domain: Product of spectra Incident light field spectrum BRDF spectrum =x Light field spectrum after reflection

Transport to aperture  Transport through free space: angular shear of the light field spectrum [Durand05] Spatial frequency Angular frequency

Transport to aperture Occluder  Transport through free space  Occlusion: Convolution with blocker spectrum [Durand05] = * Light field spectrum after occlusion Light field before occlusion Blocker spectrum

Occlusion test Occluder  To find occluders for ray through pixel p  Test if depth value at q is in cone of rays Image p

Occlusion test Occluder  To find occluders for ray through pixel p  Test if depth value at q is in cone of rays Image p q

Occlusion test Occluder  To find occluders for ray through pixel p  Test if depth value at q is in cone of rays Image p q

Occlusion test Occluder  To find occluders for ray through pixel p  Test if depth value at q is in cone of rays Image p q

Operations on sampled spectra X Y X+Y Draw Samples f(x)g(x)

Operations on sampled spectra X+Y f(x) g(x) * X+Y X Y Draw Samples f(x)g(x) Simply add frequency samples and sampled occluder spectra

Occlusion test Occluder  To find occluders for ray through pixel p  Test if depth value at q is in cone of rays Image p q

Transport to aperture Occluder  Transport through free space  Occlusion: Convolution with blocker spectrum [Durand05] = * Light field spectrum after occlusion Light field before occlusion Blocker spectrum

Transport to aperture Occluder  Transport through free space  Occlusion  Transport through free space Angular frequency Spatial frequency

Effect of finite aperture  Model integration of rays at the aperture as convolution in ray space

Fourier depth of field analysis Ray integration modeled as convolution ImageApertureLensPlane in focus

 Model integration of rays at the aperture as convolution in ray space Fourier depth of field analysis ImageApertureLensPlane in focus Ray integration modeled as convolution Dirac in space Box in angle Ray space

 Model integration of rays at the aperture as convolution in ray space Fourier depth of field analysis ImageApertureLensPlane in focus Ray integration modeled as convolution Dirac in space Box in angle Ray space Fourier Constant in space Sinc in angle Parametrization at plane in focus

 Model integration of rays at the aperture as convolution in ray space Fourier depth of field analysis ImageApertureLensPlane in focus Ray integration modeled as convolution Dirac in space Box in angle Ray space Fourier Constant in space Sinc in angle Shear by distance to Lens

Effect of finite aperture  Model integration as convolution in ray space  Hence product in Fourier space  See paper for details = x Incident spectrum at aperture Aperture response spectrum Light field spectrum after DOF effect

Transport to sensor  Angular shear [Durand05]  Usually small

Estimating sampling rates Local image bandwidth? Variance over aperture?

Propagating light field spectra Aperture Sensor Coarse depth image Propagate spectra Image –space bandwidth Scene ? Aperture variance ?

Spatial frequency Angular frequency

Aperture variance  Project obliquely onto angular axis

Aperture variance  Variance = (power spectrum) 2 – (DC) 2 [Parseval]

Local image-space bandwidth  Project horizontally onto angular axis

Local image-space bandwidth  See paper for details Bandwidth Max angular frequency

Summary: Operations on light field spectra Reflection Transport Aperture Transport Processes

Operations on light field spectra Product Shear (angle) Convolution Product Shear (angle) Reflection Transport Aperture Transport ProcessesOperations on spectra

Operations on sampled spectra Product (Band-limit) Shear (angle) Convolution Shear (angle) Product (Band-limit) Band-limit

Operations on sampled spectra Product (Band-limit) Shear (angle) Convolution Shear (angle) Product (Band-limit) Reject samples

Operations on sampled spectra Product Shear (angle) Convolution Product Shear (angle) P P’  Angular shear, distance s  Update P(x, t) to P’(x, t-sx)

Operations on sampled spectra Product Shear (angle) Convolution Product Shear (angle) X Y X+Y Draw Samples f(x)g(x)

Operations on sampled spectra Product Shear (angle) Convolution Product Shear (angle) X Y X+Y Draw Samples f(x)g(x) f(x) g(x) *

Bandwidth estimation is useful  Easy modification to existing algorithms  Efficient allocation of samples  Independent of method used to estimate radiance

Typical Algorithm for estimating defocus for each pixel x in P for each sample y in L Sum ← Sum + EstimatedRadiance(x, y) P = {uniformly distributed image samples} N A Image (x) = Sum / N A Algorithm SimDOF (P, N A ) L ← SampleLens(N A )

Simple modification to sampling for each pixel x in P P = {uniformly distributed image samples} N A Algorithm SimDOF (P, N A ) (P, A ) ← BandwidthEstimation() (P, A ) P = {bandwidth dependent image samples} A = {aperture variance estimate} N A proportional to A(x) Reconstruct (Image, P) L ← SampleLens(N A ) for each sample y in L Image (x) = Average / N A Sum ← Sum + EstimatedRadiance(x, y)

Summary Bandwidth Estimation Sample generation over image and lens Estimate radiance rays through image and lens samples Reconstruct image from scattered radiance estimates

Summary Bandwidth Estimation Sample generation over image and lens Estimate radiance rays through image and lens samples Reconstruct image from scattered radiance estimates < 2% of total time ~ 0.5% of total time

Bandwidth estimation is fast for each pixel x in P P = {uniformly distributed image samples} N A Algorithm SimDOF (P, N A ) (P, A ) ← BandwidthEstimation() (P, A ) P = {bandwidth dependent image samples} A = {aperture variance estimate} N A proportional to A(x) Reconstruct (Image, P) L ← SampleLens(N A ) for each sample y in L Image (x) = Average / N A Sum ← Sum + EstimatedRadiance(x, y) < 2% of total time ~ 0.5 % of total time

Results Local bandwidth (image space) Aperture variance

Results: Computation time (seconds) Bandwidth estimation Raytracing Image reconstruction

Results: Quality comparison (similar cost) Stratified lens sampling (70 lens samp/pixel) Our algorithm Adaptive sampling

Results: Cost comparison (similar quality) Speedup = 17.3 Speedup = 14.7 Speedup = 24.0 #Primary rays using standard technique #Primary rays using bandwidth prediction Speedup =

Results: Variance estimate Our estimated variance Computed reference variance

Results: Variance estimate Our estimated variance Computed reference variance

Summary of phenomena Reference Aperture variance Image-space bandwidth  Defocus  Reflectance  Occlusion

Reference Aperture variance Image-space bandwidth  Defocus  Reflectance  Occlusion Defocus Summary of phenomena

Reference Aperture variance Image-space bandwidth  Defocus  Reflectance  Occlusion Reflection Summary of phenomena

Reference  Defocus  Reflectance  Occlusion Aperture variance Image-space bandwidth Occlusion Summary of phenomena

Limitations  Approximate bandwidth – We do not account for illumination – Coarse approximation of band-limiting operations

Conclusion  Reduced number of primary rays for depth of field  Fourier analysis  New mechanism for propagating local bandwidth  Considered several transport phenomena

Acknowledgements   ANR "HFIBMR" (ANR-07-BLAN-0331)   INRIA Equipe Associée with MIT "Flexible Rendering“   INRIA post-doctoral program   NSF   Microsoft New Faculty Fellowship   Sloan fellowship

Thank you Reference Aperture variance Image-space bandwidth

Results: Similar cost Without bandwidth prediction With our bandwidth prediction

Thin lens system: Object in focus Plane in focus LensAperture Image

Thin lens system: Object out of focus Plane in focus Image LensAperture Blurry image

Thin lens system: Object out of focus Plane in focus Image LensAperture Circle of confusion

Depth of field (DOF) Image LensAperture Maximum acceptable circle of confusion Depth of Field

Blurry pixels

Focusing on a subject

Depth of field  Range of depths that appear acceptably sharp Aperture Sensor Scene Depth of field

Physically based image synthesis  Mimic the effect of a shallow depth of field  Costly affair

Our speedup 17.3x 24.0x

Pinhole camera Image Pinhole aperture Pixel p

Aperture integration is costly Image Aperture N P pixels N P x N A Primary rays Pinhole image N P primary rays

Sampling: 1) Image Image Aperture Which pixels to trace primary rays through? ?

Sampling: 2) Aperture How many primary rays through a given p? Image Aperture Pixel p ?

Review: Local light field [Durand05] 4D light field around a central ray Central ray [Durand05]

Our approach Aperture Sensor Scene Transport local-light field spectral information

Our approach Aperture Sensor Scene Local bandwidth Aperture variance

Our approach  Transport spectra of local light field to sensor  Predict image-space bandwidth and aperture variance  Derive sampling rates Aperture Sensor Scene Estimate sampling rates