Real-Time Lens Blur Effects and Focus Control Sungkil Lee, Elmar Eisemann, and Hans-Peter Seidel Sunyeong Kim Nov. 23 nd. 2010.

Slides:



Advertisements
Similar presentations
Computer graphics & visualization. Image Synthesis – WS 07/08 Dr. Jens Krüger – Computer Graphics and Visualization Group Our GPU Photon Tracing Pipeline.
Advertisements

Fast Depth-of-Field Rendering with Surface Splatting Jaroslav Křivánek CTU Prague IRISA – INRIA Rennes Jiří Žára CTU Prague Kadi Bouatouch IRISA – INRIA.
Parallax-Interpolated Shadow Map Occlusion
Ray tracing. New Concepts The recursive ray tracing algorithm Generating eye rays Non Real-time rendering.
Chapter 23:Mirrors and Lenses Flat Mirrors Homework assignment : 20,24,42,45,51  Image of a point source P P’ The reflected rays entering eyes look as.
Flat Mirrors Consider an object placed in front of a flat mirror
Lenses in the Paraxial Limit
3D Graphics Rendering and Terrain Modeling
Lab 10: Lenses 1.Focal Length 2.Magnification 3.Lens Equation 4.Depth of Field 5.Spherical Aberrations and Curved Focal Plane 6.Chromatic Aberration 7.Astigmatism.
© 2010 Adobe Systems Incorporated. All Rights Reserved. T. Georgiev, Adobe Systems A. Lumsdaine, Indiana University The Multi-Focus Plenoptic Camera.
Light Field Rendering Shijin Kong Lijie Heng.
Light Field Stitching with a Plenoptic Camera Zhou Xue LCAV - École Polytechnique Fédérale de Lausanne Dec
Paper Presentation - Micropolygon Ray Tracing With Defocus and Motion Blur - Qiming Hou, Hao Qin, Wenyao Li, Baining Guo, Kun Zhou Presenter : Jong Hyeob.
Small f/number, “fast” system, little depth of focus, tight tolerances on placement of components Large f/number, “slow” system, easier tolerances,
Christian Lauterbach COMP 770, 2/11/2009
Real-Time Rendering Paper Presentation Imperfect Shadow Maps for Efficient Computation of Indirect Illumination T. Ritschel T. Grosch M. H. Kim H.-P. Seidel.
Rasterization and Ray Tracing in Real-Time Applications (Games) Andrew Graff.
Image Formation and Optics
Lenses We will only consider “thin” lenses where the thickness of the lens is small compared to the object and image distances. Eugene Hecht, Optics, Addison-Wesley,
Final Gathering on GPU Toshiya Hachisuka University of Tokyo Introduction Producing global illumination image without any noise.
CSCE 641 Computer Graphics: Image-based Rendering (cont.) Jinxiang Chai.
Siggraph’2000, July 27, 2000 Jin-Xiang Chai Xin Tong Shing-Chow Chan Heung-Yeung Shum Microsoft Research, China Plenoptic Sampling SIGGRAPH’2000.
CIS 681 Distributed Ray Tracing. CIS 681 Anti-Aliasing Graphics as signal processing –Scene description: continuous signal –Sample –digital representation.
The Camera : Computational Photography Alexei Efros, CMU, Fall 2008.
 Marc Levoy IBM / IBR “The study of image-based modeling and rendering is the study of sampled representations of geometry.”
Light Field. Modeling a desktop Image Based Rendering  Fast Realistic Rendering without 3D models.
I NTERACTIVE V OLUME R ENDERING FOR V IRTUAL C OLONOSCOPY IEEE Proceedings of Visualization, Phoenix, U.S.A., Oct. 1997, pp. 433 – 436 Presented.
Technology and Historical Overview. Introduction to 3d Computer Graphics  3D computer graphics is the science, study, and method of projecting a mathematical.
Geometric Optics September 14, Areas of Optics Geometric Optics Light as a ray. Physical Optics Light as a wave. Quantum Optics Light as a particle.
Physics 014 Images. Topics  Plane mirrors  Spherical mirrors  Thin lenses.
01/28/05© 2005 University of Wisconsin Last Time Improving Monte Carlo Efficiency.
Image Formation Fundamentals Basic Concepts (Continued…)
Wavefront Tracing for Precise Bokeh Evaluation Real-time Rendering of Physically Based Optical Effects in Theory and Practice Masanori KAKIMOTO Tokyo.
Recap from Wednesday Two strategies for realistic rendering capture real world data synthesize from bottom up Both have existed for 500 years. Both are.
Optics Real-time Rendering of Physically Based Optical Effects in Theory and Practice Masanori KAKIMOTO Tokyo University of Technology.
Geometric Optics. An object inside the focus casts a virtual image that is only focused by the eye.
Reporter: Wade Chang Advisor: Jian-Jiun Ding 1 Depth Estimation and Focus Recovery.
1 Finding depth. 2 Overview Depth from stereo Depth from structured light Depth from focus / defocus Laser rangefinders.
Rendering Overview CSE 3541 Matt Boggus. Rendering Algorithmically generating a 2D image from 3D models Raster graphics.
Yi-Chin Fang, Institute of Electro-Optical Engineering, National Kaohsiung First Univ. of Science and Technology Improvements of Petzval Field Curvature.
Image Based Rendering. Light Field Gershun in 1936 –An illuminated objects fills the surrounding space with light reflected of its surface, establishing.
CS348B Lecture 7Pat Hanrahan, 2005 Camera Simulation EffectCause Field of viewFilm size, stops and pupils Depth of field Aperture, focal length Motion.
.1 The BLUR Project Brian A. Barsky University of California, Berkeley.
Image formation.
Ray Tracing Fall, Introduction Simple idea  Forward Mapping  Natural phenomenon infinite number of rays from light source to object to viewer.
Real-Time Relief Mapping on Arbitrary Polygonal Surfaces Fabio Policarpo Manuel M. Oliveira Joao L. D. Comba.
Bounding Volume Hierarchy. The space within the scene is divided into a grid. When a ray travels through a scene, it only passes a few boxes within the.
Discontinuous Displacement Mapping for Volume Graphics, Volume Graphics 2006, July 30, Boston, MA Discontinuous Displacement Mapping for Volume Graphics.
Compact, Fast and Robust Grids for Ray Tracing Ares Lagae & Philip Dutré 19 th Eurographics Symposium on Rendering EGSR 2008Wednesday, June 25th.
CSCE 641 Computer Graphics: Image-based Rendering (cont.) Jinxiang Chai.
Depth-of-Field Rendering with Multiview Synthesis SigGraph Asia 2009 Sungkil Lee, Elmar Eisemann, and Hans-Peter Seidel Sunyeong Kim Nov. 2 nd
Chapter 18 Mirrors and Lenses. Objectives 18.1 Explain how concave, convex, and plane mirrors form images 18.1 Locate images using ray diagrams, and calculate.
Camera surface reference images desired ray ‘closest’ ray focal surface ‘closest’ camera Light Field Parameterization We take a non-traditional approach.
CS559: Computer Graphics Lecture 36: Raytracing Li Zhang Spring 2008 Many Slides are from Hua Zhong at CUM, Paul Debevec at USC.
Geometric Optics: Mirrors and Lenses. Mirrors with convex and concave spherical surfaces. Note that θ r = θ i for each ray.
Real-Time Relief Mapping on Arbitrary Polygonal Surfaces Fabio Policarpo Manuel M. Oliveira Joao L. D. Comba.
Physically Based Lens Flare
Presented by 翁丞世  View Interpolation  Layered Depth Images  Light Fields and Lumigraphs  Environment Mattes  Video-Based.
SHADOW CASTER CULLING FOR EFFICIENT SHADOW MAPPING JIŘÍ BITTNER 1 OLIVER MATTAUSCH 2 ARI SILVENNOINEN 3 MICHAEL WIMMER 2 1 CZECH TECHNICAL UNIVERSITY IN.
Depth of Field for Photorealistic Ray Traced Images JESSICA HO AND DUNCAN MACMICHAEL MARCH 7, 2016 CSS552: TOPICS IN RENDERING.
3D Rendering 2016, Fall.
Rendering Pipeline Fall, 2015.
HW #4, Due Sep. 21 Ch. 2: P28, PH8, PH16 Ch. 3: P3, P5.
Reconstruction For Rendering distribution Effect
A Forest of Sensors: Tracking
3D Graphics Rendering PPT By Ricardo Veguilla.
© University of Wisconsin, CS559 Fall 2004
Factors that Influence the Geometric Detection Pattern of Vehicle-based Licence Plate Recognition Systems Martin Rademeyer Thinus Booysen, Arno Barnard.
Primary Sample Space Path Guiding
Distributed Ray Tracing
Presentation transcript:

Real-Time Lens Blur Effects and Focus Control Sungkil Lee, Elmar Eisemann, and Hans-Peter Seidel Sunyeong Kim Nov. 23 nd. 2010

2 Goal ● Real-time Rendering system for defocus blur and lens effects. ● Efficient algorithm for DOF and lens blur effects ● Interactive and intuitive focus control system ● Generalized method for expressive DOF rendering

3 Overview ● Rendering Algorithm ● Optical Aberration ● Controlling Focus and Lens Effects

4 Realistic Lens Blur ● Two steps 1. An Image-based layered representation of the scene using a modified depth-peeling strategy 2. Tracing several rays for each sensor (pixels)

5 Layer Construction ● Layered-image based representation using depth peeling ● Important Observation ● Layer pixels that cannot be reached by any lens rays do not need to be extracted ● A depth-peeled representation is point- sampled at the pixel centers

6 Lens Ray Tracing 1.Intersection test of the aperture and the lens surface 2.Intersection test of the ray against the layer ● Using footprint ● All lens rays in parallel

7 Lens Ray Tracing ● Two challenges ● Bounding footprint for all lens ray in a depth interval ● Computing the min/max values in a footprint region.

8 Bounding the Footprint ● For a depth interval [d 1,d 2 ], It is enough to take the maximum of the COCs at d 1 and d 2. ● Collect intersection point and compute the bounding quad in image space ● Given the bounding quad, Min/Max depth value can be computed

9 Min/Max Depth Value ● Using N-buffer [Décoret 2005] ● Set of textures {T i } of identical resolution ● T 0 is the original image ● Pixel p in T i contains the minimum and maximum value of T 0 inside a square of size 2 i x2 i around p. ● Some pixels are empty. Then, depth is 0.

10 Multi-Layer Packing ● Packing four depth values into a single RGBA texture ● Share one N-Buffer ● Simultaneously testing all four depth values

11 Optical Aberration ● Spherical Aberration ● Curvature of Field (COF) ● Chromatic Aberration ● An empirical equation by Sellmeier

12 Control Focus and Lens Effect ● Controlling Focus for Standard Lens Models ● Thin Lens ● Spherical thick Lens ● Tilt-Shift Photography ● Expressive Focus Control ● Focal Surface ● DOF Interpolation ● Expressive Aberration Effects ● Less accurate simulation with more intuitive control.

13 Result

14 Result

15 Result

16 Result ● Result Video Result Video

17 Discussion ● Extended umbra peeling ● Artifact-free rendering ● In comparison to the single-pass decomposition ● Slower depth peeling ● More cache efficient ● Multiple layers in parallel ● Less and more-predictable arithmetic operations

18 Conclusion ● This is a real-time lens-blur rendering system ● Managing many lens aberration effects ● Simply controlling DOF blur

19 THANK YOU Question or Comment?