01/28/05© 2005 University of Wisconsin Last Time Improving Monte Carlo Efficiency.

Slides:



Advertisements
Similar presentations
Fast Depth-of-Field Rendering with Surface Splatting Jaroslav Křivánek CTU Prague IRISA – INRIA Rennes Jiří Žára CTU Prague Kadi Bouatouch IRISA – INRIA.
Advertisements

Ray tracing. New Concepts The recursive ray tracing algorithm Generating eye rays Non Real-time rendering.
1 Graphics CSCI 343, Fall 2013 Lecture 18 Lighting and Shading.
Light Fields PROPERTIES AND APPLICATIONS. Outline  What are light fields  Acquisition of light fields  from a 3D scene  from a real world scene 
CSE 872 Dr. Charles B. Owen Advanced Computer Graphics1 Ray Tracing Variants Distributed ray tracing Generalized rays Cone Tracing Beam Tracing Pencil.
Small f/number, “fast” system, little depth of focus, tight tolerances on placement of components Large f/number, “slow” system, easier tolerances,
Image Formation and Optics
Announcements. Projection Today’s Readings Nalwa 2.1.
Announcements Mailing list (you should have received messages) Project 1 additional test sequences online Talk today on “Lightfield photography” by Ren.
Announcements Mailing list Project 1 test the turnin procedure *this week* (make sure it works) vote on best artifacts in next week’s class Project 2 groups.
Lecture 12: Projection CS4670: Computer Vision Noah Snavely “The School of Athens,” Raphael.
Linear View Synthesis Using a Dimensionality Gap Light Field Prior
Image Formation III Chapter 1 (Forsyth&Ponce) Cameras “Lenses”
CIS 681 Distributed Ray Tracing. CIS 681 Anti-Aliasing Graphics as signal processing –Scene description: continuous signal –Sample –digital representation.
The Camera : Computational Photography Alexei Efros, CMU, Fall 2008.
Cylindrical and Spherical Coordinates Representation and Conversions.
Basic Principles of Imaging and Photometry Lecture #2 Thanks to Shree Nayar, Ravi Ramamoorthi, Pat Hanrahan.
09/18/03CS679 - Fall Copyright Univ. of Wisconsin Last Time Bump Mapping Multi-pass algorithms.
Cornell CS465 Fall 2004 Lecture 3© 2004 Steve Marschner 1 Ray Tracing CS 465 Lecture 3.
Cornell CS465 Fall 2004 Lecture 3© 2004 Steve Marschner 1 Ray Tracing CS 465 Lecture 3.
01/24/05© 2005 University of Wisconsin Last Time Raytracing and PBRT Structure Radiometric quantities.
Viewing and Projections
1 CS6825: Image Formation How are images created. How are images created.
02/25/05© 2005 University of Wisconsin Last Time Meshing Volume Scattering Radiometry (Adsorption and Emission)
Image Formation Fundamentals Basic Concepts (Continued…)
09/09/03CS679 - Fall Copyright Univ. of Wisconsin Last Time Event management Lag Group assignment has happened, like it or not.
Recap from Wednesday Two strategies for realistic rendering capture real world data synthesize from bottom up Both have existed for 500 years. Both are.
University of Texas at Austin CS384G - Computer Graphics Fall 2008 Don Fussell Distribution Ray Tracing.
CS559: Computer Graphics Lecture 9: Projection Li Zhang Spring 2008.
01/21/05© 2005 University of Wisconsin Last Time Course introduction A simple physically-based rendering example.
Filtering Robert Lin April 29, Outline Why filter? Filtering for Graphics Sampling and Reconstruction Convolution The Fourier Transform Overview.
02/10/03© 2003 University of Wisconsin Last Time Participating Media Assignment 2 –A solution program now exists, so you can preview what your solution.
Week 10 - Wednesday.  What did we talk about last time?  Shadow volumes and shadow mapping  Ambient occlusion.
In describing the propagation of light as a wave we need to understand: wavefronts: a surface passing through points of a wave that have the same phase.
1 Perception, Illusion and VR HNRS 299, Spring 2008 Lecture 2 Introduction, Light Course webpage:
Fourier Depth of Field Cyril Soler, Kartic Subr, Frédo Durand, Nicolas Holzschuch, François Sillion INRIA, UC Irvine, MIT CSAIL.
CS348B Lecture 7Pat Hanrahan, 2005 Camera Simulation EffectCause Field of viewFilm size, stops and pupils Depth of field Aperture, focal length Motion.
: Chapter 11: Three Dimensional Image Processing 1 Montri Karnjanadecha ac.th/~montri Image.
Games Development 1 Camera Projection / Picking CO3301 Week 8.
Linear filtering. Motivation: Noise reduction Given a camera and a still scene, how can you reduce noise? Take lots of images and average them! What’s.
CS559: Computer Graphics Lecture 9: Rasterization Li Zhang Spring 2008.
Cameras Digital Image Synthesis Yung-Yu Chuang 10/26/2006 with slides by Pat Hanrahan and Matt Pharr.
Basic Perspective Projection Watt Section 5.2, some typos Define a focal distance, d, and shift the origin to be at that distance (note d is negative)
Computer Graphics Camera Projection / Picking CO2409 Week 8 - Optional Advanced Material Not on Exam.
Ray Tracing Fall, Introduction Simple idea  Forward Mapping  Natural phenomenon infinite number of rays from light source to object to viewer.
Rendering Pipeline Fall, D Polygon Rendering Many applications use rendering of 3D polygons with direct illumination.
Image Formation III Chapter 1 (Forsyth&Ponce) Cameras “Lenses” Guido Gerig CS-GY 6643, Spring 2016 (slides modified from Marc Pollefeys, UNC Chapel Hill/
Reconnaissance d’objets et vision artificielle
Example: warping triangles Given two triangles: ABC and A’B’C’ in 2D (12 numbers) Need to find transform T to transfer all pixels from one to the other.
01/26/05© 2005 University of Wisconsin Last Time Raytracing and PBRT Structure Radiometric quantities.
Global Illumination (3) Path Tracing. Overview Light Transport Notation Path Tracing Photon Mapping.
1 Computational Vision CSCI 363, Fall 2012 Lecture 2 Introduction to Vision Science.
Noise Filtering in Monte Carlo Rendering
CIS 681 Distributed Ray Tracing. CIS 681 Anti-Aliasing Graphics as signal processing –Scene description: continuous signal –Sample –digital representation.
3D Rendering 2016, Fall.
Rendering Pipeline Fall, 2015.
CS262 – Computer Vision Lect 4 - Image Formation
The Camera : Computational Photography
Motion and Optical Flow
Distributed Ray Tracing
CSCE 441 Computer Graphics 3-D Viewing
Sampling and Antialiasing
© University of Wisconsin, CS559 Fall 2004
Path Tracing (some material from University of Wisconsin)
The Camera : Computational Photography
Announcements Midterm out today Project 1 demos.
Distributed Ray Tracing
Distributed Ray Tracing
Announcements Midterm out today Project 1 demos.
Distributed Ray Tracing
Presentation transcript:

01/28/05© 2005 University of Wisconsin Last Time Improving Monte Carlo Efficiency

01/28/05© 2005 University of Wisconsin Today Improving Efficiency with Monte Carlo Integration

01/28/05© 2005 University of Wisconsin Cameras The camera’s task is to take a pixel and compute a ray out into the scene –Camera is given an image point, a lens sample point, and a time –Output is a ray in world space, with normalized direction There are many types of camera –Orthographic –Perspective –Spherical –Define your own …

01/28/05© 2005 University of Wisconsin Depth of Field Details on constructing rays are in PBR Ch 6 All cameras in PBRT take parameters for depth of field –A Lens radius parameter –A focal distance parameter When asked for a ray, the camera gets sample values to use to compute a point on the lens

01/28/05© 2005 University of Wisconsin Realistic Lens System Aperture: Size controls how many world rays project to pixel

01/28/05© 2005 University of Wisconsin Simplified Lens Model (1) Image PtNear Clip PlaneFocal PlaneLens Plane All rays through a single point on the focal plane land at the same pixel – things at focal plane are in focus –The aperture controls how big the solid angle is that gets thorugh Aperture size

01/28/05© 2005 University of Wisconsin Simplified Lens Model (2) Rays through a point off the focal plane land at multiple pixel locations – circle of confusion Or, a single pixel sees multiple points at a given depth Image PtNear Clip PlaneFocal PlaneLens Plane Aperture size

01/28/05© 2005 University of Wisconsin Adjusting Rays for Depth of Field The lens radius is the size of the little circle on the lens –NOT aperture size – includes aperture and lens position Compute a sample within this circle –Circle is centered at same (x,y) as image point –It will be the “start” of our adjusted ray

01/28/05© 2005 University of Wisconsin Input Image PtNear Clip PlaneFocal PlaneLens Plane Start with ray that has origin at near clip plane and passes through the “focal point” –Regardless of where they hit the lens, all rays should hit focal plane at same location as this ray

01/28/05© 2005 University of Wisconsin Adjusting Ray (2) Compute hit point with focal plane Regardless of where we hit the lens, we should hit this same point Image PtNear Clip PlaneFocal PlaneLens Plane Aperture size

01/28/05© 2005 University of Wisconsin Adjusting Ray (3) New ray uses lens point as origin, and passes through focal plane point PBR is actually a little fuzzy on exactly where ray starts –Only makes sense if lens plane is same as near clip plane Image PtNear Clip PlaneFocal PlaneLens Plane Aperture size

01/28/05© 2005 University of Wisconsin Depth of Field Effect

01/28/05© 2005 University of Wisconsin Realistic Cameras Kolb et al. describe a more realistic camera model –Craig Kolb, Don Mitchell and Pat Hanrahan, “A realistic camera model for computer graphics”, SIGGRAPH '95, pp –Model all parts of lens system, including sizes and shapes of all sub- parts, distances between surfaces, index of refraction, etc

01/28/05© 2005 University of Wisconsin Using Kolb’s Model Sample in solid angle out of pixel Trace ray through lens system –Fast – know sequence of intersections already Also important to know the “exit pupil”, the range of solid angle that passes through the lens

01/28/05© 2005 University of Wisconsin Thick Lens Approximation WorldCamera Thin Lens WorldCamera Thick Lens Can be done with 4x4 transformation

01/28/05© 2005 University of Wisconsin Effect of Lens Incorrect field of view due to standard model is apparent Not in these images: –Distortion around the edge of the image: “coma”, “pincushion distortion”, “barrel” –“Vignetting”: darkening around edge due to rays from edge of image hitting obstacles inside the lens AccurateThick approxStandard graphics

01/28/05© 2005 University of Wisconsin Sampling The Lens (PBR Sect ) The camera is passed two canonical random variables for the lens sample These must be converted into samples on a disk First method:

01/28/05© 2005 University of Wisconsin But … Regions on the right all have equal area – which is requirement for uniform distribution But why is it still a problem?

01/28/05© 2005 University of Wisconsin Shirley’s method Regions are more similar in shape

01/28/05© 2005 University of Wisconsin Image Sampling and Reconstruction (PBR Chap 7) No time for an review of this topic –See CS559 notes We have several samples of the image at points scattered over the image plane –They are not uniformly arranged, which means most reconstruction theory, particularly frequency domain methods, are useless The problem is to combine them to determine the pixel’s final color –We want to do this as each sample comes in, because we may have many many more samples than pixels

01/28/05© 2005 University of Wisconsin Filtering We will use weighted interpolation to reconstruct: f is the filter function Filter has a width and height – area of support Sum is over samples falling inside support We can compute this as each sample comes in – is sample’s contribution to pixel –Same sample may contribute to many pixels

01/28/05© 2005 University of Wisconsin Filtering and Sampling Filters and samples can interact in strange ways Different sampler, same filter

01/28/05© 2005 University of Wisconsin Different Filters BoxGaussianMitchell

01/28/05© 2005 University of Wisconsin Box and Triangle Note: Code in PBR ignores normalization

01/28/05© 2005 University of Wisconsin Gaussian and Mitchell Gaussian tends to blur too much Mitchell enhances edges a little, which is perceptually pleasing –Parameterized filter: B and C. Keep B=2C=1 for good results

01/28/05© 2005 University of Wisconsin Windowed Sinc Would like a sinc function but without infinite support Solution is to multiply it by another functions that has finite support

01/28/05© 2005 University of Wisconsin Windowed Sinc

01/28/05© 2005 University of Wisconsin More Filters There is a wealth of research on filter design In images with noisy samples, as we will frequently see, a common idea is to avoid the effect of an outlier –Or view it as the one sample that found the really bright spot, and smooth it over many samples

01/28/05© 2005 University of Wisconsin Next Time Reflectance Functions