Importance Resampling for Global Illumination Justin F. Talbot Master’s Thesis Defense Brigham Young University Provo, UT.

Slides:



Advertisements
Similar presentations
Introduction Simple Random Sampling Stratified Random Sampling
Advertisements

Combining Monte Carlo Estimators If I have many MC estimators, with/without various variance reduction techniques, which should I choose?
Monte Carlo Methods and Statistical Physics
Sampling: Final and Initial Sample Size Determination
Statistics for Managers Using Microsoft® Excel 5th Edition
April 21, 2010 STAT 950 Chris Wichman. Motivation Every ten years, the U.S. government conducts a population census, and every five years the U. S. National.
Sampling Attila Gyulassy Image Synthesis. Overview Problem Statement Random Number Generators Quasi-Random Number Generation Uniform sampling of Disks,
Monte Carlo Integration Robert Lin April 20, 2004.
Advanced Computer Graphics (Spring 2005) COMS 4162, Lectures 18, 19: Monte Carlo Integration Ravi Ramamoorthi Acknowledgements.
Advanced Computer Graphics (Fall 2009) CS 294, Rendering Lecture 5: Monte Carlo Path Tracing Ravi Ramamoorthi
Resampling techniques Why resampling? Jacknife Cross-validation Bootstrap Examples of application of bootstrap.
Ranked Set Sampling: Improving Estimates from a Stratified Simple Random Sample Christopher Sroka, Elizabeth Stasny, and Douglas Wolfe Department of Statistics.
Machine Learning CUNY Graduate Center Lecture 7b: Sampling.
Sérgio Pequito Phd Student
Monte Carlo Integration COS 323 Acknowledgment: Tom Funkhouser.
Basic Business Statistics, 10e © 2006 Prentice-Hall, Inc.. Chap 7-1 Chapter 7 Sampling Distributions Basic Business Statistics 10 th Edition.
Advanced Computer Graphics (Spring 2006) COMS 4162, Lecture 20: Monte Carlo Path Tracing Ravi Ramamoorthi Acknowledgements.
1 Monte Carlo Global Illumination Brandon Lloyd COMP 238 December 16, 2002.
Sequential Monte Carlo and Particle Filtering Frank Wood Gatsby, November 2007 TexPoint fonts used in EMF. Read the TexPoint manual before you delete this.
The Excel NORMDIST Function Computes the cumulative probability to the value X Business Statistics: A First Course, 5e © 2009 Prentice-Hall, Inc
7-1 Copyright ©2011 Pearson Education, Inc. publishing as Prentice Hall Chapter 7 Sampling and Sampling Distributions Statistics for Managers using Microsoft.
STAT 4060 Design and Analysis of Surveys Exam: 60% Mid Test: 20% Mini Project: 10% Continuous assessment: 10%
Photo-realistic Rendering and Global Illumination in Computer Graphics Spring 2012 Stochastic Radiosity K. H. Ko School of Mechatronics Gwangju Institute.
Chapter 7 Sampling and Sampling Distributions n Simple Random Sampling n Point Estimation n Introduction to Sampling Distributions n Sampling Distribution.
1 1 Slide © 2005 Thomson/South-Western Slides Prepared by JOHN S. LOUCKS St. Edward’s University Slides Prepared by JOHN S. LOUCKS St. Edward’s University.
1 Miodrag Bolic ARCHITECTURES FOR EFFICIENT IMPLEMENTATION OF PARTICLE FILTERS Department of Electrical and Computer Engineering Stony Brook University.
1 1 Slide © 2003 Thomson/South-Western Slides Prepared by JOHN S. LOUCKS St. Edward’s University.
Simulating the value of Asian Options Vladimir Kozak.
Chap 20-1 Statistics for Business and Economics, 6e © 2007 Pearson Education, Inc. Chapter 20 Sampling: Additional Topics in Sampling Statistics for Business.
1 Lesson 3: Choosing from distributions Theory: LLN and Central Limit Theorem Theory: LLN and Central Limit Theorem Choosing from distributions Choosing.
Machine Learning Lecture 23: Statistical Estimation with Sampling Iain Murray’s MLSS lecture on videolectures.net:
Monte Carlo I Previous lecture Analytical illumination formula This lecture Numerical evaluation of illumination Review random variables and probability.
Copyright ©2011 Pearson Education 7-1 Chapter 7 Sampling and Sampling Distributions Statistics for Managers using Microsoft Excel 6 th Global Edition.
Particle Filters for Shape Correspondence Presenter: Jingting Zeng.
Module 1: Statistical Issues in Micro simulation Paul Sousa.
Learning Lateral Connections between Hidden Units Geoffrey Hinton University of Toronto in collaboration with Kejie Bao University of Toronto.
1 Lesson 8: Basic Monte Carlo integration We begin the 2 nd phase of our course: Study of general mathematics of MC We begin the 2 nd phase of our course:
1 Chapter 7 Sampling and Sampling Distributions Simple Random Sampling Point Estimation Introduction to Sampling Distributions Sampling Distribution of.
1 SMU EMIS 7364 NTU TO-570-N Inferences About Process Quality Updated: 2/3/04 Statistical Quality Control Dr. Jerrell T. Stracener, SAE Fellow.
© 2013 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part.
02/10/03© 2003 University of Wisconsin Last Time Participating Media Assignment 2 –A solution program now exists, so you can preview what your solution.
Sampling Design and Analysis MTH 494 LECTURE-12 Ossam Chohan Assistant Professor CIIT Abbottabad.
Basic Business Statistics, 10e © 2006 Prentice-Hall, Inc.. Chap 7-1 Chapter 7 Sampling Distributions Basic Business Statistics.
Importance Resampling for Global Illumination Justin Talbot, David Cline, and Parris Egbert Brigham Young University Provo, UT.
Chapter 7 Point Estimation of Parameters. Learning Objectives Explain the general concepts of estimating Explain important properties of point estimators.
Machine Learning Chapter 5. Evaluating Hypotheses
Chapter 19 Monte Carlo Valuation. Copyright © 2006 Pearson Addison-Wesley. All rights reserved Monte Carlo Valuation Simulation of future stock.
Basic Business Statistics, 11e © 2009 Prentice-Hall, Inc. Chap 7-1 Chapter 7 Sampling and Sampling Distributions Basic Business Statistics 11 th Edition.
02/12/03© 2003 University of Wisconsin Last Time Intro to Monte-Carlo methods Probability.
McGraw-Hill/IrwinCopyright © 2014 by The McGraw-Hill Companies, Inc. All rights reserved. SAMPLING Chapter 14.
Photo-realistic Rendering and Global Illumination in Computer Graphics Spring 2012 Stochastic Path Tracing Algorithms K. H. Ko School of Mechatronics Gwangju.
Slide 1Lastra, 2/14/2016 Monte-Carlo Methods. Slide 2Lastra, 2/14/2016 Topics Kajiya’s paper –Showed that existing rendering methods are approximations.
Introduction to Sampling Methods Qi Zhao Oct.27,2004.
The Unscented Particle Filter 2000/09/29 이 시은. Introduction Filtering –estimate the states(parameters or hidden variable) as a set of observations becomes.
Gil McVean, Department of Statistics Thursday February 12 th 2009 Monte Carlo simulation.
01/26/05© 2005 University of Wisconsin Last Time Raytracing and PBRT Structure Radiometric quantities.
Ka-fu Wong © 2007 ECON1003: Analysis of Economic Data Lesson0-1 Supplement 2: Comparing the two estimators of population variance by simulations.
Tree and Forest Classification and Regression Tree Bagging of trees Boosting trees Random Forest.
Computer graphics III – Multiple Importance Sampling Jaroslav Křivánek, MFF UK
Chapter 9 Sampling Distributions 9.1 Sampling Distributions.
Canadian Bioinformatics Workshops
Chapter 19 Monte Carlo Valuation. © 2013 Pearson Education, Inc., publishing as Prentice Hall. All rights reserved.19-2 Monte Carlo Valuation Simulation.
Data Science Credibility: Evaluating What’s Been Learned
Lesson 8: Basic Monte Carlo integration
Simple and Robust Iterative Importance Sampling of Virtual Point Lights Iliyan Georgiev Philipp Slusallek.
Path Tracing (some material from University of Wisconsin)
Chapter 7 Sampling Distributions
Monte Carlo I Previous lecture Analytical illumination formula
Monte Carlo Integration
Photon Density Estimation using Multiple Importance Sampling
Presentation transcript:

Importance Resampling for Global Illumination Justin F. Talbot Master’s Thesis Defense Brigham Young University Provo, UT

Global Illumination Goal: Create realistic images from virtual scenes.

Global Illumination Goal: Create realistic images from virtual scenes. Approach: Treat each pixel as an integral.

Monte Carlo Integration Approximation: Generate random samples, {y 1,…,y N }, from density q Generate random samples, {y 1,…,y N }, from density q Evaluate f at each sample Evaluate f at each sample Compute estimate Compute estimate

Monte Carlo Integration Importance Sampling Importance Sampling Choose q to be nearly proportional to fChoose q to be nearly proportional to f Restrictions:Restrictions: q must be normalized (integrate to 1) q must be normalized (integrate to 1) q must be easy to sample q must be easy to sample

Thesis Question Can we generalize importance sampling to allow Can we generalize importance sampling to allow unnormalized q?unnormalized q? difficult to sample q?difficult to sample q? Motivation: Motivation: If so, then we can pick a q that is more proportional to f.If so, then we can pick a q that is more proportional to f. More variance reduction.More variance reduction.

Thesis Contributions Resampled Importance Sampling (RIS) Resampled Importance Sampling (RIS) Proofs Proofs RIS unbiasedRIS unbiased RIS varianceRIS variance Efficiency optimal parametersEfficiency optimal parameters Robust approximate parametersRobust approximate parameters RIS combined with RIS combined with Stratified SamplingStratified Sampling Multiple Importance SamplingMultiple Importance Sampling Application to direct lighting problem Application to direct lighting problem

Resampled Importance Sampling A generalization of importance sampling that permits A generalization of importance sampling that permits unnormalized q andunnormalized q and difficult to sample q.difficult to sample q. Based upon a sampling technique called importance resampling. Based upon a sampling technique called importance resampling.

Importance Resampling Goal: generate samples from q Goal: generate samples from q Problems: Problems: q may not be normalized.q may not be normalized. q can’t be sampled using simpler techniquesq can’t be sampled using simpler techniques Solution: use 2-stage sampling (resampling) Solution: use 2-stage sampling (resampling)

Importance Resampling

1) Generate proposals from density p. p should be easy to sample Proposals = {x 1,…,x M }

Importance Resampling 1) Generate proposals from density p. 2) Compute weights. Weighted proposals form a discrete approximation of q

Importance Resampling 1) Generate proposals from density p. 2) Compute weights. 3) Draw samples from the proposals with probability prop. to weight. Samples are approximately distributed according to q! Samples = {y 1,…,y N }

Importance Resampling Provides a way to generate samples from a “difficult” distribution. Provides a way to generate samples from a “difficult” distribution. Limitations: Limitations: Distribution is an approximation for any finite number of proposals, M.Distribution is an approximation for any finite number of proposals, M. Samples may be repeated if drawn from same set of proposals.Samples may be repeated if drawn from same set of proposals.

Resampled Importance Sampling How do we combine How do we combine proposals {x 1,…,x M },proposals {x 1,…,x M }, weights {w(x 1 ),…,w(x M )}, andweights {w(x 1 ),…,w(x M )}, and samples {y 1,…,y N }samples {y 1,…,y N } to create an unbiased estimate of ?

Resampled Importance Sampling How do we combine How do we combine proposals {x 1,…,x M },proposals {x 1,…,x M }, weights {w(x 1 ),…,w(x M )}, andweights {w(x 1 ),…,w(x M )}, and samples {y 1,…,y N }samples {y 1,…,y N } to create an unbiased estimate of ?

Resampled Importance Sampling How do we combine How do we combine proposals {x 1,…,x M },proposals {x 1,…,x M }, weights {w(x 1 ),…,w(x M )}, andweights {w(x 1 ),…,w(x M )}, and samples {y 1,…,y N }samples {y 1,…,y N } to create an unbiased estimate of ? Same as standard Monte Carlo integration estimate (except q is not normalized)

Resampled Importance Sampling How do we combine How do we combine proposals {x 1,…,x M },proposals {x 1,…,x M }, weights {w(x 1 ),…,w(x M )}, andweights {w(x 1 ),…,w(x M )}, and samples {y 1,…,y N }samples {y 1,…,y N } to create an unbiased estimate of ? Additional term corrects: Importance Resampling approximation Unnormalized q

Thesis Question Can we generalize importance sampling to allow Can we generalize importance sampling to allow unnormalized q?unnormalized q? difficult to sample q?difficult to sample q? Motivation: Motivation: If so, then we can pick a q that is more proportional to f.If so, then we can pick a q that is more proportional to f. More variance reduction.More variance reduction. YES!

Resampled Importance Sampling The variance of RIS is: The variance of RIS is: To give more variance reduction than standard importance sampling: To give more variance reduction than standard importance sampling: proposals must be computationally cheaper than samples ANDproposals must be computationally cheaper than samples AND q must be more prop. to f than p (a better importance sampling density).q must be more prop. to f than p (a better importance sampling density).

Resampled Importance Sampling We also have to choose M (# of proposals) and N (# of samples). We also have to choose M (# of proposals) and N (# of samples). For a fixed time constraint, we have to trade off. For a fixed time constraint, we have to trade off.

Example - Choosing M and N N=100, M=100 (Better shadows, color) N=1, M=450 (Better direct lighting) ↔

Resampled Importance Sampling Could directly minimize variance equation Could directly minimize variance equation Too hard, so we approximate Too hard, so we approximate

Resampled Importance Sampling M* = 0.5 * T total / T proposal M* = 0.5 * T total / T proposal N* = 0.5 * T total / T sample N* = 0.5 * T total / T sample Simple Simple Give equal time to proposals and samplesGive equal time to proposals and samples Robust Robust Results in no more than twice the variance of the true optimal valuesResults in no more than twice the variance of the true optimal values

Results – Direct Lighting RIS using estimated optimal values: M* = 218, N* = % variance reduction (equal time)

Results – Direct Lighting N=100, M=100 N=64.8, M=218N=1, M=450

Results II 34% variance reduction

Results III 33% variance reduction

Stratifying RIS Stratified sampling Stratified sampling

Stratifying RIS Stratified sampling Stratified sampling Divide domain into strataDivide domain into strata Take a single sample in each strataTake a single sample in each strata Avoids clustering of samplesAvoids clustering of samples

Stratifying RIS In RIS In RIS Stratify proposalsStratify proposals Avoids clustering Avoids clustering Apply standard techniques Apply standard techniques

Stratifying Proposals 34% variance reduction RIS without stratification Proposals only

Stratifying RIS In RIS In RIS Stratify proposalsStratify proposals Avoids clustering Avoids clustering Apply standard techniques Apply standard techniques

Stratifying RIS In RIS In RIS Stratify proposalsStratify proposals Avoids clustering Avoids clustering Apply standard techniques Apply standard techniques Stratify samplesStratify samples Avoids clustering Avoids clustering Avoids duplicates Avoids duplicates

Stratifying RIS How do we stratify samples? How do we stratify samples? Equal-proposalsEqual-proposals Equal-weightsEqual-weights

Stratifying Samples 34% variance reduction37% variance reduction42% variance reduction Proposals onlyEqual-proposalsEqual-weights

Multiple Importance Sampling We can often generate proposals from multiple densities We can often generate proposals from multiple densities How can we combine them? How can we combine them? Start at surfaceStart at light

Multiple Importance Sampling We can often generate proposals from multiple densities We can often generate proposals from multiple densities How can we combine them? How can we combine them? Multiple Importance SamplingMultiple Importance Sampling Start at surfaceStart at light

Multiple Importance Sampling 1) Generate proposals from densities p 1,…,p K. p should be easy to sample, i.e. using CDF inversion or rejection sampling Proposals = {x 1,…,x M }

Multiple Importance Sampling 1) Generate proposals from densities p 1,…,p K. 2) Compute weights.

Multiple Importance Sampling 1) Generate proposals from densities p 1,…,p K. 2) Compute weights.

Multiple Importance Sampling 1) Generate proposals from densities p 1,…,p K. 2) Compute weights. 3) Draw samples from the proposals with probability prop. to weight.

Multiple Importance Sampling Start at surfaceStart at light

Multiple Importance Sampling MIS without RISMIS with RIS 30% variance reduction

Thesis Contributions Resampled Importance Sampling (RIS) Resampled Importance Sampling (RIS) Proofs Proofs RIS unbiasedRIS unbiased RIS varianceRIS variance Efficiency optimal parametersEfficiency optimal parameters Robust approximate parametersRobust approximate parameters RIS combined with RIS combined with Stratified SamplingStratified Sampling Multiple Importance SamplingMultiple Importance Sampling Application to direct lighting problem Application to direct lighting problem

Concluding Thoughts RIS is better than IS when: RIS is better than IS when: q is a better importance sampling density than p q is a better importance sampling density than pAND Computing proposals is much cheaper than computing samples Computing proposals is much cheaper than computing samples Intuition: RIS takes advantage of differences in variance or computation expense Intuition: RIS takes advantage of differences in variance or computation expense

Concluding Thoughts Future Work Future Work Application to other problems in global illuminationApplication to other problems in global illumination Application to other fieldsApplication to other fields Development of better choices of q and pDevelopment of better choices of q and p Examine trade off between computational expense and importance sampling quality Examine trade off between computational expense and importance sampling quality

Questions