Week 9 - Wednesday.  What did we talk about last time?  Fresnel reflection  Snell's Law  Microgeometry effects  Implementing BRDFs  Image based.

Slides:



Advertisements
Similar presentations
Lecture 8 Transparency, Mirroring
Advertisements

Computer Vision Radiometry. Bahadir K. Gunturk2 Radiometry Radiometry is the part of image formation concerned with the relation among the amounts of.
1 Graphics CSCI 343, Fall 2013 Lecture 18 Lighting and Shading.
Computer graphics & visualization Global Illumination Effects.
Computer Vision Spring ,-685 Instructor: S. Narasimhan Wean 5403 T-R 3:00pm – 4:20pm Lecture #12.
Environment Mapping CSE 781 – Roger Crawfis
Week 9 - Monday.  What did we talk about last time?  BRDFs  Texture mapping and bump mapping in shaders.
Week 10 - Monday.  What did we talk about last time?  Global illumination  Shadows  Projection shadows  Soft shadows.
Real-Time Rendering TEXTURING Lecture 02 Marina Gavrilova.
Week 7 - Wednesday.  What did we talk about last time?  Transparency  Gamma correction  Started texturing.
Virtual Realism LIGHTING AND SHADING. Lighting & Shading Approximate physical reality Ray tracing: Follow light rays through a scene Accurate, but expensive.
Foundations of Computer Graphics (Spring 2012) CS 184, Lecture 21: Radiometry Many slides courtesy Pat Hanrahan.
Advanced Computer Graphics (Spring 2013) CS 283, Lecture 8: Illumination and Reflection Many slides courtesy.
Rendering with Environment Maps Jaroslav Křivánek, KSVI, MFF UK
1. What is Lighting? 2 Example 1. Find the cubic polynomial or that passes through the four points and satisfies 1.As a photon Metal Insulator.
Illumination Model How to compute color to represent a scene As in taking a photo in real life: – Camera – Lighting – Object Geometry Material Illumination.
Rendering (彩現 渲染).
Computer Graphics (Spring 2008) COMS 4160, Lecture 20: Illumination and Shading 2
Texture Mapping CPSC /24/03 Abhijeet Ghosh.
(conventional Cartesian reference system)
Computer Graphics (Fall 2008) COMS 4160, Lecture 19: Illumination and Shading 2
1 CSCE 641: Computer Graphics Lighting Jinxiang Chai.
Objectives Learn to shade objects so their images appear three- dimensional Learn to shade objects so their images appear three- dimensional Introduce.
University of British Columbia CPSC 314 Computer Graphics Jan-Apr 2005 Tamara Munzner Lighting and Shading Week.
Basic Principles of Surface Reflectance Lecture #3 Thanks to Shree Nayar, Ravi Ramamoorthi, Pat Hanrahan.
Computer Graphics (Fall 2004) COMS 4160, Lecture 16: Illumination and Shading 2 Lecture includes number of slides from.
1 Dr. Scott Schaefer Radiosity. 2/38 Radiosity 3/38 Radiosity Physically based model for light interaction View independent lighting Accounts for indirect.
Computer Graphics Inf4/MSc Computer Graphics Lecture 11 Texture Mapping.
SET09115 Intro Graphics Programming
CS 445 / 645: Introductory Computer Graphics
1 Computer Graphics Week13 –Shading Models. Shading Models Flat Shading Model: In this technique, each surface is assumed to have one normal vector (usually.
Shading (introduction to rendering). Rendering  We know how to specify the geometry but how is the color calculated.
Technology and Historical Overview. Introduction to 3d Computer Graphics  3D computer graphics is the science, study, and method of projecting a mathematical.
Shading / Light Thanks to Srinivas Narasimhan, Langer-Zucker, Henrik Wann Jensen, Ravi Ramamoorthi, Hanrahan, Preetham.
Environment Mapping. Examples Fall Motivation Silver candlestick No appropriate texture for it “ environment ” map Simulates the results of ray-tracing.
02/28/05© 2005 University of Wisconsin Last Time Scattering theory Integrating tranfer equations.
-Global Illumination Techniques
09/09/03CS679 - Fall Copyright Univ. of Wisconsin Last Time Event management Lag Group assignment has happened, like it or not.
09/11/03CS679 - Fall Copyright Univ. of Wisconsin Last Time Graphics Pipeline Texturing Overview Cubic Environment Mapping.
CS 638, Fall 2001 Today Project Stage 0.5 Environment mapping Light Mapping.
Week 10 - Wednesday.  What did we talk about last time?  Shadow volumes and shadow mapping  Ambient occlusion.
An Efficient Representation for Irradiance Environment Maps Ravi Ramamoorthi Pat Hanrahan Stanford University SIGGRAPH 2001 Stanford University SIGGRAPH.
CS-378: Game Technology Lecture #4: Texture and Other Maps Prof. Okan Arikan University of Texas, Austin V Lecture #4: Texture and Other Maps.
1 Introduction to Computer Graphics with WebGL Ed Angel Professor Emeritus of Computer Science Founding Director, Arts, Research, Technology and Science.
Advanced Illumination Models Chapter 7 of “Real-Time Rendering, 3 rd Edition”
Advanced Computer Graphics Shadow Techniques CO2409 Computer Graphics Week 20.
SIGGRAPH 2010 Course: Physically Based Shading Models in Film and Game Production SIGGRAPH 2010 Physically Based Shading Models in Film and Game Production.
Photo-realistic Rendering and Global Illumination in Computer Graphics Spring 2012 Material Representation K. H. Ko School of Mechatronics Gwangju Institute.
Lecture Fall 2001 Illumination and Shading in OpenGL Light Sources Empirical Illumination Shading Transforming Normals Tong-Yee Lee.
Computer Graphics (Spring 2003) COMS 4160, Lecture 18: Shading 2 Ravi Ramamoorthi Guest Lecturer: Aner Benartzi.
CSCE 441: Computer Graphics Ray Tracing
Local Illumination and Shading
COMPUTER GRAPHICS CS 482 – FALL 2015 SEPTEMBER 29, 2015 RENDERING RASTERIZATION RAY CASTING PROGRAMMABLE SHADERS.
Module 06 –environment mapping Module 06 – environment mapping Module 06 Advanced mapping techniques: Environment mapping.
1 CSCE 441: Computer Graphics Lighting Jinxiang Chai.
Illumination Model How to compute color to represent a scene As in taking a photo in real life: – Camera – Lighting – Object Geometry Material Illumination.
1 CSCE 441: Computer Graphics Lighting Jinxiang Chai.
Distributed Ray Tracing. Can you get this with ray tracing?
1 CSCE 441: Computer Graphics Lighting Jinxiang Chai.
Computer Graphics: Illumination
Image-based Lighting Computational Photography
Advanced Computer Graphics
(c) 2002 University of Wisconsin
Illumination Model How to compute color to represent a scene
Chapter XVI Texturing toward Global Illumination
Chapter IX Lighting.
Lecture 28: Photometric Stereo
14th Lecture – Final Lecture
Computer Graphics Material Colours and Lighting
Real-time Global Illumination with precomputed probe
Presentation transcript:

Week 9 - Wednesday

 What did we talk about last time?  Fresnel reflection  Snell's Law  Microgeometry effects  Implementing BRDFs  Image based rendering

 So far, we have only been talking about lighting as coming from a particular source  Lighting like that happens mostly in space  On earth, area lighting has a huge impact  Sky light from the sun's light scattering through the atmosphere  Indoor lighting is usually indirect (because a bare bulb hurts the eyes)

 Area lights are complex  The book describes the 3D integration over a hemisphere of angles needed to properly quantify radiance  No lights in reality are point lights  All lights have an area that has some effect

 The simplest model of indirect light is ambient light  This is light that has a constant value  It doesn't change with direction  It doesn't change with distance  Without modeling occlusion (which usually ends up looking like shadows) ambient lighting can look very bad  We can add ambient lighting to our existing BRDF formulation with a constant term:

 A more complicated tool for area lighting is environment mapping (EM)  The key assumption of EM is that only direction matters  Light sources must be far away  The object does not reflect itself  In EM, we make a 2D table of the incoming radiance based on direction  Because the table is 2D, we can store it in an image

 The radiance reflected by a mirror is based on the reflected view vector r = 2(nv)n – v  The reflectance equation is: where RF is the Fresnel reflectance and L i is the incoming radiance from vector r

 Steps: 1. Generate or load a 2D image representing the environment 2. For each pixel that contains a reflective object, compute the normal at the corresponding location on the surface 3. Compute the reflected view vector from the view vector and the normal 4. Use the reflected view vector to compute an index into the environment map 5. Use the texel for incoming radiance

 It doesn't work well with flat surfaces  The direction doesn't vary much, mapping a lot of the surface to a narrow part of the environment map  Normal mapping combined with EM helps a lot  The range of values in an environment map may be large (to cover many light intensities)  As a consequence, the space requirements may be higher than normal textures

 Blinn and Newell used a longitude/latitude system with a projection like Mercator   is longitude and goes from 0 to 2π   is latitude and goes from 0 to π  We can compute these from the normalized reflected view vector:   = arccos(-r z )   = atan2(r y, r x )  Problems  There are too many texels near the poles  The seam of the left and the right halves cannot easily be interpolated across

 Imagine the environment is viewed through a perfectly reflective sphere  The resulting sphere map (also called a light probe) is what you'd see if you photographed such a sphere (like a Christmas ornament)  The sphere map has a basis giving its own coordinate system (h,u,f)  The image was generated by looking along the f axis, with h to the right and u up (all normalized)

 To use the sphere map, convert the surface normal n and the view vector v to the sphere space by multiplying by the following matrix:  Sphere mapping only shows the environment on the front of the sphere  It is view dependent

 Cubic environmental mapping is the most popular current method  Fast  Flexible  Take a camera, render a scene facing in all six directions  Generate six textures  For each point on the surface of the object you're rendering, map to the appropriate texel in the cube

 Pros  Fast, supported by hardware  View independent  Shader Model 4.0 can generate a cube map in a single pass with the geometry shader  Cons  It has better sampling uniformity than sphere maps, but not perfect (isocubes improve this)  Still requires high dynamic range textures (lots of memory)  Still only works for distant objects

 We have talked about using environment mapping for mirror-like surfaces  The same idea can be applied to glossy (but not perfect) reflections  By blurring the environment map texture, the surface will appear rougher  For surfaces with varying roughness, we can simply access different mipmap levels on the cube map texture

 Environment mapping can be used for diffuse colors as well  Such maps are called irradiance environment maps  Because the viewing angle is not important for diffuse colors, only the surface normal is used to decide what part of the irradiance map is used

 Global illumination basics

 Keep reading Chapter 8  Start reading Chapter 9