Download presentation
Presentation is loading. Please wait.
Published bySantiago Kilborn Modified over 10 years ago
1
Week 9 - Monday
2
What did we talk about last time? BRDFs Texture mapping and bump mapping in shaders
6
Fresnel reflectance is an ideal mathematical description of how perfectly smooth materials reflect light The angle of reflection is the same as the angle of incidence and can be computed: The transmitted (visible) radiance L t is based on the Fresnel reflectance and the angle of refraction of light into the material:
7
The angle of refraction into the material is related to the angle of incidence and the refractive indexes of the materials below the interface and above the interface: We can combine this identity with the previous equation:
9
Reflectance is obviously dependent on angle Perpendicular (0°) gives essentially the specular color of the material Higher angles will become more reflective The function R F (θ i ) is also dependent on material (and the light color)
10
Because it's non-linear, Schlick gives an approximation that works for most substances: We can use a table of R F (0°) values
11
External reflection needs to be modeled more often than internal reflection Modeling internal reflection is the same except that the higher optical density can cause total internal reflection
12
Usually is not as complex as specular light We can measure a value ρ that gives the ratio between light escaping a surface relative to light entering a surface ρ is called the scattering albedo Because of conversation of energy, the more light that is reflected through Fresnel reflection, the less there is to be reflected diffusely Thus, a simple approximation for diffuse light is
14
The cause of many lighting effects is microgeometry The smoother the surface, the tighter (and brighter) the reflections are
15
Glancing angles can minimize the impacts of surface roughness, making rough surfaces reflective at very high angles Most surfaces are isotropic (symmetrical) in the way they are rough Anisotropic surfaces like brushed metal have directional blurring
17
The book gives a number of BRDF equations It is also possible to samples materials (from every angle, at every color of light) to measure a BRDF of your own Once you've got such a model, how do you implement it?
18
The shader will use the following equation: The cosine term is found with the dot product Most BRDFs contain a 1/π term Many systems pre-divide E L by π Make sure you don't double divide (or double multiply) If some value is computed repeatedly, consider putting it in a texture for lookup Mipmapping may not work for non-linear BRDFs
19
It may be expensive to compute the shading based on all the light sources Also, many APIs (and various graphics cards) limit the number of light sources Some lights must be averaged into each other for performance reasons
20
Shading is usually done while z-buffer testing is done It's possible to do all the z-buffer testing and then go back and shade only those fragments that contribute to the final scene
22
A great deal of graphics research deals with rendering real scenes Don't cameras do that? Sure, but these graphics guys couldn't publish papers if the stuff wasn't hard for some reason: Reconstructing novel viewpoints Walkthroughs with user controlled paths Introducing synthetic objects into real scenes Re-lighting real scenes with new light sources I would be remiss if I didn't mention these topics even though they usually have nothing to do with video games and often cannot be rendered in real time
24
Although the research is old now, Daniel Aliaga et al. produced an impressive system for recreating real scenes in real time in which a user can control the path he or she takes A robot records thousands and thousands of omnidirectional images and its location when it takes them Then, images are merged together to create a novel view for the current location and orientation
25
Rendering the images in real time isn't hard Knowing the robot's position for all images is surprisingly difficult Storing and loading the next images that will be needed in reconstruction is a huge caching and compression problem Getting the robot to walk around and scan a scene automatically ended up being too hard Some of these ideas were used for Google Street View, which is neither real time nor allows for arbitrary locations
27
Synthetic objects can be rendered using a BRDF based on measurements of real-world materials Alternatively, we could sample a real-world object from many different directions and get enough information to re-light it You can also capture lighting from the real world using a mirrored ball Then you can re-light: A real image with a different set of real lights Synthetic objects with realistic real light
30
Area lighting Environment mapping
31
Work on Assignment 4 Due this Friday, March 20 Start working on Project 3 Due April 2 Keep reading Chapter 8
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.