CS-378: Game Technology Lecture #7: More Mapping Prof. Okan Arikan

Slides:



Advertisements
Similar presentations
16.1 Si23_03 SI23 Introduction to Computer Graphics Lecture 16 – Some Special Rendering Effects.
Advertisements

Lecture 8 Transparency, Mirroring
CS123 | INTRODUCTION TO COMPUTER GRAPHICS Andries van Dam © 1/16 Deferred Lighting Deferred Lighting – 11/18/2014.
Technische Universität München Computer Graphics SS 2014 Graphics Effects Rüdiger Westermann Lehrstuhl für Computer Graphik und Visualisierung.
Ray tracing. New Concepts The recursive ray tracing algorithm Generating eye rays Non Real-time rendering.
Graphics Pipeline.
Texture Mapping. Texturing  process that modifies the appearance of each point on a surface using an image or function  any aspect of appearance can.
Week 7 - Monday.  What did we talk about last time?  Specular shading  Aliasing and antialiasing.
Week 10 - Monday.  What did we talk about last time?  Global illumination  Shadows  Projection shadows  Soft shadows.
CS-378: Game Technology Lecture #9: More Mapping Prof. Okan Arikan University of Texas, Austin Thanks to James O’Brien, Steve Chenney, Zoran Popovic, Jessica.
Real-Time Rendering TEXTURING Lecture 02 Marina Gavrilova.
CAP4730: Computational Structures in Computer Graphics Visible Surface Determination.
Computer Graphics Visible Surface Determination. Goal of Visible Surface Determination To draw only the surfaces (triangles) that are visible, given a.
Real-Time Rendering SPEACIAL EFFECTS Lecture 03 Marina Gavrilova.
Computer Graphics Inf4/MSc Computer Graphics Lecture 11 Texture Mapping.
09/18/03CS679 - Fall Copyright Univ. of Wisconsin Last Time Bump Mapping Multi-pass algorithms.
9/20/2001CS 638, Fall 2001 Today Finishing Up Reflections More Multi-Pass Algorithms Shadows.
Shadows Computer Graphics. Shadows Shadows Extended light sources produce penumbras In real-time, we only use point light sources –Extended light sources.
Erdem Alpay Ala Nawaiseh. Why Shadows? Real world has shadows More control of the game’s feel  dramatic effects  spooky effects Without shadows the.
Computer Graphics Mirror and Shadows
CS 638, Fall 2001 Today Light Mapping (Continued) Bump Mapping with Multi-Texturing Multi-Pass Rendering.
COMP 175: Computer Graphics March 24, 2015
CS 638, Fall 2001 Admin Grad student TAs may have had their accounts disabled –Please check and the lab if there is a problem If you plan on graduating.
Texture Mapping (cont.) Jian Huang, CS 594, updated in Fall’08.
09/09/03CS679 - Fall Copyright Univ. of Wisconsin Last Time Event management Lag Group assignment has happened, like it or not.
09/11/03CS679 - Fall Copyright Univ. of Wisconsin Last Time Graphics Pipeline Texturing Overview Cubic Environment Mapping.
Game Programming (Mapping) Spring.
Advanced Computer Graphics Depth & Stencil Buffers / Rendering to Textures CO2409 Computer Graphics Week 19.
CS 638, Fall 2001 Today Project Stage 0.5 Environment mapping Light Mapping.
CS 638, Fall 2001 Multi-Pass Rendering The pipeline takes one triangle at a time, so only local information, and pre-computed maps, are available Multi-Pass.
Week 10 - Wednesday.  What did we talk about last time?  Shadow volumes and shadow mapping  Ambient occlusion.
Multi-pass Rendering. © 2002 James K. Hahn, N.H. Baek2 Multi-pass Rendering Repeat: image  Rendering pass  result imageRepeat: image  Rendering pass.
Computer Graphics The Rendering Pipeline - Review CO2409 Computer Graphics Week 15.
CS-378: Game Technology Lecture #4: Texture and Other Maps Prof. Okan Arikan University of Texas, Austin V Lecture #4: Texture and Other Maps.
Advanced Computer Graphics Advanced Shaders CO2409 Computer Graphics Week 16.
CS-378: Game Technology Lecture #8: More Mapping Prof. Okan Arikan University of Texas, Austin Thanks to James O’Brien, Steve Chenney, Zoran Popovic, Jessica.
CS-378: Game Technology Lecture #7: More Mapping Prof. Okan Arikan University of Texas, Austin Thanks to James O’Brien, Steve Chenney, Zoran Popovic, Jessica.
09/16/03CS679 - Fall Copyright Univ. of Wisconsin Last Time Environment mapping Light mapping Project Goals for Stage 1.
Review on Graphics Basics. Outline Polygon rendering pipeline Affine transformations Projective transformations Lighting and shading From vertices to.
Computing & Information Sciences Kansas State University Lecture 12 of 42CIS 636/736: (Introduction to) Computer Graphics CIS 636/736 Computer Graphics.
Real-Time Dynamic Shadow Algorithms Evan Closson CSE 528.
Shadows David Luebke University of Virginia. Shadows An important visual cue, traditionally hard to do in real-time rendering Outline: –Notation –Planar.
09/23/03CS679 - Fall Copyright Univ. of Wisconsin Last Time Reflections Shadows Part 1 Stage 1 is in.
1© 2009 Autodesk Hardware Shade – Presenting Your Designs Hardware and Software Shading HW Shade Workflow Tessellation Quality Settings Lighting Settings.
Basic Ray Tracing CMSC 435/634.
- Introduction - Graphics Pipeline
Stenciling Effects Glenn G. Chappell
Discrete Techniques.
Week 7 - Monday CS361.
Photorealistic Rendering vs. Interactive 3D Graphics
Week 2 - Friday CS361.
Week 11 - Wednesday CS361.
Game Programming (Mapping)
Deferred Lighting.
3D Graphics Rendering PPT By Ricardo Veguilla.
Bump Mapping -1 Three scales of detail on an object
The Graphics Rendering Pipeline
CS451Real-time Rendering Pipeline
So Far We have assumed that we know: The point The surface normal
Jim X. Chen George Mason University
© University of Wisconsin, CS559 Fall 2004
(c) 2002 University of Wisconsin
© University of Wisconsin, CS559 Fall 2004
Real-Time Rendering Intro to Shadows
UMBC Graphics for Games
(c) 2002 University of Wisconsin
CS5500 Computer Graphics May 29, 2006
CS-378: Game Technology Lecture #4: Texture and Other Maps
Last Time Presentation of your game and idea Environment mapping
Frame Buffer Applications
Presentation transcript:

CS-378: Game Technology Lecture #7: More Mapping Prof. Okan Arikan University of Texas, Austin Thanks to James O’Brien, Steve Chenney, Zoran Popovic, Jessica Hodgins V2005-08-1.1

Today More on mapping Buffers Environment mapping Light mapping Bump mapping Buffers

Environment Mapping Environment mapping produces reflections on shiny objects Texture is transferred in the direction of the reflected ray from the environment map onto the object Uses ray with same direction but starting at the object center Map contains a view of the world as seen from the center of the object Environment Map Reflected ray Lookup ray Viewer Object

Need For Speed Underground Environment Mapping www.debevec.org Need For Speed Underground Far Cry

Lat/Long Mapping The original algorithm (1976) placed the map on a sphere centered on the object Mapping functions assume that s,t texture coordinates equate to latitude and longitude on the sphere: What is bad about this method? Sampling Map generation Complex texture coordinate computations

Cube Mapping Put the object at the center of a cube Represent the environment on the cube faces Assumptions ? Hardware supported Reflection ray View ray

Sphere Mapping Again the map lives on a sphere, but now the coordinate mapping is simplified To generate the map: Take a map point (s,t), cast a ray onto a sphere in the -Z direction, and record what is reflected Equivalent to photographing a reflective sphere with an orthographic camera (long lens, big distance) Again, makes the method suitable for film special effects

A Sphere Map

Indexing Sphere Maps Given the reflection vector: Implemented in hardware Problems: Highly non-uniform sampling Highly non-linear mapping

Non-uniform Sampling

Non-linear Mapping Linear interpolation of per-vertex texture coordinates picks up the wrong texture pixels Use small polygons! Correct Linear

Example

Other Env. Map Tricks Partially reflective objects First stage applied color texture Second stage does environment mapping using alpha blend with existing color Just put the lights in the environment map What does this simulate? Recursive reflections Bad cases for environment maps?

Light Maps Speed up lighting calculations by pre-computing lighting and storing it in maps Allows complex illumination models to be used in generating the map (eg shadows, radiosity) Used in complex rendering algorithms (Radiance), not just games Issues: How is the mapping determined? How are the maps generated? How are they applied at run-time?

Example www.flipcode.com Call of duty

Choosing a Mapping Problem: In a preprocessing phase, points on polygons must be associated with points in maps One solution: Find groups of polygons that are “near” co-planar and do not overlap when projected onto a plane Result is a mapping from polygons to planes Combine sections of the chosen planes into larger maps Store texture coordinates at polygon vertices Lighting tends to change quite slowly (except when?), so the map resolution can be poor

Generating the Map Problem: What value should go in each pixel of the light map? Solution: Map texture pixels back into world space (using the inverse of the texture mapping) Take the illumination of the polygon and put it in the pixel Advantages of this approach: Choosing “good” planes means that texture pixels map to roughly square pieces of polygon - good sampling Not too many maps are required, and not much memory is wasted

Example Nearest interpolation Linear interpolation What type of lighting (diffuse, specular, reflections) can the map store?

Example No light maps With light maps

Applying Light Maps Use multi-texturing hardware First stage: Apply color texture map Second stage: Modulate with light map Actually, make points darker with light map DirectX allows you to make points brighter with texture Pre-lighting textures: Apply the light map to the texture maps as a pre-process Why is this less appealing? Multi-stage rendering: Same effect as multi-texturing, but modulating in the frame buffer

Dynamic Light Maps Light maps are a preprocessing step, so they can only capture static lighting Texture transformations allow some effects What is required to recompute a light map at run-time? How might we make this tractable? Spatial subdivision algorithms allow us to identify nearby objects, which helps with this process Compute a separate, dynamic light map at runtime using same mapping as static light map Add additional texture pass to apply the dynamic map

Fog Maps Dynamic modification of light-maps Put fog objects into the scene Compute where they intersect with geometry and paint the fog density into a dynamic light map Use same mapping as static light map uses Apply the fog map as with a light map Extra texture stage

Fog Map Example

Bump Mapping Bump mapping modifies the surface normal vector according to information in the map Light dependent: the appearance of the surface depends on the lighting direction View dependent: the effect of the bumps may depend on which direction the surface is viewed from Bump mapping can be implemented with multi- texturing, multi-pass rendering, or pixel shaders

Storing the Bump Map Several options for what to store in the map The normal vector to use An offset to the default normal vector Data derived from the normal vector Illumination changes for a fixed view

Embossing Apply height field as a modulating texture map First application, apply it in place Second application, shift it by amount that depends on the light direction, and subtract it

Dot Product bump mapping Store normal vectors in the bump map Specify light directions instead of colors at the vertices Apply the bump map using the dot3 operator Takes a dot product Lots of details: Light directions must be normalized – can be done with a cubic environment map How do you get the color in? How do you do specular highlights?

Dot Product Results www.nvidia.com

Normal Mapping DOOM 3 James Hastings-Trew

Environment Bump Mapping Perturb the environment map lookup directions with the bump map Nvidia Far Cry

Multi-Pass Rendering The pipeline takes one triangle at a time, so only local information and pre-computed maps are available Multi-pass techniques render the scene, or parts of the scene, multiple times Makes use of auxiliary buffers to hold information Make use of tests and logical operations on values in the buffers Really, a set of functionality that can be used to achieve a wide range of effects Mirrors, shadows, bump-maps, anti-aliasing, compositing, …

Buffers Buffers allow you to store global information about the rendered scene Like scratch work space, or extra screen memory They are only cleared when you say so This functionality is fundamentally different from that of vertex or pixel shaders Buffers are defined by: The type of values they store The logical operations that they influence The way they are accessed (written and read)

OpenGL Buffers Color buffers: Store RGBA color information for each pixel OpenGL actually defines four or more color buffers: front/back (double buffering), left/right (stereo) and auxiliary color buffers Depth buffer: Stores depth information for each pixel Stencil buffer: Stores some number of bits for each pixel Accumulation buffer: Like a color buffer, but with higher resolution and different operations

Fragment Tests A fragment is a pixel-sized piece of shaded polygon, with color and depth information After pixel shaders and/or texturing The tests and operations performed with the fragment on its way to the color buffer are essential to understanding multi-pass techniques Most important are, in order: Alpha test Stencil test Depth test Blending Tests must be explicitly enabled As the fragment passes through, some of the buffers may also have values stored into them

Alpha Test The alpha test either allows a fragment to pass, or stops it, depending on the outcome of a test: Here, fragment is the fragment’s alpha value, and reference is a reference alpha value that you specify op is one of: <, <=, =, !=, >, >= There are also the special tests: Always and Never Always let the fragment through or never let it through What is a sensible default? if ( fragment op reference ) pass fragment on

Billboards Billboards are texture-mapped polygons, typically used for things like trees Image-based rendering method where complex geometry (the tree) is replaced with an image placed in the scene (the textured polygon) The texture has alpha values associated with it: 1 where the tree is, and 0 where it isn’t So you can see through the polygon in places where the tree isn’t

Alpha Test and Billboards You can use texture blending to make the polygon see through, but there is a big problem What happens if you draw the billboard and then draw something behind it? Hint: Think about the depth buffer values This is one reason why transparent objects must be rendered back to front The best way to draw billboards is with an alpha test: Do not let alpha < 0.5 pass through Depth buffer is never set for fragments that are see through Doesn’t work for partially transparent polygons - more later

Stencil Buffer The stencil buffer acts like a paint stencil - it lets some fragments through but not others It stores multi-bit values – you have some control of #bits You specify two things: The test that controls which fragments get through The operations to perform on the buffer when the test passes or fails All tests/operation look at the value in the stencil that corresponds to the pixel location of the fragment Typical usage: One rendering pass sets values in the stencil, which control how various parts of the screen are drawn in the second pass

Stencil Tests You give an operation, a reference value, and a mask Operations: Always let the fragment through Never let the fragment through Logical operations between the reference value and the value in the buffer: <, <=, =, !=, >, >= The mask is used to select particular bit-planes for the operation (reference & mask ) op ( buffer & mask )

Stencil Operations Specify three different operations If the stencil test fails If the stencil passes but the depth test fails If the stencil passes and the depth test passes Operations are: Keep the current stencil value Zero the stencil Replace the stencil with the reference value Increment the stencil Decrement the stencil Invert the stencil (bitwise)

Depth Test and Operation Depth test compares the depth of the fragment and the depth in the buffer Depth increases with greater distance from viewer Tests are: Always, Never, <, <=, =, !=, >, >= Depth operation is to write the fragments depth to the buffer, or to leave the buffer unchanged Why do the test but leave the buffer unchanged? Each buffer stores different information about the pixel, so a test on one buffer may be useful in managing another

Copy to Texture You can copy the framebuffer contents to a texture Very powerful Why ?

Multi-Pass Algorithms Designing a multi-pass algorithm is a non-trivial task At least one person I know of has received a PhD for developing such algorithms References for multi-pass algorithms: Real Time Rendering has them indexed by problem The OpenGL Programming guide discusses many multi-pass techniques in a reasonably understandable manner Game Programming Gems has some Watt and Policarpo has others Several have been published as academic papers

Multipass examples Transparent objects

Reading Core Techniques & Algorithms in Game Programming Chapter 18 pages 565 - 600