Last Time Presentation of your game and idea Environment mapping

Slides:



Advertisements
Similar presentations
1GR2-00 GR2 Advanced Computer Graphics AGR Lecture 9 Adding Realism Through Texture.
Advertisements

16.1 Si23_03 SI23 Introduction to Computer Graphics Lecture 16 – Some Special Rendering Effects.
Lecture 8 Transparency, Mirroring
CS123 | INTRODUCTION TO COMPUTER GRAPHICS Andries van Dam © 1/16 Deferred Lighting Deferred Lighting – 11/18/2014.
CS 352: Computer Graphics Chapter 7: The Rendering Pipeline.
Ray tracing. New Concepts The recursive ray tracing algorithm Generating eye rays Non Real-time rendering.
Graphics Pipeline.
03/16/2009Dinesh Manocha, COMP770 Texturing Surface’s texture: its look & feel Graphics: a process that takes a surface and modifies its appearance using.
Texture Mapping. Texturing  process that modifies the appearance of each point on a surface using an image or function  any aspect of appearance can.
Week 7 - Monday.  What did we talk about last time?  Specular shading  Aliasing and antialiasing.
Week 10 - Monday.  What did we talk about last time?  Global illumination  Shadows  Projection shadows  Soft shadows.
Real-Time Rendering TEXTURING Lecture 02 Marina Gavrilova.
CAP4730: Computational Structures in Computer Graphics Visible Surface Determination.
Computer Graphics Visible Surface Determination. Goal of Visible Surface Determination To draw only the surfaces (triangles) that are visible, given a.
Real-Time Rendering SPEACIAL EFFECTS Lecture 03 Marina Gavrilova.
Computer Graphics Shadows
09/18/03CS679 - Fall Copyright Univ. of Wisconsin Last Time Bump Mapping Multi-pass algorithms.
9/20/2001CS 638, Fall 2001 Today Finishing Up Reflections More Multi-Pass Algorithms Shadows.
Computer Graphics: Programming, Problem Solving, and Visual Communication Steve Cunningham California State University Stanislaus and Grinnell College.
Shadows Computer Graphics. Shadows Shadows Extended light sources produce penumbras In real-time, we only use point light sources –Extended light sources.
Erdem Alpay Ala Nawaiseh. Why Shadows? Real world has shadows More control of the game’s feel  dramatic effects  spooky effects Without shadows the.
Computer Graphics Mirror and Shadows
University of Illinois at Chicago Electronic Visualization Laboratory (EVL) CS 426 Intro to 3D Computer Graphics © 2003, 2004, 2005 Jason Leigh Electronic.
CS 638, Fall 2001 Today Light Mapping (Continued) Bump Mapping with Multi-Texturing Multi-Pass Rendering.
Technology and Historical Overview. Introduction to 3d Computer Graphics  3D computer graphics is the science, study, and method of projecting a mathematical.
CS 638, Fall 2001 Admin Grad student TAs may have had their accounts disabled –Please check and the lab if there is a problem If you plan on graduating.
CS 376 Introduction to Computer Graphics 04 / 11 / 2007 Instructor: Michael Eckmann.
Texture Mapping. Scope Buffers Buffers Various of graphics image Various of graphics image Texture mapping Texture mapping.
09/09/03CS679 - Fall Copyright Univ. of Wisconsin Last Time Event management Lag Group assignment has happened, like it or not.
09/11/03CS679 - Fall Copyright Univ. of Wisconsin Last Time Graphics Pipeline Texturing Overview Cubic Environment Mapping.
Game Programming (Mapping) Spring.
CS 638, Fall 2001 Today Project Stage 0.5 Environment mapping Light Mapping.
CS 638, Fall 2001 Multi-Pass Rendering The pipeline takes one triangle at a time, so only local information, and pre-computed maps, are available Multi-Pass.
Multi-pass Rendering. © 2002 James K. Hahn, N.H. Baek2 Multi-pass Rendering Repeat: image  Rendering pass  result imageRepeat: image  Rendering pass.
Computer Graphics The Rendering Pipeline - Review CO2409 Computer Graphics Week 15.
CS-378: Game Technology Lecture #4: Texture and Other Maps Prof. Okan Arikan University of Texas, Austin V Lecture #4: Texture and Other Maps.
Advanced Computer Graphics Advanced Shaders CO2409 Computer Graphics Week 16.
CS-378: Game Technology Lecture #7: More Mapping Prof. Okan Arikan University of Texas, Austin Thanks to James O’Brien, Steve Chenney, Zoran Popovic, Jessica.
1 Perception and VR MONT 104S, Fall 2008 Lecture 21 More Graphics for VR.
09/16/03CS679 - Fall Copyright Univ. of Wisconsin Last Time Environment mapping Light mapping Project Goals for Stage 1.
CS 325 Introduction to Computer Graphics 04 / 12 / 2010 Instructor: Michael Eckmann.
Where We Stand So far we know how to: –Transform between spaces –Rasterize –Decide what’s in front Next –Deciding its intensity and color.
What are shaders? In the field of computer graphics, a shader is a computer program that runs on the graphics processing unit(GPU) and is used to do shading.
Real-Time Dynamic Shadow Algorithms Evan Closson CSE 528.
Shadows David Luebke University of Virginia. Shadows An important visual cue, traditionally hard to do in real-time rendering Outline: –Notation –Planar.
09/23/03CS679 - Fall Copyright Univ. of Wisconsin Last Time Reflections Shadows Part 1 Stage 1 is in.
Module 05 –Bump mapping Module 05 – Bump mapping Module 05 Advanced mapping techniques: Bump mapping.
1© 2009 Autodesk Hardware Shade – Presenting Your Designs Hardware and Software Shading HW Shade Workflow Tessellation Quality Settings Lighting Settings.
- Introduction - Graphics Pipeline
Ying Zhu Georgia State University
Stenciling Effects Glenn G. Chappell
Discrete Techniques.
CS-378: Game Technology Lecture #7: More Mapping Prof. Okan Arikan
Week 7 - Monday CS361.
© University of Wisconsin, CS559 Spring 2004
Week 11 - Wednesday CS361.
Game Programming (Mapping)
Deferred Lighting.
3D Graphics Rendering PPT By Ricardo Veguilla.
The Graphics Rendering Pipeline
CS451Real-time Rendering Pipeline
So Far We have assumed that we know: The point The surface normal
© University of Wisconsin, CS559 Fall 2004
(c) 2002 University of Wisconsin
Lecture 13 Clipping & Scan Conversion
Chapter XIV Normal Mapping
CS5500 Computer Graphics May 29, 2006
CS-378: Game Technology Lecture #4: Texture and Other Maps
Adding Surface Detail 고려대학교 컴퓨터 그래픽스 연구실.
Adding Surface Detail 고려대학교 컴퓨터 그래픽스 연구실.
Presentation transcript:

Last Time Presentation of your game and idea Environment mapping Light Maps 03/16/10 Spring 2010 - NTUST

Today Light Maps Bump Maps Multi-Pass Rendering 03/16/10 Spring 2010 - NTUST

Light Maps Speed up lighting calculations by pre-computing lighting and storing it in maps Allows complex illumination models to be used in generating the map (eg shadows, radiosity) Used in complex rendering algorithms (Radiance), not just games Issues: How is the mapping determined? How are the maps generated? How are they applied at run-time? 03/16/10 Spring 2010 - NTUST

Example 03/16/10 Spring 2010 - NTUST

Choosing a Mapping Problem: In a preprocessing phase, points on polygons must be associated with points in maps One solution: Find groups of polygons that are “near” co-planar and do not overlap when projected onto a plane Result is a mapping from polygons to planes Combine sections of the chosen planes into larger maps Store texture coordinates at polygon vertices Lighting tends to change quite slowly (except when?), so the map resolution can be poor 03/16/10 Spring 2010 - NTUST

Generating the Map Problem: What value should go in each pixel of the light map? Solution: Map texture pixels back into world space (using the inverse of the texture mapping) Take the illumination of the polygon and put it in the pixel Advantages of this approach: Choosing “good” planes means that texture pixels map to roughly square pieces of polygon - good sampling Not too many maps are required, and not much memory is wasted 03/16/10 Spring 2010 - NTUST

Example 03/16/10 Spring 2010 - NTUST

Example Nearest interpolation Linear interpolation What type of lighting (diffuse, specular, reflections) can the map store? 03/16/10 Spring 2010 - NTUST

Example No light maps With light maps 03/16/10 Spring 2010 - NTUST

Applying Light Maps Use multi-texturing hardware First stage: Apply color texture map Second stage: Modulate with light map Actually, make points darker with light map DirectX allows you to make points brighter with texture Pre-lighting textures: Apply the light map to the texture maps as a pre-process Why is this less appealing? Multi-stage rendering: Same effect as multi-texturing, but modulating in the frame buffer 03/16/10 Spring 2010 - NTUST

Dynamic Light Maps Light maps are a preprocessing step, so they can only capture static lighting Texture transformations allow some effects What is required to recompute a light map at run-time? How might we make this tractable? Spatial subdivision algorithms allow us to identify nearby objects, which helps with this process Compute a separate, dynamic light map at runtime using same mapping as static light map Add additional texture pass to apply the dynamic map 03/16/10 Spring 2010 - NTUST

Fog Maps Dynamic modification of light-maps Put fog objects into the scene Compute where they intersect with geometry and paint the fog density into a dynamic light map Use same mapping as static light map uses Apply the fog map as with a light map Extra texture stage 03/16/10 Spring 2010 - NTUST

Fog Map Example 03/16/10 Spring 2010 - NTUST

Irradiance Map Diffuse surface: light map => not depend on view direction Specular surface: currently environment map => using reflection direction Diffuse reflection: the same reflected view vector may correspond to different reflection situations => irradiance environment map Using surface normal to indicate the lighting Applying a very wide filter to the original environment map 03/16/10 Spring 2010 - NTUST

Irradiance Map The cosine weighted hemisphere around the surface normal is sampled from the environment texture and summed to obtain the irradiance, which is view-independent. 03/16/10 Spring 2010 - NTUST

Irradiance Environment Map (II) 03/16/10 Spring 2010 - NTUST

Irradiance EM (III) Generally stored and accessed separately from the specular environment or reflection map, usually in a view-independent EM representation. Surface normal is used to access the map Values retrieved from the irradiance environment map are multiplied by the diffuse reflectance. Hard to create it efficiently on fly 03/16/10 Spring 2010 - NTUST

Spherical Harmonics A set of orthonormal basis functions similar to Fourier basis. Can be weighted and summed to approximate some general space of functions Infinite number of coefficients to get exact representation. A few to give a good representation Some nice properties Rotation invariant – the result of rotating the projection of a function is the same as rotating the function and then projecting it Inexpensive to evaluate. Save the space Light easy to decompose into SH and do the convolution and recover back. 03/16/10 Spring 2010 - NTUST

Bump Mapping Bump mapping modifies the surface normal vector according to information in the map Light dependent: the appearance of the surface depends on the lighting direction View dependent: the effect of the bumps may depend on which direction the surface is viewed from Bump mapping can be implemented with multi-texturing, multi-pass rendering, or pixel shaders 03/16/10 Spring 2010 - NTUST

Storing the Bump Map Several options for what to store in the map The normal vector to use An offset to the default normal vector Data derived from the normal vector Illumination changes for a fixed view 03/16/10 Spring 2010 - NTUST

View Dependent Bump Maps Store four maps (or more) showing the illumination effects of the bumps from four (or more) view directions Bump maps on diffuse surfaces just make them lighter or darker - don’t change the color At run time: Compute the dot product of the view direction with the ideal view direction for each bump map Bump maps that were computed with views near the current one will have big dot products Use the computed dot product as a blend factor when applying each bump map Must be able to specify the blend function to the texture unit 03/16/10 Spring 2010 - NTUST

Embossing Apply height field as a modulating texture map First application, apply it in place Second application, shift it by amount that depends on the light direction, and subtract it 03/16/10 Spring 2010 - NTUST

Dot Product (dot3) bump mapping Store normal vectors in the bump map Specify light directions instead of colors at the vertices Apply the bump map using the dot3 operator Takes a dot product Lots of details: Light directions must be normalized – can be done with a cubic environment map How do you get the color in? How do you do specular highlights? 03/16/10 Spring 2010 - NTUST

Dot Product Results 03/16/10 Spring 2010 - NTUST

Environment Bump Mapping Perturb the environment map lookup directions with the bump map 03/16/10 Spring 2010 - NTUST

Environment Bump Map Results 03/16/10 Spring 2010 - NTUST

Multi-Pass Rendering The pipeline takes one triangle at a time, so only local information and pre-computed maps are available Multi-pass techniques render the scene, or parts of the scene, multiple times Makes use of auxiliary buffers to hold information Make use of tests and logical operations on values in the buffers Really, a set of functionality that can be used to achieve a wide range of effects Mirrors, shadows, bump-maps, anti-aliasing, compositing, … 03/16/10 Spring 2010 - NTUST

Buffers Buffers allow you to store global information about the rendered scene Like scratch work space, or extra screen memory They are only cleared when you say so This functionality is fundamentally different from that of vertex or pixel shaders Buffers are defined by: The type of values they store The logical operations that they influence The way they are accessed (written and read) 03/16/10 Spring 2010 - NTUST

OpenGL Buffers Color buffers: Store RGBA color information for each pixel OpenGL actually defines four or more color buffers: front/back (double buffering), left/right (stereo) and auxiliary color buffers Depth buffer: Stores depth information for each pixel Stencil buffer: Stores some number of bits for each pixel Accumulation buffer: Like a color buffer, but with higher resolution and different operations 03/16/10 Spring 2010 - NTUST

Fragment Tests A fragment is a pixel-sized piece of shaded polygon, with color and depth information After pixel shaders and/or texturing The tests and operations performed with the fragment on its way to the color buffer are essential to understanding multi-pass techniques Most important are, in order: Alpha test Stencil test Depth test Blending Tests must be explicitly enabled As the fragment passes through, some of the buffers may also have values stored into them 03/16/10 Spring 2010 - NTUST

Alpha Test The alpha test either allows a fragment to pass, or stops it, depending on the outcome of a test: Here, fragment is the fragment’s alpha value, and reference is a reference alpha value that you specify op is one of: <, <=, =, !=, >, >= There are also the special tests: Always and Never Always let the fragment through or never let it through What is a sensible default? if ( fragment op reference ) pass fragment on 03/16/10 Spring 2010 - NTUST

Billboards Billboards are texture-mapped polygons, typically used for things like trees Image-based rendering method where complex geometry (the tree) is replaced with an image placed in the scene (the textured polygon) The texture has alpha values associated with it: 1 where the tree is, and 0 where it isn’t So you can see through the polygon in places where the tree isn’t 03/16/10 Spring 2010 - NTUST

Alpha Test and Billboards You can use texture blending to make the polygon see through, but there is a big problem What happens if you draw the billboard and then draw something behind it? Hint: Think about the depth buffer values This is one reason why transparent objects must be rendered back to front The best way to draw billboards is with an alpha test: Do not let alpha < 0.5 pass through Depth buffer is never set for fragments that are see through Doesn’t work for partially transparent polygons - more later 03/16/10 Spring 2010 - NTUST

Stencil Buffer The stencil buffer acts like a paint stencil - it lets some fragments through but not others It stores multi-bit values – you have some control of #bits You specify two things: The test that controls which fragments get through The operations to perform on the buffer when the test passes or fails All tests/operation look at the value in the stencil that corresponds to the pixel location of the fragment Typical usage: One rendering pass sets values in the stencil, which control how various parts of the screen are drawn in the second pass 03/16/10 Spring 2010 - NTUST

Stencil Tests You give an operation, a reference value, and a mask Operations: Always let the fragment through Never let the fragment through Logical operations between the reference value and the value in the buffer: <, <=, =, !=, >, >= The mask is used to select particular bit-planes for the operation (reference & mask ) op ( buffer & mask ) 03/16/10 Spring 2010 - NTUST

Stencil Operations Specify three different operations Operations are: If the stencil test fails If the stencil passes but the depth test fails If the stencil passes and the depth test passes Operations are: Keep the current stencil value Zero the stencil Replace the stencil with the reference value Increment the stencil Decrement the stencil Invert the stencil (bitwise) 03/16/10 Spring 2010 - NTUST

Depth Test and Operation Depth test compares the depth of the fragment and the depth in the buffer Depth increases with greater distance from viewer Tests are: Always, Never, <, <=, =, !=, >, >= Depth operation is to write the fragments depth to the buffer, or to leave the buffer unchanged Why do the test but leave the buffer unchanged? Each buffer stores different information about the pixel, so a test on one buffer may be useful in managing another 03/16/10 Spring 2010 - NTUST

Todo Stage two: give life to your game March 30: demonstrate in class Show us your user interface Show us your way to import the main character and other scene objects Show us some basic function of your game. March 30: demonstrate in class 03/16/10 Spring 2010 - NTUST