CS 638, Fall 2001 Admin Grad student TAs may have had their accounts disabled –Please check and email the lab if there is a problem If you plan on graduating.

Slides:



Advertisements
Similar presentations
Graphics Pipeline.
Advertisements

03/16/2009Dinesh Manocha, COMP770 Texturing Surface’s texture: its look & feel Graphics: a process that takes a surface and modifies its appearance using.
Texture Mapping. Texturing  process that modifies the appearance of each point on a surface using an image or function  any aspect of appearance can.
Week 7 - Monday.  What did we talk about last time?  Specular shading  Aliasing and antialiasing.
Real-Time Rendering TEXTURING Lecture 02 Marina Gavrilova.
Week 7 - Wednesday.  What did we talk about last time?  Transparency  Gamma correction  Started texturing.
Virtual Realism LIGHTING AND SHADING. Lighting & Shading Approximate physical reality Ray tracing: Follow light rays through a scene Accurate, but expensive.
1. What is Lighting? 2 Example 1. Find the cubic polynomial or that passes through the four points and satisfies 1.As a photon Metal Insulator.
Week 9 - Wednesday.  What did we talk about last time?  Fresnel reflection  Snell's Law  Microgeometry effects  Implementing BRDFs  Image based.
Based on slides created by Edward Angel
Computer Graphics - Class 10
IMGD 1001: Illumination by Mark Claypool
Computer Graphics (Fall 2005) COMS 4160, Lecture 16: Illumination and Shading 1
(conventional Cartesian reference system)
1 CSCE 641: Computer Graphics Lighting Jinxiang Chai.
7M836 Animation & Rendering
Objectives Learn to shade objects so their images appear three- dimensional Learn to shade objects so their images appear three- dimensional Introduce.
University of British Columbia CPSC 414 Computer Graphics © Tamara Munzner 1 Shading Week 5, Wed 1 Oct 2003 recap: lighting shading.
CS 480/680 Computer Graphics Shading I Dr. Frederick C Harris, Jr.
University of Texas at Austin CS 378 – Game Technology Don Fussell CS 378: Computer Game Technology Beyond Meshes Spring 2012.
Computer Graphics Inf4/MSc Computer Graphics Lecture 11 Texture Mapping.
09/18/03CS679 - Fall Copyright Univ. of Wisconsin Last Time Bump Mapping Multi-pass algorithms.
SET09115 Intro Graphics Programming
Shading (introduction to rendering). Rendering  We know how to specify the geometry but how is the color calculated.
University of Illinois at Chicago Electronic Visualization Laboratory (EVL) CS 426 Intro to 3D Computer Graphics © 2003, 2004, 2005 Jason Leigh Electronic.
COMPUTER GRAPHICS CS 482 – FALL 2014 AUGUST 27, 2014 FIXED-FUNCTION 3D GRAPHICS MESH SPECIFICATION LIGHTING SPECIFICATION REFLECTION SHADING HIERARCHICAL.
Technology and Historical Overview. Introduction to 3d Computer Graphics  3D computer graphics is the science, study, and method of projecting a mathematical.
Computer Graphics An Introduction. What’s this course all about? 06/10/2015 Lecture 1 2 We will cover… Graphics programming and algorithms Graphics data.
Shading & Texture. Shading Flat Shading The process of assigning colors to pixels. Smooth Shading Gouraud ShadingPhong Shading Shading.
09/09/03CS679 - Fall Copyright Univ. of Wisconsin Last Time Event management Lag Group assignment has happened, like it or not.
09/11/03CS679 - Fall Copyright Univ. of Wisconsin Last Time Graphics Pipeline Texturing Overview Cubic Environment Mapping.
Game Programming (Mapping) Spring.
CS447/ Realistic Rendering -- Radiosity Methods-- Introduction to 2D and 3D Computer Graphics.
CS 638, Fall 2001 Multi-Pass Rendering The pipeline takes one triangle at a time, so only local information, and pre-computed maps, are available Multi-Pass.
Rendering Overview CSE 3541 Matt Boggus. Rendering Algorithmically generating a 2D image from 3D models Raster graphics.
CS-378: Game Technology Lecture #4: Texture and Other Maps Prof. Okan Arikan University of Texas, Austin V Lecture #4: Texture and Other Maps.
Advanced Computer Graphics Advanced Shaders CO2409 Computer Graphics Week 16.
University of Texas at Austin CS 378 – Game Technology Don Fussell CS 378: Computer Game Technology Basic Rendering Pipeline and Shading Spring 2012.
03/14/02 (c) 2002 University of Wisconsin, CS559 Last Time Some more visibility (Hidden Surface Removal) algorithms –A-buffer –Scanline –Depth sorting.
CS 638, Fall 2001 Interactive Programs Games are interactive systems - they must respond to the user Today is all about how interactive programs are designed.
09/16/03CS679 - Fall Copyright Univ. of Wisconsin Last Time Environment mapping Light mapping Project Goals for Stage 1.
11/04/04© University of Wisconsin, CS559 Fall 2004 Last Time Visibility –Z-Buffer and transparency –A-buffer –Area subdivision –BSP Trees –Exact Cell-Portal.
CS 325 Introduction to Computer Graphics 03 / 29 / 2010 Instructor: Michael Eckmann.
Review on Graphics Basics. Outline Polygon rendering pipeline Affine transformations Projective transformations Lighting and shading From vertices to.
RENDERING Introduction to Shading models – Flat and Smooth shading – Adding texture to faces – Adding shadows of objects – Building a camera in a program.
Lecture Fall 2001 Illumination and Shading in OpenGL Light Sources Empirical Illumination Shading Transforming Normals Tong-Yee Lee.
Visual Appearance Chapter 4 Tomas Akenine-Möller Department of Computer Engineering Chalmers University of Technology.
11/5/2002 (c) University of Wisconsin, CS 559 Last Time Local Shading –Diffuse term –Specular term –All together –OpenGL brief overview.
Local Illumination and Shading
Where We Stand So far we know how to: –Transform between spaces –Rasterize –Decide what’s in front Next –Deciding its intensity and color.
What are shaders? In the field of computer graphics, a shader is a computer program that runs on the graphics processing unit(GPU) and is used to do shading.
1 CSCE 441: Computer Graphics Lighting Jinxiang Chai.
In the name of God Computer Graphics. Where We Stand So far we know how to: –Transform between spaces –Draw polygons Next –Deciding a pixel’s intensity.
1 CSCE 441: Computer Graphics Lighting Jinxiang Chai.
03/19/2002(c) University of Wisconsin, CS 559 Last Time BSP Tree rendering and exact visibility in mazes Local Shading –Diffuse term –Specular term.
Computer Graphics: Illumination
Illumination and Shading. Illumination (Lighting) Model the interaction of light with surface points to determine their final color and brightness OpenGL.
Shaders, part 2 alexandri zavodny.
Week 7 - Monday CS361.
© University of Wisconsin, CS559 Spring 2004
Photorealistic Rendering vs. Interactive 3D Graphics
© University of Wisconsin, CS559 Spring 2004
Visual Appearance Chapter 4
Game Programming (Mapping)
Deferred Lighting.
© University of Wisconsin, CS559 Fall 2004
CS5500 Computer Graphics May 29, 2006
CS-378: Game Technology Lecture #4: Texture and Other Maps
Adding Surface Detail 고려대학교 컴퓨터 그래픽스 연구실.
Adding Surface Detail 고려대학교 컴퓨터 그래픽스 연구실.
Presentation transcript:

CS 638, Fall 2001 Admin Grad student TAs may have had their accounts disabled –Please check and the lab if there is a problem If you plan on graduating with any degree in the coming year, you should see Lorene and collect a questionnaire –List compiled from questionnaires will be provided to employers Sooner, or later, LithTech install CDs will be available to borrow overnight –Arrangements yet to be finalized

CS 638, Fall 2001 NTSC vs. PAL Two major differences: –Vertical resolution: 625 vs. 525 (not all useable) –Frame rate: 50 Hz vs. 60 Hz (approx) Issues: –Artwork appearance, particularly for menus and other 2D art –Animation timing: Detach animation clock from frame rate clock, which is good practice anyway

CS 638, Fall 2001 Graphics Review Recall the standard graphics pipeline (OpenGL in this case):

CS 638, Fall 2001 Normal Vectors The surface normal vector describes the orientation of the surface at a point –Mathematically: Vector that is perpendicular to the tangent plane of the surface What’s the problem with this definition? –Just “the normal vector” or “the normal” –Will use N to denote Many lighting calculations are parameterized by the normal vector –Later, see how to exploit this

CS 638, Fall 2001 Local Shading Models Local shading models provide a way to determine the intensity and color of a point on a surface –The models are local because they don’t consider other objects at all –We use them because they are fast and simple to compute –They do not require knowledge of the entire scene, only the current piece of surface Works well for pipelined architectures, because pipeline only knows about one piece of geometry at a time

CS 638, Fall 2001 “Traditional” Shading Model What it captures: –Direct illumination from light sources –Diffuse and Specular components –(Very) Approximate effects of global lighting What it doesn’t do: –Shadows –Mirrors –Refraction –Lots of other stuff …

CS 638, Fall 2001 “Standard” Lighting Model Consists of three terms linearly combined: –Diffuse component for the amount of incoming light reflected equally in all directions –Specular component for the amount of light reflected in a mirror-like fashion –Ambient term to approximate light arriving via other surfaces It doesn’t do shadows, mirrors, refraction, lots of other stuff …

CS 638, Fall 2001 Diffuse Illumination Incoming light, I i, from direction L, is reflected equally in all directions –No dependence on viewing direction Amount of light reflected depends on: –Angle of surface with respect to light source Actually, determines how much light is collected by the surface, to then be reflected –Diffuse reflectance coefficient of the surface, k d Don’t want to illuminate back side. Use

CS 638, Fall 2001 Diffuse Example Where is the light?

CS 638, Fall 2001 Specular Reflection (Phong Model) Incoming light is reflected primarily in the mirror direction, R. (H is half vector, N is normal) –Perceived intensity depends on the relationship between the viewing direction, V, and the mirror direction –Bright spot is called a specularity Intensity controlled by: –The specular reflectance coefficient, k s –The parameter, n, controls the apparent size of the specularity Higher n, smaller highlight L R V H

CS 638, Fall 2001 Specular Example

CS 638, Fall 2001 Putting It Together Global ambient intensity, I a : –Gross approximation to light bouncing around of all other surfaces –Modulated by ambient reflectance k a Just sum all the terms If there are multiple lights, sum contributions from each light Several variations, and approximations

CS 638, Fall 2001 Flat shading Compute shading at a representative point and apply to whole polygon –OpenGL uses one of the vertices Advantages: –Fast - one shading value per polygon Disadvantages: –Inaccurate –Discontinuities at polygon boundaries

CS 638, Fall 2001 Gourand Shading Shade each vertex with it’s own location and normal Linearly interpolate across the face Advantages: –Fast - incremental calculations when rasterizing –Much smoother - use one normal per shared vertex to get continuity between faces Disadvantages: –Specularities get lost

CS 638, Fall 2001 Phong Interpolation Interpolate normals across faces Shade each pixel Advantages: –High quality, narrow specularities Disadvantages: –Expensive –Still an approximation for most surfaces Not to be confused with Phong’s specularity model

CS 638, Fall 2001

Texture Mapping The problem: Colors, normals, etc. are only specified at vertices. How do we add detail between vertices? Solution: Specify the details in an image (the texture) and specify how to apply the image to the geometry (the map) Works for shading parameters other than color, as we shall see –The basic underlying idea is the mapping

CS 638, Fall 2001 Basic Mapping The texture lives in a 2D space –Parameterize points in the texture with 2 coordinates: (s,t) –These are just what we would call (x,y) if we were talking about an image, but we wish to avoid confusion with the world (x,y,z) Define the mapping from (x,y,z) in world space to (s,t) in texture space With polygons: –Specify (s,t) coordinates at vertices –Interpolate (s,t) for other points based on given vertices

CS 638, Fall 2001 Basic Mapping

CS 638, Fall 2001 I assume you recall… Texture sampling (aliasing) is a big problem –Mipmaps and other filtering techniques are the solution The texture value for points that map outside the texture image can be generated in various ways –Repeat, Clamp, … Texture coordinates are specified at vertices and interpolated across triangles Width and height of texture images is constrained (powers of two, sometimes must be square)

CS 638, Fall 2001 Textures in Games The game engine provides some amount of texture support Artists are supplied with tools to exploit this support –They design the texture images –They specify how to apply the image to the object Commonly, textures are supplied at varying resolutions to support different hardware performance –Note that the texture mapping code does not need to be changed - just load different sized maps at run time Textures are, without doubt, the most important part of a game’s look

CS 638, Fall 2001 Example Texture Tool

CS 638, Fall 2001 Packing textures Problem: The limits on texture width/height make it inefficient to store many textures –For example: long, thin objects Solution: Artists pack the textures for many objects into one image –The texture coordinates for a given object may only index into a small part of the image –Care must be taken at the boundary between sub-images to achieve correct blending –Mipmapping is restricted –Best for objects that will be at known resolution (weapons, for instance)

CS 638, Fall 2001 Combining Textures

CS 638, Fall 2001 Texture Matrix Normally, the texture coordinates given at vertices are interpolated and directly used to index the texture The texture matrix applies a homogeneous transform to the texture coordinates before indexing the texture What use is this? –Two examples in this lecture: Animating textures and projective texturing

CS 638, Fall 2001 Animating Texture (method 1) The texture matrix can be used to translate or rotate the texture If the texture matrix is changed from frame to frame, the texture will appear to move on the object This is particularly useful for things like flame, or swirling vortices, or pulsing entrances, …

CS 638, Fall 2001 Demo

CS 638, Fall 2001 Projective Texturing The texture should appear to be projected onto the scene, as if from a slide projector Solution: –Equate texture coordinates with world coordinates –Think about it from the projector’s point of view: wherever a world point appears in the projector’s view, it should pick up the texture –Use a texture matrix equivalent to the projection matrix for the projector – maps world points into texture image points Details available in many places Problems? What else could you do with it?

CS 638, Fall 2001 Multitexturing Some effects are easier to implement if multiple textures can be applied –Future lectures: Light maps, bump maps, shadows, … Multitexturing hardware provides a pipeline of texture units, each of which applies a standard texture map operation –Fragments are passed through the pipeline with each step working on the result of the previous stage –Texture parameters are specified independently for each unit, further improving functionality –For example, the first stage applies a color map, the next modifies the illumination to simulate bumps, the third modifies opacity –Not the same as multi-pass rendering - all applied in one pass

CS 638, Fall 2001 What’s in a Texture? The graphics hardware doesn’t know what is in a texture –It applies a set of operations using values it finds in the texture, the existing value of the fragment (pixel), and maybe another color –The programmer gets to decide what the operations are, within some set of choices provided by the hardware –Examples: the texture may contain scalar “luminance” information, which simply multiplies the fragment color. What use is this? the texture might contain “alpha” data that multiplies the fragment’s alpha channel but leaves the fragment color alone. What use is this? –Future lectures will look at creative interpretations of textures