Week 8 - Monday CS361.

Slides:



Advertisements
Similar presentations
Exploration of bump, parallax, relief and displacement mapping
Advertisements

Bump Mapping CSE 781 Roger Crawfis.
Week 8 - Friday.  What did we talk about last time?  Radiometry  Photometry  Colorimetry  Lighting with shader code  Ambient  Directional (diffuse.
Understanding the graphics pipeline Lecture 2 Original Slides by: Suresh Venkatasubramanian Updates by Joseph Kider.
Graphics Pipeline.
03/16/2009Dinesh Manocha, COMP770 Texturing Surface’s texture: its look & feel Graphics: a process that takes a surface and modifies its appearance using.
Texture Mapping. Texturing  process that modifies the appearance of each point on a surface using an image or function  any aspect of appearance can.
Week 7 - Monday.  What did we talk about last time?  Specular shading  Aliasing and antialiasing.
Week 11 - Wednesday.  Image based effects  Skyboxes  Lightfields  Sprites  Billboards  Particle systems.
Week 10 - Monday.  What did we talk about last time?  Global illumination  Shadows  Projection shadows  Soft shadows.
Real-Time Rendering TEXTURING Lecture 02 Marina Gavrilova.
Week 7 - Wednesday.  What did we talk about last time?  Transparency  Gamma correction  Started texturing.
Week 8 - Wednesday.  What did we talk about last time?  Textures  Volume textures  Cube maps  Texture caching and compression  Procedural texturing.
Week 9 - Wednesday.  What did we talk about last time?  Fresnel reflection  Snell's Law  Microgeometry effects  Implementing BRDFs  Image based.
Introduction to Shader Programming
IN4151 Introduction 3D graphics 1 Introduction to 3D computer graphics part 2 Viewing pipeline Multi-processor implementation GPU architecture GPU algorithms.
Status – Week 277 Victor Moya.
1 3D –graphics and animation Shading and Surface Characteristics Harri Airaksinen.
University of British Columbia CPSC 414 Computer Graphics © Tamara Munzner 1 Shading Week 5, Wed 1 Oct 2003 recap: lighting shading.
Computer Science – Game DesignUC Santa Cruz Adapted from Jim Whitehead’s slides Shaders Feb 18, 2011 Creative Commons Attribution 3.0 (Except copyrighted.
University of Texas at Austin CS 378 – Game Technology Don Fussell CS 378: Computer Game Technology Beyond Meshes Spring 2012.
Computer Graphics Inf4/MSc Computer Graphics Lecture 11 Texture Mapping.
SET09115 Intro Graphics Programming
Week 3 - Wednesday.  What did we talk about last time?  Project 1  Graphics processing unit  Programmable shading.
1 Computer Graphics Week13 –Shading Models. Shading Models Flat Shading Model: In this technique, each surface is assumed to have one normal vector (usually.
Week 8 - Monday.  What did we talk about last time?  Workday  Before that:  Image texturing ▪ Magnification ▪ Minification  Mipmapping  Summed area.
REAL-TIME VOLUME GRAPHICS Christof Rezk Salama Computer Graphics and Multimedia Group, University of Siegen, Germany Eurographics 2006 Real-Time Volume.
University of Illinois at Chicago Electronic Visualization Laboratory (EVL) CS 426 Intro to 3D Computer Graphics © 2003, 2004, 2005 Jason Leigh Electronic.
GPU Programming Robert Hero Quick Overview (The Old Way) Graphics cards process Triangles Graphics cards process Triangles Quads.
CS 638, Fall 2001 Admin Grad student TAs may have had their accounts disabled –Please check and the lab if there is a problem If you plan on graduating.
Computer Graphics Texture Mapping
Week 2 - Wednesday CS361.
UW EXTENSION CERTIFICATE PROGRAM IN GAME DEVELOPMENT 2 ND QUARTER: ADVANCED GRAPHICS Textures.
09/09/03CS679 - Fall Copyright Univ. of Wisconsin Last Time Event management Lag Group assignment has happened, like it or not.
TERRAIN SET09115 Intro to Graphics Programming. Breakdown  Basics  What do we mean by terrain?  How terrain rendering works  Generating terrain 
Week 3 - Monday.  What did we talk about last time?  Graphics rendering pipeline  Rasterizer Stage.
Computer Graphics The Rendering Pipeline - Review CO2409 Computer Graphics Week 15.
CS-378: Game Technology Lecture #4: Texture and Other Maps Prof. Okan Arikan University of Texas, Austin V Lecture #4: Texture and Other Maps.
Advanced Computer Graphics Advanced Shaders CO2409 Computer Graphics Week 16.
Game Programming 06 The Rendering Engine
Computer Graphics 2 Lecture 7: Texture Mapping Benjamin Mora 1 University of Wales Swansea Pr. Min Chen Dr. Benjamin Mora.
09/16/03CS679 - Fall Copyright Univ. of Wisconsin Last Time Environment mapping Light mapping Project Goals for Stage 1.
Computer Graphics: Programming, Problem Solving, and Visual Communication Steve Cunningham California State University Stanislaus and Grinnell College.
BUMP-MAPPING SET09115 Intro to Graphics Programming.
COMPUTER GRAPHICS CS 482 – FALL 2015 SEPTEMBER 29, 2015 RENDERING RASTERIZATION RAY CASTING PROGRAMMABLE SHADERS.
Computer Science – Game DesignUC Santa Cruz Tile Engine.
Ying Zhu Georgia State University
Week 2 - Monday CS361.
Computer Graphics Imaging
Week 3 - Monday CS361.
Week 7 - Wednesday CS361.
Week 12 - Thursday CS361.
Week 7 - Monday CS361.
Week 8 - Wednesday CS361.
Week 11 - Wednesday CS361.
Deferred Lighting.
3D Graphics Rendering PPT By Ricardo Veguilla.
Bump Mapping -1 Three scales of detail on an object
The Graphics Rendering Pipeline
CS451Real-time Rendering Pipeline
Chapter 14 Shading Models.
Chapter IX Bump Mapping
Computer Animation Texture Mapping.
UMBC Graphics for Games
CS-378: Game Technology Lecture #4: Texture and Other Maps
Computer Graphics Introduction to Shaders
Game Programming Algorithms and Techniques
Computer Graphics Material Colours and Lighting
Chapter 14 Shading Models.
CIS 441/541: Introduction to Computer Graphics Lecture 15: shaders
Presentation transcript:

Week 8 - Monday CS361

Last time What did we talk about last time? Mipmapping Summed area tables Anisotropic filtering

Questions?

Project 2

MonoGame Shader Examples

Effect files We have been using BasicEffect to achieve most of our shading BasicEffect gets so much done that it's tempting not to move any further But more complex effects can be achieved by writing shader code ourselves To start, open up the MonoGame pipepline, right-click on the Content folder, and choose Add > New Item Choose Effect from the wizard and name it something that ends with .fx

Shader declarations We need to declare the variables we are going to use at the top of the file These usually include at least the following (which are already given in the template) We're also going to add an ambient color and an ambient intensity for this shader float4x4 World; float4x4 View; float4x4 Projection; float4x4 WorldInverseTranspose; float4 AmbientColor = float4(1, 1, 1, 1); float AmbientIntensity = 0.1;

Shader data structures We also have to define structures to take the input and the output These vary because different vertex formats include different data (position, normals, colors, texture coordinates) The simplest possible input and output would have position only The POSITION0 is called a semantic Semantics are used to tell the shader what the purpose of a variable is so that it can pass the right data in and out struct VertexShaderInput { float4 Position : POSITION0; float4 Normal : NORMAL0; }; struct VertexShaderOutput { float4 Color : COLOR0;

Vertex shader The job of the vertex shader is, at the very least, to transform a vertex from model space to world space to view space to clip space It can also do normal and color calculations VertexShaderOutput VertexShaderFunction(VertexShaderInput input){ VertexShaderOutput output; float4 worldPosition = mul(input.Position, World); float4 viewPosition = mul(worldPosition, View); output.Position = mul(viewPosition, Projection); float4 normal = mul(input.Normal, WorldInverseTranspose); float lightIntensity = dot(normal, normalize(DiffuseLightDirection)); output.Color = saturate(DiffuseColor * DiffuseIntensity * lightIntensity); return output; }

Pixel shader The pixel shader must find the final color of the pixel fragment This pixel shader uses a diffuse shading model The computed lighting is added to the ambient lighting float4 PixelShaderFunction(VertexShaderOutput input) : COLOR0 { return saturate(input.Color + AmbientColor * AmbientIntensity); }

Techniques You're allowed to name your vertex and pixel shaders anything you want You specify which you're going to use in a technique At this level, techniques only have one pass, but it is possible to use multiple techniques to achieve interesting effects technique Diffuse { pass Pass1 { VertexShader = compile VS_SHADERMODEL VertexShaderFunction(); PixelShader = compile PS_SHADERMODEL PixelShaderFunction(); }

Using the shader in the game You have to load the shader like other content Then, we run a loop similar to the earlier one, setting the parameters for the effect object effect = Content.Load<Effect>("Diffuse"); foreach( ModelMesh mesh in model.Meshes ) { foreach (ModelMeshPart part in mesh.MeshParts) { part.Effect = effect; effect.Parameters["World"]. SetValue(mesh.ParentBone.Transform * world); effect.Parameters["View"].SetValue(view); effect.Parameters["WorldInverseTranspose"]. SetValue(Matrix.Transpose(Matrix.Invert(mesh.ParentBone.Transform * world))); effect.Parameters["Projection"]. SetValue(projection); } mesh.Draw();

Other Texturing Issues

Volume textures Image textures are the most common, but 3D volume textures can be used These textures store data in a (u, v, w) coordinate space Even volume textures can be mipmapped Quadrilinear interpolation! In practice, volume textures are usually used for fog, smoke, or explosions 3D effects that are inconsistent over the volume

Cube maps A cube map is a kind of texture map with 6 faces Cube maps are used to texture surfaces based on direction They are commonly used in environment mapping A ray is made from the center of the cube out to the surface The component with the largest magnitude selects which of the 6 faces The other components are used for (u,v) coordinates Cube maps can cause awkward seams when jumping between faces

Texture caching You will never need to worry about this in this class, but texture memory space is a huge problem There are many different caching strategies, similar ones used for RAM: Least Recently Used (LRU): Swap out the least recently used texture, very commonly used Most Recently Used (MRU): Swap out the most recently used texture, use only during thrashing Prefetching can be useful to maintain consistent frame rates

Texture compression JPEG and PNG are common compression techniques for regular images In graphics hardware, these are too complicated to be decoded on the fly That's why the finished MonoGame projects have pre-processed.tkb files Most DirectX texture compression divides textures into 4 x 4 tiles Two 16-bit RGB values are recorded for each tile Each texel uses 2 bits to select one of the two colors or two interpolated values between them

More texture compression Ericsson texture compression (ETC) is used in OpenGL It breaks texels into 2 x 4 blocks with a single color It uses per-pixel luminance information to add detail to the blocks Normal maps (normals stored as textures) allow for interesting compression approaches Only x and y components are needed since the z component can be calculated The x and y can then be stored using the BC5 format for two channels of color data

Procedural texturing A procedural texture is made by computing a function of u and v instead of looking up a texel in an image Noise functions are often used to give an appearance of randomness Volume textures can be generated on the fly Values can be returned based on distance to certain feature points (redder colors near heat, for example)

Texture animation Textures don't have to be static The application can alter them over time Alternatively, u and v values can be remapped to make the texture appear to move Matrix transformations can be used for zoom, rotation, shearing, etc. Video textures can be used to play back a movie in a texture Blending between textures can allow an object to transform like a chameleon

Material Mapping The lighting we have discussed is based on material properties Diffuse color Specular color Smoothness coefficient m A texture can be used to modify these values on a per-pixel basis A normal image texture can be considered a diffuse color map One that affects specular colors is a specular color map (usually grayscale) One that affects m is a gloss map

Alpha Mapping Alpha values allow for interesting effects Decaling is when you apply a texture that is mostly transparent to a (usually already textured) surface Cutouts can be used to give the impression of a much more complex underlying polygon 1-bit alpha doesn't require sorting Cutouts are not always convincing from every angle

Student Lecture: Bump Mapping

Bump Mapping

Bump mapping Bump mapping refers to a wide range of techniques designed to increase small scale detail Most bump mapping is implemented per-pixel in the pixel shader 3D effects of bump mapping are greater than textures alone, but less than full geometry

Macro, meso, and micro Macro-geometry is made up of vertices and triangles Limbs and head of a body Micro-geometry are characteristics shaded in the pixel shader, often with texture maps Smoothness (specular color and m parameter) based on microscopic smoothness of a material Meso-geometry is the stuff in between that is too complex for macro-geometry but large enough to change over several pixels Wrinkles Folds Seams Bump mapping techniques are primarily concerned with mesoscale effects

Blinn's methods James Blinn proposed the offset vector bump map or offset map Stores bu and bv values at each texel, giving the amount that the normal should be changed at that point Another method is a heightfield, a grayscale image that gives the varying heights of a surface Normal changes can be computed from the heightfield

Normal maps The results are the same, but these kinds of deformations are usually stored in normal maps Normal maps give the full 3-component normal change Normal maps can be in world space (uncommon) Only usable if the object never moves Or object space Requires the object only to undergo rigid body transforms Or tangent space Relative to the surface, can assume positive z Lighting and the surface have to be in the same space to do shading Filtering normal maps is tricky

Parallax mapping Bump mapping doesn't change what can be seen, just the normal High enough bumps should block each other Parallax mapping approximates the part of the image you should see by moving from the height back to the view vector and taking the value at that point The final point used is:

Parallax mapping continued At shallow viewing angles, the previous approximation can look bad A small change results in a big texture change To improve the situation, the offset is limited (by not scaling by the z component) It flattens the bumpiness at shallow angles, but it doesn't look crazy New equation:

Relief mapping The weakness of parallax mapping is that it can't tell where it first intersects the heightfield Samples are made along the view vector into the heightfield Three different research groups proposed the idea at the same time, all with slightly different techniques for doing the sampling There is still active research here Polygon boundaries are still flat in most models

Heightfield texturing Yet another possibility is to change vertex position based on texture values Called displacement mapping With the geometry shader, new vertices can be created on the fly Occlusion, self-shadowing, and realistic outlines are possible and fast Unfortunately, collision detection becomes more difficult

Upcoming

Next time… Radiometry Photometry Colorimetry BRDFs

Reminders Start reading Chapter 7 Finish Project 2 Due on Friday