Week 8 - Monday.  What did we talk about last time?  Workday  Before that:  Image texturing ▪ Magnification ▪ Minification  Mipmapping  Summed area.

Slides:



Advertisements
Similar presentations
Michael I. Gold NVIDIA Corporation
Advertisements

Exploration of bump, parallax, relief and displacement mapping
03/16/2009Dinesh Manocha, COMP770 Texturing Surface’s texture: its look & feel Graphics: a process that takes a surface and modifies its appearance using.
Texture Mapping. Texturing  process that modifies the appearance of each point on a surface using an image or function  any aspect of appearance can.
Week 7 - Monday.  What did we talk about last time?  Specular shading  Aliasing and antialiasing.
Week 11 - Wednesday.  Image based effects  Skyboxes  Lightfields  Sprites  Billboards  Particle systems.
Week 10 - Monday.  What did we talk about last time?  Global illumination  Shadows  Projection shadows  Soft shadows.
Real-Time Rendering TEXTURING Lecture 02 Marina Gavrilova.
ATEC Procedural Animation Introduction to Procedural Methods in 3D Computer Animation Dr. Midori Kitagawa.
Texture Visual detail without geometry. Texture Mapping desire for heightened realism.
Week 7 - Wednesday.  What did we talk about last time?  Transparency  Gamma correction  Started texturing.
Computer Graphics Visible Surface Determination. Goal of Visible Surface Determination To draw only the surfaces (triangles) that are visible, given a.
Week 9 - Wednesday.  What did we talk about last time?  Fresnel reflection  Snell's Law  Microgeometry effects  Implementing BRDFs  Image based.
Texture Mapping CPSC /24/03 Abhijeet Ghosh.
(conventional Cartesian reference system)
Status – Week 277 Victor Moya.
Texture mapping. Adds realism to computer graphics Texture mapping applies a pattern of color Bump mapping alters the surface Mapping is cheaper than.
Basic Rendering Techniques V Recognizing basic rendering techniques.
University of Texas at Austin CS 378 – Game Technology Don Fussell CS 378: Computer Game Technology Beyond Meshes Spring 2012.
Computer Graphics Inf4/MSc Computer Graphics Lecture 11 Texture Mapping.
SET09115 Intro Graphics Programming
1 Computer Graphics Week13 –Shading Models. Shading Models Flat Shading Model: In this technique, each surface is assumed to have one normal vector (usually.
Computer Graphics Inf4/MSc Computer Graphics Lecture 9 Antialiasing, Texture Mapping.
Computer Graphics: Programming, Problem Solving, and Visual Communication Steve Cunningham California State University Stanislaus and Grinnell College.
3D Computer Graphics: Textures. Textures: texels Texture is a way of assigning a diffuse color to a pixel – can be with 1, 2 or 3D- can use maps, interpolation.
1 Texture. 2 Overview Introduction Painted textures Bump mapping Environment mapping Three-dimensional textures Functional textures Antialiasing textures.
Computer Graphics Inf4/MSc Computer Graphics Lecture 7 Texture Mapping, Bump-mapping, Transparency.
University of Illinois at Chicago Electronic Visualization Laboratory (EVL) CS 426 Intro to 3D Computer Graphics © 2003, 2004, 2005 Jason Leigh Electronic.
GPU Programming Robert Hero Quick Overview (The Old Way) Graphics cards process Triangles Graphics cards process Triangles Quads.
Technology and Historical Overview. Introduction to 3d Computer Graphics  3D computer graphics is the science, study, and method of projecting a mathematical.
Introducing To 3D Modeling George Atanasov Telerik Corporation
CS 638, Fall 2001 Admin Grad student TAs may have had their accounts disabled –Please check and the lab if there is a problem If you plan on graduating.
Computer Graphics Texture Mapping
11/11/04© University of Wisconsin, CS559 Fall 2004 Last Time Shading Interpolation Texture mapping –Barycentric coordinates for triangles.
Texture Mapping. Scope Buffers Buffers Various of graphics image Various of graphics image Texture mapping Texture mapping.
Week 11 - Thursday.  What did we talk about last time?  Image processing  Blurring  Edge detection  Color correction  Tone mapping  Lens flare.
UW EXTENSION CERTIFICATE PROGRAM IN GAME DEVELOPMENT 2 ND QUARTER: ADVANCED GRAPHICS Textures.
Buffers Textures and more Rendering Paul Taylor & Barry La Trobe University 2009.
09/09/03CS679 - Fall Copyright Univ. of Wisconsin Last Time Event management Lag Group assignment has happened, like it or not.
09/11/03CS679 - Fall Copyright Univ. of Wisconsin Last Time Graphics Pipeline Texturing Overview Cubic Environment Mapping.
Rendering Overview CSE 3541 Matt Boggus. Rendering Algorithmically generating a 2D image from 3D models Raster graphics.
Computer Graphics The Rendering Pipeline - Review CO2409 Computer Graphics Week 15.
Photo-realistic Rendering and Global Illumination in Computer Graphics Spring 2012 Texturing K. H. Ko School of Mechatronics Gwangju Institute of Science.
Advanced Computer Graphics Advanced Shaders CO2409 Computer Graphics Week 16.
Game Programming 06 The Rendering Engine
Computer Graphics 2 Lecture 7: Texture Mapping Benjamin Mora 1 University of Wales Swansea Pr. Min Chen Dr. Benjamin Mora.
CHAPTER 8 Color and Texture Mapping © 2008 Cengage Learning EMEA.
1 Perception and VR MONT 104S, Fall 2008 Lecture 21 More Graphics for VR.
Realtime NPR Toon and Pencil Shading Joel Jorgensen May 4, 2010.
Advanced Computer Graphics Shadow Techniques CO2409 Computer Graphics Week 20.
09/16/03CS679 - Fall Copyright Univ. of Wisconsin Last Time Environment mapping Light mapping Project Goals for Stage 1.
Computer Graphics: Programming, Problem Solving, and Visual Communication Steve Cunningham California State University Stanislaus and Grinnell College.
Basic Perspective Projection Watt Section 5.2, some typos Define a focal distance, d, and shift the origin to be at that distance (note d is negative)
CS 450: COMPUTER GRAPHICS TEXTURE MAPPING SPRING 2015 DR. MICHAEL J. REALE.
Real-Time Relief Mapping on Arbitrary Polygonal Surfaces Fabio Policarpo Manuel M. Oliveira Joao L. D. Comba.
CSCI 440.  So far we have learned how to  build shapes  create movement  change views  add simple lights  But, our objects still look very cartoonish.
COMPUTER GRAPHICS CS 482 – FALL 2015 SEPTEMBER 29, 2015 RENDERING RASTERIZATION RAY CASTING PROGRAMMABLE SHADERS.
Greg Humphreys CS445: Intro Graphics University of Virginia, Fall 2003 Texture Mapping Greg Humphreys University of Virginia CS 445, Fall 2003.
Texturing Tomas Akenine-Möller Department of Computer Engineering Chalmers University of Technology.
Ying Zhu Georgia State University
Week 8 - Monday CS361.
Week 7 - Wednesday CS361.
Week 7 - Monday CS361.
Bump Mapping -1 Three scales of detail on an object
Basic Rendering Techniques
Chapter IX Bump Mapping
UMBC Graphics for Games
CS-378: Game Technology Lecture #4: Texture and Other Maps
Advanced Computer Graphics: Texture
Presentation transcript:

Week 8 - Monday

 What did we talk about last time?  Workday  Before that:  Image texturing ▪ Magnification ▪ Minification  Mipmapping  Summed area tables

 Typically a chain of mipmaps is created, each half the size of the previous  That's why cards like square power of 2 textures  Often the filtered version is made with a box filter, but better filters exist  The trick is figuring out which mipmap level to use  The level d can be computed based on the change in u relative to a change in x

 One way to improve quality is to interpolate between u and v texels from the nearest two d levels  Picking d can be affected by a level of detail bias term which may vary for the kind of texture being used

 Sometimes we are magnifying in one axis of the texture and minifying in the other  Summed area tables are another method to reduce the resulting overblurring  It sums up the relevant pixels values in the texture  It works by precomputing all possible rectangles

 Summed area tables work poorly for non-rectangular projections into texture space  Modern hardware uses unconstrained anisotropic filtering  The shorter side of the projected area determines d, the mipmap index  The longer side of the projected area is a line of anisotropy  Multiple samples are taken along this line  Memory requirements are no greater than regular mipmapping

 Image textures are the most common, but 3D volume textures can be used  These textures store data in a (u, v, w) coordinate space  Even volume textures can be mipmapped  Quadrilinear interpolation!  In practice, volume textures are usually used for fog, smoke, or explosions  3D effects that are inconsistent over the volume

 A cube map is a kind of texture map with 6 faces  Cube maps are used to texture surfaces based on direction  They are commonly used in environment mapping  A ray is made from the center of the cube out to the surface  The component with the largest magnitude selects which of the 6 faces  The other components are used for (u,v) coordinates  Cube maps can cause awkward seams when jumping between faces

 You will never need to worry about this in this class, but texture memory space is a huge problem  There are many different caching strategies, similar ones used for RAM:  Least Recently Used (LRU): Swap out the least recently used texture, very commonly used  Most Recently Used (MRU): Swap out the most recently used texture, use only during thrashing  Prefetching can be useful to maintain consistent frame rates

 JPEG and PNG are common compression techniques for regular images  In graphics hardware, these are too complicated to be decoded on the fly  That's why the finished SharpDX projects have pre- processed.tkb files  Most DirectX texture compression divides textures into 4 x 4 tiles  Two 16-bit RGB values are recorded for each tile  Each texel uses 2 bits to select one of the two colors or two interpolated values between them

 Ericsson texture compression (ETC) is used in OpenGL  It breaks texels into 2 x 4 blocks with a single color  It uses per-pixel luminance information to add detail to the blocks  Normal maps (normals stored as textures) allow for interesting compression approaches  Only x and y components are needed since the z component can be calculated  The x and y can then be stored using the BC5 format for two channels of color data

 A procedural texture is made by computing a function of u and v instead of looking up a texel in an image  Noise functions are often used to give an appearance of randomness  Volume textures can be generated on the fly  Values can be returned based on distance to certain feature points (redder colors near heat, for example)

 Textures don't have to be static  The application can alter them over time  Alternatively, u and v values can be remapped to make the texture appear to move  Matrix transformations can be used for zoom, rotation, shearing, etc.  Video textures can be used to play back a movie in a texture  Blending between textures can allow an object to transform like a chameleon

 The lighting we have discussed is based on material properties  Diffuse color  Specular color  Smoothness coefficient m  A texture can be used to modify these values on a per-pixel basis  A normal image texture can be considered a diffuse color map  One that affects specular colors is a specular color map (usually grayscale)  One that affects m is a gloss map

 Alpha values allow for interesting effects  Decaling is when you apply a texture that is mostly transparent to a (usually already textured) surface  Cutouts can be used to give the impression of a much more complex underlying polygon  1-bit alpha doesn't require sorting  Cutouts are not always convincing from every angle

 Bump mapping refers to a wide range of techniques designed to increase small scale detail  Most bump mapping is implemented per- pixel in the pixel shader  3D effects of bump mapping are greater than textures alone, but less than full geometry

 Macro-geometry is made up of vertices and triangles  Limbs and head of a body  Micro-geometry are characteristics shaded in the pixel shader, often with texture maps  Smoothness (specular color and m parameter) based on microscopic smoothness of a material  Meso-geometry is the stuff in between that is too complex for macro-geometry but large enough to change over several pixels  Wrinkles  Folds  Seams  Bump mapping techniques are primarily concerned with mesoscale effects

 James Blinn proposed the offset vector bump map or offset map  Stores b u and b v values at each texel, giving the amount that the normal should be changed at that point  Another method is a heightfield, a grayscale image that gives the varying heights of a surface  Normal changes can be computed from the heightfield

 The results are the same, but these kinds of deformations are usually stored in normal maps  Normal maps give the full 3- component normal change  Normal maps can be in world space (uncommon)  Only usable if the object never moves  Or object space  Requires the object only to undergo rigid body transforms  Or tangent space  Relative to the surface, can assume positive z  Lighting and the surface have to be in the same space to do shading  Filtering normal maps is tricky

 Bump mapping doesn't change what can be seen, just the normal  High enough bumps should block each other  Parallax mapping approximates the part of the image you should see by moving from the height back to the view vector and taking the value at that point  The final point used is:

 At shallow viewing angles, the previous approximation can look bad  A small change results in a big texture change  To improve the situation, the offset is limited (by not scaling by the z component)  It flattens the bumpiness at shallow angles, but it doesn't look crazy  New equation:

 The weakness of parallax mapping is that it can't tell where it first intersects the heightfield  Samples are made along the view vector into the heightfield  Three different research groups proposed the idea at the same time, all with slightly different techniques for doing the sampling  There is still active research here  Polygon boundaries are still flat in most models

 Yet another possibility is to change vertex position based on texture values  Called displacement mapping  With the geometry shader, new vertices can be created on the fly  Occlusion, self-shadowing, and realistic outlines are possible and fast  Unfortunately, collision detection becomes more difficult

 Radiometry  Photometry  Colorimetry  BRDFs

 Start reading Chapter 7  Finish Project 2  Due on Friday