Download presentation
Presentation is loading. Please wait.
Published byEugene Ferguson Modified over 9 years ago
1
CS559: Computer Graphics Lecture 20: Project 2 Review and Texture mapping Li Zhang Spring 2010 Many slides from Ravi Ramamoorthi, Columbia Univ, Greg Humphreys, UVA and Rosalee Wolfe, DePaul tutorial teaching texture mapping visually
2
Sample Code by Chi Man Liu
3
Other basic features How to change speed? – Add a slider with a callback to change timePerFrame How to orient a car? How to draw trees? – A triangle + a rectangle – A cone + a cylinder Using gluQuadrics
4
Piecewise Cubics P1 P4 P3 P2
5
Piecewise Cubics Task1: draw the whole curve DrawCubic(P4,P1,P2,P3); DrawCubic(P1,P2,P3,P4); DrawCubic(P2,P3,P4,P1); DrawCubic(P3,P4,P1,P2); DrawCubic(Q1,Q2,Q3,Q4) For (t=0; t < 0.1; t+=0.01) { A=computePoint(Q1,Q2,Q3,Q4,t); B=computePoint(Q1,Q2,Q3,Q4,t+0.01); drawLine(A,B); } Task1: draw the whole curve DrawCubic(P4,P1,P2,P3); DrawCubic(P1,P2,P3,P4); DrawCubic(P2,P3,P4,P1); DrawCubic(P3,P4,P1,P2); DrawCubic(Q1,Q2,Q3,Q4) For (t=0; t < 0.1; t+=0.01) { A=computePoint(Q1,Q2,Q3,Q4,t); B=computePoint(Q1,Q2,Q3,Q4,t+0.01); drawLine(A,B); } Task 2: find where the train is at time t: If (0<=t<1) return computePoint(P4,P1,P2,P3,t) If (1<=t<2) return computePoint(P1,P2,P3,P4,t-1); If (2<=t<3) return computePoint(P2,P3,P4,P1,t-2); If (3<=t<4) return computePoint(P3,P4,P1,P2,t-3); Task 2: find where the train is at time t: If (0<=t<1) return computePoint(P4,P1,P2,P3,t) If (1<=t<2) return computePoint(P1,P2,P3,P4,t-1); If (2<=t<3) return computePoint(P2,P3,P4,P1,t-2); If (3<=t<4) return computePoint(P3,P4,P1,P2,t-3); Fine with Hermite, Catmull-Ron, Cardinal How about Bezier?
6
Arc-Length Parameterization Arbitrary curves? P1 P4 P3 P2
7
Arc-length parameterization Freeform x y 0 1 t 0 L s Arc-length parameterization
8
Correct Orientation in 3D Define and interpolate up vector
9
Implement simple physics Energy conservation
10
Multiple Cars Each has its own parameter t, assuming arc- length parameterization
11
Hack Shadow (XL,YL,ZL) (x,y,z)
12
Train smoke? Balls moving upward and dissipating
13
Texture Mapping Important topic: nearly all objects textured – Wood grain, faces, bricks and so on – Adds visual detail to scenes Polygonal model With surface texture
14
Adding Visual Detail Basic idea: use images instead of more polygons to represent fine scale color variation
15
Parameterization geometry + = image texture map Q: How do we decide where on the geometry each color from the image should go?
16
Option: Varieties of mappings [Paul Bourke]
17
Option: unfold the surface [Piponi2000]
18
Option: make an atlas [Sander2001] chartsatlassurface
19
Outline Types of mappings Interpolating texture coordinates Broader use of textures
20
How to map object to texture? To each vertex (x,y,z in object coordinates), must associate 2D texture coordinates (s,t) So texture fits “nicely” over object
21
Implementing texture mapping A texture lives in it own abstract image coordinates paramaterized by (u,v) in the range ([0..1], [0..1]): It can be wrapped around many different surfaces: Note: if the surface moves/deforms, the texture goes with it.
22
How to map object to texture? To each vertex (x,y,z in object coordinates), must associate 2D texture coordinates (s,t) So texture fits “nicely” over object
23
Planar mapping Like projections, drop z coord (u,v) = (x/W,y/H) Problems: what happens near silhouettes?
24
Cylindrical Mapping Cylinder: r, θ, z with (u,v) = (θ/(2π),z) – Note seams when wrapping around (θ = 0 or 2π)
25
Basic procedure First, map (square) texture to basic map shape Then, map basic map shape to object – Or vice versa: Object to map shape, map shape to square Usually, this is straightforward – Maps from square to cylinder, plane, … – Maps from object to these are simply coordinate transform
26
Spherical Mapping Convert to spherical coordinates: use latitude/long. – Singularities at north and south poles
27
Cube Mapping
29
Piecewise Mapping From Steve Marschner
30
Slides from Leonard Mcmillan
31
Outline Types of projections Interpolating texture coordinates Broader use of textures
32
1 st idea: Gouraud interp. of texcoords Using barycentric Coordinates
33
1 st idea: Gouraud interp. of texcoords Scan line
34
Artifacts McMillan’s demo of this is at http://graphics.lcs.mit.edu/classes/6.837/F98/Lecture21/Slide05.html http://graphics.lcs.mit.edu/classes/6.837/F98/Lecture21/Slide05.html Another example http://graphics.lcs.mit.edu/classes/6.837/F98/Lecture21/Slide06.html What artifacts do you see? Why? Hint: problem is in interpolating parameters
35
Interpolating Parameters The problem turns out to be fundamental to interpolating parameters in screen-space – Uniform steps in screen space uniform steps in world space Texture image
36
Slides from Jingyi Yu Linear Interpolation in Screen Space Compare linear interpolation in screen space Without loss of generality, let’s assume that the viewport is located 1 unit away from the center of projection. That is
37
Linear Interpolation in 3-Space to interpolation in 3-space: Slides from Jingyi Yu
38
How to make them Mesh Still need to scan convert in screen space... so we need a mapping from t values to s values. We know that the all points on the 3-space edge project onto our screen-space line. Thus we can set up the following equality: and solve for s in terms of t giving: Unfortunately, at this point in the pipeline (after projection) we no longer have z 1 and z 2 lingering around (Why? Efficiency, don’t need to compute 1/z all the time). However, we do have w 1 = 1/z 1 and w 2 = 1/z 2. Slides from Jingyi Yu
39
Interpolating Parameters We can now use this expression for s to interpolate arbitrary parameters, such as texture indices (u, v), over our 3-space triangle. This is accomplished by substituting our solution for s given t into the parameter interpolation. Therefore, if we premultiply all parameters that we wish to interpolate in 3-space by their corresponding w value and add a new plane equation to interpolate the w values themselves, we can interpolate the numerators and denominator in screen-space. We then need to perform a divide a each step to get to map the screen-space interpolants to their corresponding 3-space values. This is a simple modification to the triangle rasterizer that we developed in class. Slides from Jingyi Yu
40
1 st idea: Gouraud interp. of texcoords Scan line Replace I to uw, vw, and w, then compute (uw/w, and vw/w)
41
1 st idea: Gouraud interp. of texcoords Scan line Do same thing for point b. From a and b, interpolate for s
42
Perspective-Correct Interpolation In short… – Rather than interpolating u and v directly, interpolate u/z and v/z These do interpolate correctly in screen space Also need to interpolate z and multiply per-pixel – Problem: we don’t keep z anymore – Solution: we do keep w 1/z – So…interpolate uw and vw and w, and compute u = uw/w and v = vw/w for each pixel This unfortunately involves a divide per pixel http://graphics.lcs.mit.edu/classes/6.837/F98/Lecture21/Slide14.html
43
Texture Mapping Linear interpolation of texture coordinates Correct interpolation with perspective divide Hill Figure 8.42 http://graphics.lcs.mit.edu/classes/6.8 37/F98/Lecture21/Slide14.html
44
Why don’t we notice? Traditional screen-space Gourand shading is wrong. However, you usually will not notice because the transition in colors is very smooth (And we don't know what the right color should be anyway, all we care about is a pretty picture). There are some cases where the errors in Gourand shading become obvious. When do we notice? – When switching between different levels-of-detail representations – At "T" joints.
45
Dealing with Incorrect Interpolation You can reduce the perceived artifacts of non-perspective correct interpolation by subdividing the texture-mapped triangles into smaller triangles (why does this work?). But, fundamentally the screen-space interpolation of projected parameters is inherently flawed. http://groups.csail.mit.edu/graphics/classes/6.837/F98/Lecture21/Slide15.html
46
Outline Types of mappings Interpolating texture coordinates Texture Resampling Broader use of textures
47
Texture Map Filtering Naive texture mapping aliases badly Look familiar? int uval = round(u * W); int vval = round(v * H); int pix = texture.getPixel(uval, vval); Nearest Neighbor Sampling
48
Texture Map Filtering Naive texture mapping aliases badly Look familiar? int uval = round(u * W); int vval = round(v * H); int pix = texture.getPixel(uval, vval); Actually, each pixel maps to a region in texture – |PIX| < |TEX| Easy: interpolate (bilinear) between texel values – |PIX| > |TEX| Hard: average the contribution from multiple texels – |PIX| ~ |TEX| Still need interpolation!
49
Mipmap Let d = |PIX| be a measure of pixel size Option 1: use the longer edge of the quadrilateral formed by the pixel's cell to approximate the pixel's coverage Option 2: use the max of |du/dx|, |du/dy|, |dv/x|, |dv/dy| Sample (u,v,d) d=0 d=1 d=2 Using tri-linear interpolation What’s the memory overhead? d=3 Then take logarithm
50
Storing MIP Maps One convienent method of storing a MIP map is shown below (It also nicely illustrates the 1/3 overhead of maintaining the MIP map).
51
Nearest Neighbor Sampling Mipmap Sampling
52
Ripmap Using Trilinaer => quadrilinearSample (u,v,du,dv) What’s the memory overhead?
53
Summed Area Table What’s the memory overhead?
54
Nearest Neighbor Sampling Mipmap Sampling Summed Area Table
55
Summed-Area Tables How much storage does a summed-area table require? Does it require more or less work per pixel than a MIP map? What sort of low-pass filter does a summed-area table implement? No Filtering MIP mapping Summed- Area Table
56
Other filtering method, see Real-Time Rendering chapter 6.
57
Outline Types of mappings Interpolating texture coordinates Texture Resampling Texture mapping OpenGL Broader use of textures
58
Slides from Jingyi Yu Simple OpenGL Example Specify a texture coordinate at each vertex (s, t) Canonical coordinates where s and t are between 0 and 1 public void Draw() { glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT); glLoadIdentity(); glTranslated(centerx, centery, depth); glMultMatrixf(Rotation); : // Draw Front of the Cube glEnable(GL_TEXTURE_2D); glBegin(GL_QUADS); glTexCoord2d(0, 1); glVertex3d( 1.0, 1.0, 1.0); glTexCoord2d(1, 1); glVertex3d(-1.0, 1.0, 1.0); glTexCoord2d(1, 0); glVertex3d(-1.0,-1.0, 1.0); glTexCoord2d(0, 0); glVertex3d( 1.0,-1.0, 1.0); glEnd(); glDisable(GL_TEXTURE_2D); : glFlush(); } glTexCoord works like glColor
59
Initializing Texture Mapping static GLubyte image[64][64][4]; static GLuint texname; void init(void) { glClearColor (0.0, 0.0, 0.0, 0.0); glShadeModel(GL_FLAT); glEnable(GL_DEPTH_TEST); //load in or generate image; … glPixelStorei(GL_UNPACK_ALIGNMENT, 1); glGenTextures(1, &texName); glBindTexture(GL_TEXTURE_2D, texName); glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_REPEAT); glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_REPEAT); glTexPrameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_NEAREST); glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_NEAREST); glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, checkImageWidth, checkImageHeight, 0, GL_RGBA, GL_UNSIGNED_BYTE, image); } static GLubyte image[64][64][4]; static GLuint texname; void init(void) { glClearColor (0.0, 0.0, 0.0, 0.0); glShadeModel(GL_FLAT); glEnable(GL_DEPTH_TEST); //load in or generate image; … glPixelStorei(GL_UNPACK_ALIGNMENT, 1); glGenTextures(1, &texName); glBindTexture(GL_TEXTURE_2D, texName); glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_REPEAT); glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_REPEAT); glTexPrameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_NEAREST); glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_NEAREST); glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, checkImageWidth, checkImageHeight, 0, GL_RGBA, GL_UNSIGNED_BYTE, image); } Level index in the Pyramid
60
OpenGL Texture Peculiarities The width and height of Textures in OpenGL must be powers of 2 The parameter space of each dimension of a texture ranges from [0,1) regardless of the texture’s actual size. The behavior of texture indices outside of the range [0,1) is determined by the texture wrap options. glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_REPEAT); glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_REPEAT); glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP); glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP); // Draw Front of the Cube glEnable(GL_TEXTURE_2D); glBegin(GL_QUADS); glTexCoord2d(-1, 2); glVertex3d( 1.0, 1.0, 1.0); glTexCoord2d(2, 2); glVertex3d(-1.0, 1.0, 1.0); glTexCoord2d(2, -1); glVertex3d(-1.0,-1.0, 1.0); glTexCoord2d(-1, -1); glVertex3d( 1.0,-1.0, 1.0); glEnd(); glDisable(GL_TEXTURE_2D);
61
http://www.xmission.com/~nate/tutors.html
62
Slides from Jingyi Yu Simple OpenGL Example Specify a texture coordinate at each vertex (s, t) Canonical coordinates where s and t are between 0 and 1 public void Draw() { glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT); glLoadIdentity(); glTranslated(centerx, centery, depth); glMultMatrixf(Rotation); : // Draw Front of the Cube glEnable(GL_TEXTURE_2D); glBegin(GL_QUADS); glTexCoord2d(0, 1); glVertex3d( 1.0, 1.0, 1.0); glTexCoord2d(1, 1); glVertex3d(-1.0, 1.0, 1.0); glTexCoord2d(1, 0); glVertex3d(-1.0,-1.0, 1.0); glTexCoord2d(0, 0); glVertex3d( 1.0,-1.0, 1.0); glEnd(); glDisable(GL_TEXTURE_2D); : glFlush(); }
63
2/27/2016Lecture 2063 OpenGL Texture Peculiarities The width and height of Textures in OpenGL must be powers of 2 The parameter space of each dimension of a texture ranges from [0,1) regardless of the texture’s actual size. The behavior of texture indices outside of the range [0,1) is determined by the texture wrap options. glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_REPEAT); glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_REPEAT); glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP); glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP); // Draw Front of the Cube glEnable(GL_TEXTURE_2D); glBegin(GL_QUADS); glTexCoord2d(-1, 2); glVertex3d( 1.0, 1.0, 1.0); glTexCoord2d(2, 2); glVertex3d(-1.0, 1.0, 1.0); glTexCoord2d(2, -1); glVertex3d(-1.0,-1.0, 1.0); glTexCoord2d(-1, -1); glVertex3d( 1.0,-1.0, 1.0); glEnd(); glDisable(GL_TEXTURE_2D);
64
http://www.xmission.com/~nate/tutors.html
65
Initializing Texture Mapping Generate an artificial pattern or load in an image #define checkImageWidth 64 #define checkImageHeight 64 static GLubyte checkImage[checkImageHeight][checkImageWidth][4]; void makeCheckImage(void) { int i, j, c; for (i = 0; i < checkImageHeight; i++) { for (j = 0; j < checkImageWidth; j++) { c = ((((i&0x8)==0)^((j&0x8))==0))*255; checkImage[i][j][0] = (GLubyte) c; checkImage[i][j][1] = (GLubyte) c; checkImage[i][j][2] = (GLubyte) c; checkImage[i][j][3] = (GLubyte) 255; } #define checkImageWidth 64 #define checkImageHeight 64 static GLubyte checkImage[checkImageHeight][checkImageWidth][4]; void makeCheckImage(void) { int i, j, c; for (i = 0; i < checkImageHeight; i++) { for (j = 0; j < checkImageWidth; j++) { c = ((((i&0x8)==0)^((j&0x8))==0))*255; checkImage[i][j][0] = (GLubyte) c; checkImage[i][j][1] = (GLubyte) c; checkImage[i][j][2] = (GLubyte) c; checkImage[i][j][3] = (GLubyte) 255; }
66
Initializing Texture Mapping static GLuint texname; void init(void) { glClearColor (0.0, 0.0, 0.0, 0.0); glShadeModel(GL_FLAT); glEnable(GL_DEPTH_TEST); makeCheckImage(); glPixelStorei(GL_UNPACK_ALIGNMENT, 1); glGenTextures(1, &texName); glBindTexture(GL_TEXTURE_2D, texName); glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_REPEAT); glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_REPEAT); glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_NEAREST); glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_NEAREST); glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, checkImageWidth, checkImageHeight, 0, GL_RGBA, GL_UNSIGNED_BYTE, checkImage); } static GLuint texname; void init(void) { glClearColor (0.0, 0.0, 0.0, 0.0); glShadeModel(GL_FLAT); glEnable(GL_DEPTH_TEST); makeCheckImage(); glPixelStorei(GL_UNPACK_ALIGNMENT, 1); glGenTextures(1, &texName); glBindTexture(GL_TEXTURE_2D, texName); glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_REPEAT); glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_REPEAT); glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_NEAREST); glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_NEAREST); glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, checkImageWidth, checkImageHeight, 0, GL_RGBA, GL_UNSIGNED_BYTE, checkImage); } Level index in the Pyramid
67
Initializing Texture Mapping Red book chapter on Texture mapping, Example 9-1 All Red book example source code can be found at http://www.opengl.org/resources/code/samples/ redbook/ Course Tutorial 10, http://pages.cs.wisc.edu/~cs559- 1/Tutorial10.htm
68
Slides from Jingyi Yu OpenGL Mipmap Incorporating MIPmapping into OpenGL applications is surprisingly easy. The gluBuildMipmaps() utility routine will automatically construct a mipmap from a given texture buffer. It will filter the texture using a simple box filter and then subsample it by a factor of 2 in each dimension. It repeats this process until one of the texture’s dimensions is 1. Each texture ID, can have multiple levels associated with it. GL_LINEAR_MIPMAP_LINEAR trilinearly interploates between texture indices and MIPmap levels. Other options include GL_NEAREST_MIPMAP_NEAREST, GL_NEAREST_MIPMAP_LINEAR, and GL_LINEAR_MIPMAP_NEAREST. // Boilerplate Texture setup code glTexImage2D(GL_TEXTURE_2D, 0, 4, texWidth, texHeight, 0, GL_RGBA,GL_UNSIGNED_BYTE, data); gluBuild2DMipmaps(GL_TEXTURE_2D, 4, texWidth, texHeight, GL_RGBA, GL_UNSIGNED_BYTE, data); glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR); GL_LINEAR_MIPMAP_LINEAR glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR); glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_REPEAT); glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_REPEAT); glTexEnvf(GL_TEXTURE_ENV, GL_TEXTURE_ENV_MODE, GL_DECAL); OpenGL also provides a facility for specifying the MIPmap image at each level using multiple calls to the glTexImage*D() function. This approach provides more control over filtering in the MIPmap construction.
69
Other Issues with Textures Tedious to specify texture coordinates for every triangle Textures are attached to the geometry Can't use just any image as a texture The "texture" can't have projective distortions Reminder: linear interpolation in image space is not equivalent to linear interpolation in 3-space (This is why we need "perspective-correct" texturing). The converse is also true. Makes it hard to use pictures as textures
70
2/27/2016Lecture 2070 Projective Textures Treat the texture as a light source (like a slide projector) No need to specify texture coordinates explicitly A good model for shading variations due to illumination (cool spotlights) A fair model for view-dependent reflectance (can use pictures)
71
2/27/2016Lecture 2071 The Mapping Process During the Illumination process: For each vertex of triangle (in world or lighting space) Compute ray from the projective texture's origin to point Compute homogeneous texture coordinate, [ti, tj, t] T During scan conversion (in projected screen space) Interpolate all three texture coordinates in 3-space (premultiply by w of vertex) Do normalization at each rendered pixel i = t i / t, j = t j / t Access projected texture This is the same process, albeit with an additional transform, as perspective correct texture mapping. Thus, we can do it for free! Almost.
72
2/27/2016Lecture 2072 Another Frame “Texture Space” OpenGL is able to insert this extra projection transformation for textures by including another matrix stack, called GL_TEXTURE. The transform we want is: This matrix undoes the world-to- eye transform on the MODEL_VIEW matrix stack., so the projective texture can be specified in world coordinates. Note: If you specify your lights in eye-space then this matrix is identity. This matrix positions the projector in the world, much like the viewing matrix positions the eye within the world. (HINT, you can use gluLookAt() to set this up if you want. This matrix specifies the frustum of the projector. It is a non- affine, projection matrix. You can use any of the projection transformations to establish it, such as glFrustum(), glOrtho() or gluPerspective (). This extra matrix maps from normalized device coordinates ranging from [-1,1] to valid texture coordinates ranging from [0,1].
73
2/27/2016Lecture 2073 OpenGL Example Here is a code fragment implementing projective textures in OpenGL // The following information is associated with the current active texture // Basically, the first group of setting says that we will not be supplying texture coordinates. // Instead, they will be automatically established based on the vertex coordinates in “EYE- SPACE” // (after application of the MODEL_VIEW matrix). glTexGeni(GL_S, GL_TEXTURE_GEN_MODE, (int) GL_EYE_LINEAR); glTexGeni(GL_T, GL_TEXTURE_GEN_MODE, (int) GL_EYE_LINEAR); glTexGeni(GL_R, GL_TEXTURE_GEN_MODE, (int) GL_EYE_LINEAR); glTexGeni(GL_Q, GL_TEXTURE_GEN_MODE, (int) GL_EYE_LINEAR); // These calls initialize the TEXTURE_MAPPING function to identity. We will be using // the Texture matrix stack to establish this mapping indirectly. float [] eyePlaneS = { 1.0f, 0.0f, 0.0f, 0.0f }; float [] eyePlaneT = { 0.0f, 1.0f, 0.0f, 0.0f }; float [] eyePlaneR = { 0.0f, 0.0f, 1.0f, 0.0f }; float [] eyePlaneQ = { 0.0f, 0.0f, 0.0f, 1.0f }; glTexGenfv(GL_S, GL_EYE_PLANE, eyePlaneS); glTexGenfv(GL_T, GL_EYE_PLANE, eyePlaneT); glTexGenfv(GL_R, GL_EYE_PLANE, eyePlaneR); glTexGenfv(GL_Q, GL_EYE_PLANE, eyePlaneQ);
74
2/27/2016Lecture 2074 OpenGL Example (cont) The following code fragment is inserted into Draw( ) or Display( ) if (projTexture) { glEnable(GL_TEXTURE_2D); glEnable(GL_TEXTURE_GEN_S); glEnable(GL_TEXTURE_GEN_T); glEnable(GL_TEXTURE_GEN_R); glEnable(GL_TEXTURE_GEN_Q); projectTexture(); } // … draw everything that the texture is projected onto if (projTexture) { glDisable(GL_TEXTURE_2D); glDisable(GL_TEXTURE_GEN_S); glDisable(GL_TEXTURE_GEN_T); glDisable(GL_TEXTURE_GEN_R); glDisable(GL_TEXTURE_GEN_Q); }
75
2/27/2016Lecture 2075 OpenGL Example (cont) Here is where the extra “Texture” transformation on the vertices is inserted. private void projectTexture() { glMatrixMode(GL_TEXTURE); glLoadIdentity(); glTranslated(0.5, 0.5, 0.5); // Scale and bias the [-1,1] NDC values glScaled(0.5, 0.5, 0.5); // to the [0,1] range of the texture map gluPerspective(15, 1, 5, 7); // projector "projection" and view matrices gluLookAt(lightPosition[0],lightPosition[1],lightPosition[2], 0,0,0, 0,1,0); glMatrixMode(GL_MODELVIEW); }
76
Outline Types of mappings Interpolating texture coordinates Texture Resampling Broader use of textures
77
Texture animation Basic idea: treat texture coordinate like color – Moving water texture to simulate flow – zoom, rotation, and shearing image on a surface – cross tree using alpha – Fading in and fading out (Blending marble to skin) with multi-texturing
78
The basic idea is that, instead of using a texture to change a color component in the illumination equation, we access a texture to modify the surface normal.
79
Today Bezier patch, Subdivision surface, butterfly, catmull clark Particle systems Fractals Multi-texturing ? For material lighting? Environment map Bump map, displacement map Textures for lighting and shadows Lighting with Texture Shadow mapping Projector texture mapping? 1D texture -> rain for examples Material mapping Mipmap/ripmapping
80
Bump map Displacement map Enviroment map
81
Shadow map
82
What if (u,v) falls outside of [0,1]? Repeat, mirror, clamp, border
83
Outline Types of projections Interpolating texture coordinates Broader use of textures
84
2/27/2016Lecture 2084 Environment Maps Instead of using transformed vertices to index the projected texture', we can use transformed surface normal s to compute indices into the texture map. These sorts of mapping can be used to simulate reflections, and other shading effects. This approach is not completely accurate. It assumes that all reflected rays begin from the same point, and that all objects in the scene are the same distance from that point.
85
2/27/2016Lecture 2085 Sphere Mapping Basics OpenGL provides special support for a particular form of Normal mapping called sphere mapping. It maps the normals of the object to the corresponding normal of a sphere. It uses a texture map of a sphere viewed from infinity to establish the color for the normal.
86
2/27/2016Lecture 2086 Sphere Mapping Mapping the normal to a point on the sphere r1r1 r2r2 r3r3 n3n3 v3v3 n2n2 v2v2 n1n1 v1v1 Recall: r n v (-1,-1) (1,1)
87
2/27/2016Lecture 2087 OpenGL code Example This was a very special purpose hack in OpenGL, however, we have it to thank for a lot of the flexibility in today’s graphics hardware… this hack was the genesis of programmable vertex shading. // this gets inserted where the texture is created glTexGeni(GL_S, GL_TEXTURE_GEN_MODE, (int) GL_SPHERE_MAP); glTexGeni(GL_T, GL_TEXTURE_GEN_MODE, (int) GL_SPHERE_MAP); // Add this before rendering any primatives if (texWidth > 0) { glEnable(GL_TEXTURE_2D); glEnable(GL_TEXTURE_GEN_S); glEnable(GL_TEXTURE_GEN_T); }
88
2/27/2016Lecture 2088 What’s the Best Map? A sphere map is not the only representation choice for environment maps. There are alternatives, with more uniform sampling properties, but they require different normal-to-texture mapping functions.
89
Texture Mapping Applications Modulation, light maps Bump mapping Displacement mapping Illumination or Environment Mapping Procedural texturing And many more
90
Modulation textures Map texture values to scale factor Wood texture
91
Bump Mapping Texture = change in surface normal! Sphere w/ diffuse textureSwirly bump map Sphere w/ diffuse texture and swirly bump map
92
Displacement Mapping
93
Illumination Maps Quake introduced illumination maps or light maps to capture lighting effects in video games Texture map: Texture map + light map: Light map
94
Cube map
95
Sphere map
96
Shaddow map
97
Environment Maps Images from Illumination and Reflection Maps: Simulated Objects in Simulated and Real Environments Gene Miller and C. Robert Hoffman SIGGRAPH 1984 “Advanced Computer Graphics Animation” Course Notes Terminator
98
Interpolating Parameters Perspective foreshortening is not getting applied to our interpolated parameters – Parameters should be compressed with distance – Linearly interpolating them in screen-space doesn’t do this Correct Mapping Results – http://graphics.lcs.mit.edu/classes/6.837/F98/Lec ture21/Slide14.html http://graphics.lcs.mit.edu/classes/6.837/F98/Lec ture21/Slide14.html
99
2/27/2016Lecture 2099 Bump Mapping Textures can be used to alter the surface normals of an object. This does not actual shape of the surface -- we are only shading it as if it were a different shape! This technique is called bump mapping. The texture map is treated as a single-valued height function. The value of the function is not actually used, just its partial derivatives. The partial derivatives tell how to alter the true surface normal at each point on the surface to make the object appear as if it were deformed by the height function.
100
2/27/2016Lecture 20100 More Bump Map Examples Since the actual shape of the object does not change, the silhouette edge of the object will not change. Bump Mapping also assumes that the Illumination model is applied at every pixel (as in Phong Shading or ray tracing).
101
2/27/2016Lecture 20101 One More Bump Map Example
102
2/27/2016Lecture 20102 Displacement Mapping Texture maps can be used to actually move surface points. This is called displacement mapping. How is this fundamentally different than bump mapping? The geometry must be displaced before visibility is determined. Is this easily done in the graphics pipeline? In a ray-tracer?
103
2/27/2016Lecture 20103 Three Dimensional or Solid Textures The textures that we have discussed to this point are two-dimensional functions mapped onto two-dimensional surfaces. Another approach is to consider a texture as a function defined over a three-dimensional surface. Textures of this type are called solid textures. Solid textures are very effective at representing some types of materials such as marble and wood. Generally, solid textures are defined procedural functions rather than tabularized or sampled functions as used in 2- D The approach that we will explore is based on An Image Synthesizer, by Ken Perlin, SIGGRAPH '85. The vase to the right is from this paper.
104
2/27/2016Lecture 20104 Multi-Texturing Shaders will often have uses for multiple textures, how do you bind them? How do you specify multiple texture coordinates for a vertex?
105
2/27/2016Lecture 20105 Multi-Texture: OpenGL
106
2/27/2016Lecture 20106 Textures in GLSL Textures are just uniforms of type sampler2D Tell GLSL that you want one of these sampler2Ds to be GL_TEXTUREi by setting the corresponding uniform to i
107
2/27/2016Lecture 20107 Night and Day
108
2/27/2016Lecture 20108 Night and Day Use one texture for day, one for night Use the same texture coordinates for both texture Components – Vertex program – Fragment program – OpenGL application
109
2/27/2016Lecture 20109 Night and Day Vertex Program
110
2/27/2016Lecture 20110 Night and Day Fragment Program
111
2/27/2016Lecture 20111 Night and Day
112
2/27/2016Lecture 20112 Night and Day OpenGL Program
113
2/27/2016Lecture 20113 Render to Texture GL_EXT_framebuffer_object Render intermediate result into a texture, and then render using the texture
114
2/27/2016Lecture 20114 Render to Texture Creating a framebuffer object Use a framebuffer object
115
Texture Map Filtering Naive texture mapping aliases badly Look familiar? int uval = (int) (u * W); int vval = (int) (v * H); int pix = texture.getPixel(uval, vval); Actually, each pixel maps to a region in texture – |PIX| < |TEX| Easy: interpolate (bilinear) between texel values – |PIX| > |TEX| Hard: average the contribution from multiple texels – |PIX| ~ |TEX| Still need interpolation!
116
Mip Maps Keep textures prefiltered at multiple resolutions – For each pixel, linearly interpolate between two closest levels (e.g., trilinear filtering) – Fast, easy for hardware Why “Mip” maps?
117
MIP-map Example No filtering: MIP-map texturing: AAAAAAAGH Where are my glasses?
119
Outline Types of projections Interpolating texture coordinates Broader use of textures
120
Texture Mapping Applications Modulation, light maps Bump mapping Displacement mapping Illumination or Environment Mapping Procedural texturing And many more
121
Modulation textures Map texture values to scale factor Wood texture
122
Bump Mapping Texture = change in surface normal! Sphere w/ diffuse textureSwirly bump map Sphere w/ diffuse texture and swirly bump map
123
Displacement Mapping
124
Illumination Maps Quake introduced illumination maps or light maps to capture lighting effects in video games Texture map: Texture map + light map: Light map
125
Cube map
126
Sphere map
127
Shaddow map
128
Environment Maps Images from Illumination and Reflection Maps: Simulated Objects in Simulated and Real Environments Gene Miller and C. Robert Hoffman SIGGRAPH 1984 “Advanced Computer Graphics Animation” Course Notes Terminator
129
Arc-length parameterization Freeform x y 0 1 t 0 L s Arc-length parameterization
130
Interpolating Parameters Perspective foreshortening is not getting applied to our interpolated parameters – Parameters should be compressed with distance – Linearly interpolating them in screen-space doesn’t do this Correct Mapping Results – http://graphics.lcs.mit.edu/classes/6.837/F98/Lec ture21/Slide14.html http://graphics.lcs.mit.edu/classes/6.837/F98/Lec ture21/Slide14.html
131
Outline Types of projections Interpolating texture coordinates Broader use of textures
132
1 st idea: Gouraud interp. of texcoords Scan line
133
Artifacts McMillan’s demo of this is at http://graphics.lcs.mit.edu/classes/6.837/F98/Lecture21/Slide05.html http://graphics.lcs.mit.edu/classes/6.837/F98/Lecture21/Slide05.html Another example http://graphics.lcs.mit.edu/classes/6.837/F98/Lecture21/Slide06.html What artifacts do you see? Why? Why not in standard Gouraud shading? Hint: problem is in interpolating parameters
134
Interpolating Parameters The problem turns out to be fundamental to interpolating parameters in screen-space – Uniform steps in screen space uniform steps in world space Texture image
135
Texture Mapping Linear interpolation of texture coordinates Correct interpolation with perspective divide Hill Figure 8.42
136
Interpolating Parameters Perspective foreshortening is not getting applied to our interpolated parameters – Parameters should be compressed with distance – Linearly interpolating them in screen-space doesn’t do this Correct Mapping Results – http://graphics.lcs.mit.edu/classes/6.837/F98/Lec ture21/Slide14.html http://graphics.lcs.mit.edu/classes/6.837/F98/Lec ture21/Slide14.html
137
Why don’t we notice? Traditional screen-space Gourand shading is wrong. However, you usually will not notice because the transition in colors is very smooth (And we don't know what the right color should be anyway, all we care about is a pretty picture). There are some cases where the errors in Gourand shading become obvious. When switching between different levels-of-detail representations At "T" joints.
138
Dealing with Incorrect Interpolation You can reduce the perceived artifacts of non-perspective correct interpolation by subdividing the texture-mapped triangles into smaller triangles (why does this work?). But, fundamentally the screen-space interpolation of projected parameters is inherently flawed.
139
Texture Map Filtering Naive texture mapping aliases badly Look familiar? int uval = (int) (u * W); int vval = (int) (v * H); int pix = texture.getPixel(uval, vval); Actually, each pixel maps to a region in texture – |PIX| < |TEX| Easy: interpolate (bilinear) between texel values – |PIX| > |TEX| Hard: average the contribution from multiple texels – |PIX| ~ |TEX| Still need interpolation!
140
Mip Maps Keep textures prefiltered at multiple resolutions – For each pixel, linearly interpolate between two closest levels (e.g., trilinear filtering) – Fast, easy for hardware Why “Mip” maps?
141
MIP-map Example No filtering: MIP-map texturing: AAAAAAAGH Where are my glasses?
142
Outline Types of projections Interpolating texture coordinates Broader use of textures
143
Texture Mapping Applications Modulation, light maps Bump mapping Displacement mapping Illumination or Environment Mapping Procedural texturing And many more
144
Modulation textures Map texture values to scale factor Wood texture
145
Bump Mapping Texture = change in surface normal! Sphere w/ diffuse textureSwirly bump map Sphere w/ diffuse texture and swirly bump map
146
Displacement Mapping
147
Illumination Maps Quake introduced illumination maps or light maps to capture lighting effects in video games Texture map: Texture map + light map: Light map
148
Environment Maps Images from Illumination and Reflection Maps: Simulated Objects in Simulated and Real Environments Gene Miller and C. Robert Hoffman SIGGRAPH 1984 “Advanced Computer Graphics Animation” Course Notes
149
Arc-length parameterization Freeform x y 0 1 t 0 L s Arc-length parameterization
150
Texture Interpolation Problem Notice that uniform steps on the image plane do not correspond to uniform steps along the edge. Without loss of generality, let’s assume that the viewport is located 1 unit away from the center of projection.
151
Slides from Jingyi Yu Linear Interpolation in Screen Space Compare linear interpolation in screen space
152
Linear Interpolation in 3-Space to interpolation in 3-space: Slides from Jingyi Yu
153
How to make them Mesh Still need to scan convert in screen space... so we need a mapping from t values to s values. We know that the all points on the 3-space edge project onto our screen-space line. Thus we can set up the following equality: and solve for s in terms of t giving: Unfortunately, at this point in the pipeline (after projection) we no longer have z 1 and z 2 lingering around (Why?). However, we do have w 1 = 1/z 1 and w 2 = 1/z 2. Slides from Jingyi Yu
154
Interpolating Parameters We can now use this expression for s to interpolate arbitrary parameters, such as texture indices (u, v), over our 3-space triangle. This is accomplished by substituting our solution for s given t into the parameter interpolation. Therefore, if we premultiply all parameters that we wish to interpolate in 3-space by their corresponding w value and add a new plane equation to interpolate the w values themselves, we can interpolate the numerators and denominator in screen-space. We then need to perform a divide a each step to get to map the screen-space interpolants to their corresponding 3-space values. This is a simple modification to the triangle rasterizer that we developed in class. Slides from Jingyi Yu
155
Perspective-Correct Interpolation Skipping a bit of math to make a long story short… – Rather than interpolating u and v directly, interpolate u/z and v/z These do interpolate correctly in screen space Also need to interpolate z and multiply per-pixel – Problem: we don’t know z anymore – Solution: we do know w 1/z – So…interpolate uw and vw and w, and compute u = uw/w and v = vw/w for each pixel This unfortunately involves a divide per pixel http://graphics.lcs.mit.edu/classes/6.837/F98/Lecture21/Slide14.html
156
To Do, add Leo’s sldiea Derivation Why imporant
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.