Advanced Graphics Algorithms Ying Zhu Georgia State University

Slides:



Advertisements
Similar presentations
OpenGL Texture Mapping
Advertisements

Real-Time Rendering TEXTURING Lecture 02 Marina Gavrilova.
Texture Mapping CPSC /24/03 Abhijeet Ghosh.
Hofstra University1 Texture Motivation: to model realistic objects need surface detail: wood grain, stone roughness, scratches that affect shininess, grass,
CSC345: Advanced Graphics & Virtual Environments
OpenGL Texture Mapping
OpenGL Texture Mapping Ed Angel Professor of Computer Science, Electrical and Computer Engineering, and Media Arts University of New Mexico.
OpenGL Texture Mapping April 16, Angel: Interactive Computer Graphics 3E © Addison-Wesley 2002 Basic Stragegy Three steps to applying a texture.
CS 4731: Computer Graphics Lecture 17: Texturing Emmanuel Agu.
Texture Mapping A way of adding surface details Two ways can achieve the goal:  Surface detail polygons: create extra polygons to model object details.
Texture Mapping. To add surface details… World of Warcraft, Blizzard Inc. More polygons (slow and hard to handle small details) Less polygons but with.
Computer Graphics Texture Mapping Eriq Muhammad Adams
1 Introduction to Computer Graphics with WebGL Ed Angel Professor Emeritus of Computer Science Founding Director, Arts, Research, Technology and Science.
2IV60 Computer Graphics set 10: Texture mapping Jack van Wijk TU/e.
Texture Mapping. Scope Buffers Buffers Various of graphics image Various of graphics image Texture mapping Texture mapping.
Texture Mapping. Example Mappings Mapping Techniques Consider the problem of rendering a sphere in the examples The geometry is very simple - a sphere.
Texture Mapping Course: Computer Graphics Presented by Fan Chen
Computer Graphics Ben-Gurion University of the Negev Fall 2012.
Mapping method Texture Mapping Environmental mapping (sphere mapping) (cube mapping)
An Interactive Introduction to OpenGL Programming Ed Angel
ECSE-4750 Computer Graphics Fall 2004 Prof. Michael Wozny TA. Abhishek Gattani TA. Stephen
Imaging and Raster Primitives Vicki Shreiner. 2 Jobs Andrew Giles Andrew Giles Chuck Fultz Chuck Fultz SIGGraph - SIGGraph.
CS380 LAB IV OpenGL Jonghyeob Lee Reference1. [OpenGL course slides by Rasmus Stenholt] Reference2. [
Texture Mapping Fall, Textures Describe color variation in interior of 3D polygon  When scan converting a polygon, vary pixel colors according.
Texture Mapping. 2 Motivation A typical modern graphics card can handle 10s of millions of polygons a second. How many individual blades of grass are.
OpenGL Texture Mapping. 2 Objectives Introduce the OpenGL texture functions and options.
Texture Mapping Angel Angel: Interactive Computer Graphics5E © Addison-Wesley
CS 480/680 Computer Graphics OpenGL Texture Mapping Dr. Frederick C Harris, Jr. Fall 2011.
Texture Mapping Software College, Shandong University Instructor: Zhou Yuanfeng
Texture Mapping Drawing Pictures on Polygons. Texture Mapping.
TEXTURES & OTHER GOODIES Computer Graphics. glTexCoord2f(...); + =
CG Summary: OpenGL Shading andTextures Angel, Chapters 5, 7; “Red Book” slides from AW, red book, etc. CSCI 6360/4360.
111/17/ :24 UML Solution Involves Selection of Discrete Representation Values.
2 COEN Computer Graphics I Evening’s Goals n Discuss displaying and reading image primitives n Describe texture mapping n Discuss OpenGL modes and.
Texture Mapping. 2 3 Loading Textures void glTexImage2D (GLenum target, GLint level, GLint internalformat, GLsizei width, GLsizei height, GLint border,
1 Introduction to Computer Graphics with WebGL Ed Angel Professor Emeritus of Computer Science Founding Director, Arts, Research, Technology and Science.
OpenGL Programming Guide : Texture Mapping Yoo jin wook Korea Univ. Computer Graphics Lab.
Texture Mapping and NURBS Week 7 David Breen Department of Computer Science Drexel University Based on material from Ed Angel, University of New Mexico.
Details of Texture Mapping Glenn G. Chappell U. of Alaska Fairbanks CS 381 Lecture Notes Monday, December 1, 2003.
第三课. Overview of this Section Concept of Texture Mapping ( 纹理映射 ) 2D Texture 3D Texture Environment Mapping Bump Mapping Others OpenGL Implementation.
Texture Mapping. For Further Reading Angel 7 th Ed: ­Chapter 7: 7.3 ~ 7.9 Beginning WebGL: ­Chapter 3 2.
CSc4820/6820 Computer Graphics Algorithms Ying Zhu Georgia State University Texture Mapping.
1 Chapter 7 Texture Mapping. 2 The Limits of Geometric Modeling Although graphics cards can render over 10 million polygons per second, that number is.
Viewing and Texture Mapping In OPENGL. VIEWING 1.One or more objects 2.A viewer with a projection surface 3.Projectors that go from the object(s) to.
CS425 © 2003 Ray S. Babcock Pixels and Bitmaps ● OpenGL allows us to work directly with bits and groups of bits, or pixels, which flow down a parallel.
Texture Mapping CEng 477 Introduction to Computer Graphics.
Madhulika (18010), Assistant Professor, LPU.
Texture Mapping Fall, 2016.
Texture Mapping We can improve the realism of graphics models by mapping a texture pattern (image) onto the modeled object surface. We refer to this technique.
OpenGL Texture Mapping
The Graphics Rendering Pipeline
Angel: Interactive Computer Graphics5E © Addison-Wesley 2009
Advanced Graphics Algorithms Ying Zhu Georgia State University
OpenGL Texture Mapping
Introduction to Computer Graphics with WebGL
Computer Graphics, Lee Byung-Gook, Dongseo Univ.
Chapters VIII Image Texturing
Interactive Graphics Algorithms Ying Zhu Georgia State University
Introduction to Texture Mapping
Texture Motivation: to model realistic objects need surface detail: wood grain, stone roughness, scratches that affect shininess, grass, wall paper. Use.
Lecture 13 Clipping & Scan Conversion
Lecture 21: Texture mapping Li Zhang Spring 2010
3D Game Programming Texture Mapping
Computer Graphics Practical Lesson 6
OpenGL Texture Mapping
OpenGL Texture Mapping
Programming Textures Lecture 15 Fri, Sep 28, 2007.
Texture Mapping Ed Angel Professor Emeritus of Computer Science
3D Game Programming Texture Mapping
OpenGL Texture Mapping
Presentation transcript:

Advanced Graphics Algorithms Ying Zhu Georgia State University Texture Mapping

The rasterizer stage Now we enter the rasterizer stage Triangle setup Texture mapping Fog Translucency Test Depth buffering Antialiasing

Rasterizer stage

Rasterizer stage

Triangle setup Can be considered a phase between the geometry stage and the rasterizer stage, or part of the rasterizer stage Major tasks: Back-face culling (optional) Scan conversion

Scan conversion Also know as scan-line conversion Up till now we are dealing with vertices of triangles For each triangle, the scan-line conversion operation starts with 3 vertices and returns with a set of pixels which fill that triangle First draw the three edges of the triangle Then fill the triangle with horizontal lines Thus the fundamental problem of scan conversion is how to draw lines

Scan-line conversion process Input: 3 vertices, each with a (x, y) coordinate, a depth (z) value, a color, and a texture coordinate (s, t) Output: a set of pixels that fill the triangle, each with a (x, y) coordinate, depth (z) value, color, and texture coordinates (s, t)

Scan-line conversion process

Line rasterization

Bresenham Algorithm Bresenham Algorithm for 1st octant:

Bresenham Method See textbook or http://www.cs.helsinki.fi/group/goa/mallinnus/lines/bresenh.html for a more detailed discussion

Line drawing demo program DDA and Bresenham algorithm demo program (http://www.siggraph.org/education/materials/HyperGraph/scanline/outprims/drawline_java/drawline.html)

Polygon Rasterization

Color interpolation for pixels Bresenham algorithm returns the (x, y) coordinates of the pixels that form a line How to assign colors to these pixels? Color values are interpolated for each pixel Gouraud shading is performed at this point These values are interpolated using a weighted average of the color values of the two end vertices (e.g. obtained from lighting calculation) Interpolate them along the three edges and then along the horizontal lines

Depth value interpolation for pixels Similarly, depth (Z) values are interpolated for each pixel These values are interpolated using a weighted average of the depth (Z) values of the edge's vertices Interpolate them along the three edges and then along the horizontal lines The depth value of a pixel is its distance from the eye (camera) Will be used in depth test

Color & depth interpolation Interpolate color and depth along edges first Next interpolate color and depth along horizontal lines

Texture coordinates interpolation Similar to color and depth values, the texture coordinates (s, t) are interpolated as well Texture coordinates (s, t) are specified for each vertex in your OpenGL program Will be used in texture mapping process Note that normals are generally not interpolated In the end, each pixel has a 2D window coordinate (x, y) a depth (Z) value (will be used in depth buffer test) a color value (obtained from Gouraud shading) a texture coordinate (will be used in texture mapping)

Texture Mapping Texture mapping concepts Creating textures Loading textures Texture objects Texture parameterization Texture filtering, wrapping, etc. Advanced texture mapping techniques Multi-texturing Bump mapping Environment mapping Light maps

What is texture mapping? Texture mapping is the method of taking a flat 2D image of what an object's surface looks like, and then applying that flat image to a 3D computer generated object. Much in the same way that you would hang wallpaper on a blank wall. Texture mapping brought computer graphics to a new level of realism. Makes a surface look textured even though geometrically it isn’t.

Different types of texture mapping Uses images to assign or modulate colors for pixels Environment mapping (reflection mapping) Uses a picture of the environment for texture maps Allows simulation of highly specular surfaces Bump mapping Actually a per-pixel lighting technique Save normals for each pixel in the format of a texture image (called normal map) During runtime, get normal for each pixel from normal map Lighting calculation for each pixel

Texture Mapping geometric model texture mapped

Texture mapping

Environment Mapping

Bump Mapping

Where does texture mapping fit into 3D pipeline? Mapping techniques are implemented at the rasterizer stage of the rendering pipeline After scan conversion and before scissor test, alpha test, depth test, etc. Texture mapping is performed per pixel Efficient because few polygons pass down the geometric pipeline

Where does texture mapping fit into 3D pipeline?

Texture Mapping Steps Declare texture image (OpenGL) Creating the texture Loading the texture Declare texture image (OpenGL) Enable texturing (OpenGL) Specify texture parameters (OpenGL) wrapping, filtering Assign texture coordinates to vertices (OpenGL) Proper mapping function is left to application

Photo Textures There are lots of free textures on the web: http://astronomy.swin.edu.au/~pbourke/texture/ http://www.3dlinks.com/textures_free.cfm

Other Methods Use frame buffer as the source of texture Uses the current frame buffer as a source image glCopyTexImage2D()

Loading Textures OpenGL does not provide any functions to read from files. You’ll have to write your own texture loader. Load an image file (e.g. JPEG) into memory Define a pointer to the image data Pass it to OpenGL using glTexImage2D() Use open source libraries http://openil.sourceforge.net/

Enable Texture Mapping glEnable(GL_TEXTURE_2D) OpenGL supports 1-4 dimensional texture maps Most texture mappings are 2D texture mapping

Declare a texture image In OpenGL, texture image is defined in the form of an array of texels (texture elements). For example: Glubyte my_texels[512][512][3]; You can define 1D, 2D, and 3D textures 2D textures are the most frequently used In OpenGL, each texture image is called a texture target Use glTexImage2D() to declare a texture image Internally load a texture image from main memory to texture memory

Declare a texture image glTexImage2D( target, level, components, w, h, border, format, type, texels ); target: type of texture, e.g. GL_TEXTURE_2D level: used for mipmapping (discussed later) components: elements per texel, normally GL_RGBA w, h: width and height of texels in pixels (64x64 min) border: used for smoothing (either 1 or 0) format: normally GL_RGBA type: describe texels, e.g. GL_UNSIGNED_BYTE texels: pointer to texel array glTexImage2D(GL_TEXTURE_2D, 0, GL_RGB, 512, 512, 0, GL_RGB, GL_UNSIGNED_BYTE, my_texels);

Example #define checkImageWidth 64 #define checkImageHeight 64 static GLubyte checkImage[checkImageHeight][checkImageWidth][4]; void makeCheckImage(void) { int i, j, c; for (i = 0; i < checkImageHeight; i++) { for (j = 0; j < checkImageWidth; j++) { c = ((((i&0x8)==0)^((j&0x8))==0))*255; checkImage[i][j][0] = (GLubyte) c; checkImage[i][j][1] = (GLubyte) c; checkImage[i][j][2] = (GLubyte) c; checkImage[i][j][3] = (GLubyte) 255; } … glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, checkImageWidth, checkImageHeight, 0, GL_RGBA, GL_UNSIGNED_BYTE, checkImage);

OpenGL Texture Dimension Restriction In older OpenGL versions, both image dimensions of the texture image you pass to OpenGL using glTexImage2D() must be powers of two Width = 2^n, height = 2^m If not, scale the image dimensions to powers of 2 through gluScaleImage() Otherwise glTexImage2D() will fail quietly The minimum image size is 64 x 64. Note: this restriction is removed in the latest OpenGL 2.0 specification

OpenGL Texture Dimension Restriction If dimensions of image are not powers of 2 gluScaleImage( format, w_in, h_in, type_in, *data_in, w_out, h_out, type_out, *data_out ); data_in is source image data_out is for destination image Image is interpolated and filtered during scaling If

Loading Multiple Textures We often want to use multiple textures and switch between them But we do not want to call glTextureImage2D() too many times Each glTextureImage2D() call loads a texture image from main memory (on motherboard) to texture memory (on graphics card) – quite expensive To solve this problem, OpenGL introduces texture objects Each texture object has a name and is bound with one texture image Keep multiple texture objects in texture memory (if memory capacity permits) Can switch between texture objects

Texture objects To use texture objects, take the following steps: Generate texture object names Call glTextureImage2D() to declare each texture image For each texture image, bind one texture object with it When you want to use a texture image, call glBindTexture() with the texture object name You may call glBindTexture() several times for each frame much faster than calling glTextureImage2D() several times because no image reloading Cleaning up texture objects by glDeleteTextures()

Texture objects glGenTextures(GLsizei n, GLuint *textureNames) Generate n currently unused texture object names (nonzero integers) glBindTexture(GLenum target, GLuint textureName) Target: GL_TEXTURE_2D, etc. textureName: obtained from glGenTextures()

Texture objects glBindTexture() does one of the three things depending on the value of textureName: If textureName has not been bound before, then create a new texture object, assign textureName to it, and bind object with current texture image declared by glTextureImage2D() If textureName has been bound before, then make this texture object the current active texture If textureName is 0, OpenGL stops using texture objects

Example // in init() function static GLuint texName; … glGenTextures(1, &texName); glBindTexture(GL_TEXTURE_2D, texName); glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, checkImageWidth, checkImageHeight, 0, GL_RGBA, GL_UNSIGNED_BYTE, checkImage); // in display() function

Parameterization Determine the texture coordinates for each vertex The process of mapping the texture image onto geometry

Mapping Functions Consider mapping from texture coordinates to a point a surface Given a texel (s, t) in the texture image, which vertex should this texel be assigned to? How to map (s, t) to a (x, y, z)? Appear to need three functions x = X(s,t) y = Y(s,t) z = Z(s,t) This is usually difficult (x,y,z) t s

Backward Mapping We usually reverse the question Given a vertex (x, y, z) on the 3D model, how to find its texture coordinate (s, t)? How to map a given (x, y, z) to a (s, t)? Need to find two functions: s = S(x,y,z) t = T(x,y,z) Such functions are difficult to find in general Much research has been devoted to this subject

Parameterization Automatic texture coordinate generation is still a difficult problem OpenGL provides some texture coordinate generation functions Helpful in some simple cases, but far from a general solution Developers usually have to figure out texture coordinates themselves Can use tools like Maya or 3DS Max to apply textures to objects Optimal texture coordinate assignments are obtained by trial and error Texture coordinates stored in Maya or 3DS files

Specify texture coordinates Call glTexCoord*() to specify texture coordinates at each vertex Texture Space Object Space t 1, 1 (s, t) = (0.2, 0.8) 0, 1 A a c (0.4, 0.2) b B C s (0.8, 0.4) 0, 0 1, 0

Typical Code glBegin(GL_POLYGON); glColor3f(r0, g0, b0); glNormal3f(u0, v0, w0); glTexCoord2f(s0, t0); glVertex3f(x0, y0, z0); glColor3f(r1, g1, b1); glNormal3f(u1, v1, w1); glTexCoord2f(s1, t1); glVertex3f(x1, y1, z1); . glEnd();

Texture coordinates interpolation Remember that texture coordinates for a pixel are computed by interpolating the texture coordinates for a set of vertices during scan conversion May cause distortions texture stretched over trapezoid showing effects of bilinear interpolation good selection of tex coordinates poor selection of tex coordinates

Texture Parameters OpenGL has a variety of parameters that determine how texture is applied Wrapping parameters determine what happens of s and t are outside the (0,1) range Filter modes allow us to use area averaging instead of point samples Mipmapping allows us to use textures at multiple resolutions Environment parameters determine how texture mapping interacts with shading

Wrapping Mode Clamping: if s,t > 1 use 1, if s,t <0 use 0 Wrapping: use s,t modulo 1 (integer part ignored) glTexParameteri( GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP ) glTexParameteri( GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_REPEAT ) texture s t GL_CLAMP wrapping GL_REPEAT

Why texture filtering? Eventually texture images will be mapped to screen pixel regions. Texture images may have more texels than the pixels in the screen space. A single pixel can map to more than one texel Texture images have fewer texels than the pixels in the screen space Multiple pixels map to a single texel

Why texture filtering? If texture map is too small compared to the pixel are being mapped, the same texel is mapped to adjacent pixels. Cause a blockiness effect If texture image has more samples than the pixel are to be applied, multiple texels can be mapped to the same pixel. Then the program has to pick one texel. It’s algorithm-dependent and result in artifacts such as texture swimming and pixel-popping.

Aliasing and Anti-aliasing The problem we just discussed is part of a bigger problem called aliasing. Aliasing is a fundamental issue in computer graphics. For example, smooth curves and other lines become jagged because the resolution of the graphics device or file is not high enough to represent a smooth curve. Anti-aliasing is the technology that reduces the aliasing effect.

Aliasing and Anti-aliasing Example Unfiltered Texture Mapping Filtered Texture Mapping

Texture Filtering Texture filtering is one of the techniques to minimize the aliasing effect caused by an insufficient texture-sampling rate. Different texture filtering methods: Point sampling Bilinear filtering Trilinear MIP-mapping Anisotropic filtering

Magnification The magnification technique is used when multiple pixels can map to a single texel, and it maps a single texel to multiple pixels. This happens when you zoom really close into a texture mapped polygon or due to perspective projection.

Minification The minification algorithm is used in the case where multiple texels can map to a single pixel, and it selects the best fit texel from among the group texels that could map to the pixel. This happens when you zoom out or due to perspective foreshortening.

Magnification and Minification More than one texel can cover a pixel (minification) or more than one pixel can cover a texel (magnification) Can use point sampling (nearest texel) or linear filtering ( 2 x 2 filter) to obtain texture values Texture Polygon Magnification Minification

Nearest Neighbor Interpolation A kind of point sampling method. For each pixel, grabs the texture sample from the texture map that has u, v coordinates that map nearest to the pixel's coordinates (the pixel center), and applies it to the pixel. Pro: requires the least amount of memory bandwidth in terms of the number of texels that need to be read from texture memory (one per pixel). Con: the result often causes artifacts as we discussed above due to insufficient samples (screen pixels) to describe the texture.

Bi-linear Filtering Bilinear filtering reads the four samples nearest to the pixel center, and uses a weighted average of those color values as the final texture color value. The weights are based on the distance from the pixel center to the four texel centers. Pro: blur out a good deal of the texture artifacts seen with point sampling, Con: it's only a four-tap filter working with a single texture map, its effectiveness is limited.

Bi-linear Filtering OpenGL provides bi-linear filtering.

Filter Modes Modes determined by glTexParameteri( target, type, mode ) glTexParameteri(GL_TEXTURE_2D, GL_TEXURE_MAG_FILTER, GL_NEAREST); glTexParameteri(GL_TEXTURE_2D, GL_TEXURE_MIN_FILTER, GL_LINEAR); Note that linear filtering requires a border of an extra texel for filtering at edges (border = 1)

OpenGL Example glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_REPEAT); glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_REPEAT); glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_NEAREST); glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR); glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, checkImageWidth, checkImageHeight, 0, GL_RGBA, GL_UNSIGNED_BYTE, checkImage);

MIP-Mapping Performing filtering during texture mapping can be expensive. Averaging covered texels can be very expensive For every pixel, might have to visit O(n) texels Can hurt performance A solution is to use pre-filtering of texture before rendering. Mip-mapping is a pre-filtering technique.

MIP-Mapping Basic idea is to make multiple copies of the original texture, and each successive MIP-map is exactly half the resolution of the previous one.

MIP-Mapping Consider this as a kind of 3D texture, wherein you have the typical two coordinates, (u, v), but now a third coordinate, s, is used to measure which MIP-map (or maps) to select based on which map resolution will most closely match the pixel area to be mapped. As the s coordinate increases, smaller and smaller MIP-maps are used.

How to select s? The derivation of the d coordinate is a bit complicated and implementation dependent. To put it simple, the MIP-map with the smallest amount of texture magnification and minification would be selected.

Storing MipMaps One convenient way to store mipmap is to put them in one big image.

MIP-Mapping Methods Bilinear MIP-Mapping Trilinear MIP-Mapping Apply bilinear filtering on the selected MIP-Map Trilinear MIP-Mapping Uses a weighted average of two bilinear samples from the two MIP-maps nearest to the pixel. Anisotropic filtering Supported by newer graphics cards. More sophisticated method that takes up to 16 samples. Best image quality but uses lots of texture memory.

Mipmapped Textures Mipmapping allows for prefiltered texture maps of decreasing resolutions Lessens interpolation errors for smaller textured objects Declare mipmap level during texture definition glTexImage2D( GL_TEXTURE_*D, level, … ) GLU mipmap builder routines will build all the textures from a given image gluBuild2DMipmaps( … ) If so, no need to call glTexImage2D(…) any more.

Example point sampling linear filtering mipmapped mipmapped point

Texture environment parameters Controls how texture color is combined with shaded color glTexEnv{fi}[v]( GL_TEXTURE_ENV, pname, param ) pname options: GL_TEXTURE_ENV_MODE GL_TEXTURE_ENV_COLOR param options: GL_MODULATE: multiply together with shaded color GL_BLEND: blends with shaded and an environmental color GL_REPLACE and GL_DECAL: use only texture color See glTexEnv*() manual page for details

OpenGL Example #include <GL/glut.h> /* Create checkerboard texture */ #define checkImageWidth 64 #define checkImageHeight 64 static GLubyte checkImage[checkImageHeight][checkImageWidth][4]; static GLuint texName;

OpenGL Example void makeCheckImage(void) { int i, j, c; for (i = 0; i < checkImageHeight; i++) { for (j = 0; j < checkImageWidth; j++) { c = ((((i&0x8)==0)^((j&0x8))==0))*255; checkImage[i][j][0] = (GLubyte) c; checkImage[i][j][1] = (GLubyte) c; checkImage[i][j][2] = (GLubyte) c; checkImage[i][j][3] = (GLubyte) 255; }

OpenGL Example void init(void) { glClearColor (0.0, 0.0, 0.0, 0.0); glShadeModel(GL_FLAT); glEnable(GL_DEPTH_TEST); makeCheckImage(); glPixelStorei(GL_UNPACK_ALIGNMENT, 1); glGenTextures(1, &texName); glBindTexture(GL_TEXTURE_2D, texName);

OpenGL Example glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_REPEAT); glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_REPEAT); glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_NEAREST); glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_NEAREST); glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, checkImageWidth, checkImageHeight, 0, GL_RGBA, GL_UNSIGNED_BYTE, checkImage); }

OpenGL Example void display(void) { glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT); glEnable(GL_TEXTURE_2D); glTexEnvf(GL_TEXTURE_ENV, GL_TEXTURE_ENV_MODE, GL_DECAL); glBindTexture(GL_TEXTURE_2D, texName); glBegin(GL_QUADS); glTexCoord2f(0.0, 0.0); glVertex3f(-2.0, -1.0, 0.0); glTexCoord2f(0.0, 1.0); glVertex3f(-2.0, 1.0, 0.0); glTexCoord2f(1.0, 1.0); glVertex3f(0.0, 1.0, 0.0); glTexCoord2f(1.0, 0.0); glVertex3f(0.0, -1.0, 0.0);

OpenGL Example glTexCoord2f(0.0, 0.0); glVertex3f(1.0, -1.0, 0.0); glEnd(); glutSwapBuffers(); glDisable(GL_TEXTURE_2D); }

OpenGL Example void reshape(int w, int h) { glViewport(0, 0, (GLsizei) w, (GLsizei) h); glMatrixMode(GL_PROJECTION); glLoadIdentity(); gluPerspective(60.0, (GLfloat) w/(GLfloat) h, 1.0, 30.0); glMatrixMode(GL_MODELVIEW); glTranslatef(0.0, 0.0, -3.6); }

OpenGL Example void keyboard (unsigned char key, int x, int y) { switch (key) { case 27: exit(0); break; default: }

OpenGL Example int main(int argc, char** argv) { glutInit(&argc, argv); glutInitDisplayMode(GLUT_DOUBLE | GLUT_RGB | GLUT_DEPTH); glutInitWindowSize(250, 250); glutInitWindowPosition(100, 100); glutCreateWindow(argv[0]); init(); glutDisplayFunc(display); glutReshapeFunc(reshape); glutKeyboardFunc(keyboard); glutMainLoop(); return 0; }

Putting them together General steps of texture mapping in OpenGL (not necessarily in these order): Create texture objects set texture filter parameters set texture wrap mode declare texture image bind texture object enable texturing Set texture environment parameters supply texture coordinates for vertex coordinates can also be generated

Multi-texturing Apply multiple texture images to one object. OpenGL and D3D allow for a single vertex to store two or more texture addresses. Can be used to create effects such as light maps, bump mapping, etc. Hardware support Parallel pixel pipelines Processing multiple texels per pixel per clock

Example of Multitexturing Keep one texture of the jacket, and one texture as a single bullet hole. With multitexturing, you can load a single instance of the bullet hole, and then place it multiple times on the main vest texture.

Light Maps Another lighting trick that uses multitexturing (first used in Quake). Combine a base texture and a light map to create elaborate lighting effects. Avoid doing the actual lighting calculations for all the lights in a scene. Used when lights and objects are fixed in space. + =

Environment Mapping Allows for surrounding environment to be reflected on an object without modeling the physics. Map the world surrounding an object onto a cube or sphere. This is called environment map. Project the cube/sphere onto the object.

Environment Mapping During the shading calculation, use view reflection vector as index into texture map Bounce a ray from the viewer off the object (at point P) Intersect the ray with the environment map (the cube) at point E. Get the environment map’s color at E and illuminate P as if there was a virtual light source at position E. You see an image of the environment reflected on shiny surfaces.

Environment Mapping Example

Bump Mapping How do you make a surface look rough? Option1: model the surface with many small polygons. Options2: perturb the normal vectors before the shading calculation.

Bump Mapping The surface doesn’t actually change but shading makes it look that way. Bump map fakes small displacements above or below the true surface. Take advantage of multitexturing: one base texture image + one normal map. For the math behind it all, look at Ed Angel book 7.8.

Bump Mapping Example Side view Front view

Summary Texture mapping Multi-texturing Texture mapping tricks Texture image Texture objects Texture coordinates Texture filtering, wrapping, environments Multi-texturing Texture mapping tricks Light maps Environment mapping Bump mapping

Readings OpenGL Programming Guide: chapter 9 Study OpenGL Tutor: texture.exe from http://www.xmission.com/~nate/tutors.html Code reading: checker.c, mipmap.c, model.c, texbind.c, and wrap.c http://www.opengl.org/resources/code/samples/redbook/