Texture Mapping. Example Mappings Mapping Techniques Consider the problem of rendering a sphere in the examples The geometry is very simple - a sphere.

Slides:



Advertisements
Similar presentations
Graphics Pipeline.
Advertisements

OpenGL Texture Mapping
Texture Mapping. Typical application: mapping images on geometry 3D geometry (quads mesh) + RGB texture 2D (color-map) =
University of British Columbia CPSC 314 Computer Graphics Jan-Apr 2005 Tamara Munzner Textures II Week 8, Wed.
Texture Mapping CPSC /24/03 Abhijeet Ghosh.
(conventional Cartesian reference system)
Hofstra University1 Texture Motivation: to model realistic objects need surface detail: wood grain, stone roughness, scratches that affect shininess, grass,
CSC345: Advanced Graphics & Virtual Environments
Texture Mapping April 9, The Limits of Geometric Modeling Although graphics cards can render over 10 million polygons per second, that number.
OpenGL Texture Mapping
University of New Mexico
OpenGL Texture Mapping Ed Angel Professor of Computer Science, Electrical and Computer Engineering, and Media Arts University of New Mexico.
Texture Mapping Mohan Sridharan Based on slides created by Edward Angel 1 CS4395: Computer Graphics.
1 Lecture 12 Texture Mapping uploading of the texture to the video memory the application of the texture onto geometry.
OpenGL Texture Mapping April 16, Angel: Interactive Computer Graphics 3E © Addison-Wesley 2002 Basic Stragegy Three steps to applying a texture.
CS 4731: Computer Graphics Lecture 17: Texturing Emmanuel Agu.
Computer Graphics Inf4/MSc Computer Graphics Lecture 11 Texture Mapping.
Texture Mapping A way of adding surface details Two ways can achieve the goal:  Surface detail polygons: create extra polygons to model object details.
1 Computer Graphics Week13 –Shading Models. Shading Models Flat Shading Model: In this technique, each surface is assumed to have one normal vector (usually.
Texture Mapping. To add surface details… World of Warcraft, Blizzard Inc. More polygons (slow and hard to handle small details) Less polygons but with.
Computer Graphics Texture Mapping Eriq Muhammad Adams
1 Introduction to Computer Graphics with WebGL Ed Angel Professor Emeritus of Computer Science Founding Director, Arts, Research, Technology and Science.
2IV60 Computer Graphics set 10: Texture mapping Jack van Wijk TU/e.
Fundamentals of Computer Graphics Part 9 Discrete Techniques prof.ing.Václav Skala, CSc. University of West Bohemia Plzeň, Czech Republic ©2002 Prepared.
11/11/04© University of Wisconsin, CS559 Fall 2004 Last Time Shading Interpolation Texture mapping –Barycentric coordinates for triangles.
1 SIC / CoC / Georgia Tech MAGIC Lab Rossignac Textures and shadows  Generation  Mipmap  Texture coordinates,
Texture Mapping. Scope Buffers Buffers Various of graphics image Various of graphics image Texture mapping Texture mapping.
Texture Mapping Course: Computer Graphics Presented by Fan Chen
Computer Graphics Ben-Gurion University of the Negev Fall 2012.
Mapping method Texture Mapping Environmental mapping (sphere mapping) (cube mapping)
An Interactive Introduction to OpenGL Programming Ed Angel
09/09/03CS679 - Fall Copyright Univ. of Wisconsin Last Time Event management Lag Group assignment has happened, like it or not.
CS 638, Fall 2001 Today Project Stage 0.5 Environment mapping Light Mapping.
Texture Mapping Fall, Textures Describe color variation in interior of 3D polygon  When scan converting a polygon, vary pixel colors according.
Texture Mapping. 2 Motivation A typical modern graphics card can handle 10s of millions of polygons a second. How many individual blades of grass are.
OpenGL Texture Mapping. 2 Objectives Introduce the OpenGL texture functions and options.
Texture Mapping Angel Angel: Interactive Computer Graphics5E © Addison-Wesley
CS 480/680 Computer Graphics OpenGL Texture Mapping Dr. Frederick C Harris, Jr. Fall 2011.
1 Texture Mapping. 2 Texture Aliasing MIPmaps Environment Mapping Bump Mapping Displacement Mapping Shadow Maps Solid Textures Antialiasing.
Texture Mapping Software College, Shandong University Instructor: Zhou Yuanfeng
Texture Mapping Drawing Pictures on Polygons. Texture Mapping.
Computing & Information Sciences Kansas State University Lecture 10 of 42CIS 636/736: (Introduction to) Computer Graphics CIS 636/736 Computer Graphics.
2 COEN Computer Graphics I Evening’s Goals n Discuss displaying and reading image primitives n Describe texture mapping n Discuss OpenGL modes and.
1 Introduction to Computer Graphics with WebGL Ed Angel Professor Emeritus of Computer Science Founding Director, Arts, Research, Technology and Science.
11/5/2002 (c) University of Wisconsin, CS 559 Last Time Local Shading –Diffuse term –Specular term –All together –OpenGL brief overview.
第三课. Overview of this Section Concept of Texture Mapping ( 纹理映射 ) 2D Texture 3D Texture Environment Mapping Bump Mapping Others OpenGL Implementation.
Texture Mapping. For Further Reading Angel 7 th Ed: ­Chapter 7: 7.3 ~ 7.9 Beginning WebGL: ­Chapter 3 2.
1 Chapter 7 Texture Mapping. 2 The Limits of Geometric Modeling Although graphics cards can render over 10 million polygons per second, that number is.
Recap Last lecture we looked at local shading models –Diffuse and Phong specular terms –Flat and smooth shading Some things were glossed over –Light source.
University of New Mexico
Texture Mapping.
Computer Graphics Texture Mapping
Madhulika (18010), Assistant Professor, LPU.
Texture Mapping Fall, 2016.
Texture Mapping We can improve the realism of graphics models by mapping a texture pattern (image) onto the modeled object surface. We refer to this technique.
OpenGL Texture Mapping
Angel: Interactive Computer Graphics5E © Addison-Wesley 2009
OpenGL Texture Mapping
Chapters VIII Image Texturing
© University of Wisconsin, CS559 Fall 2004
Introduction to Texture Mapping
(c) University of Wisconsin, CS 559
3D Game Programming Texture Mapping
OpenGL Texture Mapping
Texture Mapping Ed Angel Professor Emeritus of Computer Science
OpenGL Texture Mapping
Programming Textures Lecture 15 Fri, Sep 28, 2007.
Texture Mapping Ed Angel Professor Emeritus of Computer Science
3D Game Programming Texture Mapping
OpenGL Texture Mapping
Presentation transcript:

Texture Mapping

Example Mappings

Mapping Techniques Consider the problem of rendering a sphere in the examples The geometry is very simple - a sphere But the color changes rapidly With the local shading model, so far, the only place to specify color is at the vertices To get color details, would need thousands of polygons for a simple shape Same things goes for an orange: simple shape but complex details Solution: Mapping techniques use simple geometry modified by a mapping of some type

Modeling an Orange Consider the problem of modeling an orange (the fruit) Start with an orange-colored sphere Too simple Replace sphere with a more complex shape Does not capture surface characteristics (small dimples) Takes too many polygons to model all the dimples

Modeling an Orange (2) Take a picture of a real orange, scan it, and “paste” onto simple geometric model This process is known as texture mapping Still might not be sufficient because resulting surface will be smooth Need to change local shape Bump mapping

textures The concept is very simple!

Texture Mapping Texture mapping associates the color of a point with the color in an image: the texture Each point on the sphere gets the color of mapped pixel of the texture Question to address: Which point of the texture do we use for a given point on the surface? Establish a mapping from surface points to image points Different mappings are common for different shapes We will, first, look at triangles (polygons)

Basic Mapping The texture lives in a 2D space Parameterize points in the texture with 2 coordinates: (s,t) These are just what we would call (x,y) if we were talking about an image, but we wish to avoid confusion with the world (x,y,z) Define the mapping from (x,y,z) in world space to (s,t) in texture space To find the color in the texture, take an (x,y,z) point on the surface, map it into texture space, and use it to look up the color of the texture

Texture Mapping parametric coordinates texture coordinates world coordinates window coordinates

Mapping Functions Basic problem is how to find the maps Consider mapping from a point a surface to texture coordinates need functions s = s (x, y, z) t = t (x, y, z) Such functions are difficult to find in general s t (x,y,z)

Polygon Texture Interpolation For polygons, specify (s,t) coordinates at vertices s = s (x, y, z) t = t (x, y, z) Linearly interpolate the mapping for other points in world space Straight lines in world space go to straight lines in texture space Texture map s t Triangle in world space

Barycentric Coordinates This can be used for checking if P is In the triangle. It is also useful when Computing the texture coordinates and Other linear interpolations (normal). P is in the triangle if c 1 > 0, c 2 >0, and c 1 +c 2 < 1

Set Texture Coordinates for Curved Surfaces Assume a curved parametric surface S = {x(u, v), y(u, v), z(u, v)} approximated by a polygonal mesh It’s easy to set appropriate texture coordinates for mesh vertices u = a → s = 0 and u = b → s = 1 v = c → t = 0 and v = d → t = 1

Cylindrical Surfaces Parametric form Mapping function

Spherical Surfaces Parametric form Linear Mapping Triangular mapping

Texture Mapping and Pipelines 2D texture is stored in texture memory as an n×m array of pixel elements (texels) Rasterization converts primitives to fragments (pixels to be displayed) Texture mapping is applied to fragments in the rasterization stage The color of a fragment is determined by looking up in the textures. Geometric Processing Fragment Processing Display Textures Pixel

Textures Two-dimensional texture pattern T(s, t) It is stored in texture memory as an n*m array of texture elements (texels) Due to the nature of rendering process, which works on a pixel-to-pixel base, we are more interested in the inverse map from screen coordinates to the texture coordinates Requires us to go from screen coordinates to texture coordinates We are interested in mapping the pixel domains (areas) to areas in the texture space

Computing Pixel Color in Texture mapping Associate texture with polygon Map pixel onto polygon and then into texture map Use weighted average of covered texture to compute color.

Basic Strategy of OpenGL Texture Mapping Three steps to applying a texture 1. specify the texture read or generate image assign to texture enable texturing 2. specify texture parameters wrapping, filtering 3. assign texture coordinates to vertices Proper mapping function is left to application

Texture Example The texture (below) is a 256 x 256 image that has been mapped to a rectangular polygon which is viewed in perspective

2D Texturing Enable 2D Texturing: glEnable(GL_TEXTURE_2D) Texture mapping is disabled by default Texture objects store texture data and keep it readily available for usage. Many texture objects can be generated. Generate identifiers for texture objects first GLuint texids[n]; glGenTextures(n, texids) n : the number of texture objects identifiers to generate texids : an array of unsigned integers to hold texture object identifiers Bind a texture object as the current texture glBindTexture(target, identifier) Target : can be GL_TEXTURE_1D, GL_TEXTURE_2D, or GL_TEXTURE 3D Identifier : a texture object identifier Following texture commands refer to the bound texture, e.g. glTexImage2D()

Define Image as a Texture Load image into a 2D Texture Map glTexImage2D(GL_TEXTURE_2D, level, internalFormat, width, height, border, format, type, texels) level : level of the texture resolutions. For now = 0 internalFormat : number of color components used for the texture. (integer value from 1 - 4, 3 for RGB) width, height : size of the texture map (should be power of 2) border : with (1) or without (0) border format : format of the texture data, e.g. GL_RGB type : data type of the texture data, e.g. GL_BYTE textels : pointer to the actual texture image data

1D and 3D Textures 1D textures can be considered as 2D textures with a height equal to 1. It is often used for drawing color stripes glTexImage1D(GL_TEXTURE_1D, level, components, width, border, format, type, texels) 3D textures can be considered as a stack of 2D textures. It is often used in visualization applications, e.g. medical imaging. glTexImage3D(GL_TEXTURE_3D, level, components, width, height, depth, border, format, type, texels)

3D Texturing Examples

Texture Parameters OpenGL has a variety of parameters that determine how texture is applied Wrapping parameters determine what happens if s and t are outside the (0,1) range Filter modes allow us to use area averaging instead of point samples Mipmapping allows us to use textures at multiple resolutions Environment parameters determine how texture mapping interacts with shading

Wrapping Mode Clamping: if s,t > 1 use 1, if s,t <0 use 0 Wrapping: use s,t modulo 1 glTexParameteri( GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP ) glTexParameteri( GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_REPEAT ) texture s t GL_CLAMP wrapping GL_REPEAT wrapping

Texture Functions You can specify how the texture-map colors are used to modify the pixel colors glTexEnvi(GL_TEXTURE_ENV,GL_TEXTURE_ENV_MODE, mode); mode values: GL_REPLACE: replace pixel color with texture color GL_DECAL : replace pixel color with texture color for GL_RGB texture GL_BLEND: C = C f (1-C t ) + C c C t,  C f is the pixel color, C t is the texture color, and C c is some constant color GL_MODULATE: C = C f C t More on OpenGL programming guide Example: Texture is applied after lighting, so how do you adjust the texture’s brightness? Make the polygon white and light it normally Use glTexEnvi(GL_TEXTURE_ENV, GL_TEXTURE_ENV_MODE, GL_MODULATE) Then, texture color is multiplied by surface (fragment) color and appears lighted

Every point on a surface should have a texture coordinate (s, t) in texture mapping We often specify texture coordinates to polygon vertices and interpolate texture coordinates with the polygon. glTexCoord*() specified at each vertex s t 1, 1 0, 1 0, 01, 0 (s, t) = (0.2, 0.8) (0.4, 0.2) (0.8, 0.4) A BC a b c Texture SpaceObject Space Mapping a Texture

Typical Code glBegin(GL_POLYGON); glColor3f(r0, g0, b0); //if no shading used glNormal3f(u0, v0, w0); // if shading used glTexCoord2f(s0, t0); glVertex3f(x0, y0, z0); glColor3f(r1, g1, b1); glNormal3f(u1, v1, w1); glTexCoord2f(s1, t1); glVertex3f(x1, y1, z1);. glEnd(); Note that we can use vertex arrays to increase efficiency. See the example posted on the web site.

Texture Filtering A pixel may be mapped to a small portion of a texel or a collection of texels from the texture map. How to determine the color of the pixel? Magnification: when a pixel mapped to a small portion of a texel glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, type); type : GL_NEAREST or GL_LINEAR Minification: when a pixel mapped to many texels glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, type); type : GL_NEAREST, GL_LINEAR, GL_NEAREST_MIPMAP_LINEAR, GL_LINEAR_MIPMAP_LINEAR, …

Texture Aliasing Rasterized and textured Polygon to pixels Texture map Without filteringUsing Mipmaping

Mipmaps Use different resolution of texture image for rendering different objects Level 0: original texture map Level 1: half size in width and height Define mipmaps glTexImage2D( GL_TEXTURE_2D, level, GL_RGB, …); Where level = 0, 1, 2,.. Automatically generate mipmaps gluBuild2DMipmaps(GL_TEXTURE_2D, GL_RGB, width, height, format, type, texels); For near objects For far objects For middle objects

Mipmap Filters Mipmap minification filters GL_LINEAR_MIPMAP_NEAREST: use the nearest mipmap closest to the polygon resolution, and use linear filtering GL_LINEAR_MIPMAP_LINEAR: use linear interpolation between the two mipmaps closest to the polygon resolution, and use GL_LINEAR filtering in each mipmap Example Code: gluBuild2DMipmaps(GL_TEXTURE_2D, GL_RGB, 64, 64, GL_RGB, GL_UNSIGNED_BYTE, texImage); glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR); glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR_MIPMAP_NEAREST);

Other Texture Stuff Texture must be in fast memory - it is accessed for every pixel drawn If you exceed it, performance will degrade horribly There are functions for managing texture memory Skilled artists can pack textures for different objects into one map Texture memory is typically limited, so a range of functions are available to manage it Specifying texture coordinates can be annoying, so there are functions to automate it Sometimes you want to apply multiple textures to the same point: Multitexturing is now in some hardware

Texture Matrix Normally, the texture coordinates given at vertices are interpolated and directly used to index the texture The texture matrix applies a homogeneous transform to the texture coordinates before indexing the texture What use is this?

Animating Texture Loading a texture onto the graphics card is very expensive But once there, the texture matrix can be used to “transform” the texture For example, changing the translation can select different parts of the texture If the texture matrix is changed from frame to frame, the texture will appear to move on the object This is particularly useful for things like flame, or swirling vortices, or pulsing entrances, …

Example glMatrixMode(GL_TEXTURE); glLoadIdentity(); // Scale the texture matrix glScalef(scale, scale, scale); // Rotate the UVs around the center of the texture glTranslatef(0.5f, 0.5f, 0.0f); glRotatef(angle, 0.0f, 0.0f, 1.0f); glTranslatef(-0.5f, -0.5f, 0.0f); glMatrixMode(GL_MODELVIEW ); Demo from

Procedural Texture Mapping Instead of looking up an image, pass the texture coordinates to a function that computes the texture value on the fly Renderman, the Pixar rendering language, does this Available in a limited form with vertex shaders on current generation hardware Advantages: Near-infinite resolution with small storage cost Idea works for many other things Has the disadvantage of being slow in many cases

Other Types of Mapping Environment mapping looks up incoming illumination in a map Simulates reflections from shiny surfaces Bump-mapping computes an offset to the normal vector at each rendered pixel No need to put bumps in geometry, but silhouette looks wrong Displacement mapping adds an offset to the surface at each point Like putting bumps on geometry, but simpler to model All are available in software renderers like RenderMan compliant renderers All these are becoming available in hardware

Bump Mapping Textures can be used to alter the surface normal of an object. This does not change the actual shape of the surface -- we are only shading it as if it were a different shape! This technique is called bump mapping. The texture map is treated as a single-valued height function. The value of the function is not actually used, just its partial derivatives. The partial derivatives tell how to alter the true surface normal at each point on the surface to make the object appear as if it were deformed by the height function. Since the actual shape of the object does not change, the silhouette edge of the object will not change. Bump Mapping also assumes that the Illumination model is applied at every pixel (as in Phong Shading or ray tracing). Sphere w/Diffuse Texture Swirly Bump Map Sphere w/Diffuse Texture & Bump Map

Bump Map Examples Cylinder w/Diffuse Texture Map Bump Map Cylinder w/Texture Map & Bump Map

One More Bump Map Example Notice that the shadow boundaries remain unchanged.

Displacement Mapping We use the texture map to actually move the surface point. This is called displacement mapping. How is this fundamentally different than bump mapping? The geometry must be displaced before visibility is determined.

Environment Maps We use the direction of the reflected ray to index a texture map. We can simulate reflections. This approach is not completely accurate. It assumes that all reflected rays begin from the same point, and that all objects in the scene are the same distance from that point.

Environment Mapping

Environment mapping produces reflections of its environment on shiny objects Texture is transferred in the direction of the reflected ray from the environment map onto the object In principle, trace a ray from eye to a point P on the object surface, determine the reflection ray, and then trace the reflection ray until it hits the surrounding sphere or cube. The color from environment map at this hit-point will be placed at point P Reflected ray: R=2(N·V)N-V. R is used to index the environment map. Object Viewer Reflected ray Environment Map

Environment Mapping The environment map may take one of several forms: Cubic mapping: map resides on 6 faces of a cube Spherical mapping: map resides on a sphere surrounding the object The map should contain a view of the world with the point of interest on the object as the eye We can’t store a separate map for each point, so one map is used with the eye at the center of the object Introduces distortions in the reflection, but the eye doesn’t notice Distortions are minimized for a small object in a large room The object will not reflect itself The mapping can be computed at each pixel, or only at the vertices

Sphere Mapping Implemented in hardware Single texture map Problems: Highly non-uniform sampling Highly non-linear mapping

Cubic Mapping The map resides on the surfaces of a cube around the object Typically, align the faces of the cube with the coordinate axes To generate the map: For each face of the cube, render the world from the center of the object with the cube face as the image plane Rendering can be arbitrarily complex (it’s off-line) Or, take 6 photos of a real environment with a camera in the object’s position Actually, take many more photos from different places the object might be Warp them to approximate map for all intermediate points Remember Terminator 2?

Cubic Map Example

Indexing Cubic Maps Assume you have R and the cube’s faces are aligned with the coordinate axes, and have texture coordinates in [0,1]x[0,1] How do you decide which face to use? How do you decide which texture coordinates to use? What is the problem using cubic maps when texture coordinates are only computed at vertices?

OpenGL Implementation We can use automatically generated texture coordinates in OpenGL // Build the environment as a texture object // Automatically generate the texture coordinates glTexGenf(GL_S, GL_TEXTURE_GEN_MODE, GL_SPHERE_MAP); glTexGenf(GL_T, GL_TEXTURE_GEN_MODE, GL_SPHERE_MAP); glEnable(GL_TEXTURE_GEN_S); glEnable(GL_TEXTURE_GEN_T); // Bind the environment texture … // Draw object … Example