Recap Last lecture we looked at local shading models –Diffuse and Phong specular terms –Flat and smooth shading Some things were glossed over –Light source.

Slides:



Advertisements
Similar presentations
03/16/2009Dinesh Manocha, COMP770 Texturing Surface’s texture: its look & feel Graphics: a process that takes a surface and modifies its appearance using.
Advertisements

Texture Mapping. Texturing  process that modifies the appearance of each point on a surface using an image or function  any aspect of appearance can.
Real-Time Rendering TEXTURING Lecture 02 Marina Gavrilova.
Texture Mapping. Typical application: mapping images on geometry 3D geometry (quads mesh) + RGB texture 2D (color-map) =
Texture Visual detail without geometry. Texture Mapping desire for heightened realism.
Week 7 - Wednesday.  What did we talk about last time?  Transparency  Gamma correction  Started texturing.
Virtual Realism LIGHTING AND SHADING. Lighting & Shading Approximate physical reality Ray tracing: Follow light rays through a scene Accurate, but expensive.
University of British Columbia CPSC 314 Computer Graphics Jan-Apr 2005 Tamara Munzner Textures II Week 8, Wed.
Texture Mapping CPSC /24/03 Abhijeet Ghosh.
OpenGL Texture Mapping
OpenGL Texture Mapping Ed Angel Professor of Computer Science, Electrical and Computer Engineering, and Media Arts University of New Mexico.
OpenGL Texture Mapping April 16, Angel: Interactive Computer Graphics 3E © Addison-Wesley 2002 Basic Stragegy Three steps to applying a texture.
CS 4731: Computer Graphics Lecture 17: Texturing Emmanuel Agu.
Computer Graphics Inf4/MSc Computer Graphics Lecture 11 Texture Mapping.
09/18/03CS679 - Fall Copyright Univ. of Wisconsin Last Time Bump Mapping Multi-pass algorithms.
Texture Mapping A way of adding surface details Two ways can achieve the goal:  Surface detail polygons: create extra polygons to model object details.
Texture Mapping. To add surface details… World of Warcraft, Blizzard Inc. More polygons (slow and hard to handle small details) Less polygons but with.
Computer Graphics Texture Mapping Eriq Muhammad Adams
CS 638, Fall 2001 Admin Grad student TAs may have had their accounts disabled –Please check and the lab if there is a problem If you plan on graduating.
Computer Graphics Texture Mapping
2IV60 Computer Graphics set 10: Texture mapping Jack van Wijk TU/e.
11/11/04© University of Wisconsin, CS559 Fall 2004 Last Time Shading Interpolation Texture mapping –Barycentric coordinates for triangles.
1 SIC / CoC / Georgia Tech MAGIC Lab Rossignac Textures and shadows  Generation  Mipmap  Texture coordinates,
Texture Mapping. Scope Buffers Buffers Various of graphics image Various of graphics image Texture mapping Texture mapping.
Texture Mapping. Example Mappings Mapping Techniques Consider the problem of rendering a sphere in the examples The geometry is very simple - a sphere.
Texture Mapping Course: Computer Graphics Presented by Fan Chen
Computer Graphics Ben-Gurion University of the Negev Fall 2012.
Mapping method Texture Mapping Environmental mapping (sphere mapping) (cube mapping)
09/09/03CS679 - Fall Copyright Univ. of Wisconsin Last Time Event management Lag Group assignment has happened, like it or not.
CS380 LAB IV OpenGL Jonghyeob Lee Reference1. [OpenGL course slides by Rasmus Stenholt] Reference2. [
Texture Mapping. 2 Motivation A typical modern graphics card can handle 10s of millions of polygons a second. How many individual blades of grass are.
OpenGL Texture Mapping. 2 Objectives Introduce the OpenGL texture functions and options.
CS 480/680 Computer Graphics OpenGL Texture Mapping Dr. Frederick C Harris, Jr. Fall 2011.
Texture Mapping Drawing Pictures on Polygons. Texture Mapping.
TEXTURES & OTHER GOODIES Computer Graphics. glTexCoord2f(...); + =
CG Summary: OpenGL Shading andTextures Angel, Chapters 5, 7; “Red Book” slides from AW, red book, etc. CSCI 6360/4360.
2 COEN Computer Graphics I Evening’s Goals n Discuss displaying and reading image primitives n Describe texture mapping n Discuss OpenGL modes and.
09/16/03CS679 - Fall Copyright Univ. of Wisconsin Last Time Environment mapping Light mapping Project Goals for Stage 1.
Computer Graphics: Programming, Problem Solving, and Visual Communication Steve Cunningham California State University Stanislaus and Grinnell College.
11/04/04© University of Wisconsin, CS559 Fall 2004 Last Time Visibility –Z-Buffer and transparency –A-buffer –Area subdivision –BSP Trees –Exact Cell-Portal.
11/5/2002 (c) University of Wisconsin, CS 559 Last Time Local Shading –Diffuse term –Specular term –All together –OpenGL brief overview.
Where We Stand So far we know how to: –Transform between spaces –Rasterize –Decide what’s in front Next –Deciding its intensity and color.
In the name of God Computer Graphics. Where We Stand So far we know how to: –Transform between spaces –Draw polygons Next –Deciding a pixel’s intensity.
第三课. Overview of this Section Concept of Texture Mapping ( 纹理映射 ) 2D Texture 3D Texture Environment Mapping Bump Mapping Others OpenGL Implementation.
03/19/2002(c) University of Wisconsin, CS 559 Last Time BSP Tree rendering and exact visibility in mazes Local Shading –Diffuse term –Specular term.
Texture Mapping CEng 477 Introduction to Computer Graphics.
Ying Zhu Georgia State University
Week 7 - Wednesday CS361.
Texture Mapping Fall, 2016.
© University of Wisconsin, CS559 Spring 2004
© University of Wisconsin, CS559 Spring 2004
Game Programming (Mapping)
OpenGL Texture Mapping
OpenGL Texture Mapping
Last Time Midterm Shading Light source Shading Interpolation
Chapters VIII Image Texturing
© University of Wisconsin, CS559 Fall 2004
(c) University of Wisconsin, CS 559
3D Game Programming Texture Mapping
Textures II Week 10, Mon Mar 22
CS-378: Game Technology Lecture #4: Texture and Other Maps
Computer Graphics Practical Lesson 6
OpenGL Texture Mapping
Last Time Liang-Barsky Details Weiler-Atherton clipping algorithm
OpenGL Texture Mapping
Programming Textures Lecture 15 Fri, Sep 28, 2007.
Adding Surface Detail 고려대학교 컴퓨터 그래픽스 연구실.
3D Game Programming Texture Mapping
Adding Surface Detail 고려대학교 컴퓨터 그래픽스 연구실.
OpenGL Texture Mapping
Presentation transcript:

Recap Last lecture we looked at local shading models –Diffuse and Phong specular terms –Flat and smooth shading Some things were glossed over –Light source types and their effects –Distant viewer assumption This lecture: –Clean up the odds and ends –Texture mapping and other mapping effects –A little more on the next project

Light Sources Two aspects of light sources are important for a local shading model: –Where is the light coming from (the L vector)? –How much light is coming (the I values)? Various light source types give different answers to the above questions: –Point light source: Light from a specific point –Directional: Light from a specific direction –Spotlight: Light from a specific point with intensity that depends on the direction –Area light: Light from a continuum of points (later in the course)

Point and Directional Sources Point light: L(x) = ||p light - x|| –The L vector depends on where the surface point is located –Must be normalized - slightly expensive –OpenGL light at 1,1,1: Directional light: L(x) = L light –The L vector does not change over points in the world –OpenGL light traveling in direction 1,1,1 (L is in opposite direction): Glfloat light_position[] = { 1.0, 1.0, 1.0, 1.0 }; glLightfv(GL_LIGHT0, GL_POSITION, light_position); Glfloat light_position[] = { 1.0, 1.0, 1.0, 0.0 }; glLightfv(GL_LIGHT0, GL_POSITION, light_position);

Spotlights Point source, but intensity depends on L: –Requires a position: the location of the source –Requires a direction: the center axis of the light –Requires a cut-off: how broad the beam is –Requires and exponent: how the light tapers off at the edges of the cone Intensity scaled by (L·D) n glLightfv(GL_LIGHT0, GL_POSITION, light_posn); glLightfv(GL_LIGHT0, GL_SPOT_DIRECTION, light_dir); glLightfv(GL_LIGHT0, GL_SPOT_CUTOFF, 45.0); glLightfv(GL_LIGHT0, GL_SPOT_EXPONENT, 1.0); cut-off direction

Distant Viewer Approximation Specularities require the viewing direction: –V(x) = ||VRP-x|| –Slightly expensive to compute Distant viewer approximation uses a global V –Independent of which point is being lit –Use the view plane normal vector –Error depends on the nature of the scene Explored in the homework

Mapping Techniques Consider the problem of rendering a soup can –The geometry is very simple - a cylinder –But the color changes rapidly, with sharp edges –With the local shading model, so far, the only place to specify color is at the vertices –To do a soup tin, would need thousands of polygons for a simple shape –Same things goes for an orange: simple shape but complex normal vectors Solution: Mapping techniques use simple geometry modified by a mapping of some type

Texture Mapping (Watt 8.1) The soup tin is easily described by pasting a label on the plain cylinder Texture mapping associates the color of a point with the color in an image: the texture –Soup tin: Each point on the cylinder get the label’s color Question: Which point of the texture do we use for a given point on the surface? Establish a mapping from surface points to image points –Different mappings are common for different shapes –We will, for now, just look at triangles (polygons)

Basic Mapping The texture lives in a 2D space –Parameterize points in the texture with 2 coordinates: (s,t) –These are just what we would call (x,y) if we were talking about an image, but we wish to avoid confusion with the world (x,y,z) Define the mapping from (x,y,z) in world space to (s,t) in texture space With polygons: –Specify (s,t) coordinates at vertices –Interpolate (s,t) for other points based on given vertices

Basic Mapping

Interpolating Coordinates (x 1, y 1 ), (s 1, t 1 )(x 2, y 2 ), (s 2, t 2 ) (x 3, y 3 ), (s 3, t 3 )

Basic OpenGL Texturing Specify texture coordinates for the polygon: –Use glTexCoord2f(s,t) before each vertex: glTexCoord2f(0,0); glVertex3f(x,y,z); Create a texture object and fill it with texture data: –glGenTextures(num, &indices) to get identifiers for the objects –glBindTexture(GL_TEXTURE_2D, identifier) to bind the texture Following texture commands refer to the bound texture –glTexParameteri(GL_TEXTURE_2D, …, …) to specify parameters for use when applying the texture –glTexImage2D(GL_TEXTURE_2D, ….) to specify the texture data (the image itself) MORE…

Basic OpenGL Texturing (cont) Enable texturing: glEnable(GL_TEXTURE_2D) State how the texture will be used: –glTexEnvf(…) Texturing is done after lighting You’re ready to go…

Nasty Details There are a large range of functions for controlling the layout of texture data: –You must state how the data in your image is arranged –Eg: glPixelStorei(GL_UNPACK_ALIGNMENT, 1) tells OpenGL not to skip bytes at the end of a row –You must state how you want the texture to be put in memory: how many bits per “pixel”, which channels,… For project 3, when you use this stuff, there will be example code, and the Red Book contains examples

Controlling Different Parameters The “pixels” in the texture map may be interpreted as many different things: –As colors in RGB or RGBA format –As grayscale intensity –As alpha values only The data can be applied to the polygon in many different ways: –Replace: Replace the polygon color with the texture color –Modulate: Multiply the polygon color with the texture color or intensity –Similar to compositing: Composite texture with base using operator

Example: Diffuse shading and texture Say you want to have an object textured and have the texture appear to be diffusely lit Problem: Texture is applied after lighting, so how do you adjust the texture’s brightness? Solution: –Make the polygon white and light it normally –Use glTexEnvi(GL_TEXTURE_2D, GL_TEXTURE_ENV_MODE, GL_MODULATE) –Use GL_RGB for internal format –Then, texture color is multiplied by surface (fragment) color, and alpha is taken from fragment

Textures and Aliasing Textures are subject to aliasing: –An polygon point maps into a texture image, essentially sampling the texture at a point Standard approaches: –Pre-filtering: Filter the texture down before applying it –Post-filtering: Take multiple pixels from the texture and filter them before applying to the polygon fragment

Mipmapping (Pre-filtering) If a textured object is far away, one screen pixel (on an object) may map to many texture pixels –The problem is: how to combine them A mipmap is a low resolution version of a texture –Texture is filtered down as a pre-processing step: gluBuild2DMipmaps(…) –When the textured object is far away, use the mipmap chosen so that one image pixel maps to at most four mipmap pixels –Full set of mipmaps requires double the storage of the original texture

Post-Filtering You tell OpenGL what sort of post-filtering to do When the image pixel is smaller than the texture pixel: –glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, type) –Type is GL_LINEAR or GL_NEAREST When the image pixel is bigger than the texture pixels: –GL_TEX_MIN_FILTER to specify “minification” filter –Can choose to: Take nearest point in base texture, GL_NEAREST Linearly interpolate nearest 4 pixels in base texture, GL_LINEAR Take the nearest mipmap and then take nearest or interpolate in that mipmap, GL_NEAREST_MIPMAP_LINEAR Interpolate between the two nearest mipmaps using nearest or interpolated points from each, GL_LINEAR_MIPMAP_LINEAR

Boundaries You can control what happens if a point maps to a texture coordinate outside of the texture image Repeat: Assume the texture is tiled –glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_REPEAT) Clamp to Clamp to Edge: the texture coordinates are truncated to valid values, and then used –glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP) Can specify a special border color: –glTexParameterfv(GL_TEXTURE_2D, GL_TEXTURE_BORDER_COLOR, R,G,B,A)

Other Texture Stuff Texture must be in fast memory - it is accessed for every pixel drawn Texture memory is typically limited, so a range of functions are available to manage it Specifying texture coordinates can be annoying, so there are functions to automate it Sometimes you want to apply multiple textures to the same point: Multitexturing is now in some hardware

Other Texture Stuff There is a texture matrix: apply a matrix transformation to texture coordinates before indexing texture There are “image processing” operations that can be applied to the pixels coming out of the texture There are 1D and 3D textures –Instead of giving 2d texture coordinates, give higher dimensions –Mapping works essentially the same –3D used in visualization applications, such a visualizing MRI or other medical data –1D saves memory if the texture is inherently 1D, like stripes

Procedural Texture Mapping Instead of looking up an image, pass the texture coordinates to a function that computes the texture value on the fly –Renderman, the Pixar rendering language, does this –Also now becoming available in hardware Advantages: –Near-infinite resolution with small storage cost –Idea works for many other things Has the disadvantage of being slow

Other Types of Mapping Environment mapping looks up incoming illumination in a map –Simulates reflections from shiny surfaces Bump-mapping computes an offset to the normal vector at each rendered pixel –No need to put bumps in geometry, but silhouette looks wrong Displacement mapping adds an offset to the surface at each point –Like putting bumps on geometry, but simpler to model All are available in software renderers like RenderMan compliant renderers All these are becoming available in hardware