CHAPTER 8 Color and Texture Mapping © 2008 Cengage Learning EMEA.

Slides:



Advertisements
Similar presentations
Virtual Realism TEXTURE MAPPING. The Quest for Visual Realism.
Advertisements

Exploration of bump, parallax, relief and displacement mapping
Graphics Pipeline.
03/16/2009Dinesh Manocha, COMP770 Texturing Surface’s texture: its look & feel Graphics: a process that takes a surface and modifies its appearance using.
Texture Mapping. Texturing  process that modifies the appearance of each point on a surface using an image or function  any aspect of appearance can.
Week 7 - Monday.  What did we talk about last time?  Specular shading  Aliasing and antialiasing.
Texture and Colour in Virtual Worlds Programming for 3D Applications.
3D Graphics Rendering and Terrain Modeling
Real-Time Rendering TEXTURING Lecture 02 Marina Gavrilova.
Texture Visual detail without geometry. Texture Mapping desire for heightened realism.
Week 7 - Wednesday.  What did we talk about last time?  Transparency  Gamma correction  Started texturing.
Color & Light, Digitalization, Storage. Vision Rods work at low light levels and do not see color –That is, their response depends only on how many photons,
CHAPTER 12 Height Maps, Hidden Surface Removal, Clipping and Level of Detail Algorithms © 2008 Cengage Learning EMEA.
Informationsteknologi Monday, October 29, 2007Computer Graphics - Class 21 Today’s class Graphics programming Color.
HCI 530 : Seminar (HCI) Damian Schofield.
Sep 21, Fall 2005ITCS4010/ Computer Graphics Overview Color Displays Drawing Pipeline.
Texture Mapping CPSC /24/03 Abhijeet Ghosh.
(conventional Cartesian reference system)
Introduction to Volume Visualization Mengxia Zhu Fall 2007.
Texture Mapping from Watt, Ch. 8 Jonathan Han. Topics Discussed Texture Map to Models Bump Maps, Light Maps Environment (Reflection) Mapping 3D Textures.
Sep 21, Fall 2006IAT 4101 Computer Graphics Overview Color Displays Drawing Pipeline.
IAT 3551 Computer Graphics Overview Color Displays Drawing Pipeline.
Images and colour Colour - colours - colour spaces - colour models Raster data - image representations - single and multi-band (multi-channel) images -
CHAPTER 7 Viewing and Transformations © 2008 Cengage Learning EMEA.
Computer Graphics Inf4/MSc Computer Graphics Lecture 11 Texture Mapping.
1 Computer Graphics Week13 –Shading Models. Shading Models Flat Shading Model: In this technique, each surface is assumed to have one normal vector (usually.
Computer Graphics Inf4/MSc Computer Graphics Lecture 9 Antialiasing, Texture Mapping.
Computer Graphics: Programming, Problem Solving, and Visual Communication Steve Cunningham California State University Stanislaus and Grinnell College.
Computer Graphics Inf4/MSc Computer Graphics Lecture 7 Texture Mapping, Bump-mapping, Transparency.
COMP 175: Computer Graphics March 24, 2015
Technology and Historical Overview. Introduction to 3d Computer Graphics  3D computer graphics is the science, study, and method of projecting a mathematical.
Chapter 3: Image Restoration Geometric Transforms.
I-1 Steps of Image Generation –Create a model of the objects –Create a model for the illumination of the objects –Create an image (render) the result I.
Computer Graphics Texture Mapping
Computer Graphics Raster Devices Transformations Areg Sarkissian.
Fundamentals of Computer Graphics Part 9 Discrete Techniques prof.ing.Václav Skala, CSc. University of West Bohemia Plzeň, Czech Republic ©2002 Prepared.
1 Texture Mapping ©Anthony Steed Overview n Texture mapping Inverse and Forward Mapping Bilinear interpolation Perspective correction n Mipmapping.
Texture Mapping. Scope Buffers Buffers Various of graphics image Various of graphics image Texture mapping Texture mapping.
Graphics Graphics Korea University cgvr.korea.ac.kr 1 Texture Mapping 고려대학교 컴퓨터 그래픽스 연구실.
COLLEGE OF ENGINEERING UNIVERSITY OF PORTO COMPUTER GRAPHICS AND INTERFACES / GRAPHICS SYSTEMS JGB / AAS 1 Shading (Shading) & Smooth Shading Graphics.
Mapping method Texture Mapping Environmental mapping (sphere mapping) (cube mapping)
UW EXTENSION CERTIFICATE PROGRAM IN GAME DEVELOPMENT 2 ND QUARTER: ADVANCED GRAPHICS Textures.
09/09/03CS679 - Fall Copyright Univ. of Wisconsin Last Time Event management Lag Group assignment has happened, like it or not.
Advanced Computer Graphics Advanced Shaders CO2409 Computer Graphics Week 16.
Computer Graphics 2 Lecture 7: Texture Mapping Benjamin Mora 1 University of Wales Swansea Pr. Min Chen Dr. Benjamin Mora.
Graphics Lecture 4: Slide 1 Interactive Computer Graphics Lecture 4: Colour.
DIGITAL IMAGE. Basic Image Concepts An image is a spatial representation of an object An image can be thought of as a function with resulting values of.
Introduction to Computer Graphics
12/24/2015 A.Aruna/Assistant professor/IT/SNSCE 1.
OpenGL Graphics Textures. Quiz You didn't see that coming!
UniS CS297 Graphics with Java and OpenGL Blending.
CSCI 440.  So far we have learned how to  build shapes  create movement  change views  add simple lights  But, our objects still look very cartoonish.
Chapter 3 Color Objectives Identify the color systems and resolution Clarify category of colors.
Computer Graphics One of the central components of three-dimensional graphics has been a basic system that renders objects represented by a set of polygons.
1 of 32 Computer Graphics Color. 2 of 32 Basics Of Color elements of color:
Texturing Tomas Akenine-Möller Department of Computer Engineering Chalmers University of Technology.
BITMAPPED IMAGES & VECTOR DRAWN GRAPHICS
Computer Graphics Overview
Week 7 - Wednesday CS361.
Texture Mapping cgvr.korea.ac.kr.
3D Graphics Rendering PPT By Ricardo Veguilla.
Computer Vision Lecture 4: Color
Chapter IX Bump Mapping
Computer Graphics One of the central components of three-dimensional graphics has been a basic system that renders objects represented by a set of polygons.
The Graphics Pipeline Lecture 5 Mon, Sep 3, 2007.
Texture Mapping 고려대학교 컴퓨터 그래픽스 연구실.
Advanced Computer Graphics: Texture
Adding Surface Detail 고려대학교 컴퓨터 그래픽스 연구실.
Adding Surface Detail 고려대학교 컴퓨터 그래픽스 연구실.
Presentation transcript:

CHAPTER 8 Color and Texture Mapping © 2008 Cengage Learning EMEA

LEARNING OBJECTIVES In this chapter you will learn about: – –Color – –Texturing – –Texture filtering – –Mipmapping – –Nearest point interpolation – –Bilinear filtering – –Trilinear filtering – –Anisotropic filtering – –Basic texture mapping – –Bump mapping – –Cube mapping (environmental mapping)

COLOR The representation of color in display devices (using a red, green, and blue component) can directly be linked to the human perception of it. The human brain only picks up three color values at any given moment (as opposed to a complete color distribution). These three values, called tri-stimulus values, are the result of the human visual system’s color cones.

COLOR Color cones reduce any perceived color range to three distinct values. The significance of this reduction to computer graphics is that any color can be reproduced using three color elements, namely, a red, green, and blue component.

COLOR Computers make use of the red–green–blue (RGB) color model. This is an additive color model where perceived colors are formed by overlapping the primary red, green, and blue colors

COLOR Another commonly used color model is the cyan– magenta–yellow (CMY) model, also called the subtractive color model. The perceived colors are formed by overlapping the complementary colors cyan, magenta, and yellow

COLOR Working in true-color (a representation of red–green–blue color values using 24 bits per pixel), we can specify a cube by viewing color as a specific point in three-dimensional space. This cube is defined via a coordinate system analogous to the three primary colors with the intensity of a color represented by the distance from the origin to any other location within the cube – the color vector.

COLOR The hexcone model presents a hexagonal cone for the representation of the color space. Using a hexagonal cone, also referred to as a hexcone, leads to a greater level of perceptual linearality.

TEXTURING Texture mapping is an easy way of adding realism to a computer generated object. The texture (be it a tile-able photograph or complex pattern) is mapped (fitted) to the computer generated object, either stretched or tiled to encompass the entire object area.

TEXTURING Textures consist of a number of fundamental subunits called texels or texture elements (which can be considered the pixels of a texture). Arrays of texels make up a texture just as images are created using arrays of pixels. Textures can be one, two or three dimensional in nature (sometimes even four dimensional) with two-dimensional textures being the most common of all these. A one-dimensional texture is simply an array of texture elements.

TEXTURING Two-dimensional textures are represented using a two-dimensional array with each texture element addressable via an x and y vector. These textures are the type that we will be working with; even our depth and normal maps used during bump mapping are represented using these two-dimensional bitmap arrays.

TEXTURING Volumetric textures, also called three dimensional textures, are another interesting texture resource type. These textures, represented as three- dimensional volumes, are useful for describing solid material blocks from which arbitrary objects can be shaped.

TEXTURING Texture mapping is based on the manipulation of individual fragments during the graphics pipeline’s fragment processing stage. The method used to perform the actual texture mapping at application level depends mainly on the level of quality required.

TEXTURING The most common method maps a two-dimensional texture resource onto the surface of an object. This texture mapping process starts out in two- dimensional texture space and moves to three- dimensional object space where the texture is mapped onto the object – a process known as surface parameterization. A projection transformation is then used to move from object space to screen space.

TEXTURING Textures are loaded into system memory as arrays with coordinates, called texture coordinates. These coordinates allow us to address and access the individual texel elements making up the array. Texture coordinates are generally scaled to range over the interval (0, 1).

TEXTURING Two-dimensional textures can be described using the notation T(u, v) with u and v the texture coordinates uniquely defined for each vertex on a given surface. The process of texture mapping is thus concerned with aligning each texel’s texture space coordinates with a vertex on the surface of an object.

Texture Filtering Every pixel of an onscreen image contains an independently controlled color value obtained from the texture. Texture filtering, also called ‘texture smoothing’, controls the way in which pixels are colored by blending the color values of adjacent texture elements.

Texture Filtering Mipmapping –A –A mipmap is a series of pre-filtered texture images of varying resolution.

Texture Filtering Nearest Point Interpolation – –The point matching the center of a texture element is rarely obtained when texture coordinates are mapped to a two-dimensional array of texels. – –Nearest point interpolation is used to approximate this point by using the color value of the texel closest to the sampled point.

Texture Filtering Bilinear Filtering – –Bilinear filtering builds on the concept of nearest point interpolation by sampling not just one but four texture elements when texture coordinates are mapped to a two- dimensional array of texels.

Texture Filtering Trilinear Filtering – –Bilinear filtering does not perform any interpolation between mipmaps, resulting in noticeable quality changes where the graphics system switches between mipmap levels. – –Trilinear filtering solves this quality issue by extending the previous technique to perform a texture lookup coupled with a bilinear filtering operation for the two bordering mipmap images, one for the higher resolution texture, and the other for the lower resolution one.

Texture Filtering Anisotropic Filtering – –Anisotropy is a distortion visible in the texels of a textured object when the object is rotated at a specific angle to the point of view.

Texture Filtering Anisotropic Filtering – –Anisotropic texture filtering deals with this blurriness by sampling texture elements using a quadrilateral modified according to the viewing angle. – –A single pixel could encompass more texel elements in one direction, such as along the x-axis, than in another, for instance along the z-axis. – –By using a modifiable quadrilateral for the sampling of texels, we are able to maintain proper perspective and precision when mapping a texture to an object.

Basic Texture Mapping Implementation [see the textbook and source code examples, “TextureMapping(Direct3D)” and “TextureMapping(OpenGL)”, on the book’s website for detailed examples].

Bump Mapping There are a number of techniques combining lighting calculations with texture surface normal perturbations to create more realistic looking object surfaces. Bump mapping is one such technique. Bump mapping can be described as a form of texture mapping incorporating light reflection to simulate real-world surfaces where the unevenness of a surface influences the reflection of light. Bump mapping combines per-pixel lighting calculations with the normals calculated at each pixel of a surface.

Bump Mapping

Implementing Bump Mapping We can summarize the process of bump mapping as follows: 1 Determine the inverse TBN matrix. This is required because the TBN matrix translates coordinates from texture space to object space and we need to convert the light vector from object space to texture space. 2 Calculate the light vector. 3 Transform the light vector from object space to texture space by multiplying it with the TBN matrix. 4 Read the normal vector at the specific pixel. 5 Calculate the dot product between the light vector and normal vector. 6 Multiply the result from step 5 with the color of the light and that of the surface material (this is the final diffuse light color). 7 Repeat the previous six steps for each and every pixel of the textured surface.

Implementing Bump Mapping [see the textbook for a detailed example and discussion].

Cube Mapping Cube mapping, also called environmental mapping or sometimes reflection mapping, allows us to simulate complex reflections by mapping real-time computed texture images to the surface of an object. Each texture image used for environmental mapping stores a ‘snapshot’ image of the environment surrounding the mapped object. These snapshot images are then mapped to a geometric object to simulate the object reflecting its surrounding environment. An environment map can be considered an omnidirectional image.

Cube Mapping

Cube mapping is a type of texturing where six environmental maps are arranged as if they were faces of a cube. Images are combined in this manner so that an environment can be reflected in an omnidirectional fashion.

Implementing Cube Mapping [see the textbook for a detailed example and discussion].