Discrete Techniques
Contents Buffers Compositing and blending Bitmaps and images Mapping techniques
Discrete techniques I
Overview Buffers Compositing and blending
Discrete Techniques Which section of the pipeline? Geometric models, lighting models, camera models, etc. Modeling Transformations Lighting Which section of the pipeline? Done in screen space, dealing with fragments Buffer operations, per-pixel processing, etc. Viewing Transformations Clipping Projection Screen space Rasterisation Pixels for display Fragment Processing
Why fragments? After surface rasterised, no longer notion of a polygon Surface essentially been ‘broken up’ or ‘discretised’ into small pieces Each of which is at most the size of one pixel, called fragments Conceptually fragments can be smaller than one pixel i.e. more than one fragment contributes to colour of pixel
Buffers Per-pixel data is stored in buffers Color Buffer (front and back) Depth Buffer Accumulation Buffer Stencil Buffer Others (overlay planes, auxiliary buffers, color indices)
Buffers Discrete Limited resolution, spatially and depth wise Can define a buffer as a block of memory with k m x n bit-planes (usually hundreds of bits, a pixel refers to all the k elements resolution) k can range from a few hundreds of bits, a pixel refers to all the k elements at a particular location Depth, bits per pixel
Buffers The frame buffer Consists of a variety of buffers, collectively known as the frame buffer The term frame actually refers to the total display area But often when people refer to the frame buffer, they are really talking about the colour buffer OpenGL frame buffer
Buffers The colour buffer Where the RGBA pixel values are stored Generally need two separate colour buffers One for displaying, the other for rendering In double buffering referred to as front and back buffers Double buffering To prevent flickering and other undesirable artifacts that will appear if an image that is currently being displayed is updated
Buffers
Buffers The depth-buffer (z-buffer) Use for visibility determination Tests can be enabled or disabled Certain situations need disable depth-buffer tests (e.g. when rendering transparent polygons)
Buffers The stencil buffer General purpose buffer for doing things not possible with colour and depth buffer alone E.g. for creating reflective surfaces and essential in many shadow rendering techniques, like shadow values ‘Tag’ pixels in one rendering pass to control their update in subsequent rendering passes Can specify different rendering operations like Stencil test fails Stencil test passes & depth test fails Stencil test passes & depth test passes
Buffers Stencil buffer to reflection Without stencil buffer With stencil buffer
Buffers Stencil buffer for reflection (cont.) Basic algorithm Clear buffers Disable depth buffer and draw mirror surface to stencil buffer Enable depth buffer, draw reflected geometry where stencil buffer passes Disable stencil buffer, draw normal scene
Buffers Stencil buffer for reflection (cont.) Another way Clear buffers Draw all non-mirror geometry to frame & depth buffers Draw mirror to stencil buffer, where depth buffer passes Set depth to infinity, where stencil buffer passes Draw reflected geometry to frame & depth buffer, where stencil buffer passes
Buffers Shadows The umbra, area that is completely in shadow The penumbra, area partly illuminated and partly occluded
Buffers Shadows (cont.) Hard Shadows Soft Shadows
Buffers Shadow volumes Volume of space in shadow Pyramid with point light as the apex Polygons inside a shadow volume will not be illuminated by that particular light Shadow test similar to clipping
Buffers Shadow volumes with the stencil buffer Scene with shadows Stencil buffer contents • green = stencil value of 0 • red = stencil value of 1 • darker reds = stencil value > 1
Buffers Shadow volumes (cont.) Limitations Introduces a lot of new geometry and computations Can be computationally expensive Only generates hard shadows
Buffers The accumulation buffer The idea is to accumulate multiple images Common uses Compositing Anti-aliasing Motion blur Depth of field Soft shadows
Compositing and Blending Opacity and transparency Opaque surfaces permit no light to pass through Transparent surfaces permit all light to pass Translucent surfaces pass some light
Compositing and Blending Dealing with translucency in a physically correct manner is difficult due to The complexity of the internal interactions of light and matter Using a pipeline renderer
Compositing and Blending Alpha blending Alpha channel The ‘A’ component in the RGBA (or RGB) colour mode When blending enabled Value of α determines how the RGB values are written into the frame buffer Objects blended or composited blended together Because fragments from multiple objects can contribute to the colour of the same pixel Useful for rendering objects with a translucent component (e.g. water, glass, etc.)
Discrete techniques II
Overview Bitmaps and images Mapping techniques Texture mapping Other mapping techniques
Digital images Image data Generally work with images that are arrays of pixels Can be of a variety of sizes and data types For example If working with RGB images, each colour component usually represented as one byte For a 512 x 512 image Or allocate memory to
Bitmaps and images Both take the form of rectangular arrays of pixels Some differences A bitmap (note this is different from the image format) Consists of a single bit of information about each pixel (a map of bits) Used as mask to overlay another image Image data Typically include several pieces of data per pixel (e.g. complete RGBA colour components) Simply overwrites or is blended with whatever data is in the frame buffer
Bitmaps in OpenGL OpenGL treats 1-bit pixels (bitmaps) differently from multi-bit pixels (pixelmaps) Rectangular array of 0s and 1s Mask that determines if the corresponding pixel in the frame buffer is drawn with the present raster colour Most commonly used for drawing character E.g.
Bitmaps in OpenGL Data stored in chunks that are multiples of 8 bits Start at bottom left To draw the bitmap, use
Image data Pixel format in frame buffer can be different from that of processor memory Many kinds of frame buffer data Many ways to store pixel information in processor memory Various data conversion can be performed during reading, writing and copying operations These two types of memory reside in different places Need packing and unpacking Drawing/reading can be slow
Image data Pixel packing and unpacking Refers to the way in which pixel data is written to and read from processor memory An image stored in memory has between one and four chunks of data, called elements Some elements are integers, others are floating-point values (typically 0.0 – 1.0) Floating-point values usually stored in frame buffer with lower resolution (e.g. 8-bits), exact number of bits depends on hardware Pixel storage mode controlled using glPixelStore*()
Mapping Techniques There are limitations to geometric modelling Texture mapping Uses images to fill inside of polygons ‘Paint’ image onto polygons Analogy – sticking wallpaper onto a wall Increase the visual realism
Texture mapping
Texture mapping
Texture mapping 2D texture Textured 3D model
Texture mapping Different texture maps can be used for same object
Texture mapping
Texture mapping Texture coordinates are referred to as (s, t)
Texture mapping
Texture mapping Aliasing in textures The under-sampling of a signal Looks worst when moving
Texture mapping Aliasing Texture sampling Aliasing artifacts emerge as a result of discrete sampling Texture sampling Point sampling Use value of closest texel Linear filtering Use weighted average in the neighbourhood, more work
Texture mapping Size of pixel we are trying to colour may be larger or smaller than one texel
Texture mapping Mipmaps Create multiple resolutions of an image Hand it 256 x 256, create 128 x 128, 64 x 64, 32 x 32, 16 x 16, 8 x 8, 4 x 4, 2 x 2, 1 x 1 Point sample used depends on polygon’s size on screen
Texture mapping Mipmaps can be used in conjunction with texture filtering methods Nearest neighbour only Mipmaps & linear interpolation
Mapping techniques Point sampling Linear filtering Mipmapped linear filtering Mipmapped point sampling
Drawbacks of texture filtering Filters work by averaging the values of pixels in some way Higher order filters are computationally expensive Blurring effect which might be undesirable
Mapping techniques Billboarding Texture map a 2D image onto a 2D polygon Rotate it so polygon’s normal always faces the viewer Creates illusion that 2D image is a 3D object
Billboarding
Mapping techniques Bump mapping Textures used to perturb the surface normal Actual geometry of surface does not change, just shaded as if it were different shape
Mapping techniques Bump mapping – outline looks smooth + =
Mapping techniques Environment mapping Quick but inaccurate way of generating effects like reflections (a.k.a reflection mapping) Image of the surrounding environment used to project reflections onto object’s surface Common methods Cube mapping Spherical mapping
Mapping techniques Cube mapping Popular because easily constructed Top Cube mapping Popular because easily constructed Place camera in centre of box and render the 6 different views Surrounding scene mapped onto surfaces of a cube Left Back Front Right Bottom
Cube mapping
Cube mapping
Environment mapping
Mapping techniques Multi-texturing + = Paint multiple textures onto the same surface + =
Mapping techniques Light mapping Illumination maps / light maps were introduced in the game Quake to represent lighting effects
Mapping techniques Light mapping Allows lighting to be pre-calculated and stored as 2D texture map
Mapping techniques Light mapping x =
Mapping techniques Displacement mapping Use texture map to actually move points on surface Geometry displacement before visibility determined Shadows can be generated
Effects using Image Processing + =
Effects using Image Processing
Questions? Thanks