Maths & Technologies for Games Advanced Graphics: Scene Post-Processing CO3303 Week 11-12.

Slides:



Advertisements
Similar presentations
16.1 Si23_03 SI23 Introduction to Computer Graphics Lecture 16 – Some Special Rendering Effects.
Advertisements

Exploration of advanced lighting and shading techniques
POST-PROCESSING SET09115 Intro Graphics Programming.
CS123 | INTRODUCTION TO COMPUTER GRAPHICS Andries van Dam © 1/16 Deferred Lighting Deferred Lighting – 11/18/2014.
Frame Buffer Postprocessing Effects in DOUBLE-S.T.E.A.L (Wreckless)
Exploration of bump, parallax, relief and displacement mapping
Graphics Pipeline.
The Art and Technology Behind Bioshock’s Special Effects
Week 7 - Monday.  What did we talk about last time?  Specular shading  Aliasing and antialiasing.
Week 11 - Wednesday.  Image based effects  Skyboxes  Lightfields  Sprites  Billboards  Particle systems.
Week 10 - Monday.  What did we talk about last time?  Global illumination  Shadows  Projection shadows  Soft shadows.
Real-Time Rendering TEXTURING Lecture 02 Marina Gavrilova.
CHAPTER 12 Height Maps, Hidden Surface Removal, Clipping and Level of Detail Algorithms © 2008 Cengage Learning EMEA.
Fast GPU Histogram Analysis for Scene Post- Processing Andy Luedke Halo Development Team Microsoft Game Studios.
CS 4363/6353 INTRODUCTION TO COMPUTER GRAPHICS. WHAT YOU’LL SEE Interactive 3D computer graphics Real-time 2D, but mostly 3D OpenGL C/C++ (if you don’t.
Real-Time Rendering SPEACIAL EFFECTS Lecture 03 Marina Gavrilova.
Introduction to 3D Graphics John E. Laird. Basic Issues u Given a internal model of a 3D world, with textures and light sources how do you project it.
Post-rendering Cel Shading & Bloom Effect
09/18/03CS679 - Fall Copyright Univ. of Wisconsin Last Time Bump Mapping Multi-pass algorithms.
9/20/2001CS 638, Fall 2001 Today Finishing Up Reflections More Multi-Pass Algorithms Shadows.
Computer Graphics: Programming, Problem Solving, and Visual Communication Steve Cunningham California State University Stanislaus and Grinnell College.
Guilford County Sci Vis V204.01
Shadows Computer Graphics. Shadows Shadows Extended light sources produce penumbras In real-time, we only use point light sources –Extended light sources.
Maths and Technologies for Games Water Rendering CO3303 Week 22.
1 Computer Graphics Week13 –Shading Models. Shading Models Flat Shading Model: In this technique, each surface is assumed to have one normal vector (usually.
UW EXTENSION CERTIFICATE PROGRAM IN GAME DEVELOPMENT 2 ND QUARTER: ADVANCED GRAPHICS Visual quality techniques.
Computer Graphics Texture Mapping
Computer Graphics World, View and Projection Matrices CO2409 Computer Graphics Week 8.
Interactive Time-Dependent Tone Mapping Using Programmable Graphics Hardware Nolan GoodnightGreg HumphreysCliff WoolleyRui Wang University of Virginia.
09/09/03CS679 - Fall Copyright Univ. of Wisconsin Last Time Event management Lag Group assignment has happened, like it or not.
TERRAIN SET09115 Intro to Graphics Programming. Breakdown  Basics  What do we mean by terrain?  How terrain rendering works  Generating terrain 
09/11/03CS679 - Fall Copyright Univ. of Wisconsin Last Time Graphics Pipeline Texturing Overview Cubic Environment Mapping.
Advanced Computer Graphics Depth & Stencil Buffers / Rendering to Textures CO2409 Computer Graphics Week 19.
Emerging Technologies for Games Alpha Sorting and “Soft” Particles CO3303 Week 15.
Computer Graphics Module Review CO2409 Computer Graphics.
CS 638, Fall 2001 Multi-Pass Rendering The pipeline takes one triangle at a time, so only local information, and pre-computed maps, are available Multi-Pass.
Computer Graphics Bitmaps & Sprites CO2409 Computer Graphics Week 3.
Week 10 - Wednesday.  What did we talk about last time?  Shadow volumes and shadow mapping  Ambient occlusion.
Computer Graphics The Rendering Pipeline - Review CO2409 Computer Graphics Week 15.
Advanced Computer Graphics Advanced Shaders CO2409 Computer Graphics Week 16.
Shadow Mapping Chun-Fa Chang National Taiwan Normal University.
Tone Mapping on GPUs Cliff Woolley University of Virginia Slides courtesy Nolan Goodnight.
Maths & Technologies for Games DirectX 11 – New Features Tessellation & Displacement Mapping CO3303 Week 19.
Games Development 1 Camera Projection / Picking CO3301 Week 8.
1 Perception and VR MONT 104S, Fall 2008 Lecture 21 More Graphics for VR.
Advanced Computer Graphics Shadow Techniques CO2409 Computer Graphics Week 20.
09/16/03CS679 - Fall Copyright Univ. of Wisconsin Last Time Environment mapping Light mapping Project Goals for Stage 1.
Games Development 2 Entity Update & Rendering CO3301 Week 2, Part 1.
Computer Graphics Rendering 2D Geometry CO2409 Computer Graphics Week 2.
Emerging Technologies for Games Deferred Rendering CO3303 Week 22.
Computer Graphics Blending CO2409 Computer Graphics Week 14.
Computer Graphics Camera Projection / Picking CO2409 Week 8 - Optional Advanced Material Not on Exam.
Maths & Technologies for Games Graphics Optimisation - Batching CO3303 Week 5.
What are shaders? In the field of computer graphics, a shader is a computer program that runs on the graphics processing unit(GPU) and is used to do shading.
09/23/03CS679 - Fall Copyright Univ. of Wisconsin Last Time Reflections Shadows Part 1 Stage 1 is in.
Maths and Technologies for Games Water Rendering
Week 7 - Monday CS361.
Week 11 - Wednesday CS361.
Deferred Lighting.
The Graphics Rendering Pipeline
CS451Real-time Rendering Pipeline
© University of Wisconsin, CS559 Fall 2004
UMBC Graphics for Games
Computer Graphics Module Review
More modeling terms resolution –distinguish separate entities on the display vsync – syncs GPU frame-rate output with monitor refresh rate tessellation.
Computer Graphics Module Overview
Computer Graphics Introduction to Shaders
Computer Graphics Material Colours and Lighting
Frame Buffers Fall 2018 CS480/680.
Emerging Technologies for Games Review & Revision Strategy
Presentation transcript:

Maths & Technologies for Games Advanced Graphics: Scene Post-Processing CO3303 Week 11-12

Contents Back Buffer Usage - Recap Full-Screen Post-Processing Types of Post-Process Area Post-Processing [DirectX Sample Documentation for details and examples]

The Front & Back Buffers The visible viewport is sometimes called the front buffer But a second off-screen back buffer is the usual render target We render to this off-screen bitmap So rendering process is not seen After frame rendering, the back buffer is presented (copied) to the front buffer The copy is a very fast operation and will not be so easily seen as the rendering This is a form of double-buffering A general technique of using a second buffer to avoid affecting the one that is in use

Swap Methods / Chains Methods to get the back buffer content to the front buffer: Simple copy, back buffer discarded This is the usual method Swap the two buffers Useful if we want to keep the last frame Can have more than one back buffer If two buffers, this is triple-buffering Improved concurrency with GPU (GPU finishes before frame is to be shown) Multiple back buffers must use the swap method Called a swap chain

VSync or Not Copy / swap to front buffer is a fast operation But not instant, it can be seen Can perform it during the monitor's vertical sync VSync is a short time between each monitor refresh Any drawing operation in this time won’t be seen But if you do this, FPS will be tied to monitor refresh rate E.g. Monitor at 60Hz, then FPS will be 60, 30, 20, 15 etc Alternatively can copy to front buffer immediately FPS will reflect maximum performance But, will see tearing Horizontal bands between previous and current frame Most obvious when moving sideways

Alternative Render Targets Not necessary to render to a back buffer We can render to a texture or to a specially created render target We saw render-to-texture in Graphics module: Used it to render a mirror, and a camera viewpoint for shadows Such a texture needs to be specially created For even more specialist needs we can create explicit render targets, or even render to multiple targets Will see this when we do deferred rendering DirectX10 / 11 are very flexible in the use of render targets

Scene Post-Processing Assume we render the entire scene to an intermediate texture We can then copy it to the back buffer to be presented to the viewport But we can also perform additional image processing during this copy The copy process is effectively another rendering pass So we can alter the look of the entire scene using a pixel shader Use of a shader means much flexibility This is full-screen post-processing

Multiple Passes / Render Targets Can post-process in multiple passes: Render scene to a texture Render this texture in two ways into two more textures Combine the textures into the back buffer Maybe using blending (additive etc.) The textures used do not have to all be the same size Scaling down into a texture, then back up to the back buffer is a common method to blur Can make complex sequences of post processing for advanced effects See Bloom / HDR later

Post-Processing Techniques Wide range of possible techniques: Colour filters Blurs: motion blur, depth of field, bloom Distortions: e.g. heat haze, shock waves Feedback: motion blur (again) Film effects: grain, old film look, some lens effects Game effects: Night scope, predator-vision Generic 2D image filters: contrast, levels, edge detection etc. Image analysis: Calculate luminance, visibility coverage etc.

Colour Filters A simple use of post-processing is a colour filter E.g. a back & white filter: Render scene to a texture Render texture to back buffer using a special shader The shader converts each texture colour to grayscale Could use simple or complex technique: E.g. average R,G,B or full luminance calculation Other possible colour filters: Adding a colour tint Sepia (like old photo) Gradient colour filter, animated gradients Complete colour remapping: e.g. a heat scope

Colour Filters - Detail Rendering a colour filter: Render entire scene normally to an intermediate texture Set up pixel shader to get pixels from a texture and tint the, Use this shader to copy & tint the intermediate texture to full-screen quad on the back buffer Quad is defined in 2D viewport space No vertex processing required since vertices are already in 2D Ensure texels match pixels Careful with UVs. No texture filtering

Distortions A powerful class of techniques with a wide range of applications In general: Render scene to a texture Render texture onto back buffer, but distorted Create distortion using geometry or shader work Can distort entire viewport E.g. Looking through a sniper scope Or only small areas: Overlay small distortions to create heat haze, shock waves etc.

Distortions E.g. Distortion of an area: Render scene to a texture Render texture onto back buffer normally (simple copy) Render a portion of texture into back buffer with distortion Typically alter UVs with the shader Or can use geometry as illustrated Useful if area animates (e.g. water droplets on screen) Can use look-up texture to assist with UV adjust (see lab) Note the extra rendering pass compared to colour filter

Blurs Blurring a scene has several variants: Standard blurs (pixel averaging, Gaussian blur etc.) Depth of field – blur is stronger further from focal distance Bloom – bleeding of bright areas Motion blur – blur moving objects Implemented in various ways: E.g. rendering via smaller texture Pixel shader to average a few local pixels Sometimes do horizontal and vertical as separate passes Note: Motion blur can also be simulated using feedback

Depth of Field Depth of field describes the blurring of objects that are nearer or further than our focal distance Focal distance is the distance to the focus object The distance at which light rays will converge to a sharp image Eyes / cameras can adjust focal distance by manipulating lens Rendering true depth of field is complex Example simple solution using post-processing: Render scene to texture, also store depth values in a separate texture/render target or in alpha channel Blur the scene into another texture Render blend of the sharp and blurred texture into the back-buffer Depending on the original depth values Pixels close to focal length rendered from sharp texture, give more weight to the blurred texture for pixels away from focal length

Bloom / HDR Monitors cannot show the full dynamic range that we can see Typical monitor displays 0-255 brightness Brighter colours are clipped Dark colours compressed into few values Can use a high dynamic range (HDR) Use floating point colours But monitor can't display result Use tone mapping to convert HDR result to monitor range Use post-processing to give the impression of a high dynamic range E.g. bloom: the bleeding of bright light Simply a blur of the bright parts

Bloom / HDR Bloom is part of a multi-pass HDR rendering process: Render HDR scene to floating-pt texture Copy scene to back buffer using tone mapping – use measured luminance to convert HDR range to LDR range Copy to another smaller texture (1/4 size) for efficiency Filter out all but bright areas Blur the pixels to give final bloom Render bloom over back buffer using additive blending Each stage is a render pass Some stages can take more than one pass (blur is often 2-pass)

Simple Motion Blur Can use last frame as an input (a texture) to render the current one A form of feedback Easiest method is to blend new frame with previous back-buffer Must ensure we are swapping front and back buffer Must not clear back-buffer pixels Or retain intermediate texture used for post-processing from last frame Used to create motion blur effects Can be combined with distortion or other post-process

Area/Polygon Post-Processing Most effects described so far affect the entire viewport Sometimes want to post-process only a portion, e.g: Rectangle, circle or other area A specific 3D polygon The distortion was an example of case 1, using a rectangle Case 2 occurs with unique model materials E.g. Water, rippled glass etc.

Area Post-Processing Often we wish to post process the area around a known point in the world For example, heat distortion due to an engine exhaust Convert the world point to a screen point Covered this as part of picking Get dimensions of the processed area Similar world- screen conversion with distance Proceed as distortion example Usually render to a quad, just not full-screen May need to take account of depth Unlike full-screen effects, area effects may be hidden behind other objects in the scene. Ensure quad has proper depth.

Polygon Post-Processing May wish to post process an actual model polygon E.g. Rippled glass in a door Render intermediate texture as normal Not copying to quad this time, instead need to calculate the target 2D polygon Need UVs of poly in intermediate texture And 2D position of polygon in back buffer Use special shader or picking methods Subtle detail, requires close understanding of the exact process Render with post-processing in either case

Reusing/Chaining Render Targets Reuse textures when performing several post-processing passes: Render scene to texture 1 Next pass copies scene with full-screen or area effect into texture 2 Next pass copies back to texture 1 Repeat, only two textures required Final pass copies into back buffer Can build a flexible (scripted) system for generic post-processing More efficient to combine passes into a single pass where possible But lose flexibility