Computer Graphics The Rendering Pipeline - Review CO2409 Computer Graphics Week 15.

Slides:



Advertisements
Similar presentations
COMPUTER GRAPHICS SOFTWARE.
Advertisements

CS123 | INTRODUCTION TO COMPUTER GRAPHICS Andries van Dam © 1/16 Deferred Lighting Deferred Lighting – 11/18/2014.
Understanding the graphics pipeline Lecture 2 Original Slides by: Suresh Venkatasubramanian Updates by Joseph Kider.
Graphics Pipeline.
Computer Graphic Creator: Mohsen Asghari Session 2 Fall 2014.
Texture Mapping. Texturing  process that modifies the appearance of each point on a surface using an image or function  any aspect of appearance can.
CGDD 4003 THE MASSIVE FIELD OF COMPUTER GRAPHICS.
CS 4731: Computer Graphics Lecture 19: Shadows Emmanuel Agu.
Status – Week 283 Victor Moya. 3D Graphics Pipeline Akeley & Hanrahan course. Akeley & Hanrahan course. Fixed vs Programmable. Fixed vs Programmable.
The Graphics Pipeline CS2150 Anthony Jones. Introduction What is this lecture about? – The graphics pipeline as a whole – With examples from the video.
GPU Graphics Processing Unit. Graphics Pipeline Scene Transformations Lighting & Shading ViewingTransformations Rasterization GPUs evolved as hardware.
Computer Graphics: Programming, Problem Solving, and Visual Communication Steve Cunningham California State University Stanislaus and Grinnell College.
Computer Graphics Introducing DirectX
GAM532 DPS932 – Week 1 Rendering Pipeline and Shaders.
Computer Graphics Inf4/MSc Computer Graphics Lecture 9 Antialiasing, Texture Mapping.
REAL-TIME VOLUME GRAPHICS Christof Rezk Salama Computer Graphics and Multimedia Group, University of Siegen, Germany Eurographics 2006 Real-Time Volume.
Programmable Pipelines. Objectives Introduce programmable pipelines ­Vertex shaders ­Fragment shaders Introduce shading languages ­Needed to describe.
CSE 381 – Advanced Game Programming Basic 3D Graphics
Geometric Objects and Transformations. Coordinate systems rial.html.
Computer Graphics Texture Mapping
Week 2 - Wednesday CS361.
Computer Graphics World, View and Projection Matrices CO2409 Computer Graphics Week 8.
Chris Kerkhoff Matthew Sullivan 10/16/2009.  Shaders are simple programs that describe the traits of either a vertex or a pixel.  Shaders replace a.
COMP 261 Lecture 16 3D Rendering. input: set of polygons viewing direction direction of light source(s) size of window. output: an image Actions rotate.
TERRAIN SET09115 Intro to Graphics Programming. Breakdown  Basics  What do we mean by terrain?  How terrain rendering works  Generating terrain 
09/11/03CS679 - Fall Copyright Univ. of Wisconsin Last Time Graphics Pipeline Texturing Overview Cubic Environment Mapping.
CS 450: COMPUTER GRAPHICS REVIEW: INTRODUCTION TO COMPUTER GRAPHICS – PART 2 SPRING 2015 DR. MICHAEL J. REALE.
Advanced Computer Graphics Depth & Stencil Buffers / Rendering to Textures CO2409 Computer Graphics Week 19.
CSC 461: Lecture 3 1 CSC461 Lecture 3: Models and Architectures  Objectives –Learn the basic design of a graphics system –Introduce pipeline architecture.
Foundations of Computer Graphics (Fall 2012) CS 184, Lectures 13,14: Reviews Transforms, OpenGL
Computer Graphics Module Review CO2409 Computer Graphics.
1 Introduction to Computer Graphics with WebGL Ed Angel Professor Emeritus of Computer Science Founding Director, Arts, Research, Technology and Science.
1Computer Graphics Lecture 4 - Models and Architectures John Shearer Culture Lab – space 2
Advanced Computer Graphics Advanced Shaders CO2409 Computer Graphics Week 16.
GRAPHICS PIPELINE & SHADERS SET09115 Intro to Graphics Programming.
Maths & Technologies for Games DirectX 11 – New Features Tessellation & Displacement Mapping CO3303 Week 19.
Games Development 1 Camera Projection / Picking CO3301 Week 8.
09/16/03CS679 - Fall Copyright Univ. of Wisconsin Last Time Environment mapping Light mapping Project Goals for Stage 1.
Computer Graphics Rendering 2D Geometry CO2409 Computer Graphics Week 2.
Computer Graphics Blending CO2409 Computer Graphics Week 14.
Computer Graphics Camera Projection / Picking CO2409 Week 8 - Optional Advanced Material Not on Exam.
Lecture 6 Rasterisation, Antialiasing, Texture Mapping,
Maths & Technologies for Games Advanced Graphics: Scene Post-Processing CO3303 Week
Maths & Technologies for Games Graphics Optimisation - Batching CO3303 Week 5.
COMPUTER GRAPHICS CS 482 – FALL 2015 SEPTEMBER 29, 2015 RENDERING RASTERIZATION RAY CASTING PROGRAMMABLE SHADERS.
Computer Graphics Matrices
Emerging Technologies for Games Capability Testing and DirectX10 Features CO3301 Week 6.
What are shaders? In the field of computer graphics, a shader is a computer program that runs on the graphics processing unit(GPU) and is used to do shading.
GLSL Review Monday, Nov OpenGL pipeline Command Stream Vertex Processing Geometry processing Rasterization Fragment processing Fragment Ops/Blending.
Graphics Pipeline Bringing it all together. Implementation The goal of computer graphics is to take the data out of computer memory and put it up on the.
- Introduction - Graphics Pipeline
Photorealistic Rendering vs. Interactive 3D Graphics
The Graphic PipeLine
Computer Graphics Index Buffers
Graphics Processing Unit
Deferred Lighting.
Chapter 6 GPU, Shaders, and Shading Languages
The Graphics Rendering Pipeline
CS451Real-time Rendering Pipeline
Understanding Theory and application of 3D
Real-time Computer Graphics Overview
Graphics Processing Unit
The Graphics Pipeline Lecture 5 Mon, Sep 3, 2007.
Computer Graphics Module Review
Computer Graphics Module Overview
Computer Graphics Introduction to Shaders
Texture Mapping 고려대학교 컴퓨터 그래픽스 연구실.
Computer Graphics Material Colours and Lighting
03 | Creating, Texturing and Moving Objects
Computer Graphics Introducing DirectX
Presentation transcript:

Computer Graphics The Rendering Pipeline - Review CO2409 Computer Graphics Week 15

Lecture Contents 1.The Rendering Pipeline 2.Input-Assembler Stage –Vertex Data & Primitive Data 3.Vertex Shader Stage –Matrix Transformations –Lighting 4.Rasterizer Stage 5.Pixel Shader Stage –Textures 6.Output-Merger Stage

The Rendering Pipeline We looked at the rendering pipeline of DirectX: We have seen all the key stages now –But Geometry Shaders and Stream-Ouput are beyond the scope of the module Today we recap each stage –Before moving on to some much more advanced material over the next few weeks

Input-Assember Stage The key input data are the vertex coordinates of our geometry –(X,Y,Z) position for each vertex Other parts of the pipeline also need additional per-vertex data: –Vertex normals if using lighting to find colour at vertex –Vertex colours if not using lighting –Texture coordinates / UVs if we use textures Multiple sets if we blend multiple textures The Input-Assembler stage collects together the vertexes and indexes used in the geometry

Vertex Data Formats So we have seen that the input vertex data is more than just a list of vertex positions Additional data required depending on how we intend to render the model –With textures, lighting etc. We use custom vertex data formats for flexibility –Different syntax in C++ and HLSL, but important to realise this refers to same data: the original geometry ready for the pipeline // C++ structure struct BasicVertex { D3DXVECTOR3 Pos; D3DXVECTOR3 Normal; D3DXVECTOR2 UV; }; // Matching HLSL structure struct VS_INPUT { float3 Pos : POSITION; float3 Normal : NORMAL; float2 UV : TEXCOORD0; };

Vertex Buffers A vertex buffer is just an array of vertices –Defining the set of vertices in our geometry –Each vertex in a custom format as above The buffer is managed by DirectX –We must ask DirectX to create and destroy them… –…and to access the data inside The array is usually created in main memory then copied to video card memory –Need it on the video card for performance reasons First saw vertex buffers in the week 5 lab

Primitives / Index Buffers The vertex data just defines points –We usually want to define triangles in the geometry –Or other primitives, e.g. lines The vertex data alone can define the primitives –Each triplet of vertices defining a triangle Or we can use an index buffer –An simple array of integers that index the vertex buffer –A triplet of indexes for each triangle –No duplication of vertices –Saw index buffers in week 9 We have also seen triangle strips Vertex 3 Vertex 2 Vertex 1 Vertex Index Buffer Vertex Buffer –Reuses previous vertices, works with or without an index buffer

Vertex Shader Stage The vertex / index data defines the 3D geometry –Next step is to convert this data into 2D polygons We use vertex shaders for this vertex processing The key operation is a sequence of matrix transforms –Transforming vertex coordinates from 3D model space to 2D viewport space The vertex shader can also calculate lighting (using the vertex normals), perform deformation etc. –Although we will shift lighting to the pixel shader next week Some vertex data (e.g. UVs) is not used at this stage and is simply passed through to later stages

From Model to World Space The original geometry is stored in model space –A local space with a convenient origin and orientation Each model has a world matrix –That transforms the model geometry from model space to world space: –These axes are often extracted from the matrix This matrix defines position and orientation for the model Has a special form containing the local axes of the model

From World to Camera Space We view the models through a camera –Which is part of the scene A camera has a view matrix –Transforms the world space geometry into camera space Camera space defines the world as seen by the camera –Camera looks down its Z axis The view matrix is the inverse of a normal world matrix –We often store a world matrix for the camera and just invert it as a final step –Allows us to treat a camera like a model

Camera Space to Viewport Each camera has a second matrix, the projection matrix Defining how the camera space geometry is projected into 2D –Defines the camera’s field of view –And the distance to the viewport –Often called the near clip distance Final step is to scale the projected 2D geometry into viewport pixel coordinates –Performed internally by DirectX Covered this material weeks 6-9

Lighting We looked at the mathematical lighting models used to illuminate 3D geometry (week 11) We showed how lighting can be calculated while vertices are being transformed –In the same vertex shader The calculations need a normal for each vertex –Which also needs to be transformed into world space –Because the lights are in the world Several effects combined for final vertex colour: –Ambient, diffuse, specular We showed how to implement their formulae

Rasterizer Stage This stage processes 2D polygons before rendering: –Off-screen polygons are discarded (culled) –Partially off-screen polygons are clipped –Back-facing polygons are culled if required (determined by clockwise/anti-clockwise order of viewport vertices) These steps occur by setting states –Saw states in week 14 blending lab This stage also scans through the pixels of the polygons –Called rasterising / rasterizing –Calls the pixel shader for each pixel encountered Data output from earlier stages is interpolated for each pixel –2D coordinates, colours, UVs etc.

Rasterisation The Rasterization Stage scans the 2D triangle geometry to determine all the pixels within The Pixel Shader stage is called for each pixel found The data passed to the Pixel Shader is interpolated (blended) from the output data of the three vertices from the triangle This way each pixel gets slightly different data and effects are smoothed across the triangle surface

Pixel Shader Stage Next each pixel is worked on to get a final colour –The work is done in a pixel shader Texture coordinates (UVs) map textures onto polygons –UV data will have been passed through from the previous steps –Textures are provided from the C++ via shader variables Textures can be filtered (texture sampling) to improve their look –Textures covered in week 12 Texture colours are combined with lighting or polygon colours to produce final pixel colours

Output-Merger Stage The final step in the rendering pipeline is the rendering of the pixels to the viewport –Don’t always copy the pixels directly, because the object we’re rendering may be partly transparent This involves blending the final polygon colour with the existing viewport pixel colours –Saw this in the Week 14 lab –Similar to sprite blending Also the depth buffer values are tested / written –We saw the depth buffer in the week 8 lab –Will see them in more detail shortly