Download presentation
Presentation is loading. Please wait.
Published byMaximillian McCarthy Modified over 9 years ago
1
GRAPHICS PIPELINE & SHADERS SET09115 Intro to Graphics Programming
2
Breakdown Background Working with Data Graphics Pipeline Pipeline Stages Programmable Shaders
3
Recommended Reading “Real-Time Rendering”, Third Edition. Akenine- Moller, et. al. Chapters 1 - 3
4
Background
5
Review So far you have been mainly working with basic geometry Creating triangles from three points Creating quads from four points Using these shapes to build more complex models Geometry can have other values attached Colour Texture positions etc.
6
What is a Pipeline? A pipeline is just a series of processes that occur in order Read File Process Data Print Results A pipeline can have any number of processes within it The point is that data from one process is fed into the next process Think of an assembly line
7
Parallelizing a Pipeline Pipelines are very easy to parallelize Pipelining Each process works independently Whenever one process has data ready for the next, it forwards it on The sending process can still be processing data Internal Parallel Each process can itself work on different parts of the incoming data Allows the process to speed up based on the number of internal processes
8
Throughput of Data Data throughput in a pipeline is fast in comparison to traditional processing Data parallelism Streaming Data throughput is determined by the slowest process Some processes may be fast
9
Graphics Pipeline Diagram
10
Pipelines Graphics Pipeline Questions?
11
Working with Data
12
Vertex Data Data sent to graphics card is called vertex data Position of vertex Colour of vertex Texture coordinates Normals This is what you use to define your objects // Vertex 1 glVertex3f(1.0f, 0.0f, 0.0f); glColor3f(1.0f, 0.0f, 0.0f); // Vertex 2 glVertex3f(1.0f, 1.0f, 0.0f); glColor3f(1.0f, 0.0f, 1.0f); // etc.
13
Vertex Buffers To stream vertex information efficiently, we use vertex buffers Arrays of vertices Buffers exist for vertices, normals, colours, etc. GLfloat vertices[] = { 1.0f, 0.0f, 0.0f, 1.0f, 1.0f, 0.0f 0.0f, 1.0f, 0.0f }; glEnableClientState(GL_VERTEX_ARR AY); glVertexPointer(3, GL_FLOAT, 0, vertices); glDrawArrays(GL_TRIANGLES, 0, 3);
14
Index Data Working with buffers allows us to exploit index data to further improve efficiency Indices means that we reuse vertex data from a buffer at different points
15
Index Buffers Index data is also stored in a buffer Buffer contains index numbers relating to positions of vertices in the vertex buffer Render reuses vertices from the buffer // Create the eight vertices GLfloat vertices[] = {.. }; glVertexPointer(3, GL_FLOAT, 0, vertices); // Create index buffer Glubyte indices[] = { 0, 3, 1, 1, 3, 2, // etc }; glDrawElements(GL_TRIANGLES, 36, GL_UNSIGNED_BYE, indices);
16
Data for the Pipeline Using Data on the Pipeline Questions?
17
Graphics Pipeline
18
History First work on 3D rendering occurred in the 1960s Most common 3D rendering techniques evolved in the 1960s-1970s 1980s saw the introduction of graphics hardware Commodore Amiga first commercial computer to have a GPU 1990s saw the introduction of graphics cards and graphics APIs OpenGL, DirectX
19
Fixed-Function Pipeline Initially graphics cards were not programmable Fixed-function pipeline Programmers could turn of and on states, and provide some values to these states This is what you have done so far with OpenGL Nintendo Wii still has a fixed-function pipeline Although advancing rendering, this did not provide flexibility
20
Programmable Pipeline To add flexibility, GPUs were given programmable stages Shaders First shaders used in 1980s Pixar’s RenderMan 2001 nVidia introduced the GPU with programmable shaders Vertex shading Today, nearly all GPUs support shading Vertex, Geometry, Pixel / Fragment
21
Application Geometry Rasterizer The three main stages of the graphics pipeline are: Application OpenGL, DirectX stage Application stage feeds geometry data (vertex data) to the geometry stage Geometry GPU Works on the vertices and polygons towards screen mapping Rasterizer GPU Sets pixel colours for the screen
22
Graphics Pipelines Evolution General Stages Questions?
23
Pipeline Stages
24
Application Stage Complete control from the programmer Use graphics API to send data to the GPU This is where we have been working up until now May do culling to improve performance
25
Geometry Stage GPU transforms input Application stage data into screen mapped data Has five internal stages Model & View Transform Vertex Shading Geometry shader we will discuss in a later lecture Projection Clipping Screen Mapping
26
Model and View Transform Model initially exists within its own model coordinate space Allows movement, rotation, etc. Model transformation can be done on the GPU More on this throughout the module View transformation involves moving the model so it exists in the camera’s coordinate space Camera at the origin Model moved to suit
27
Vertex Shading First programmable stage Vertex data is used to perform per-vertex lighting calculations You can see the triangles
28
Projection 3D data is now projected using the projection matrix View (frustum) is transformed into a unit cube (-1, -1, -1) to (1, 1, 1) Two common projection methods Orthographic (essentially flat) Parallel lines remain parallel Perspective Distance from camera has an effect
29
Clipping We only ever try and draw what we can see Culling has removed much to us already After model, view and projection transform, only some primitives are still visible These are removed Some primitives may only be partially visible Clipped New vertices are introduced to support the clipping
30
Screen Mapping The final process in the geometry stage Only visible primitives need to be mapped to the screen This is relatively simple – the transformed (x, y) coordinates are used as screen coordinates
31
Rasterizer Stage This is where we start assigning colours to pixels on screen Four stages Triangle setup Triangle traversal Pixel Shading Merging
32
Triangle Setup and Traversal These stages are fixed within the graphics pipeline Goal is to determine which triangles cover which pixels Including the depth of the triangle pixel This data is used later in the merging stage
33
Pixel Shading Programmable stage allowing manipulation of pixel colours Lighting here provides a smoother effect Texturing also done here
34
Merging Determines the final colour of a pixel Data from pixel shader, colour buffers, depth, etc. are used to determine final colour of a screen pixel Other operations can be performed here based on pipeline state
35
High Level Stages For our purposes, we can think of the graphics pipeline as two stages Programmable Shader Stage Vertex shader to perform some lighting calculations Geometry shader Transforms vertices
36
High Level Stages Merging / Raster Operations Stage Colours individual pixels Pixel shader can be used in here Per-pixel lighting Texturing etc. Understanding the two stage process is important, but be aware of the internal operations
37
Application Stage Geometry Stage Rasterization Stage Questions?
38
Programmable Shaders
39
As mentioned, three stages in the graphics pipeline are programmable Vertex shader Manipulates individual vertices Geometry shader Works with primitives (triangles, lines, etc.) Pixel shader Works with pixels
40
Vertex Shader Lets us work with the incoming vertex data Position Colour Normals Can calculate values related to these vertices Light colour World position
41
Geometry Shader Let’s us work with individual pieces of geometry Lines Triangles Quads Can introduce new geometry at this stage
42
Stream Output It is possible to take the output of the geometry shader at this point if required No rendering to the screen This is done when working with geometry data you wish to manipulate and then use in another rendering pass Particle effects (explosions) performed here
43
Pixel Shader Let’s us determine colours of individual pixels Per-pixel lighting Texturing Normal (bump) mapping etc.
44
Render to Texture Also possible to output at this stage to a texture instead of the screen Useful for post-processing techniques Has numerous uses Motion blur, depth of field, shadowing, etc. Benie and Artur’s lighting effect last year used this technique
45
Example Shader Vertex Shader varying vec3 normal, lightDir; void main() { lightDir = normalize(vec3(gl_LightSource [0].position)); normal = normalize(gl_NormalMatrix * gl_Normal); gl_Position = ftransform(); } Pixel Shader varying vec3 normal, lightDir; void main() { float intensity; vec3 n; vec4 color; n = normalize(normal); intensity = max(dot(lightDir, n), 0.0); if (intensity > 0.98) color = vec4(0.8, 0.8, 0.8, 1.0); else if (intensity > 0.5) color = vec4(0.4, 0.4, 0.8, 1.0); // etc. gl_FragColor = color; }
47
Effect Files Shader programs are stored in effect files DirectX uses a single file approach OpenGL allows shaders to be split between files then linked together OpenGL and DirectX compile these files at runtime So, do it during the loading process Can set values using the API calls
48
Languages Number of shader languages exist High Level Shader Language (HLSL) DirectX language GL Shader Language (GLSL) OpenGL language C for graphics (Cg) nVidia language Supported via libraries in DirectX and OpenGL Slower due to higher level integration
49
Questions?
50
Summary We’ve taken a brief trip down the graphics pipeline Application Geometry Rasterization We’ve also looked at the programmable stages Vertex, Geometry, Pixel We will be using these ideas as we go forward through the module
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.