Week 3 - Monday CS361.

Slides:



Advertisements
Similar presentations
Perspective aperture ygyg yryr n zgzg y s = y g (n/z g ) ysys y s = y r (n/z r ) zrzr.
Advertisements

COMPUTER GRAPHICS CS 482 – FALL 2014 NOVEMBER 10, 2014 GRAPHICS HARDWARE GRAPHICS PROCESSING UNITS PARALLELISM.
Lecture 38: Chapter 7: Multiprocessors Today’s topic –Vector processors –GPUs –An example 1.
Week 8 - Friday.  What did we talk about last time?  Radiometry  Photometry  Colorimetry  Lighting with shader code  Ambient  Directional (diffuse.
Understanding the graphics pipeline Lecture 2 Original Slides by: Suresh Venkatasubramanian Updates by Joseph Kider.
CS-378: Game Technology Lecture #9: More Mapping Prof. Okan Arikan University of Texas, Austin Thanks to James O’Brien, Steve Chenney, Zoran Popovic, Jessica.
Week 8 - Wednesday.  What did we talk about last time?  Textures  Volume textures  Cube maps  Texture caching and compression  Procedural texturing.
9/25/2001CS 638, Fall 2001 Today Shadow Volume Algorithms Vertex and Pixel Shaders.
The Programmable Graphics Hardware Pipeline Doug James Asst. Professor CS & Robotics.
Introduction to Shader Programming
Triangles, Translations Game Design Experience Professor Jim Whitehead March 2, 2009 Creative Commons Attribution 3.0 (Except copyrighted images) creativecommons.org/licenses/by/3.0.
Computer Science – Game DesignUC Santa Cruz Adapted from Jim Whitehead’s slides Shaders Feb 18, 2011 Creative Commons Attribution 3.0 (Except copyrighted.
GPU Graphics Processing Unit. Graphics Pipeline Scene Transformations Lighting & Shading ViewingTransformations Rasterization GPUs evolved as hardware.
Computer Graphics: Programming, Problem Solving, and Visual Communication Steve Cunningham California State University Stanislaus and Grinnell College.
Week 3 - Wednesday.  What did we talk about last time?  Project 1  Graphics processing unit  Programmable shading.
CSE 872 Dr. Charles B. Owen Advanced Computer Graphics1 Illumination and Shading Lights Diffuse and Specular Illumination BasicEffect Setting and Animating.
REAL-TIME VOLUME GRAPHICS Christof Rezk Salama Computer Graphics and Multimedia Group, University of Siegen, Germany Eurographics 2006 Real-Time Volume.
Basic Graphics Concepts Day One CSCI 440. Terminology object - the thing being modeled image - view of object(s) on the screen frame buffer - memory that.
GPU Programming Robert Hero Quick Overview (The Old Way) Graphics cards process Triangles Graphics cards process Triangles Quads.
CHAPTER 4 Window Creation and Control © 2008 Cengage Learning EMEA.
Programmable Pipelines. Objectives Introduce programmable pipelines ­Vertex shaders ­Fragment shaders Introduce shading languages ­Needed to describe.
Real-time Graphical Shader Programming with Cg (HLSL)
Geometric Objects and Transformations. Coordinate systems rial.html.
Programmable Pipelines. 2 Objectives Introduce programmable pipelines ­Vertex shaders ­Fragment shaders Introduce shading languages ­Needed to describe.
Week 2 - Wednesday CS361.
Chris Kerkhoff Matthew Sullivan 10/16/2009.  Shaders are simple programs that describe the traits of either a vertex or a pixel.  Shaders replace a.
A Crash Course in HLSL Matt Christian.
Week 3 - Monday.  What did we talk about last time?  Graphics rendering pipeline  Rasterizer Stage.
1 Dr. Scott Schaefer Programmable Shaders. 2/30 Graphics Cards Performance Nvidia Geforce 6800 GTX 1  6.4 billion pixels/sec Nvidia Geforce 7900 GTX.
Real-Time High Quality Rendering CSE 291 [Winter 2015], Lecture 4 Brief Intro to Programmable Shaders
Introduction to XNA Graphics Programming Asst. Prof. Rujchai Ung-arunyawee COE, KKU.
Computer Graphics The Rendering Pipeline - Review CO2409 Computer Graphics Week 15.
GRAPHICS PIPELINE & SHADERS SET09115 Intro to Graphics Programming.
CS662 Computer Graphics Game Technologies Jim X. Chen, Ph.D. Computer Science Department George Mason University.
Programmable Pipelines Ed Angel Professor of Computer Science, Electrical and Computer Engineering, and Media Arts Director, Arts Technology Center University.
Computer Graphics 3 Lecture 6: Other Hardware-Based Extensions Benjamin Mora 1 University of Wales Swansea Dr. Benjamin Mora.
COMPUTER GRAPHICS CS 482 – FALL 2015 SEPTEMBER 29, 2015 RENDERING RASTERIZATION RAY CASTING PROGRAMMABLE SHADERS.
09/25/03CS679 - Fall Copyright Univ. of Wisconsin Last Time Shadows Stage 2 outline.
What are shaders? In the field of computer graphics, a shader is a computer program that runs on the graphics processing unit(GPU) and is used to do shading.
GLSL I.  Fixed vs. Programmable  HW fixed function pipeline ▪ Faster ▪ Limited  New programmable hardware ▪ Many effects become possible. ▪ Global.
An Introduction to the Cg Shading Language Marco Leon Brandeis University Computer Science Department.
Computer Science – Game DesignUC Santa Cruz Tile Engine.
COMP 175 | COMPUTER GRAPHICS Remco Chang1/XX13 – GLSL Lecture 13: OpenGL Shading Language (GLSL) COMP 175: Computer Graphics April 12, 2016.
Our Graphics Environment Landscape Rendering. Hardware  CPU  Modern CPUs are multicore processors  User programs can run at the same time as other.
How to use a Pixel Shader CMT3317. Pixel shaders There is NO requirement to use a pixel shader for the coursework though you can if you want to You should.
GPU Architecture and Its Application
COMPUTER GRAPHICS CHAPTER 38 CS 482 – Fall 2017 GRAPHICS HARDWARE
- Introduction - Graphics Pipeline
Week 2 - Monday CS361.
Week 8 - Monday CS361.
Programmable Shaders Dr. Scott Schaefer.
Week 8 - Wednesday CS361.
Week 2 - Friday CS361.
Programmable Pipelines
A Crash Course on Programmable Graphics Hardware
Graphics Processing Unit
Introduction to OpenGL
Chapter 6 GPU, Shaders, and Shading Languages
The Graphics Rendering Pipeline
CS451Real-time Rendering Pipeline
UMBC Graphics for Games
Chapter VI OpenGL ES and Shader
Introduction to Programmable Hardware
Graphics Processing Unit
Lecture 13 Clipping & Scan Conversion
Computer Graphics Practical Lesson 10
Computer Graphics Introduction to Shaders
CIS 441/541: Introduction to Computer Graphics Lecture 15: shaders
Introduction to OpenGL
CS 480/680 Computer Graphics GLSL Overview.
Presentation transcript:

Week 3 - Monday CS361

Last time What did we talk about last time? Graphics rendering pipeline Rasterizer Stage

Questions?

Project 1

Assignment 1

MonoGame Drawing Primitives

The BasicEffect class An effect says how things should be rendered on the screen We can specify this in details using shader programs The BasicEffect class gives you the ability to do effects without creating a shader program Less flexibility, but quick and easy The BasicEffect class has properties for: World transform View transform Projection transform Texture to be applied Lighting Fog

Vertices Vertices can be stored in many different formats depending on data you want to keep Position is pretty standard Normals are optional Color is optional We will commonly use the VertexPositionColor type to hold vertices with color

Vertex buffers The GPU holds vertices in a buffer that can be indexed into Because it is special purpose hardware, it has to be accessed in special ways It seems cumbersome, but we will often create an array of vertices, create an appropriately sized vertex buffer, and then store the vertices into the buffer VertexPositionColor[] vertices = new VertexPositionColor[3] { new VertexPositionColor(new Vector3(0, 1, 0), Color.Red), new VertexPositionColor(new Vector3(+0.5f, 0, 0), Color.Green), new VertexPositionColor(new Vector3(-0.5f, 0, 0), Color.Blue) }; vertexBuffer = new VertexBuffer(GraphicsDevice, typeof(VertexPositionColor), 3, BufferUsage.WriteOnly); vertexBuffer.SetData<VertexPositionColor>(vertices);

Drawing a vertex buffer In order to draw a vertex buffer, you have to: Set the basic effect to have the appropriate transformations Set the vertex buffer on the device as the current one being drawn Loop over the passes in the basic effect, applying them Draw the appropriate kind of primitives effect.World = world; effect.View = view; effect.Projection = projection; effect.VertexColorEnabled = true; GraphicsDevice.SetVertexBuffer(vertexBuffer); foreach (EffectPass pass in effect.CurrentTechnique.Passes) { pass.Apply(); GraphicsDevice.DrawPrimitives(PrimitiveType.TriangleList, 0, 1); }

Index buffer Sometimes a mesh will repeat many vertices Instead of repeating those vertices, we can give a list of indexes into the vertex list instead short[] indices = new short[3] { 0, 1, 2}; indexBuffer = new IndexBuffer(GraphicsDevice, typeof(short), 3, BufferUsage.WriteOnly); indexBuffer.SetData<short>(indices);

Drawing with an index buffer Once you have the index buffer, drawing with it is very similar to drawing without it You simply have to set it on the device Then call DrawIndexedPrimitives() instead of DrawPrimitives() on the device basicEffect.World = world; basicEffect.View = view; basicEffect.Projection = projection; GraphicsDevice.SetVertexBuffer<VertexPositionColor>(vertexBuffer); GraphicsDevice.Indices = indexBuffer; foreach (EffectPass pass in basicEffect.CurrentTechnique.Passes) { pass.Apply(); GraphicsDevice.DrawIndexedPrimitives(PrimitiveType.TriangleList, 0, 0, 1); }

Drawing an icosahedron An icosahedron has 20 sides, but it only has 12 vertices By using an index buffer, we can use only 12 vertices and 60 indices Check out the XNA tutorial on RB Whitaker's site for the data: http://rbwhitaker.wikidot.com/index-and-vertex-buffers There are some minor changes needed to make the code work

Lists and strips It is very common to define primitives in terms of lists and strips A list gives all the vertex indices for each of the shapes drawn 2n indices to draw n lines 3n indices to draw n triangles A strip gives only the needed information to draw a series of connected primitives n + 1 indices to draw a connected series of n lines n + 2 indices to draw a connected series of n triangles

Student Lecture: Programmable Shading

GPU

GPU GPU stands for graphics processing unit The term was coined by NVIDIA in 1999 to differentiate the GeForce256 from chips that did not have hardware vertex processing Dedicated 3D hardware was just becoming the norm and many enthusiasts used an add-on board in addition to their normal 2D graphics card Voodoo2

More pipes! Modern GPU's are generally responsible for the geometry and rasterization stages of the overall rendering pipeline The following shows color-coded functional stages inside those stages Red is fully programmable Purple is configurable Blue is not programmable at all Vertex Shader Geometry Shader Clipping Screen Mapping Triangle Setup Triangle Traversal Pixel Shader Merger

Programmable Shading

Programmable Shaders You can do all kinds of interesting things with programmable shading, but the technology is still evolving Modern shader stages such as Shader Model 4.0 and 5.0 (DirectX 10 and 11) use a common-shader core Strange as it may seem, this means that vertex, pixel, and geometry shading uses the same language

Shading languages They are generally C-like There aren't that many: HLSL: High Level Shading Language, developed by Microsoft and used for Shader Model 1.0 through 5.0 Cg: C for Graphics, developed by NVIDIA and is essentially the same as HLSL GLSL: OpenGL Shading Language, developed for OpenGL and shares some similarities with the other two These languages were developed so that you don't have to write assembly to program your graphics cards There are even drag and drop applications like NVIDIA's Mental Mill

Virtual machines To maximize compatibility across many different graphics cards, shader languages are thought of as targeting a virtual machine with certain capabilities This VM is assumed to have 4-way SIMD (single-instruction multiple-data) parallelism Vectors of 4 things are very common in graphics: Positions: xyzw Colors: rgba The vectors are commonly of float values Swizzling and masking (duplicating or ignoring) vector values are supported (kind of like bitwise operations)

Programming model A programmable shader stage has two types of inputs Uniform inputs that stay constant during draw calls Held in constant registers or constant buffers Varying inputs which are different for each vertex or pixel

Language style Fast operations: scalar and vector multiplications, additions, and combinations Well-supported (and still relatively fast): reciprocal, square root, trig functions, exponentiation and log Standard operations apply: + and * Other operations come through intrinsic functions that do not require headers or libraries: atan(), dot(), log() Flow control is done through "normal" if, switch, while, and for (but long loops are unusual)

Where the idea comes from In 1984, Cook came up with the idea of shade trees, a series of operations used to color a pixel This example shows what the shader language equivalent of the shade tree is

Shaders There are three shaders you can program Vertex shader Useful, but boring, mostly about doing transforms and getting normals Geometry shader Optional, allows you to create vertices from nowhere in hardware Pixel shader Where all the color data gets decided on Also where we'll focus

Example of real shader code The following, taken from RB Whitaker's Wiki, shows a shader for ambient lighting We start with declarations: float4x4 World; float4x4 View; float4x4 Projection; float4 AmbientColor = float4(1, 1, 1, 1); float AmbientIntensity = 0.1; struct VertexShaderInput { float4 Position : POSITION0; }; struct VertexShaderOutput

Example of real shader code continued VertexShaderOutput VertexShaderFunction(VertexShaderInput input) { VertexShaderOutput output; float4 worldPosition = mul(input.Position, World); float4 viewPosition = mul(worldPosition, View); output.Position = mul(viewPosition, Projection); return output; } float4 PixelShaderFunction(VertexShaderOutput input) : COLOR0 { return AmbientColor * AmbientIntensity; technique Ambient { pass Pass1 { VertexShader = compile vs_4_0_level_9_1 VertexShaderFunction(); PixelShader = compile ps_4_0_level_9_1 PixelShaderFunction();

The result, applied to a helicopter model:

More advanced shader code The following, taken from RB Whitaker's Wiki, shows a shader for diffuse lighting float4x4 World; float4x4 View; float4x4 Projection; float4 AmbientColor = float4(1, 1, 1, 1); float AmbientIntensity = 0.1; float4x4 WorldInverseTranspose; float4 DiffuseLightDirection = float3(1, 0, 0, 0); float4 DiffuseColor = float4(1, 1, 1, 1); float DiffuseIntensity = 1.0; struct VertexShaderInput { float4 Position : POSITION0; float4 Normal : NORMAL0; }; struct VertexShaderOutput { float4 Color : COLOR0;

More advanced shader code continued VertexShaderOutput VertexShaderFunction(VertexShaderInput input){ VertexShaderOutput output; float4 worldPosition = mul(input.Position, World); float4 viewPosition = mul(worldPosition, View); output.Position = mul(viewPosition, Projection); float4 normal = mul(input.Normal, WorldInverseTranspose); float lightIntensity = dot(normal, DiffuseLightDirection); output.Color = saturate(DiffuseColor * DiffuseIntensity * lightIntensity); return output; } float4 PixelShaderFunction(VertexShaderOutput input) : COLOR0 { return saturate(input.Color + AmbientColor * AmbientIntensity); technique Diffuse { pass Pass1 { VertexShader = compile vs_4_0_level_9_1 VertexShaderFunction(); PixelShader = compile ps_4_0_level_9_1 PixelShaderFunction();

The result, applied to a helicopter model:

Upcoming

Next time… GPU architecture Vertex shading Geometry shading Pixel shading

Reminders Keep reading Chapter 3 Keep working on Assignment 1, due this Friday by 11:59 Keep working on Project 1, due next Friday, February 10 by 11:59