CSCE 552 Spring 2011 Graphics By Jijun Tang.

Slides:



Advertisements
Similar presentations
CS 352: Computer Graphics Chapter 7: The Rendering Pipeline.
Advertisements

Ray tracing. New Concepts The recursive ray tracing algorithm Generating eye rays Non Real-time rendering.
Understanding the graphics pipeline Lecture 2 Original Slides by: Suresh Venkatasubramanian Updates by Joseph Kider.
Graphics Pipeline.
CS 4363/6353 BASIC RENDERING. THE GRAPHICS PIPELINE OVERVIEW Vertex Processing Coordinate transformations Compute color for each vertex Clipping and Primitive.
Week 10 - Monday.  What did we talk about last time?  Global illumination  Shadows  Projection shadows  Soft shadows.
CAP4730: Computational Structures in Computer Graphics Visible Surface Determination.
Rasterization and Ray Tracing in Real-Time Applications (Games) Andrew Graff.
Right of Passage.
CGDD 4003 THE MASSIVE FIELD OF COMPUTER GRAPHICS.
Chapter 5.1 Graphics. 2 Overview Fundamentals High-Level Organization Rendering Primitives Textures Lighting The Hardware Rendering Pipeline Conclusions.
CSCE 590E Spring 2007 Graphics I By Jijun Tang. Announcements Second presentation will be held on April 16 th and 18 th  April 16 th : Space Banditos,
Graphics Presented by Shuang Li. 2 Overview  Fundamentals  High-Level Organization  Rendering Primitives  Textures  Lighting  The Hardware Rendering.
1 Angel: Interactive Computer Graphics 4E © Addison-Wesley 2005 Models and Architectures Ed Angel Professor of Computer Science, Electrical and Computer.
CSCE 590E Spring 2007 Graphics II By Jijun Tang.
University of Texas at Austin CS 378 – Game Technology Don Fussell CS 378: Computer Game Technology Beyond Meshes Spring 2012.
Hidden Surface Removal
CS 445 / 645: Introductory Computer Graphics
Lecture 5: 3D Rendering Pipeline (II) Prof. Hsien-Hsin Sean Lee School of Electrical and Computer Engineering Georgia Institute of Technology.
University of Illinois at Chicago Electronic Visualization Laboratory (EVL) CS 426 Intro to 3D Computer Graphics © 2003, 2004, 2005 Jason Leigh Electronic.
CSCE 552 Spring 2010 Graphics By Jijun Tang. Fundamentals Frame and Back Buffer Visibility and Depth Buffer Stencil Buffer Triangles Vertices Coordinate.
CSCE 552 Fall D Models By Jijun Tang. Triangles Fundamental primitive of pipelines  Everything else constructed from them  (except lines and point.
CSC 461: Lecture 3 1 CSC461 Lecture 3: Models and Architectures  Objectives –Learn the basic design of a graphics system –Introduce pipeline architecture.
Week 10 - Wednesday.  What did we talk about last time?  Shadow volumes and shadow mapping  Ambient occlusion.
1 Introduction to Computer Graphics with WebGL Ed Angel Professor Emeritus of Computer Science Founding Director, Arts, Research, Technology and Science.
Chapter 5.1 (3D) Graphics. 2 Overview Fundamentals High-Level Organization Rendering Primitives Textures Lighting The Hardware Rendering Pipeline Conclusions.
1Computer Graphics Lecture 4 - Models and Architectures John Shearer Culture Lab – space 2
Taku KomuraComputer Graphics Local Illumination and Shading Computer Graphics – Lecture 10 Taku Komura Institute for Perception, Action.
Advanced Computer Graphics Advanced Shaders CO2409 Computer Graphics Week 16.
University of Texas at Austin CS 378 – Game Technology Don Fussell CS 378: Computer Game Technology Basic Rendering Pipeline and Shading Spring 2012.
CSE 381 – Advanced Game Programming GLSL Lighting.
GRAPHICS PIPELINE & SHADERS SET09115 Intro to Graphics Programming.
1 Perception and VR MONT 104S, Fall 2008 Lecture 21 More Graphics for VR.
Chapter 5.1 Graphics. 2 Overview Fundamentals High-Level Organization Rendering Primitives Textures Lighting The Hardware Rendering Pipeline Conclusions.
CSCE 552 Spring D Models By Jijun Tang. Triangles Fundamental primitive of pipelines  Everything else constructed from them  (except lines and.
Right of Passage.
Where We Stand So far we know how to: –Transform between spaces –Rasterize –Decide what’s in front Next –Deciding its intensity and color.
What are shaders? In the field of computer graphics, a shader is a computer program that runs on the graphics processing unit(GPU) and is used to do shading.
1 CSCE 441: Computer Graphics Hidden Surface Removal Jinxiang Chai.
Chapter 1 Graphics Systems and Models Models and Architectures.
1 E. Angel and D. Shreiner: Interactive Computer Graphics 6E © Addison-Wesley 2012 Models and Architectures 靜宜大學 資訊工程系 蔡奇偉 副教授 2012.
Applications and Rendering pipeline
Introduction to Computer Graphics
- Introduction - Graphics Pipeline
Photorealistic Rendering vs. Interactive 3D Graphics
Graphics Processing Unit
Deferred Lighting.
3D Graphics Rendering PPT By Ricardo Veguilla.
The Graphics Rendering Pipeline
CS451Real-time Rendering Pipeline
Understanding Theory and application of 3D
Models and Architectures
Jim X. Chen George Mason University
Chapter 14 Shading Models.
Models and Architectures
CSCE 441: Computer Graphics Hidden Surface Removal
Models and Architectures
Introduction to Computer Graphics with WebGL
Graphics, Modeling, and Textures
Lighting.
Computer Graphics One of the central components of three-dimensional graphics has been a basic system that renders objects represented by a set of polygons.
UMBC Graphics for Games
Lecture 13 Clipping & Scan Conversion
Models and Architectures
CS5500 Computer Graphics May 29, 2006
Models and Architectures
Game Programming Algorithms and Techniques
Computer Graphics (Fall 2003)
Chapter 14 Shading Models.
Frame Buffer Applications
Presentation transcript:

CSCE 552 Spring 2011 Graphics By Jijun Tang

Mesh

Texture Example

Volume Partitioning – Portals Node Portal Visible One portal invisible. No more visible nodes or portals to check. Render scene. Invisible Not tested Eye

BSP

Quadtree

Octree

PVS Room A Room C PVS = B, A, D Room D Room B Viewpoint Room E

Strips, Lists, Fans Triangle strip 2 4 4 2 1 5 6 8 2 1 2 3 6 7 1 5 3 3 9 8 4 3 Triangle list 1 3 5 4 2 7 4 5 5 6 6 Line list 1 Line strip 6 Triangle fan

Vertex Index Array

Fundamentals Frame and Back Buffer Visibility and Depth Buffer Triangles Vertices Coordinate Spaces Textures Shaders Materials

Two Buffers

Visibility and Depth Buffer Depth buffer is a buffer that holds the depth of each pixel in the scene, and is the same size as back buffer Holds a depth or “Z” value (often called the “Z buffer”) Pixels test their depth against existing value If greater, new pixel is further than existing pixel Therefore hidden by existing pixel – rejected Otherwise, is in front, and therefore visible Overwrites value in depth buffer and color in back buffer No useful units for the depth value By convention, nearer means lower value Non-linear mapping from world space

Object Depth Is Difficult

Basic Depth Buffer Algorithm for each object in scene in any order for each pixel in the object being rendered if the pixel has already been rendered, then if the new object pixel is closer than the old one, then Replace the displayed pixel with the new one

Coordinate Spaces World space Object space Camera space Arbitrary global game space Object space Child of world space Origin at entity’s position and orientation Vertex positions and normals stored in this Camera space Camera’s version of “object” space Screen (image) space Clip space vertices projected to screen space Actual pixels, ready for rendering

Relationship

Frustum The viewing frustum is the region of space in the modeled world that may appear on the screen Frustum culling is the process of removing objects that lie completely outside the viewing frustum from the rendering process

Frustum and Frustum Culling

Clip Space Distorted version of camera space Edges of screen make four side planes Near and far planes Needed to control precision of depth buffer Total of six clipping planes Distorted to make a cube in 4D clip space Makes clipping hardware simpler

Triangles will be clipped Clip Space Triangles will be clipped Clip space frustum Camera space visible frustum Eye

Shaders A program run at each vertex or pixel Generates pixel colors or vertex positions Relatively small programs Usually tens or hundreds of instructions Explicit parallelism No direct communication between shaders “Extreme SIMD” programming model Hardware capabilities evolving rapidly

Example

Lighting Components Lighting Environment Multiple Lights Diffuse Material Lighting Normal Maps Pre-computed Radiance Transfer Specular Material Lighting Environment Maps

Lighting and Approaches Processes to determine the amount and direction of light incident on a surface how that light is absorbed, reemitted, and reflected which of those outgoing light rays eventually reach the eye Approaches: Forward tracing: trace every photon from light source Backward tracing: trace a photon backward from the eye Middle-out: compromise and trace the important rays

Components Lighting is in three stages: What light shines on the surface? How does the material interact with light? What part of the result is visible to eye? Real-time rendering merges last two Occurs in vertex and/or pixel shader Many algorithms can be in either

Lighting Environment Answers first question: What light shines on the surface? Standard model is infinitely small lights Position Intensity Color Physical model uses inverse square rule: brightness = light brightness / distance2

Practical Ways But this gives huge range of brightnesses Monitors have limited range In practice it looks terrible Most people use inverse distance brightness = light brightness / distance Add min distance to stop over-brightening Except where you want over-brightening! Add max distance to cull lights Reject very dim lights for performance

Multiple Lights Environments have tens or hundreds Ambient light Too slow to consider every one every pixel Approximate less significant ones Ambient light Attributable to the reflection of light off of surfaces in the environment Single color added to all lighting Washes contrasts out of scene Acceptable for overcast daylight scenes

Ambient Lights

Hemisphere lighting Three major lights: Sky is light blue Ground is dark green or brown Dot-product normal with “up vector” Blend between the two colors Good for brighter outdoor daylight scenes

Example

Other Multiple Lights Cube map of irradiance Stores incoming light from each direction Look up value that normal points at Can represent any lighting environment Spherical harmonic irradiance Store irradiance cube map in frequency space 10 color values gives at most 6% error Calculation instead of cube-map lookup Mainly for diffuse lighting

Cube Map

Lightmaps Usually store result of lighting calculation But can just store irradiance Lighting still done at each pixel Allows lower-frequency light maps Still high-frequency normal maps Still view-dependent specular

Lightmap Example

Diffuse Material Lighting Light is absorbed and re-emitted Re-emitted in all directions equally So it does not matter where the eye is Same amount of light hits the pupil “Lambert” diffuse model is common Brightness is dot-product between surface normal and incident light vector

Example

Normal Maps Surface normal vector stored in vertices Changes slowly so that surfaces look smooth Real surfaces are rough Lots of variation in surface normal Would require lots more vertices Normal maps store normal in a texture Look up normal at each pixel Perform lighting calculation in pixel shader

Normal Mapping Example

Specular Material Lighting Light bounces off surface How much light bounced into the eye? Other light did not hit eye – so not visible! Common model is “Blinn” lighting Surface made of “microfacets” They have random orientation With some type of distribution

Example

Specular Material Lighting (2) Light comes from incident light vector …reflects off microfacet …into eye Eye and light vectors fixed for scene So we know microfacet normal required Called “half vector” half vector = (incident + eye)/2 How many have that normal?

Specular Material Lighting (3) half=(light+eye)/2 alignment=max (0, dot (half,normal)) brightness=alignmentsmoothness normal half vector light vector eye vector

Specular Material Lighting (4) Microfacets distributed around surface normal according to “smoothness” value Dot-product of half-vector and normal Then raise to power of “smoothness” Gives bright spot Where normal=half vector Tails off quicker when material is smoother

Environment Maps Blinn used for slightly rough materials Only models bright lights Light from normal objects is ignored Smooth surfaces can reflect everything No microfacets for smooth surfaces Only care about one source of light The one that reflects to hit the eye

Example

Environment Maps - 2 Put environment picture in cube map Reflect eye vector in surface normal Look up result in cube map Can take normal from normal map Bumpy chrome

Environment Maps - 3 Environment map can be static Generic sky + hills + ground Often hard to notice that it’s not correct Very cheap, very effective Or render every frame with real scene Render to cube map sides Selection of scene centre can be tricky Expensive to render scene six times

Update Procedure public void Update() { Profiler.Start(); Viewer.RealTime = TotalRealSeconds; var elapsedTime = new ElapsedTime(); elapsedTime.RealSeconds = (float)(TotalRealSeconds - LastTotalRealSeconds); elapsedTime.ClockSeconds = Viewer.Simulator.GetElapsedClockSeconds(elapsedTime.RealSeconds); LastTotalRealSeconds = TotalRealSeconds; try Viewer.RenderProcess.ComputeFPS(elapsedTime.RealSeconds); Viewer.Simulator.Update(elapsedTime.ClockSeconds); Viewer.HandleUserInput(elapsedTime); Viewer.HandleMouseMovement(); UserInput.Handled(); CurrentFrame.Clear(); Viewer.PrepareFrame(CurrentFrame, elapsedTime); } finally // Update the loader - it should only copy volatile data (.w file) if (Viewer.RealTime - Viewer.LoaderProcess.LastUpdateRealTime > LoaderProcess.UpdatePeriod) Viewer.LoaderProcess.StartUpdate();

Viewer PrepareFrame public void PrepareFrame(RenderFrame frame, ElapsedTime elapsedTime) { // Update camera first... Camera.Update(elapsedTime); // We're now ready to prepare frame for the camera. Camera.PrepareFrame(frame, elapsedTime); frame.PrepareFrame(elapsedTime); SkyDrawer.PrepareFrame(frame, elapsedTime); TerrainDrawer.PrepareFrame(frame, elapsedTime); SceneryDrawer.PrepareFrame(frame, elapsedTime); TrainDrawer.PrepareFrame(frame, elapsedTime); RoadCarHandler.PrepareFrame(frame, elapsedTime); InfoDisplay.PrepareFrame(frame, elapsedTime); }

Render Thread (Draw/Update) protected override void Draw(GameTime gameTime) { if (gameTime.ElapsedRealTime.TotalSeconds > 0.001) FrameUpdate(gameTime.TotalRealTime.TotalSeconds); if (LoaderSlow ) Thread.Sleep(10); CurrentFrame.Draw(GraphicsDevice); } private void FrameUpdate(double totalRealSeconds) // Wait for updater to finish. Viewer.UpdaterProcess.WaitTillFinished(); // Time to read the keyboard - must be done in XNA Game thread. UserInput.Update(Viewer); // Swap frames and start the next update (non-threaded updater does the whole update). SwapFrames(ref CurrentFrame, ref NextFrame); Viewer.UpdaterProcess.StartUpdate(NextFrame, totalRealSeconds); //CurrentFrame.Draw public void Draw(GraphicsDevice graphicsDevice) Materials.UpdateShaders(RenderProcess, graphicsDevice); DrawSimple(graphicsDevice, logging); RenderProcess.Viewer.WindowManager.Draw(graphicsDevice);

Hardware Rendering Pipe Input Assembly Vertex Shading Primitive Assembly, Cull, Clip Project, Rasterize Pixel Shading Z, Stencil, Framebuffer Blend Shader Characteristics Shader Languages

Hardware Rendering Pipe Current outline of rendering pipeline Can only be very general Hardware moves at rapid pace Hardware varies significantly in details Functional view only Not representative of performance Many stages move in actual hardware

Input Assembly State changes handled Streams of input data read Textures, shaders, blend modes Streams of input data read Vertex buffers Index buffers Constant data Combined into primitives

Vertex Shading Vertex data fed to vertex shader Also misc. states and constant data Program run until completion One vertex in, one vertex out Shader cannot see multiple vertices Shader cannot see triangle structure Output stored in vertex cache Output position must be in clip space

Primitive Assembly, Cull, Clip Vertices read from cache Combined to form triangles Cull triangles Frustum cull Back face (clockwise ordering of vertices) Clipping performed on non-culled tris Produces tris that do not go off-screen

Project, Rasterize Vertices projected to screen space Actual pixel coordinates Triangle is rasterized Finds the pixels it actually affects Finds the depth values for those pixels Finds the interpolated attribute data Texture coordinates Anything else held in vertices Feeds results to pixel shader

Pixel Shading Program run once for each pixel Given interpolated vertex data Can read textures Outputs resulting pixel color May optionally output new depth value May kill pixel Prevents it being rendered

Z, Stencil, Framebuffer Blend Z and stencil tests performed Pixel may be killed by tests If not, new Z and stencil values written If no framebuffer blend Write new pixel color to backbuffer Otherwise, blend existing value with new

Shader Characteristics Shaders rely on massive parallelism Breaking parallelism breaks speed Can be thousands of times slower Shaders may be executed in any order So restrictions placed on what shader can do Write to exactly one place No persistent data No communication with other shaders

Shader Languages Many different shader capabilities Early languages looked like assembly Different assembly for each shader version Now have C-like compilers Hides a lot of implementation details Works with multiple versions of hardware Still same fundamental restrictions Don’t break parallelism! Expected to keep evolving rapidly

Conclusions Traverse scene nodes Vertices transformed to screen space Reject or ignore invisible nodes Draw objects in visible nodes Vertices transformed to screen space Using vertex shader programs Deform mesh according to animation Make triangles from them Rasterize into pixels

Conclusions (2) Lighting done by combination Some part vertex shader Some part pixel shader Results in new color for each pixel Reject pixels that are invisible Write or blend to backbuffer