Computer Graphics: Programming, Problem Solving, and Visual Communication Steve Cunningham California State University Stanislaus and Grinnell College.

Slides:



Advertisements
Similar presentations
Today Composing transformations 3D Transformations
Advertisements

CS 352: Computer Graphics Chapter 7: The Rendering Pipeline.
Ray tracing. New Concepts The recursive ray tracing algorithm Generating eye rays Non Real-time rendering.
Understanding the graphics pipeline Lecture 2 Original Slides by: Suresh Venkatasubramanian Updates by Joseph Kider.
Graphics Pipeline.
Computer Graphic Creator: Mohsen Asghari Session 2 Fall 2014.
The Graphics Pipeline CS2150 Anthony Jones. Introduction What is this lecture about? – The graphics pipeline as a whole – With examples from the video.
1 Angel: Interactive Computer Graphics 4E © Addison-Wesley 2005 Models and Architectures Ed Angel Professor of Computer Science, Electrical and Computer.
GPU Graphics Processing Unit. Graphics Pipeline Scene Transformations Lighting & Shading ViewingTransformations Rasterization GPUs evolved as hardware.
University of Texas at Austin CS 378 – Game Technology Don Fussell CS 378: Computer Game Technology Beyond Meshes Spring 2012.
10/5/04© University of Wisconsin, CS559 Fall 2004 Last Time Compositing Painterly Rendering Intro to 3D Graphics Homework 3 due Oct 12 in class.
Shadows Computer Graphics. Shadows Shadows Extended light sources produce penumbras In real-time, we only use point light sources –Extended light sources.
Computer Graphics: Programming, Problem Solving, and Visual Communication Steve Cunningham California State University Stanislaus and Grinnell College.
Computer Graphics: Programming, Problem Solving, and Visual Communication Steve Cunningham California State University Stanislaus and Grinnell College.
Computer Graphics: Programming, Problem Solving, and Visual Communication Steve Cunningham California State University Stanislaus and Grinnell College.
Programmable Pipelines. Objectives Introduce programmable pipelines ­Vertex shaders ­Fragment shaders Introduce shading languages ­Needed to describe.
INTRO TO COMPUTER GRAPHICS TEXT EDWARD ANGEL: EDITION 5 CS770/870
Computer Graphics: Programming, Problem Solving, and Visual Communication Steve Cunningham California State University Stanislaus and Grinnell College.
Programmable Pipelines. 2 Objectives Introduce programmable pipelines ­Vertex shaders ­Fragment shaders Introduce shading languages ­Needed to describe.
Week 2 - Wednesday CS361.
Image Synthesis Rabie A. Ramadan, PhD 2. 2 Java OpenGL Using JOGL: Using JOGL: Wiki: You can download JOGL from.
Chris Kerkhoff Matthew Sullivan 10/16/2009.  Shaders are simple programs that describe the traits of either a vertex or a pixel.  Shaders replace a.
CS 450: COMPUTER GRAPHICS REVIEW: INTRODUCTION TO COMPUTER GRAPHICS – PART 2 SPRING 2015 DR. MICHAEL J. REALE.
Week 2 - Friday.  What did we talk about last time?  Graphics rendering pipeline  Geometry Stage.
Graphics Systems and OpenGL. Business of Generating Images Images are made up of pixels.
CSC 461: Lecture 3 1 CSC461 Lecture 3: Models and Architectures  Objectives –Learn the basic design of a graphics system –Introduce pipeline architecture.
OpenGL Conclusions OpenGL Programming and Reference Guides, other sources CSCI 6360/4360.
CSE Real Time Rendering Week 2. Graphics Processing 2.
1 Introduction to Computer Graphics with WebGL Ed Angel Professor Emeritus of Computer Science Founding Director, Arts, Research, Technology and Science.
Computer Graphics: Programming, Problem Solving, and Visual Communication Steve Cunningham California State University Stanislaus and Grinnell College.
Computer Graphics The Rendering Pipeline - Review CO2409 Computer Graphics Week 15.
1Computer Graphics Lecture 4 - Models and Architectures John Shearer Culture Lab – space 2
COMPUTER GRAPHICS CSCI 375. What do I need to know?  Familiarity with  Trigonometry  Analytic geometry  Linear algebra  Data structures  OOP.
A Quadrilateral Rendering Primitive Kai Hormann · Marco Tarini A Quadrilateral Rendering Primitive Visual Computing Group · CNR · Pisa.
Programmable Pipelines Ed Angel Professor of Computer Science, Electrical and Computer Engineering, and Media Arts Director, Arts Technology Center University.
09/16/03CS679 - Fall Copyright Univ. of Wisconsin Last Time Environment mapping Light mapping Project Goals for Stage 1.
Computer Graphics: Programming, Problem Solving, and Visual Communication Steve Cunningham California State University Stanislaus and Grinnell College.
Advanced Computer Graphics Spring 2014 K. H. Ko School of Mechatronics Gwangju Institute of Science and Technology.
Computer Graphics: Programming, Problem Solving, and Visual Communication Steve Cunningham California State University Stanislaus and Grinnell College.
University of British Columbia CPSC 314 Computer Graphics Jan-Apr 2013 Tamara Munzner Rasterization.
Review on Graphics Basics. Outline Polygon rendering pipeline Affine transformations Projective transformations Lighting and shading From vertices to.
1 Georgia Tech, IIC, GVU, 2006 MAGIC Lab Rossignac Graphic pipeline  Scan-conversion algorithm (high level)  Pixels.
1 Angel: Interactive Computer Graphics5E © Addison- Wesley 2009 Image Formation Fundamental imaging notions Fundamental imaging notions Physical basis.
COMPUTER GRAPHICS CS 482 – FALL 2015 SEPTEMBER 29, 2015 RENDERING RASTERIZATION RAY CASTING PROGRAMMABLE SHADERS.
What are shaders? In the field of computer graphics, a shader is a computer program that runs on the graphics processing unit(GPU) and is used to do shading.
The Graphics Pipeline Revisited Real Time Rendering Instructor: David Luebke.
09/23/03CS679 - Fall Copyright Univ. of Wisconsin Last Time Reflections Shadows Part 1 Stage 1 is in.
1 E. Angel and D. Shreiner: Interactive Computer Graphics 6E © Addison-Wesley 2012 Models and Architectures 靜宜大學 資訊工程系 蔡奇偉 副教授 2012.
GLSL Review Monday, Nov OpenGL pipeline Command Stream Vertex Processing Geometry processing Rasterization Fragment processing Fragment Ops/Blending.
Computer Graphics: Programming, Problem Solving, and Visual Communication Steve Cunningham California State University Stanislaus and Grinnell College.
Computer Graphics: Programming, Problem Solving, and Visual Communication Steve Cunningham California State University Stanislaus and Grinnell College.
- Introduction - Graphics Pipeline
Draw a Simple Object.
Week 2 - Friday CS361.
Programmable Pipelines
3D Transformations Source & Courtesy: University of Wisconsin,
Graphics Processing Unit
Introduction to OpenGL
Lecture 18 Fasih ur Rehman
The Graphics Rendering Pipeline
CS451Real-time Rendering Pipeline
Models and Architectures
Models and Architectures
Models and Architectures
Introduction to Computer Graphics with WebGL
Day 05 Shader Basics.
Lecture 13 Clipping & Scan Conversion
Models and Architectures
Models and Architectures
Introduction to OpenGL
Presentation transcript:

Computer Graphics: Programming, Problem Solving, and Visual Communication Steve Cunningham California State University Stanislaus and Grinnell College PowerPoint Instructor’s Resource

The Rendering Pipeline How the OpenGL system creates the image from your modeling

A Different Perspective on Creating Images Up to this point we have focused on the content of images, without much thought about how they are created Rendering is the process of taking the descriptions you have provided and setting the appropriate pixels to the appropriate colors to create the actual image

The Rendering Pipeline Rendering is accomplished by starting with your modeling and applying a sequence of operations that become more and more detailed: a pipeline This chapter is really about the steps in that pipeline: what they are and how they are done

The Rendering Pipeline (2) The basic steps in the pipeline are: You will recognize many of the pieces from earlier chapters

The Rendering Pipeline (3) The steps can happen in different ways –Software-only –On a graphics card OpenGL only specifies the process, not the implementation

Two Parts to the Pipeline The first part of the pipeline is the Geometry Pipeline This works on the geometry you define and takes it to screen space The second part of the pipeline is the Rendering Pipeline This takes the basic geometry in screen space and actually sets all pixels

Model Space The actual definition of your graphics objects happens in model space when you define The vertices of your object - glVertex(…) The way these vertices are grouped - glBegin(…) - glEnd()

Model Space to World Space The vertices you defined are transformed through the modeling transformation that is currently active, and the results are vertices in world space Grouping information is passed along Light position can be affected if this is defined within your modeling

World Space to 3D Eye Space The viewing transformation is applied to all points in world space in order to transform them into 3D eye space In OpenGL, the modeling and viewing transformations are combined into the modelview transformation, and this is what is really applied Grouping is passed along

3D Eye Space to Screen Space This is performed by the projection transformation Much more than geometry is done, however! –The glColor statement or lighting model give the point a color –The z-value in eye space is used to compute a depth –Clipping on the view volume is performed so only visible geometry is preserved –Grouping is passed along

A Point in Screen Space A point is screen space corresponds to a pixel, but it also has a number of properties that are needed for rendering –Position –Depth –Color - RGB[A] –Texture coordinates –Normal vector

Rendering To begin the rendering process, we have “pixels with properties” for each vertex of the geometry The first step is to proceed from vertices to edges by computing the pixels that bound the graphics objects The edges are determined by the grouping you defined as part of your modeling

Computing Edges Edges are computed by interpolation –Geometric interpolation, such as the Bresenham algorithm, is used to compute the coordinates of each pixel in the edge –There are rules about edge computation that avoid including pixels that keep from including any pixel in two different edges and that do not include any horizontal edge

Computing Edges (2) The interpolation is deeper than pixels –The geometry interpolation is extended to calculate the color, depth, texture coordinates, and normal for each edge pixel –If the projection used perspective, this also needs to be used to interpolate depth and texture

Result of Edge Computation The result of the edge computation is a set of edges for each graphical object Because OpenGL only works with convex objects, and because of the rules about including pixels in edges, for any horizontal line of pixels there are either zero or two edges that meet this line

Fragments If there are exactly two edges that meet a horizontal line of pixels, we need to determine the color of all pixels between the two edge pixels on the line These pixels are called a fragment

Fragments (2) To determine the color of each pixel, –Interpolate from left to right on the line –For each pixel, Calculate the depth Calculate the color (interpolate the color or use the texture) If depth testing, check the depth in depth buffer If masking, check against mask If the pixel passes the depth and mask tests –Perform any blending needed –Write the new color and depth to the color and depth buffers

Some OpenGL Details The overall OpenGL system model

Some OpenGL Details (2) Processing for texture maps

Some OpenGL Details (3) Detail of fragment processing

Programmable Shaders The OpenGL system model shown here uses a fixed-function pipeline where all the operations are already defined This is being expanded to a pipeline that lets you define some of the functions by adding programs, called shaders, that you can define These shaders can be applied as a few specific places in the pipeline

Three Programmable Stages

Geometry Shaders Geometry shaders work as the primitives (vertices plus groupings) are defined They will allow you to extend the original geometry with additional vertices or groups

Vertex Shaders Vertex shaders let you manipulate the individual vertices before they are passed to the rasterization stages

Fragment Shaders Fragment shaders let you manipulate individual pixels to apply new algorithms to color or select pixels

Shaders Are New… … and are beyond the scope of a beginning graphics course at this time If you have a shader-capable graphics card and a shader-capable OpenGL system, it will be interesting for you to experiment with them once your OpenGL skills are solid We suggest that you use Mike Bailey’s glman system as a learning tool