CS 418 : EXAM 2 REVIEW TA: Ryan Freedman Sushma Kini Chi Zhou.

Slides:



Advertisements
Similar presentations
Graphics Pipeline.
Advertisements

Computer Graphic Creator: Mohsen Asghari Session 2 Fall 2014.
Informationsteknologi Monday, October 29, 2007Computer Graphics - Class 21 Today’s class Graphics programming Color.
Part I: Basics of Computer Graphics Rendering Polygonal Objects (Read Chapter 1 of Advanced Animation and Rendering Techniques) Chapter
SNC2D. Primary LIGHT colours are red, green, and blue SECONDARY light colours are created by combining only two of the three primary colours of light.
9/14/04© University of Wisconsin, CS559 Spring 2004 Last Time Intensity perception – the importance of ratios Dynamic Range – what it means and some of.
1/1/20001 Topic >>>> Scan Conversion CSE Computer Graphics.
Image Formation CS418 Computer Graphics John C. Hart.
I-1 Steps of Image Generation –Create a model of the objects –Create a model for the illumination of the objects –Create an image (render) the result I.
Color John C. Hart CS 418 Intro to Computer Graphics.
CS 325 Introduction to Computer Graphics 02 / 01 / 2010 Instructor: Michael Eckmann.
CS 480/680 Computer Graphics Image Formation Dr. Frederick C Harris, Jr.
1 Introduction to Computer Graphics with WebGL Ed Angel Professor Emeritus of Computer Science Founding Director, Arts, Research, Technology and Science.
OpenGL Conclusions OpenGL Programming and Reference Guides, other sources CSCI 6360/4360.
Real-time Shadow Mapping. Shadow Mapping Shadow mapping uses two-pass rendering - render depth texture from the light ’ s point-of-view - render from.
CS559: Computer Graphics Lecture 8: 3D Transforms Li Zhang Spring 2008 Most Slides from Stephen Chenney.
Global Illumination. Local Illumination  the GPU pipeline is designed for local illumination  only the surface data at the visible point is needed to.
Review on Graphics Basics. Outline Polygon rendering pipeline Affine transformations Projective transformations Lighting and shading From vertices to.
Lynwood Dunn ( ) Visual effects pioneer Acme-Dunn optical printer.
1 Georgia Tech, IIC, GVU, 2006 MAGIC Lab Rossignac Graphic pipeline  Scan-conversion algorithm (high level)  Pixels.
What are shaders? In the field of computer graphics, a shader is a computer program that runs on the graphics processing unit(GPU) and is used to do shading.
Shadows David Luebke University of Virginia. Shadows An important visual cue, traditionally hard to do in real-time rendering Outline: –Notation –Planar.
Ta: Ryan Freedman Sushma Kini Chi Zhou.  Pipeline  Clipping  Transformation, homogeneous coordinates  Lighting  Perspective  Viewing.
2 3D Viewing Process  3D viewing process MC Model Space Model Transformation Model Transformation WC World Space Viewing Transformation Viewing Transformation.
CSE 185 Introduction to Computer Vision
COMPUTER GRAPHICS CS 482 – FALL 2016 CHAPTER 28 COLOR COLOR PERCEPTION CHROMATICITY COLOR MODELS COLOR INTERPOLATION.
Color Models Light property Color models.
Half Toning Dithering RGB CMYK Models
Rendering Pipeline Fall, 2015.
Display Issues Ed Angel
- Introduction - Graphics Pipeline
Color Image Processing
Color Image Processing
25.2 The human eye The eye is the sensory organ used for vision.
(c) University of Wisconsin, CS559 Spring 2002
COLOR space Mohiuddin Ahmad.
Colour theory.
Intro to 3D Graphics.
(c) 2002 University of Wisconsin, CS559
Color Image Processing
CSCE 441 Computer Graphics 3-D Viewing
Modeling 101 For the moment assume that all geometry consists of points, lines and faces Line: A segment between two endpoints Face: A planar area bounded.
Color Image Representation
Introduction to Computer Graphics with WebGL
The Graphics Rendering Pipeline
CS451Real-time Rendering Pipeline
COMS 161 Introduction to Computing
Color Representation Although we can differentiate a hundred different grey-levels, we can easily differentiate thousands of colors.
Chapter 12 COLOR THEORY.
Introduction to Computer Graphics with WebGL
CSL 859: Advanced Computer Graphics
Computer Graphics Lecture 20
Color Image Processing
Slides taken from Scott Schaefer
Projection in 3-D Glenn G. Chappell
The Graphics Pipeline Lecture 5 Mon, Sep 3, 2007.
Light Light has wave-like properties
Chapter VII Rasterizer
Two ways to discuss color 1) Addition 2) Subtraction
Visibility (hidden surface removal)
Computer Graphics Module Overview
Color Image Processing
Color Model By : Mustafa Salam.
Digital Image Processing
Models and Architectures
Color Models l Ultraviolet Infrared 10 Microwave 10
Lecture 8: 3D Transforms Li Zhang Spring 2008
Visuals are analog signals...
Game Programming Algorithms and Techniques
Frame Buffer Applications
Presentation transcript:

CS 418 : EXAM 2 REVIEW TA: Ryan Freedman Sushma Kini Chi Zhou

LIST of TOPICS  Vertex Shader  Image and Color Spaces  Rasterization  Texture Mapping and Coordinates  Visibility

CMY Subtractive Color  Cyan, Magenta, Yellow  Color model used in pigments and reflective materials (ink,paint)  Grade school color rules Blue + Yellow = Green? Cyan + Yellow = Green  Also CMYK (blacK) C + M + Y = Brown? C + M + Y = Black (in theory) C + M + Y = Gray (in practice) C M Y

Selecting Colors HSV = Hue, Saturation, Value  1978, Alvy Ray Smith  Hue [0,360] is angle about color wheel 0  = red, 60  = yellow, 120  = green, 180  = cyan, 240  = blue, 300  = magenta  Saturation [0,1] is distance from gray S = (maxRGB – minRGB)/maxRGB  Value [0,1] is distance from black V = maxRGB HSL = Hue, Saturation, Lightness  Double cone, saturation in middle  = maxRGB – minRGB maxRGB = R  H = (G – B)/  maxRGB = G  H = 2+(B – R)/  maxRGB = B  H =4+ (R – G)/  H = (60*H) mod 360 Eric Pierce

Exercise For each row’s named color, enter its corresponding coordinates in each of the column’s color spaces and compute its luminance in the last column. Color Name(R, G, B)(C, M, Y)(H, S, V)YLuminance Blue(0, 0, 1)

Exercise For each row’s named color, enter its corresponding coordinates in each of the column’s color spaces and compute its luminance in the last column. Color Name(R, G, B)(C, M, Y)(H, S, V)YLuminance Blue(0, 0, 1)(1, 1, 0)

Exercise For each row’s named color, enter its corresponding coordinates in each of the column’s color spaces and compute its luminance in the last column. Color Name(R, G, B)(C, M, Y)(H, S, V)YLuminance Blue(0, 0, 1)(1, 1, 0)

Exercise For each row’s named color, enter its corresponding coordinates in each of the column’s color spaces and compute its luminance in the last column. Color Name(R, G, B)(C, M, Y)(H, S, V)YLuminance Blue(0, 0, 1)(1, 1, 0)(240, 1, 1).11

Rasterization Converts  lines and triangles  with floating point vertices  in viewport (screen) coordinates into  pixels  with integer coordinates  in viewport (screen) coordinates pixels centered at grid vertices, not grid cells

Line Rasterization  How to rasterize a line from (0,0) to (4,3)  Pixel (0,0) and (4,3) easy  One pixel for each integer x-coordinate  Pixel’s y-coordinate closest to line  If line equal distance between two pixels, pick on arbitrarily but consistently ? ?

Midpoint Algorithm  Which pixel should be plotted next?  East?  Northeast?  Line equation y = mx + b m = (y 1 – y 0 )/(x 1 – x 0 ) b = y 0 – mx 0 f(x,y) = mx + b – y NE E M f > 0 f < 0

Midpoint Increments f(M) = f(P) + m – ½ f(M E )= f(x+2,y+½) = m(x+2) + b – (y+½) = f(P) + 2m – ½ = f(M) + m f(M NE ) = f(x+2,y+1½) = m(x+2) + b – (y+1½) = f(P) + 2m – 1½ = f(M) + m – 1 f(1, ½)= m + b – ½ = m – ½ if line starts at origin NE E M P EE ENE NE 2 MEME M NE

Exercise For a line that extends from (0,0) to (8,2): Define a function f(x,y) that is negative above the line and positive below the line. Determine f(x,y) at the first midpoint Determine the point to rasterize (E or NE) Determine the location of the pixel drawn

Exercise For a line that extends from (0,0) to (8,2): Define a function f(x,y) that is negative above the line and positive below the line. f(x,y) = mx+b-y = (x/4)+0-y = (x/4)-y Determine f(x,y) at the first midpoint Determine the point to rasterize (E or NE) Determine the location of the pixel drawn

Exercise For a line that extends from (0,0) to (8,2): Define a function f(x,y) that is negative above the line and positive below the line. f(x,y) = mx+b-y = (x/4)+0-y = (x/4)-y Determine f(x,y) at the first midpoint Midpoint = (1,1/2) f(1,1/2) = ¼ - ½ = -¼ Determine the point to rasterize (E or NE) Determine the location of the pixel drawn

Exercise For a line that extends from (0,0) to (8,2): Define a function f(x,y) that is negative above the line and positive below the line. f(x,y) = mx+b-y = (x/4)+0-y = (x/4)-y Determine f(x,y) at the first midpoint Midpoint = (1,1/2) f(1,1/2) = ¼ - ½ = -¼ Determine the point to rasterize (E or NE) Because f(m) <= 0 : E, the midpoint is above the line Determine the location of the pixel drawn E: Current pixel P = (Px,Py), E = (Px+1,Py) (0+1,0) = (1,0)

The Over Operator  How to indicate which parts of front picture are clear and which are opaque  Use alpha channel to indicate opacity [Smith]  Over operator [Porter & Duff S’84]  A over B: C A over B =  A C A + (1 –  A )  B C B  A over B =  A + (1 –  A )  B  Note that  A C A used in color equations, so store  A C A instead of C A  A over B w/premultiplied alpha C A over B = C A + (1 –  A ) C B  A over B =  A + (1 –  A )  B B A C = (  R  G  B  )

Z-Buffer  Algorithm: For each rasterized fragment (x,y) If z > zbuffer(x,y) then framebuffer(x,y) = fragment color zbuffer(x,y) = z  Question: For fragment (1,1), if zbuffer(1,1) contains the value -3. Will the point having z value -2 be displayed?

Z-Buffer  Algorithm: For each rasterized fragment (x,y) If z > zbuffer(x,y) then framebuffer(x,y) = fragment color zbuffer(x,y) = z  Question: For fragment (1,1), if zbuffer(1,1) contains the value -3. Will the point having z value -2 be displayed? Yes

Normalized View Volume x y z x y z Model Coords World Coords Viewing Coords Clip Coords Screen Coords (left,top,-near) (right,bottom,-near) (0,0,-far) glFrustum(left,right,bottom,top,near,far) (-1,1,1) z 1 x 1 y 1

Question  Consider a scene. For the eye’s view, let V be its viewing matrix, P its perspective distortion matrix and W its window-to-viewpoint matrix. Let p= [x,y,z,1] T represent the pixel position (x,y) with depth value z (Assume that multiplication by P automatically performs a homogeneous divide afterward, so that the homogeneous coordinate w = 1 always for the result of all matrix-vector products.)  (a) Let the transformation matrix A map the point p back to world coordinates, such that q= Ap is a homogeneous point in world coordinates that would be rendered onto pixel (x,y) with depth value z What is the formula for matrix A?

Question  Consider a scene. For the eye’s view, let V be its viewing matrix, P its perspective distortion matrix and W its window-to-viewpoint matrix. Let p= [x,y,z,1] T represent the pixel position (x,y) with depth value z (Assume that multiplication by P automatically performs a homogeneous divide afterward, so that the homogeneous coordinate w = 1 always for the result of all matrix-vector products.)  (a) Let the transformation matrix A map the point p back to world coordinates, such that q= Ap is a homogeneous point in world coordinates that would be rendered onto pixel (x,y) with depth value z What is the formula for matrix A? A=V -1 P -1 W -1

Normalized Perspective Distortion x y z x y z (left,top,-near) (right,bottom,-near) (0,0,-far) (-1,1,1) z 1 x 1 y 1

Question x y z (2,1,-1) (-2,-1,-1) (0,0,-3)  Find the Normalized perspective distortion transformation matrix from the data given below:

Question x y z (2,1,-1) (-2,-1,-1) (0,0,-3)