Week 8 - Friday.  What did we talk about last time?  Radiometry  Photometry  Colorimetry  Lighting with shader code  Ambient  Directional (diffuse.

Slides:



Advertisements
Similar presentations
Computer Vision Radiometry. Bahadir K. Gunturk2 Radiometry Radiometry is the part of image formation concerned with the relation among the amounts of.
Advertisements

Bump Mapping CSE 781 Roger Crawfis.
Week 5 - Friday.  What did we talk about last time?  Quaternions  Vertex blending  Morphing  Projections.
Lecture 35: Photometric stereo CS4670 / 5670 : Computer Vision Noah Snavely.
Week 9 - Monday.  What did we talk about last time?  BRDFs  Texture mapping and bump mapping in shaders.
Radiometry. Outline What is Radiometry? Quantities Radiant energy, flux density Irradiance, Radiance Spherical coordinates, foreshortening Modeling surface.
Physically Based Illumination Models
Week 8 - Wednesday.  What did we talk about last time?  Textures  Volume textures  Cube maps  Texture caching and compression  Procedural texturing.
Subsurface scattering Model of light transport in translucent materials Marble, jade, milk, skin Light penetrates material and exits at different point.
Torrance Sparrow Model of Reflectance + Oren Nayar Model of Reflectance.
1. What is Lighting? 2 Example 1. Find the cubic polynomial or that passes through the four points and satisfies 1.As a photon Metal Insulator.
Week 9 - Wednesday.  What did we talk about last time?  Fresnel reflection  Snell's Law  Microgeometry effects  Implementing BRDFs  Image based.
Rendering (彩現 渲染).
Computer Graphics (Spring 2008) COMS 4160, Lecture 20: Illumination and Shading 2
Skin Rendering GPU Graphics Gary J. Katz University of Pennsylvania CIS 665 Adapted from David Gosselin’s Power Point and article, Real-time skin rendering,
Computer Graphics (Fall 2008) COMS 4160, Lecture 19: Illumination and Shading 2
1 CSCE 641: Computer Graphics Lighting Jinxiang Chai.
Global Illumination May 7, Global Effects translucent surface shadow multiple reflection.
7M836 Animation & Rendering
Computer Graphics (Fall 2004) COMS 4160, Lecture 16: Illumination and Shading 2 Lecture includes number of slides from.
University of British Columbia CPSC 414 Computer Graphics © Tamara Munzner 1 Shading Week 5, Wed 1 Oct 2003 recap: lighting shading.
Course Website: Computer Graphics 16: Illumination.
Illumination and Direct Reflection Kurt Akeley CS248 Lecture 12 1 November 2007
Lighting & Shading.
LIGHTING Part One - Theory based on Chapter 6. Lights in the real world Lights bounce off surfaces and reflect colors, scattering light in many directions.
Computer Science – Game DesignUC Santa Cruz Adapted from Jim Whitehead’s slides Shaders Feb 18, 2011 Creative Commons Attribution 3.0 (Except copyrighted.
Week 3 - Wednesday.  What did we talk about last time?  Project 1  Graphics processing unit  Programmable shading.
Presentation by Dr. David Cline Oklahoma State University
GPU Programming Robert Hero Quick Overview (The Old Way) Graphics cards process Triangles Graphics cards process Triangles Quads.
02/25/05© 2005 University of Wisconsin Last Time Meshing Volume Scattering Radiometry (Adsorption and Emission)
Computer Science 631 Lecture 7: Colorspace, local operations
MIT EECS 6.837, Durand and Cutler Local Illumination.
Week 2 - Wednesday CS361.
-Global Illumination Techniques
Week 3 - Monday.  What did we talk about last time?  Graphics rendering pipeline  Rasterizer Stage.
Week 6 - Wednesday.  What did we talk about last time?  Light  Material  Sensors.
Week 10 - Wednesday.  What did we talk about last time?  Shadow volumes and shadow mapping  Ambient occlusion.
An Efficient Representation for Irradiance Environment Maps Ravi Ramamoorthi Pat Hanrahan Stanford University SIGGRAPH 2001 Stanford University SIGGRAPH.
Taku KomuraComputer Graphics Local Illumination and Shading Computer Graphics – Lecture 10 Taku Komura Institute for Perception, Action.
Advanced Computer Graphics Advanced Shaders CO2409 Computer Graphics Week 16.
CSE 381 – Advanced Game Programming GLSL Lighting.
Announcements Office hours today 2:30-3:30 Graded midterms will be returned at the end of the class.
Photo-realistic Rendering and Global Illumination in Computer Graphics Spring 2012 Material Representation K. H. Ko School of Mechatronics Gwangju Institute.
Global Illumination. Local Illumination  the GPU pipeline is designed for local illumination  only the surface data at the visible point is needed to.
Monte-Carlo Ray Tracing and
Computer Graphics (Spring 2003) COMS 4160, Lecture 18: Shading 2 Ravi Ramamoorthi Guest Lecturer: Aner Benartzi.
Local Illumination and Shading
1 CSCE 441: Computer Graphics Lighting Jinxiang Chai.
OpenGL Shading. 2 Objectives Learn to shade objects so their images appear three-dimensional Introduce the types of light-material interactions Build.
1 CSCE 441: Computer Graphics Lighting Jinxiang Chai.
01/27/03© 2002 University of Wisconsin Last Time Radiometry A lot of confusion about Irradiance and BRDFs –Clarrified (I hope) today Radiance.
Illumination Study of how different materials reflect light Definition of radiance, the fundamental unit of light transfer in computer graphics How the.
Computer Graphics Lecture 30 Mathematics of Lighting and Shading - IV Taqdees A. Siddiqi
1 CSCE 441: Computer Graphics Lighting Jinxiang Chai.
Computer Graphics Ken-Yi Lee National Taiwan University (the slides are adapted from Bing-Yi Chen and Yung-Yu Chuang)
Computer Science – Game DesignUC Santa Cruz Tile Engine.
Computer Graphics: Illumination
Announcements Project 3a due today Project 3b due next Friday.
CS580: Radiometry Sung-Eui Yoon ( 윤성의 ) Course URL:
Week 8 - Monday CS361.
Week 3 - Monday CS361.
Reflective Shadow Mapping By: Mitchell Allen.
Week 8 - Wednesday CS361.
© 2002 University of Wisconsin
Chapter 14 Shading Models.
Chapter IX Lighting.
CS5500 Computer Graphics May 29, 2006
Computer Graphics (Fall 2003)
Shape from Shading and Texture
Introduction to Ray Tracing
Presentation transcript:

Week 8 - Friday

 What did we talk about last time?  Radiometry  Photometry  Colorimetry  Lighting with shader code  Ambient  Directional (diffuse and specular)  Point

 Adding a specular component to the diffuse shader requires incorporating the view vector  It will be included in the shader file and be set as a parameter in the C# code

 The camera location is added to the declarations  As are specular colors and a shininess parameter float4x4 World; float4x4 View; float4x4 Projection; float4x4 WorldInverseTranspose; float3 Camera; static const float PI = f; float4 AmbientColor = float4(1, 1, 1, 1); float AmbientIntensity = 0.1; float3 DiffuseLightDirection = float3(1, 1, 0); float4 DiffuseColor = float4(1, 1, 1, 1); float DiffuseIntensity = 0.7; float Shininess; float4 SpecularColor = float4(1, 1, 1, 1); float SpecularIntensity = 0.5; float4x4 World; float4x4 View; float4x4 Projection; float4x4 WorldInverseTranspose; float3 Camera; static const float PI = f; float4 AmbientColor = float4(1, 1, 1, 1); float AmbientIntensity = 0.1; float3 DiffuseLightDirection = float3(1, 1, 0); float4 DiffuseColor = float4(1, 1, 1, 1); float DiffuseIntensity = 0.7; float Shininess; float4 SpecularColor = float4(1, 1, 1, 1); float SpecularIntensity = 0.5;

 The output adds a normal so that the half vector can be computed in the pixel shader  A world position lets us compute the view vector to the camera struct VertexShaderInput { float4 Position : SV_POSITION; float3 Normal : NORMAL; }; struct VertexShaderOutput { float4 Position : SV_POSITION; float4 Color : COLOR; float3 Normal : NORMAL; float4 WorldPosition : POSITIONT; }; struct VertexShaderInput { float4 Position : SV_POSITION; float3 Normal : NORMAL; }; struct VertexShaderOutput { float4 Position : SV_POSITION; float4 Color : COLOR; float3 Normal : NORMAL; float4 WorldPosition : POSITIONT; };

 The same computations as the diffuse shader, but we store the normal and the transformed world position in the output VertexShaderOutput VertexShaderFunction(VertexShaderInput input) { VertexShaderOutput output; float4 worldPosition = mul(input.Position, World); output.WorldPosition = worldPosition; float4 viewPosition = mul(worldPosition, View); output.Position = mul(viewPosition, Projection); float3 normal = normalize(mul(input.Normal, (float3x3)WorldInverseTranspose)); float lightIntensity = dot(normal, normalize(DiffuseLightDirection)); output.Color = saturate(DiffuseColor * DiffuseIntensity * lightIntensity); output.Normal = normal; return output; } VertexShaderOutput VertexShaderFunction(VertexShaderInput input) { VertexShaderOutput output; float4 worldPosition = mul(input.Position, World); output.WorldPosition = worldPosition; float4 viewPosition = mul(worldPosition, View); output.Position = mul(viewPosition, Projection); float3 normal = normalize(mul(input.Normal, (float3x3)WorldInverseTranspose)); float lightIntensity = dot(normal, normalize(DiffuseLightDirection)); output.Color = saturate(DiffuseColor * DiffuseIntensity * lightIntensity); output.Normal = normal; return output; }

 Here we finally have a real computation because we need to use the pixel normal (which is averaged from vertices) in combination with the view vector  The technique is the same float4 PixelShaderFunction(VertexShaderOutput input) : SV_Target { float3 light = normalize(DiffuseLightDirection); float3 normal = normalize(input.Normal); float3 reflect = normalize(2 * dot(light, normal) * normal – light); float3 view = normalize(input.WorldPosition - Camera); float dotProduct = dot(reflect, view); float4 specular = (8 + Shininess) / (8 * PI) * SpecularIntensity * SpecularColor * max(pow(dotProduct, Shininess), 0) * length(input.Color); return saturate(input.Color + AmbientColor * AmbientIntensity + specular); } float4 PixelShaderFunction(VertexShaderOutput input) : SV_Target { float3 light = normalize(DiffuseLightDirection); float3 normal = normalize(input.Normal); float3 reflect = normalize(2 * dot(light, normal) * normal – light); float3 view = normalize(input.WorldPosition - Camera); float dotProduct = dot(reflect, view); float4 specular = (8 + Shininess) / (8 * PI) * SpecularIntensity * SpecularColor * max(pow(dotProduct, Shininess), 0) * length(input.Color); return saturate(input.Color + AmbientColor * AmbientIntensity + specular); }

 Point lights model omni lights at a specific position  They generally attenuate (get dimmer) over a distance and have a maximum range  DirectX has a constant attenuation, linear attenuation, and a quadratic attenuation  You can choose attenuation levels through shaders  They are more computationally expensive than directional lights because a light vector has to be computed for every pixel  It is possible to implement point lights in a deferred shader, lighting only those pixels that actually get used

 We add light position float4x4 World; float4x4 View; float4x4 Projection; float4x4 WorldInverseTranspose; float3 LightPosition; float3 Camera; static const float PI = f; float4 AmbientColor = float4(1, 1, 1, 1); float AmbientIntensity = 0.1f; float LightRadius = 50; float4 DiffuseColor = float4(1, 1, 1, 1); float DiffuseIntensity = 0.7; float Shininess; float4 SpecularColor = float4(1, 1, 1, 1); float SpecularIntensity = 0.5f; float4x4 World; float4x4 View; float4x4 Projection; float4x4 WorldInverseTranspose; float3 LightPosition; float3 Camera; static const float PI = f; float4 AmbientColor = float4(1, 1, 1, 1); float AmbientIntensity = 0.1f; float LightRadius = 50; float4 DiffuseColor = float4(1, 1, 1, 1); float DiffuseIntensity = 0.7; float Shininess; float4 SpecularColor = float4(1, 1, 1, 1); float SpecularIntensity = 0.5f;

 We no longer need color in the output  We do need the vector to the camera from the location  We keep the world location at that fragment struct VertexShaderInput { float4 Position : SV_POSITION; float3 Normal : NORMAL; }; struct VertexShaderOutput { float4 Position : SV_POSITION; float4 WorldPosition : POSITIONT; float3 Normal : NORMAL; }; struct VertexShaderInput { float4 Position : SV_POSITION; float3 Normal : NORMAL; }; struct VertexShaderOutput { float4 Position : SV_POSITION; float4 WorldPosition : POSITIONT; float3 Normal : NORMAL; };

 We compute the normal and the world position VertexShaderOutput VertexShaderFunction(VertexShaderInput input) { VertexShaderOutput output; float4 worldPosition = mul(input.Position, World); output.WorldPosition = worldPosition; float4 viewPosition = mul(worldPosition, View); output.Position = mul(viewPosition, Projection); float3 normal = normalize(mul(input.Normal, (float3x3)WorldInverseTranspose)); output.Normal = normal; return output; } VertexShaderOutput VertexShaderFunction(VertexShaderInput input) { VertexShaderOutput output; float4 worldPosition = mul(input.Position, World); output.WorldPosition = worldPosition; float4 viewPosition = mul(worldPosition, View); output.Position = mul(viewPosition, Projection); float3 normal = normalize(mul(input.Normal, (float3x3)WorldInverseTranspose)); output.Normal = normal; return output; }

 Lots of junk in here float4 PixelShaderFunction(VertexShaderOutput input) : SV_Target { float3 normal = normalize(input.Normal); float3 lightDirection = LightPosition – (float3)input.WorldPosition; float intensity = pow(1.0f – saturate(length(lightDirection)/LightRadius), 2); lightDirection = normalize(lightDirection); //normalize after float3 view = normalize(Camera - (float3)input.WorldPosition); float diffuseColor = dot(normal, lightDirection); float3 reflect = normalize(2 * diffuseColor * normal – lightDirection); float dotProduct = dot(reflect, view); float specular = (8 + Shininess) / (8 * PI) * SpecularIntensity * SpecularColor * max(pow(dotProduct, Shininess), 0) * length(diffuseColor); return saturate(diffuseColor + AmbientColor * AmbientIntensity + specular); } float4 PixelShaderFunction(VertexShaderOutput input) : SV_Target { float3 normal = normalize(input.Normal); float3 lightDirection = LightPosition – (float3)input.WorldPosition; float intensity = pow(1.0f – saturate(length(lightDirection)/LightRadius), 2); lightDirection = normalize(lightDirection); //normalize after float3 view = normalize(Camera - (float3)input.WorldPosition); float diffuseColor = dot(normal, lightDirection); float3 reflect = normalize(2 * diffuseColor * normal – lightDirection); float dotProduct = dot(reflect, view); float specular = (8 + Shininess) / (8 * PI) * SpecularIntensity * SpecularColor * max(pow(dotProduct, Shininess), 0) * length(diffuseColor); return saturate(diffuseColor + AmbientColor * AmbientIntensity + specular); }

 The bidirectional reflectance distribution function is a function that describes the difference between outgoing radiance and incoming irradiance  This function changes based on:  Wavelength  Angle of light to surface  Angle of viewer from surface  For point or directional lights, we do not need differentials and can write the BRDF:

 We've been talking about lighting models  Lambertian, specular, etc.  A BRDF is an attempt to model physics slightly better  A big difference is that different wavelengths are absorbed and reflected different by different materials  Rendering models in real time with (more) accurate BRDFs is still an open research problem

 They also have global lighting (shadows and reflections)  Taken from

 The BRDF is supposed to account for all the light interactions we discussed in Chapter 5 (reflection and refraction)  We can see the similarity to the lighting equation from Chapter 5, now with a BRDF:

 If the subsurface scattering effects are great, the size of the pixel may matter  Then, a bidirectional surface scattering reflectance distribution function (BSSRDF) is needed  Or if the surface characteristics change in different areas, you need a spatially varying BRDF  And so on…

 Helmholtz reciprocity:  f(l,v) = f(v,l)  Conservation of energy:  Outgoing energy cannot be greater than incoming energy  The simplest BRDF is Lambertian shading  We assume that energy is scattered equally in all directions  Integrating over the hemisphere gives a factor of π  Dividing by π gives us exactly what we saw before:

 We'll start with our specular shader for directional light and add textures to it

 The texture for the ship is below:

 We add a texture variable called ModelTexture  We also add a SamplerState structure that specifies how to filter the texture Texture2D ModelTexture; SamplerState ModelTextureSampler { Filter = MIN_MAG_MIP_LINEAR; AddressU = Clamp; AddressV = Clamp; }; Texture2D ModelTexture; SamplerState ModelTextureSampler { Filter = MIN_MAG_MIP_LINEAR; AddressU = Clamp; AddressV = Clamp; };

 We add a texture coordinate to the input and the output of the vertex shader struct VertexShaderInput { float4 Position : SV_POSITION; float3 Normal : NORMAL; float2 Texture : TEXCOORD; }; struct VertexShaderOutput { float4 Position : SV_POSITION; float4 Color : COLOR; float3 Normal : NORMAL; float4 WorldPosition : POSITIONT; float2 Texture : TEXCOORD; }; struct VertexShaderInput { float4 Position : SV_POSITION; float3 Normal : NORMAL; float2 Texture : TEXCOORD; }; struct VertexShaderOutput { float4 Position : SV_POSITION; float4 Color : COLOR; float3 Normal : NORMAL; float4 WorldPosition : POSITIONT; float2 Texture : TEXCOORD; };

 Almost nothing changes here except that we copy the input texture coordinate into the output VertexShaderOutput VertexShaderFunction(VertexShaderInput input) { VertexShaderOutput output; float4 worldPosition = mul(input.Position, World); output.WorldPosition = worldPosition; float4 viewPosition = mul(worldPosition, View); output.Position = mul(viewPosition, Projection); float3 normal = normalize(mul(input.Normal, (float3x3)WorldInverseTranspose)); float lightIntensity = dot(normal, normalize(DiffuseLightDirection)); output.Color = saturate(DiffuseColor * DiffuseIntensity * lightIntensity); output.Normal = normal; output.Texture = input.Texture; return output; } VertexShaderOutput VertexShaderFunction(VertexShaderInput input) { VertexShaderOutput output; float4 worldPosition = mul(input.Position, World); output.WorldPosition = worldPosition; float4 viewPosition = mul(worldPosition, View); output.Position = mul(viewPosition, Projection); float3 normal = normalize(mul(input.Normal, (float3x3)WorldInverseTranspose)); float lightIntensity = dot(normal, normalize(DiffuseLightDirection)); output.Color = saturate(DiffuseColor * DiffuseIntensity * lightIntensity); output.Normal = normal; output.Texture = input.Texture; return output; }

 We have to pull the color from the texture and set its alpha to 1  Then scale the components of the color by the texture color float4 PixelShaderFunction(VertexShaderOutput input) : SV_Target { float3 light = normalize(DiffuseLightDirection); float3 normal = normalize(input.Normal); float3 reflect = normalize(2 * dot(light, normal) * normal – light); float3 view = normalize(input.WorldPosition - Camera); float dotProduct = dot(reflect, view); float4 specular = (8 + Shininess) / (8 * PI) * SpecularIntensity * SpecularColor * max(pow(dotProduct, Shininess), 0) * length(input.Color); float4 textureColor = ModelTexture.Sample(ModelTextureSampler, input.Texture); textureColor.a = 1; return saturate(textureColor * input.Color + AmbientColor * AmbientIntensity + specular); } float4 PixelShaderFunction(VertexShaderOutput input) : SV_Target { float3 light = normalize(DiffuseLightDirection); float3 normal = normalize(input.Normal); float3 reflect = normalize(2 * dot(light, normal) * normal – light); float3 view = normalize(input.WorldPosition - Camera); float dotProduct = dot(reflect, view); float4 specular = (8 + Shininess) / (8 * PI) * SpecularIntensity * SpecularColor * max(pow(dotProduct, Shininess), 0) * length(input.Color); float4 textureColor = ModelTexture.Sample(ModelTextureSampler, input.Texture); textureColor.a = 1; return saturate(textureColor * input.Color + AmbientColor * AmbientIntensity + specular); }

 To use a texture, we naturally have to load a texture  We have to set the texture, but as a resource, not as a value texture = Content.Load (" ShipTexture "); effect.Parameters["ModelTexture"].SetResource ( shipTexture); effect.Parameters["ModelTexture"].SetResource ( shipTexture);

 It's easiest to do bump mapping in SharpDX using a normal map  Of course, a normal map is hard to create by hand  What's more common is to create a height map and then use a tool for creating a normal map from it  xNormal is a free utility to do this 

 The conversion from a grayscale height map to a normal map looks like this

 We have a normal to a surface, but there are also tangent directions  We call these the tangent and the binormal  Apparently serious mathematicians think it should be called the bitangent  The binormal is tangent to the surface and orthogonal to the other tangent  We distort the normal with weighted sums of the tangent and binormal (stored in our normal map) Normal Tangent Binormal

 Choosing BRDFs  Implementing BRDFs  Image-based approaches to sampling BRDFs

 Finish reading Chapter 7  Finish Project 2  Due tonight by midnight