Volume Rendering (3) Hardware Texture Mapping Methods.

Slides:



Advertisements
Similar presentations
15.1 Si23_03 SI23 Introduction to Computer Graphics Lecture 15 – Visible Surfaces and Shadows.
Advertisements

8.1si31_2001 SI31 Advanced Computer Graphics AGR Lecture 8 Polygon Rendering.
Graphics Pipeline.
Direct Volume Rendering. What is volume rendering? Accumulate information along 1 dimension line through volume.
REAL-TIME VOLUME GRAPHICS Christof Rezk Salama Computer Graphics and Multimedia Group, University of Siegen, Germany Eurographics 2006 Real-Time Volume.
Other DVR Algorithms and A Comparison Jian Huang, CS 594, Spring 2002.
CAP4730: Computational Structures in Computer Graphics Visible Surface Determination.
CECS461 Computer Graphics II University of Missouri at Columbia Hidden Surface Removal.
Computer Graphics Visible Surface Determination. Goal of Visible Surface Determination To draw only the surfaces (triangles) that are visible, given a.
CS 4731: Computer Graphics Lecture 18: Hidden Surface Removal Emmanuel Agu.
Splatting Josh Anon Advanced Graphics 1/29/02. Types of Rendering Algorithms Backward mapping Image plane mapped into data Ray casting Forward mapping.
1 Angel: Interactive Computer Graphics 4E © Addison-Wesley 2005 Models and Architectures Ed Angel Professor of Computer Science, Electrical and Computer.
Introduction to 3D Graphics John E. Laird. Basic Issues u Given a internal model of a 3D world, with textures and light sources how do you project it.
University of Texas at Austin CS 378 – Game Technology Don Fussell CS 378: Computer Game Technology Beyond Meshes Spring 2012.
09/18/03CS679 - Fall Copyright Univ. of Wisconsin Last Time Bump Mapping Multi-pass algorithms.
Hidden Surface Removal
1 Computer Graphics Week13 –Shading Models. Shading Models Flat Shading Model: In this technique, each surface is assumed to have one normal vector (usually.
Technology and Historical Overview. Introduction to 3d Computer Graphics  3D computer graphics is the science, study, and method of projecting a mathematical.
Programmable Pipelines. Objectives Introduce programmable pipelines ­Vertex shaders ­Fragment shaders Introduce shading languages ­Needed to describe.
CSE 381 – Advanced Game Programming Basic 3D Graphics
Definitions Spectral Elements – data structures that contain information about data at points within each geometric entity. Finite elements only hold information.
Lecture 3 : Direct Volume Rendering Bong-Soo Sohn School of Computer Science and Engineering Chung-Ang University Acknowledgement : Han-Wei Shen Lecture.
C O M P U T E R G R A P H I C S Guoying Zhao 1 / 14 C O M P U T E R G R A P H I C S Guoying Zhao 1 / 14 Going-through.
VIS Group, University of Stuttgart Tutorial T4: Programmable Graphics Hardware for Interactive Visualization Adaptive Terrain Slicing (Stefan Roettger)
CSC 461: Lecture 3 1 CSC461 Lecture 3: Models and Architectures  Objectives –Learn the basic design of a graphics system –Introduce pipeline architecture.
Visible-Surface Detection Jehee Lee Seoul National University.
3D Graphics for Game Programming Chapter IV Fragment Processing and Output Merging.
1Computer Graphics Lecture 4 - Models and Architectures John Shearer Culture Lab – space 2
Visualization Process sensorssimulationdatabase raw data images Transformation.
David Luebke11/26/2015 CS 551 / 645: Introductory Computer Graphics David Luebke
Review on Graphics Basics. Outline Polygon rendering pipeline Affine transformations Projective transformations Lighting and shading From vertices to.
Yizhou Yu Texture-Mapping Real Scenes from Photographs Yizhou Yu Computer Science Division University of California at Berkeley Yizhou Yu Computer Science.
11/24/ :45 Graphics II Shadow Maps Reflections Session 5.
Direct Volume Rendering
What are shaders? In the field of computer graphics, a shader is a computer program that runs on the graphics processing unit(GPU) and is used to do shading.
CS 445 / 645: Introductory Computer Graphics Review.
Shadows David Luebke University of Virginia. Shadows An important visual cue, traditionally hard to do in real-time rendering Outline: –Notation –Planar.
Chapter 1 Graphics Systems and Models Models and Architectures.
Visible-Surface Detection Methods. To identify those parts of a scene that are visible from a chosen viewing position. Surfaces which are obscured by.
Lecture 30: Visible Surface Detection
Forward Projection Pipeline and Transformations CENG 477 Introduction to Computer Graphics.
Textures, Sprites, and Fonts
Discrete Techniques.
Computer Graphics CC416 Week 15 3D Graphics.
Week 2 - Friday CS361.
CSCE 441: Computer Graphics Hidden Surface Removal (Cont.)
3D Graphics Rendering PPT By Ricardo Veguilla.
CENG 477 Introduction to Computer Graphics
The Graphics Rendering Pipeline
Models and Architectures
Jim X. Chen George Mason University
Models and Architectures
Models and Architectures
Introduction to Computer Graphics with WebGL
3D Rendering Pipeline Hidden Surface Removal 3D Primitives
Real-Time Volume Graphics [06] Local Volume Illumination
© University of Wisconsin, CS559 Fall 2004
Hank Childs, University of Oregon
Real-time Rendering Shadow Maps
Volume Rendering Lecture 21.
Volume Rendering.
Lecture 13 Clipping & Scan Conversion
Visibility (hidden surface removal)
Models and Architectures
CSCE 441: Computer Graphics Hidden Surface Removal (Cont.)
Models and Architectures
RADEON™ 9700 Architecture and 3D Performance
Game Programming Algorithms and Techniques
Introduction to Ray Tracing
Introduction to Computer Graphics
Presentation transcript:

Volume Rendering (3) Hardware Texture Mapping Methods

Texture Mapping 2D image2D polygon + Textured-mapped polygon

Texture Mapping (2) (0,0) (1,0) (0,1)(1,1) Each texel has 2D coordinates assigned to it. assign the texture coordinates to each polygon to establish the mapping (0,0.5)(0.5,0.5) (0,0)(0.5,0)

Tex. Mapping for Volume Rendering Remember ray casting … y z

Texture based volume rendering x z y Render each xz slice in the volume as a texture-mapped polygon The texture contains RGBA (color and opacity) The polygons are drawn from back to front

Texture based volume rendering Algorithm: (using 2D texture mapping hardware) Turn off the z-buffer; Enable blending For (each slice from back to front) { - Load the 2D slice of data into texture memory - Create a polygon corresponding to the slice - Assign texture coordinates to four corners of the polygon - Render and composite the polygon (use OpenGL alpha blending) }

Some Considerations… (2) What if we change the viewing position? That is okay, we just change the eye position (or rotate the polygons and re-render), Until … y z

Some Considerations… (3) Until … y z You are not going to see anything this way … This is because the view direction now is Parallel to the slice planes What do we do?

Some Considerations… (4) y z What do we do? Change the orientation of slicing planes Now the slice polygons are parallel to XZ plane in the object space We need to reorganize the input textures Reorganize the textures on the fly is too time consuming. We might want to prepare this texture sets beforehand

Some Considerations… (5) When do we need to change the slicing planes? ? y z When the major component of view vector changes from z to y

Some Considerations… (6) Major component of view vector? Normalized view vector (x,y,z) -> get the maximum one If x: then the slicing planes are parallel to yz plane If y: then the slicing planes are parallel to xz plane If z: then the slicing planes are parallel to xy plane -> This is called (object-space) axis-aligned method. Therefore, we need to keep three copies of data around all the time!!

Problem Object-space axis-alighed method can create artifact: Poping Effect y z V = V = There is a sudden change of viewing slicing (and thus the sampling rate) then the view vector transits from one major direction to another. The change in the result image intensity can be quite visible

Solution Use Image-space axis-aligned slicing plane: the slicing planes are always parallel to the view plane

Image-Space Axis-Aligned Arbitrary slicing through the volume and texture mapping capabilities are needed - Arbitrary slicing: this can be computed using software in real time This is basically polygon-volume clipping

Image-Space Axis-Aligned (2) Texture mapping to the arbitrary slices This requires 3D Solid texture mapping Input texture: A bunch of slices (volume) Depending on the position of the polygon, appropriate textures are mapped to the polygon.

Solid (3D) Texture Mapping Now the input texture space is 3D Texture coordinates (r,s) -> (r,s,t) (0,0,0) (1,0,0) (0,1,0) (1,1,0) (0,1,1)(1,1,1) (r0,s0,t0) (r1,s1,t1) (r2,s2,t2)(r3,s3,t3)

Pros and Cons Advantages : - Higher quality - No popping effect Disadvantages: - Need to compute the slicing planes for every view angle - 3D texture mapping is not supported by most of graphics hardware except high end ones (althugh people are working on it right now. ATI Radeon card, $250, starts to support it. Both 2D or 3D hardware texture mapping methods can not compute Shading on the fly. The input textures have to be pre-shaded.