Download presentation
Presentation is loading. Please wait.
1
Image-Based Rendering using Hardware Accelerated Dynamic Textures Keith Yerex Dana Cobzas Martin Jagersand
2
Motivation Rendering Traditional geometry based techniques: detailed 3D model + texture hard to achieve photorealism Image-based models non-geometric model from images practically hard to apply
3
Challenges Hard to generate detailed 3D models Texturing from images require very precise alignment with the model Rendering arbitrary views using IBM requires dense sample of the plenoptic function IBR techniques don’t deal with dynamic scenes
4
Overview I1I1 ItIt … == + + Warped texture Motion params (R 1 a 1 b 1 ) …(R t a t b t ) Structure P Texture basis y 1 … y t Texture coeff New pose (R a b) Training Model New view
5
Structure from motion Input – Structure from set of corresponding points tracked in a set of images Assumptions Static scene Camera model injective perspective weak perspective orthographic Estimated model projective affine metric euclidean
6
Structure from motion Tracked features Structure from motion algorithm poses structure
7
SFM algorithms Few images, perspective camera, precise calibration epipolar constraint trilinear tensor Long motion, affine or perspective structure factorization methods
8
Metric structure Weak perspective camera Extension of Tomasi Kanade factorisation algorithm Extract affine structure Relation between the sffine structure and camera coordinate frame Transform the structure into metric (unit pixel size)
9
Weak perspective projection N points Normalized with respect to centroid Rank theorem Factorization
10
Metric constraints Extract motion parameters Eliminate scale Compute direction of camera axis k = i x j parameterize rotation with Euler angles Model PReprojection Pose x = (r,s,a,b)
11
Dynamic Textures Purpose Model image intensity variations due to 1.Small geometric errors due to tracking 2.Non planarity of real surface 3.Non-rigidity of real object 4.Pose varying lighting effects Non-geometric, mixing of spatial basis
12
Spatial Basis Intro 1.Moving sine wave can be modeled: 2.Small image motion Spatially fixed basis 2 basis vectors 6 basis vectors
13
Image Variability Formally consider residual variation in an image stabilization problem Optic flow type constraint
14
Structural Image Variability Affine warp function Corresponding image variability Discretized for images
15
Composite Image variability Similarily can show that composite image variability Can be modeled as sum of basis Struct Depth Non-plan Light Res Err
16
Example Lighting variation
17
Statistical Image Variability In practice image variability hard to compute from one image Instead we use PCA to estimate image variability from a large sequence of images This yields a transformed basis Can estimate linear model J In practice Delaunay triang & bi-linear model
18
Image variability comparison Derivatives from one picture Statistically estimated variability
19
Implementation Matlab for geometric modeling and prototyping mexVision for tracking (30Hz frame rate) Hardware accelerated OpenGL for rendering (2.8Hz in SW, 18Hz on GeForce 2) pthreads and pvm for parallel processing MATLAB meXVision OpenGL
20
Hardware rendering Unsigned basis Scaling to 8 bit Where
21
OpenGL
22
Example Renderings
23
Kinematic arm
24
Geometric errors staticdynamic
25
Geometric errors static dynamic
26
Geometric errors DynamicStatic Static texturing Dynamic
27
Pixel error Vertical jitterHorizontal jitter Static texture1.150.98 Dynamic texture0.520.71
28
Conclusions Coarse geometry tractable to estimate Errors from small geometric misalignments compensated for using dynamic texture System runs on consumer PC with web cam and game graphics card Applications Insert characters/objects into games Video phone
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.