Appearance modeling: textures and IBR Class 17. 3D photography course schedule Introduction Aug 24, 26(no course) Aug.31,Sep.2(no course) Sep. 7, 9(no.

Slides:



Advertisements
Similar presentations
Wavelets Fast Multiresolution Image Querying Jacobs et.al. SIGGRAPH95.
Advertisements

An Introduction to Light Fields Mel Slater. Outline Introduction Rendering Representing Light Fields Practical Issues Conclusions.
Introduction to Image-Based Rendering Jian Huang, CS 594, Spring 2002 A part of this set of slides reference slides used at Standford by Prof. Pat Hanrahan.
3D Graphics Rendering and Terrain Modeling
Light Fields PROPERTIES AND APPLICATIONS. Outline  What are light fields  Acquisition of light fields  from a 3D scene  from a real world scene 
Lightfields, Lumigraphs, and Image-based Rendering.
Shape-from-X Class 11 Some slides from Shree Nayar. others.
Lapped Textures Emil Praun and Adam Finkelstien (Princeton University) Huges Hoppe (Microsoft Research) SIGGRAPH 2000 Presented by Anteneh.
Unstructured Lumigraph Rendering
Precomputed Local Radiance Transfer for Real-time Lighting Design Anders Wang Kristensen Tomas Akenine-Moller Henrik Wann Jensen SIGGRAPH ‘05 Presented.
A new approach for modeling and rendering existing architectural scenes from a sparse set of still photographs Combines both geometry-based and image.
HCI 530 : Seminar (HCI) Damian Schofield.
Copyright  Philipp Slusallek Cs fall IBR: Model-based Methods Philipp Slusallek.
Computer Graphics (Fall 2005) COMS 4160, Lecture 16: Illumination and Shading 1
Advanced Computer Graphics (Spring 2005) COMS 4162, Lecture 21: Image-Based Rendering Ravi Ramamoorthi
Structured light and active ranging techniques Class 8.
Image-Based Modeling and Rendering CS 6998 Lecture 6.
Multi-view stereo Many slides adapted from S. Seitz.
Light Field Mapping: Hardware-Accelerated Visualization of Surface Light Fields.
High-Quality Video View Interpolation
The UNIVERSITY of NORTH CAROLINA at CHAPEL HILL An Incremental Weighted Least Squares Approach To Surface Light Fields Greg Coombe Anselmo Lastra.
Self-calibration Class 13 Read Chapter 6. Assignment 3 Collect potential matches from all algorithms for all pairs Matlab ASCII format, exchange data.
Copyright  Philipp Slusallek IBR: View Interpolation Philipp Slusallek.
Surface Light Fields for 3D Photography Daniel N. Wood University of Washington SIGGRAPH 2001 Course.
CSCE 641 Computer Graphics: Image-based Rendering (cont.) Jinxiang Chai.
Surface Light Fields for 3D Photography Daniel Wood Daniel Azuma Wyvern Aldinger Brian Curless Tom Duchamp David Salesin Werner Stuetzle.
Structure from motion Class 12 Read Chapter 5. Assignment 2 ChrisMS regions Nathan… BrianM&S LoG features LiSIFT features ChadMS regions Seon JooSIFT.
3D reconstruction Class 16. 3D photography course schedule Introduction Aug 24, 26(no course) Aug.31,Sep.2(no course) Sep. 7, 9(no course) Sep. 14, 16Projective.
CSCE 641: Computer Graphics Image-based Rendering Jinxiang Chai.
CS 563 Advanced Topics in Computer Graphics View Interpolation and Image Warping by Brad Goodwin Images in this presentation are used WITHOUT permission.
CSCE 641 Computer Graphics: Image-based Modeling (Cont.) Jinxiang Chai.
Multiple View Geometry : Computational Photography Alexei Efros, CMU, Fall 2006 © Martin Quinn …with a lot of slides stolen from Steve Seitz and.
Accurate, Dense and Robust Multi-View Stereopsis Yasutaka Furukawa and Jean Ponce Presented by Rahul Garg and Ryan Kaminsky.
Stereo matching Class 10 Read Chapter 7 Tsukuba dataset.
Image Based Rendering And Modeling Techniques And Their Applications Jiao-ying Shi State Key laboratory of Computer Aided Design and Graphics Zhejiang.
The Story So Far The algorithms presented so far exploit: –Sparse sets of images (some data may not be available) –User help with correspondences (time.
Computer Graphics Inf4/MSc Computer Graphics Lecture 11 Texture Mapping.
Real-Time High Quality Rendering CSE 291 [Winter 2015], Lecture 6 Image-Based Rendering and Light Fields
Technology and Historical Overview. Introduction to 3d Computer Graphics  3D computer graphics is the science, study, and method of projecting a mathematical.
Advanced Computer Graphics (Spring 2013) CS 283, Lecture 15: Image-Based Rendering and Light Fields Ravi Ramamoorthi
Automatic Registration of Color Images to 3D Geometry Computer Graphics International 2009 Yunzhen Li and Kok-Lim Low School of Computing National University.
KinectFusion : Real-Time Dense Surface Mapping and Tracking IEEE International Symposium on Mixed and Augmented Reality 2011 Science and Technology Proceedings.
Dynamically Reparameterized Light Fields Aaron Isaksen, Leonard McMillan (MIT), Steven Gortler (Harvard) Siggraph 2000 Presented by Orion Sky Lawlor cs497yzy.
Interactive Rendering of Meso-structure Surface Details using Semi-transparent 3D Textures Vision, Modeling, Visualization Erlangen, Germany November 16-18,
Proxy Plane Fitting for Line Light Field Rendering Presented by Luv Kohli COMP238 December 17, 2002.
Image-based rendering Michael F. Cohen Microsoft Research.
03/12/03© 2003 University of Wisconsin Last Time NPR Assignment Projects High-Dynamic Range Capture Image Based Rendering Intro.
I 3D: Interactive Planar Reconstruction of Objects and Scenes Adarsh KowdleYao-Jen Chang Tsuhan Chen School of Electrical and Computer Engineering Cornell.
Lightfields, Lumigraphs, and Other Image-Based Methods.
03/24/03© 2003 University of Wisconsin Last Time Image Based Rendering from Sparse Data.
Image Based Rendering. Light Field Gershun in 1936 –An illuminated objects fills the surrounding space with light reflected of its surface, establishing.
Plenoptic Modeling: An Image-Based Rendering System Leonard McMillan & Gary Bishop SIGGRAPH 1995 presented by Dave Edwards 10/12/2000.
03/09/05© 2005 University of Wisconsin Last Time HDR Image Capture Image Based Rendering –Improved textures –Quicktime VR –View Morphing NPR Papers: Just.
CSL 859: Advanced Computer Graphics Dept of Computer Sc. & Engg. IIT Delhi.
CS559: Computer Graphics Lecture 8: Warping, Morphing, 3D Transformation Li Zhang Spring 2010 Most slides borrowed from Yungyu ChuangYungyu Chuang.
- Laboratoire d'InfoRmatique en Image et Systèmes d'information
112/5/ :54 Graphics II Image Based Rendering Session 11.
Yizhou Yu Texture-Mapping Real Scenes from Photographs Yizhou Yu Computer Science Division University of California at Berkeley Yizhou Yu Computer Science.
3D Object Representations 2011, Fall. Introduction What is CG?  Imaging : Representing 2D images  Modeling : Representing 3D objects  Rendering : Constructing.
CSCE 641 Computer Graphics: Image-based Rendering (cont.) Jinxiang Chai.
Image-Based Rendering Geometry and light interaction may be difficult and expensive to model –Think of how hard radiosity is –Imagine the complexity of.
Announcements Final is Thursday, March 18, 10:30-12:20 –MGH 287 Sample final out today.
Eigen Texture Method : Appearance compression based method Surface Light Fields for 3D photography Presented by Youngihn Kho.
3D Object Representations 2009, Fall. Introduction What is CG?  Imaging : Representing 2D images  Modeling : Representing 3D objects  Rendering : Constructing.
Presented by 翁丞世  View Interpolation  Layered Depth Images  Light Fields and Lumigraphs  Environment Mattes  Video-Based.
1 Real-Time High-Quality View-dependent Texture Mapping using Per-Pixel Visibility Damien Porquet Jean-Michel Dischler Djamchid Ghazanfarpour MSI Laboratory,
Image-Based Rendering
3D Object Representations
Filtering Things to take away from this lecture An image as a function
Filtering An image as a function Digital vs. continuous images
Presentation transcript:

Appearance modeling: textures and IBR Class 17

3D photography course schedule Introduction Aug 24, 26(no course) Aug.31,Sep.2(no course) Sep. 7, 9(no course) Sep. 14, 16Projective GeometryCamera Model and Calibration (assignment 1) Feb. 21, 23Camera Calib. and SVMFeature matching (assignment 2) Feb. 28, 30Feature trackingEpipolar geometry (assignment 3) Oct. 5, 7Computing FTriangulation and MVG Oct. 12, 14(university day)(fall break) Oct. 19, 21StereoActive ranging Oct. 26, 28Structure from motionSfM and Self-calibration Nov. 2, 4Shape-from-silhouettesSpace carving Nov. 9, 113D modelingAppearance Modeling Nov.12 papers (2-3pm SN115) Nov. 16, 18(VMV’04) Nov. 23, 25papers & discussion(Thanksgiving) Nov.30,Dec.2papers & discussionpapers and discussion Dec.3 papers (2-3pm SN115) Dec. 7?Project presentations

Papers Li Exact Voxel Occupancy with Graph Cuts Sudipta Stereo without epipolar lines Chris A graph cut based adaptive structured light approach for real-time range acquisition Nathan Space-time faces Brian Depth-from-focus … Chad Interactive Modeling from Dense Color and Sparse Depth Seon Joo Outdoor calibration of active cameras Jason spectral partitioning Sriram Linear multi-view reconstruction Christine 3D photography using dual …

Projects ChrisWide-area display reconstruction NathanStructured light BrianDepth-from-focus/defocus LiVisual-hulls with occlusions ChadLaser scanner for 3D environments Seon JooCollaborative 3D tracking JasonSfM for long sequences Sudipta Combining exact silhouettes and photoconsistency SriramPanoramic cameras self-calibration Christine desktop lamp scanner

Multiple depth imagesVolumetric integration Volumetric 3D integration

Appearance Modeling Texturing Single image Multiple image Image-based rendering (Unstructured) lightfield rendering Surface lightfields

Texture mapping 3D model Need to estimate relative pose between camera and 3D model

Texture Mapping Conventional texture-mapping with texture coordinates Projective texture-mapping

Texture Map Synthesis I Conventional Texture- Mapping with Texture Coordinates Create a triangular texture patch for each triangle The texture patch is a weighted average of the image patches from multiple photographs Pixels that are close to image boundaries or viewed from a grazing angle obtain smaller weights Photograph Texture Map 3D Triangle

Texture Map Synthesis II Allocate space for texture patches from texture maps Generalization of memory allocation to 2D Quantize edge length to a power of 2 Sort texture patches into decreasing order and use First-Fit strategy to allocate space First-Fit

A Texture Map Packed with Triangular Texture Patches

Appearance Modeling texture atlas

Dealing with auto-exposure Photometric alignment of textures (or HDR textures) (Kim and Pollefeys, CVPR’04)

Image as texture Depth image Triangle mesh Texture image Textured 3D Wireframe model Affine vs. projective texture mapping (see later)

Lightfield literature Plenoptic function Lightfield (plane) and Lumigraph (some geometry) Unstructered lightfield (some (view-dependent) geometry) Surface lightfields (full geometry) Plenoptic sampling (trade-off geometry vs. images) (Levoy&Hanrahan,Siggraph´96 Gortler et al.,Siggraph´96) (Koch et al. ICCV´99; Heigl et al. DAGM´99; Buehler et al. Siggraph‘01) (Chai et al.,Siggraph´00) (Wood et al.,Siggraph´00, Chen et al., Siggraph‘02) (Adelson&Bergen´91; McMillan&Bishop,Siggraph´95)

Lightfield rendering focal surface Approximate light rays by interpolating from closest light rays in lightfield viewpoint surface Projection of viewpoint surface in virtual camera determines which views to get lightrays from Transfer from images to virtual views over focal surface determines which pixels to use

Unstructured lightfield rendering original viewpoints Novel view For every pixel, combine best rays from closest views (Koch et al.,ICCV´99; Heigl et al.,DAGM´99) Focal surface demo

Example: desk sequence 186 images recorded with hand-held camera

Example: desk sequence structure and motion depth images 190 images 7000points

Example: Desk Lightfield Planar focal surface (shadow artefacts)

View-dependent geometry approximation original viewpoints object surface View-dependent surface approximation Novel view depth maps

Adaptation of geometry with the rendering viewpoint View-dependent geometry approximation

Geometry subdivision original viewpoints object surface View-dependent surface approximation Novel view depth maps Note: Only necessary when depth value significantly deviates from previous approximation deviates from previous approximation

Viewpoint-geometry without subdivision 4 subdivisions 2 subdivisions 1 subdivision of viewpoint surface Scalable geometric approximation

Example: Desk lightfield Planar focal surface View-dependent geometry approximation (2 subdivisions)

Hardware accelerated rendering Use blending operation similar to Gouraud shading Use projective textures!

Demo demo

Extrapolation (Buehler et al., Siggraph´01) Add mesh to cover whole image (compute non-binary blending weights) Rendered image Blending field (courtesy Leonard McMillan)

Surface Lightfields Surface location Viewing direction Surface light field (SLF) function Chen et al., Siggraph 2002, "Light Field Mapping: Efficient Representation and Hardware Rendering of Surface Light Fields""Light Field Mapping: Efficient Representation and Hardware Rendering of Surface Light Fields" R. Grzeszczuk, Presentation on Light Field Mapping, SIGGRAPH 2002 Course Notes for Course “Image-based Modeling.”Presentation on Light Field Mapping

Surface Lightfields Partition SLF across surface primitives Pi Approximate SLF for each Pi individually as Surface light field (SLF) function Light field maps: stored as 2D texture maps Surface maps View maps

Light Field Mapping Data Acquisition Resampling Partitioning Rendering Approximation Compression

Light Field Mapping Data Acquisition Resampling Partitioning Rendering Approximation Compression

images captured by hand- held camera Geometry scanned with structured lighting Images registered to geometry

Light Field Mapping Data Acquisition Resampling Partitioning Rendering Approximation Compression

Partitioning Partitioning the light field data across small surface primitives Individual parts add up to original SLF Ensure continuous approximations across neighbouring surface elementsTriangle-centered: split the light field between individual triangles

Partitioning Triangle-centered: split the light field between individual triangles ->discontinuity  ->discontinuity  Partitioning the light field data across small surface primitives Individual parts add up to original SLF Ensure continuous approximations across neighbouring surface elements

Vertex-centered Partitioning Partition surface light field data around every vertex Hat function =

Vertex-centered Partitioning

Define local reference frame of the vertex Reparameterize each vertex light field to its local coordinate system Vertex light field

Light Field Mapping Data Acquisition Resampling Partitioning Rendering Approximation Compression

Resampling Goal: Generate vertex light field function Visibility computation determines unoccluded views for each triangle ring 2 steps: Normalization of texture size Resampling of viewing directions

Resampling Each column represents a different view 1 st view 2 nd view C i -th view

Resampling 1. Normalization of texture size Each texture patch has the same shape and size Bilinear interpolation 2. Resampling of viewing directions

Resampling 1. Normalization of texture size 2. Resampling of viewing directions Projection of original views ……. c ….. C i 1 2 ……. m M

Resampling 1. Normalization of texture size 2. Resampling of viewing directions Delaunay triangulation Uniform grid of views M ……. c ….. C i 1 2 ……. m

Resampling 1. Normalization of texture size 2. Resampling of viewing directions ……. n ….. N 1 2 ……. m M

Light Field Mapping Data Acquisition Resampling Partitioning Rendering Approximation Compression

Decomposition & Approximation Rearrange 4-dimensional F into M*N matrix Decompose F using matrix factorization Truncate the sum after K terms N K<<N K Surface maps View maps

Decomposition & Approximation Split surface maps for triangle ring into surface maps for individual triangles 3 surface maps for each approximation term of each triangle

Decomposition & Approximation 1 st approximation 2 nd approximation K th approximation …. Each approximation =3 surface maps + 3 view maps f

Approximation methods PCA (principal component analysis) Progressive Arbitrary sign factors NMF (non-negative matrix factorization) Parts-based representation Non-negative factors Easier and faster rendering

Light Field Mapping Data Acquisition Resampling Partitioning Rendering Approximation Compression

Light Field Mapping Tiled surface maps Tiled view maps Light field maps are redundant Very high compression ratio (10000:1)

Light Field Mapping Data Acquisition Resampling Partitioning Rendering Approximation Compression

Rendering 1 st approximation 2 nd approximation K th approximation …. Each approximation =3 surface maps + 3 view maps f

Rendering Surface map: view-independent View map: Establish vertex coordinate system Project viewing vector onto view map hemisphere

Results Bust Star Turtle Buddha Horse