Surface Reconstruction with MLS Tobias Martin CS7960, Spring 2006, Feb 23.

Slides:



Advertisements
Similar presentations
An Adaptive MLS Surface for Reconstruction with Guarantees
Advertisements

Department of Computer Science and Engineering Normal Estimation for Point Clouds: A Comparison Study for a Voronoi Based Method Tamal K. DeyGang LiJian.
Scale & Affine Invariant Interest Point Detectors Mikolajczyk & Schmid presented by Dustin Lennon.
Principal Component Analysis Based on L1-Norm Maximization Nojun Kwak IEEE Transactions on Pattern Analysis and Machine Intelligence, 2008.
Surface Reconstruction From Unorganized Point Sets
Surface normals and principal component analysis (PCA)
Continuous Charge Distributions
PCA + SVD.
3D Shape Histograms for Similarity Search and Classification in Spatial Databases. Mihael Ankerst,Gabi Kastenmuller, Hans-Peter-Kriegel,Thomas Seidl Univ.
Model base human pose tracking. Papers Real-Time Human Pose Tracking from Range Data Simultaneous Shape and Pose Adaption of Articulated Models using.
Continuous Collision Detection: Progress and Challenges Gino van den Bergen dtecta
Computing Medial Axis and Curve Skeleton from Voronoi Diagrams Tamal K. Dey Department of Computer Science and Engineering The Ohio State University Joint.
Instructor: Mircea Nicolescu Lecture 13 CS 485 / 685 Computer Vision.
Asst. Prof. Yusuf Sahillioğlu
Chapter 24 Gauss’s Law.
Estimating Surface Normals in Noisy Point Cloud Data Niloy J. Mitra, An Nguyen Stanford University.
Uncertainty and Variability in Point Cloud Surface Data Mark Pauly 1,2, Niloy J. Mitra 1, Leonidas J. Guibas 1 1 Stanford University 2 ETH, Zurich.
CS CS 175 – Week 2 Processing Point Clouds Local Surface Properties, Moving Least Squares.
Surface Reconstruction Some figures by Turk, Curless, Amenta, et al.
Chapter 24 Gauss’s Law.
Optimal Bandwidth Selection for MLS Surfaces
Lecture 4 Unsupervised Learning Clustering & Dimensionality Reduction
OBBTree: A Hierarchical Structure for Rapid Interference Detection Gottschalk, M. C. Lin and D. ManochaM. C. LinD. Manocha Department of Computer Science,
Unsupervised Learning
CS4670: Computer Vision Kavita Bala Lecture 7: Harris Corner Detection.
Robust estimation Problem: we want to determine the displacement (u,v) between pairs of images. We are given 100 points with a correlation score computed.
Tamal K. Dey The Ohio State University Computing Shapes and Their Features from Point Samples.
Lecture II-2: Probability Review
CSE 681 Ray Tracing Implicit Surfaces. CSE 681 Overview Similar to CSG –Combine primitive objects to form complex object Primitives are “density fields”
CS 485/685 Computer Vision Face Recognition Using Principal Components Analysis (PCA) M. Turk, A. Pentland, "Eigenfaces for Recognition", Journal of Cognitive.
From Chapter 23 – Coulomb’s Law
Morphological Analysis of 3D Scalar Fields based on Morse Theory and Discrete Distortion Mohammed Mostefa Mesmoudi Leila De Floriani Paola Magillo Dept.
Ch 8.1 Numerical Methods: The Euler or Tangent Line Method
CSE554AlignmentSlide 1 CSE 554 Lecture 8: Alignment Fall 2014.
COMP 175: Computer Graphics March 24, 2015
Multiple Integrals 12. Double Integrals over General Regions 12.3.
Algorithms for Triangulations of a 3D Point Set Géza Kós Computer and Automation Research Institute Hungarian Academy of Sciences Budapest, Kende u
COMMON EVALUATION FINAL PROJECT Vira Oleksyuk ECE 8110: Introduction to machine Learning and Pattern Recognition.
CSC321: Neural Networks Lecture 12: Clustering Geoffrey Hinton.
SURFACE RECONSTRUCTION FROM POINT CLOUD Bo Gao Master’s Thesis December, 2007 Thesis Committee: Professor Harriet Fell Professor Robert Futrelle College.
Electric Flux and Gauss Law
Digital Image Processing CCS331 Relationships of Pixel 1.
Chapter 24 Gauss’s Law. Let’s return to the field lines and consider the flux through a surface. The number of lines per unit area is proportional to.
Image Modeling & Segmentation Aly Farag and Asem Ali Lecture #2.
CS654: Digital Image Analysis Lecture 30: Clustering based Segmentation Slides are adapted from:
CSE554AlignmentSlide 1 CSE 554 Lecture 8: Alignment Fall 2013.
CSE 185 Introduction to Computer Vision Face Recognition.
Mathematical Morphology Mathematical morphology (matematická morfologie) –A special image analysis discipline based on morphological transformations of.
Reconstruction of Solid Models from Oriented Point Sets Misha Kazhdan Johns Hopkins University.
3D Game Engine Design 1 3D Game Engine Design Ch D MAP LAB.
Detecting Undersampling in Surface Reconstruction Tamal K. Dey and Joachim Giesen Ohio State University.
Tony Jebara, Columbia University Advanced Machine Learning & Perception Instructor: Tony Jebara.
A New Voronoi-based Reconstruction Algorithm
L15 – Spatial Interpolation – Part 1 Chapter 12. INTERPOLATION Procedure to predict values of attributes at unsampled points Why? Can’t measure all locations:
Chapter 13 (Prototype Methods and Nearest-Neighbors )
Mathematical Morphology
Shape Reconstruction from Samples with Cocone Tamal K. Dey Dept. of CIS Ohio State University.
Robust Watermarking of 3D Mesh Models. Introduction in this paper, it proposes an algorithm that extracts 2D image from the 3D model and embed watermark.
Visual Tracking by Cluster Analysis Arthur Pece Department of Computer Science University of Copenhagen
Irfan Ullah Department of Information and Communication Engineering Myongji university, Yongin, South Korea Copyright © solarlits.com.
Instructor: Mircea Nicolescu Lecture 5 CS 485 / 685 Computer Vision.
Bigyan Ankur Mukherjee University of Utah. Given a set of Points P sampled from a surface Σ,  Find a Surface Σ * that “approximates” Σ  Σ * is generally.
Computer Graphics CC416 Lecture 04: Bresenham Line Algorithm & Mid-point circle algorithm Dr. Manal Helal – Fall 2014.
Fig 24-CO, p.737 Chapter 24: Gauss’s Law قانون جاوس 1- Electric Flux 2- Gauss’s Law 3-Application of Gauss’s law 4- Conductors in Electrostatic Equilibrium.
CSE 554 Lecture 8: Alignment
Spectral Methods for Dimensionality
Interest Points EE/CSE 576 Linda Shapiro.
Chapter 23 Gauss’s Law Spring 2008.
Where did we stop? The Bayes decision rule guarantees an optimal classification… … But it requires the knowledge of P(ci|x) (or p(x|ci) and P(ci)) We.
Presentation transcript:

Surface Reconstruction with MLS Tobias Martin CS7960, Spring 2006, Feb 23

Literature An Adaptive MLS Surface for Reconstruction with Guarantees, T. K. Dey and J. Sun A Sampling Theorem for MLS Surfaces, Peer-Timo Bremer, John C. Hart

An Adaptive MLS Surface for Reconstruction with Guarantess Tamil K. Dey and Jian Sun

Implicit MLS Surfaces Zero-level-set of function I(x) defines S  Normals are important, because points are projected along the normals

Motivation Original smooth, closed surface S. Given Conditions: –Sampling Density –Normal Estimates –Noise  Design an implicit function whose zero set recovers S.

Motivation So far, only uniform sampling condition. Restriction: The red arc requires 10 4 samples because of small feature.

Motivation Establish an adaptive sampling condition, similar to the one of Amenta and Bern.  Incorporate local feature size in sampling condition. Red arc only requires 6 samples.

Sampling Condition Recent Surface Reconstruction algorithms are based on noise free samples.  Notion of ε-sample has to be modified. P is a noisy (ε,  )-sample if –Every z  S to its closest point in P is < ε lfs(z). –The distance for p  P to z=proj(p) is < ε 2 lfs(z). –Every p  P has a normal which has an angle with its corresponding projected surface point < ε. –The number of sample points inside B(x, ε lfs(proj(x)) is less than a small number .  Note that it is difficult to check whether a sample P is a noisy (ε,  )-sample.

Effect of nearby samples Points within a small neighborhood are predictably distributed within a s small slab:

Adaptive MLS at point x should be decided primarily by near sample points.  Choose such, that sample points outside a neighborhood have less effect.  Use Gaussian functions.  Their width control influence of samples. Make width dependent on lfs  Define width as fraction of lfs

Adaptive MLS Choice of weighting function:  p (x) =  Many sample points at p which contribute to point x.

Adaptive MLS Choice of weighting function:  (x) p =  Sample point p has a constant weight( ).  Influence of p does not decrease with distance.

Adaptive MLS Compromise: Take fraction of as width of Gaussian.  Weighting function decreases as p goes far away from x.  Weighting is small for small features, i.e. small features do not require more samples.

Contribution of distant samples Show that the effect of distant sample points can be bounded.  Rely on nearby features. How to show that?

Contribution of distant samples r=

Contribution of distant samples Once there is a bound on the number of samples in B  /2, we lemma 3 is used which shows an upper bound on its influence, i.e.

Contribution of distant samples Using lemma 3 they prove the main theorem:

Contribution of distant samples The space outside B(x, rlfs(proj(x)) can be decomposed in an infinite number of shells, i.e.  The influence of all points outside is equal to the influence of all the points in the shells which was bounded by lemma 3.  Therefore, the contributions of points outside B(x,rlfs(proj(x)) can be bounded.

Algorithm

Normal Estimation Delaunay based, i.e. calculate DT of input points. Big Delaunay ball: radius greater than certain times the nearest neighbor. The vectors of incident points to center of B(c,r) approximate normals.

Normal Estimation

Feature Detection In noiseless case:  Take poles in Voronoi cells, shortest distance approximates lfs(p). This does not work in the case of noise. Every medial axis point is covered with a big Delaunay Ball (observation by Dey and Goswami) Take biggest Delaunay Ball in both (normal) directions of sample point p.  Centers act as poles (L). lfs(proj(x)) is d(p, L) where p is closest point to x in P.

Projection Newton Iteration: move p to p’, i.e.: Iterate until d(p,p’) becomes smaller than a certain treshold. To calculate and only use the points in Ball with radius 5x the width of the Gaussian weighting function.  Points outside have little effect on the function. Newton projection has big convergent domain.

Reconstruction Finally, use a reconstruction algorithm, e.g. Cocone to calculate a mesh.

AMLS vs PMLS Background on “Projection MLS”: –Surface is the set of stationary points of a function f. –An energy function e measures the quality of the fit of a plane to the point set at r. –The local minima of e nearest to r is f(r).

AMLS vs PMLS The zero set of the energy function defines the surface. But there are other two layers of zero-level sets where the energy function reaches its local maximum.

AMLS vs PMLS When distance between the layers gets small, computations on the PMLS surface become difficult.  Small “marching step” in ray tracing  Small size of cubes in polygonizer

AMLS vs PMLS Furthermore, projection procedure in PMLS is not trivial.  non-linear optimization.  …and finding a starting value is difficult if the two maximum layers are close. If there is noise, maxima layers could interfere.  Holes and disconnectness.

AMLS vs PMLS

A Sampling Theorem for MLS Surfaces Peer-Timo Bremer and John C. Hart

Motivation We saw a lot of work about PSS. But not much knowledge about resulting mathematical properties. It is assumed that surface construction is well defined within a neighborhood.

Motivation MLS filters samples, and projects them onto a a local tangent plane. MLS is robust but difficult to analyze. Current algorithms may actually miss the surface.  MLS is defined as the stationary points of a (dynamic) projection.  This projection is “dynamic” and can be undefined.  Robustness of MLS is determined by projection.  Need a sample condition which guarantees that projection is defined everywhere.

Motivation What is necessary to have a faithful projection?  Correct normals. We need a condition which shows that given a surface and a sampling, normals are well defined.

Sampling Condition - Outline Given a sample, show Normal vector needed by MLS does not vanish Normal vector needed by MLS does not vanish Given conditions of the surface and sampling, the normal is well defined Given conditions of the surface and sampling, the normal is well defined

PCA - Review We have given a set of points in R 3 which lie or don’t lie on a surface. We know the 3x3 covariance matrix at point q is The cov(q) can be factored into U T LU …where

PCA - Review v max, v mid, v min define local coordinate frame:

MLS Surface e.g. Gaussian Weighted average 3x3 covariance matrix

MLS surface From we know: The eigenvector of the unique smallest eigenvalue defines the normal. Using that, the zero set of defines the MLS surface.  has to be well defined!

New Sampling Condition Previous Sampling condition: “S is well sampled by P if normal directions are defined inside a neighborhood of.” (Adamson and Alexa) This is not good because S .  S could be “well sampled” but undefined normal direction can result New Sampling condition: “S is well sampled by P if normal directions are defined inside a neighborhood of S.

You can prove a sampling to be well defined by ensuring that has a unique smallest eigenvalue over a neighborhood of S. It is proved in terms of the weighted variance of the samples P. Directional Variance:  Variance in a specific direction Uniqueness of smallest Eigenvalue

We combine that with and get …and decompose it into eigenvalues and eigenvectors:

Uniqueness of smallest Eigenvalue Using that it can be shown that if min is not unique  var n (q) isn’t unique either. (Lemma 2) This leads to:

Sampling Theorem To prove the sampling condition, show that for every point q, there is a normal direction whose weighted variance is less than that of any perpendicular direction.  Derive upper bound of weighted variance in normal direction n.  Derive lower bound in an arbitrary tangent direction x.  Determine sampling conditions: max(var n (q)) < min(var x (q))

Sampling Theorem To show, partition datapoints into ones that increase var n more versus ones that increase var x more. Construct two planes through q …which seperates the points such that

Sampling Theorem Points below those planes increase var n (q) more than var x (q). The region below the planes define an hourglass shape.

Sampling Theorem Furthermore, q is at most w=  lfs(proj(q) above surface.

Sampling Theorem Project hourglass onto lower medial ball (B-). Find limit of max possible effect of these “risky” samples. Use the risky area (red) to overestimate the number of samples and their contribution to var n.

Sampling Theorem Define upper bound on samples in risky area and use it to compute upper bound of var n. Define lower bound on samples outside risky area to compute lower bound of var x. This leads to the Main Theorem: