Discrete geometry Lecture 2 1 © Alexander & Michael Bronstein

Slides:



Advertisements
Similar presentations
The Capacity of Wireless Networks Danss Course, Sunday, 23/11/03.
Advertisements

Hierarchical Clustering, DBSCAN The EM Algorithm
Differential geometry I
 Distance Problems: › Post Office Problem › Nearest Neighbors and Closest Pair › Largest Empty and Smallest Enclosing Circle  Sub graphs of Delaunay.
By Groysman Maxim. Let S be a set of sites in the plane. Each point in the plane is influenced by each point of S. We would like to decompose the plane.
Topology-Invariant Similarity and Diffusion Geometry
Discrete Geometry Tutorial 2 1
Introduction to Bioinformatics
Isometry-Invariant Similarity
Corp. Research Princeton, NJ Cut Metrics and Geometry of Grid Graphs Yuri Boykov, Siemens Research, Princeton, NJ joint work with Vladimir Kolmogorov,
1 Numerical geometry of non-rigid shapes Consistent approximation of geodesics in graphs Consistent approximation of geodesics in graphs Tutorial 3 © Alexander.
Shape reconstruction and inverse problems
Invariant correspondence
1 Processing & Analysis of Geometric Shapes Shortest path problems Shortest path problems The discrete way © Alexander & Michael Bronstein, ©
Lecture 6 Image Segmentation
Asst. Prof. Yusuf Sahillioğlu
Numerical Optimization
Isometry invariant similarity
Fast marching methods Lecture 3 1 © Alexander & Michael Bronstein
© University of Minnesota Data Mining for the Discovery of Ocean Climate Indices 1 CSci 8980: Data Mining (Fall 2002) Vipin Kumar Army High Performance.
Numerical geometry of non-rigid shapes
University of CreteCS4831 The use of Minimum Spanning Trees in microarray expression data Gkirtzou Ekaterini.
Segmentation Divide the image into segments. Each segment:
1 Numerical geometry of non-rigid shapes Mathematical background Mathematical background Tutorial 1 © Maks Ovsjanikov, Alex & Michael Bronstein tosca.cs.technion.ac.il/book.
Correspondence & Symmetry
Prénom Nom Document Analysis: Data Analysis and Clustering Prof. Rolf Ingold, University of Fribourg Master course, spring semester 2008.
Slide 1 EE3J2 Data Mining Lecture 16 Unsupervised Learning Ali Al-Shahib.
1 Numerical geometry of non-rigid shapes Lecture II – Numerical Tools Numerical geometry of shapes Lecture II – Numerical Tools non-rigid Alex Bronstein.
Clustering Color/Intensity
Lecture 4 Unsupervised Learning Clustering & Dimensionality Reduction
1 Numerical geometry of non-rigid shapes In the Rigid Kingdom In the Rigid Kingdom Lecture 4 © Alexander & Michael Bronstein tosca.cs.technion.ac.il/book.
OBBTree: A Hierarchical Structure for Rapid Interference Detection Gottschalk, M. C. Lin and D. ManochaM. C. LinD. Manocha Department of Computer Science,
Invariant Correspondence
1 Numerical geometry of non-rigid shapes A journey to non-rigid world objects Numerical methods non-rigid Alexander Bronstein Michael Bronstein Numerical.
Unsupervised Learning
Non-Euclidean Embedding
Cluster Analysis (1).
1 Numerical geometry of non-rigid shapes Numerical Geometry Numerical geometry of non-rigid shapes Numerical geometry Alexander Bronstein, Michael Bronstein,
Numerical geometry of non-rigid shapes
1 University of Denver Department of Mathematics Department of Computer Science.
1 Numerical Geometry of Non-Rigid Shapes Invariant shape similarity Invariant shape similarity © Alexander & Michael Bronstein, © Michael Bronstein,
1 Numerical geometry of non-rigid shapes Non-Euclidean Embedding Non-Euclidean Embedding Lecture 6 © Alexander & Michael Bronstein tosca.cs.technion.ac.il/book.
Clustering & Dimensionality Reduction 273A Intro Machine Learning.
Clustering Unsupervised learning Generating “classes”
Binary Variables (1) Coin flipping: heads=1, tails=0 Bernoulli Distribution.
Surface Simplification Using Quadric Error Metrics Michael Garland Paul S. Heckbert.
1 Numerical geometry of non-rigid shapes Shortest path problems Shortest path problems Lecture 2 © Alexander & Michael Bronstein tosca.cs.technion.ac.il/book.
Digital Image Processing CCS331 Relationships of Pixel 1.
Image segmentation Prof. Noah Snavely CS1114
CSE554Fairing and simplificationSlide 1 CSE 554 Lecture 6: Fairing and Simplification Fall 2012.
Clustering.
Mesh Coarsening zhenyu shu Mesh Coarsening Large meshes are commonly used in numerous application area Modern range scanning devices are used.
Efficient Computing k-Coverage Paths in Multihop Wireless Sensor Networks XuFei Mao, ShaoJie Tang, and Xiang-Yang Li Dept. of Computer Science, Illinois.
CS 8751 ML & KDDData Clustering1 Clustering Unsupervised learning Generating “classes” Distance/similarity measures Agglomerative methods Divisive methods.
A New Voronoi-based Reconstruction Algorithm
CENG 789 – Digital Geometry Processing 04- Distances, Descriptors and Sampling on Meshes Asst. Prof. Yusuf Sahillioğlu Computer Eng. Dept,, Turkey.
Machine Learning Queens College Lecture 7: Clustering.
1 Numerical geometry of non-rigid shapes Projects Quasi-isometries Project 1 © Alexander & Michael Bronstein tosca.cs.technion.ac.il/book Numerical geometry.
Clustering (1) Chapter 7. Outline Introduction Clustering Strategies The Curse of Dimensionality Hierarchical k-means.
Mesh Resampling Wolfgang Knoll, Reinhard Russ, Cornelia Hasil 1 Institute of Computer Graphics and Algorithms Vienna University of Technology.
Rate Distortion Theory. Introduction The description of an arbitrary real number requires an infinite number of bits, so a finite representation of a.
Theory of Computational Complexity Probability and Computing Ryosuke Sasanuma Iwama and Ito lab M1.
Clustering – Definition and Basic Algorithms Seminar on Geometric Approximation Algorithms, spring 11/12.
Clustering (2) Center-based algorithms Fuzzy k-means Density-based algorithms ( DBSCAN as an example ) Evaluation of clustering results Figures and equations.
Tutorial 3 – Computational Geometry
Haim Kaplan and Uri Zwick
Clustering (3) Center-based algorithms Fuzzy k-means
Craig Schroeder October 26, 2004
Clustering 77B Recommender Systems
Clustering.
Presentation transcript:

Discrete geometry Lecture 2 1 © Alexander & Michael Bronstein tosca.cs.technion.ac.il/book Numerical geometry of non-rigid shapes Stanford University, Winter 2009 1

‘ The world is continuous, ‘ but the mind is discrete David Mumford

Discretization Continuous world Discrete world Surface Sampling Metric Topology Sampling Discrete metric (matrix of distances) Discrete topology (connectivity)

How good is a sampling?

Sampling density How to quantify density of sampling? is an -covering of if Alternatively: for all , where is the point-to-set distance.

Sampling efficiency Are all points necessary? An -covering may be unnecessarily dense (may even not be a discrete set). Quantify how well the samples are separated. is -separated if for all . For , an -separated set is finite if is compact. Also an r-covering!

Farthest point sampling Good sampling has to be dense and efficient at the same time. Find and -separated -covering of . Achieved using farthest point sampling. We defer the discussion on How to select ? How to compute ?

Farthest point sampling

Farthest point sampling Start with some . Determine sampling radius If stop. Find the farthest point from Add to

Farthest point sampling Outcome: -separated -covering of . Produces sampling with progressively increasing density. A greedy algorithm: previously added points remain in . There might be another -separated -covering containing less points. In practice used to sub-sample a densely sampled shape. Straightforward time complexity: number of points in dense sampling, number of points in . Using efficient data structures can be reduced to .

Sampling as representation Sampling represents a region on as a single point . Region of points on closer to than to any other Voronoi region (a.k.a. Dirichlet or Voronoi-Dirichlet region, Thiessen polytope or polygon, Wigner-Seitz zone, domain of action). To avoid degenerate cases, assume points in in general position: No three points lie on the same geodesic. (Euclidean case: no three collinear points). No four points lie on the boundary of the same metric ball. (Euclidean case: no four cocircular points).

Voronoi decomposition Voronoi region Voronoi edge Voronoi vertex A point can belong to one of the following Voronoi region ( is closer to than to any other ). Voronoi edge ( is equidistant from and ). Voronoi vertex ( is equidistant from three points ).

Voronoi decomposition

Voronoi decomposition Voronoi regions are disjoint. Their closure covers the entire . Cutting along Voronoi edges produces a collection of tiles . In the Euclidean case, the tiles are convex polygons. Hence, the tiles are topological disks (are homeomorphic to a disk).

Voronoi tesellation Tessellation of (a.k.a. cell complex): a finite collection of disjoint open topological disks, whose closure cover the entire . In the Euclidean case, Voronoi decomposition is always a tessellation. In the general case, Voronoi regions might not be topological disks. A valid tessellation is obtained is the sampling is sufficiently dense.

Non-Euclidean Voronoi tessellations Convexity radius at a point is the largest for which the closed ball is convex in , i.e., minimal geodesics between every lie in . Convexity radius of = infimum of convexity radii over all . Theorem (Leibon & Letscher, 2000): An -separated -covering of with convexity radius of is guaranteed to produce a valid Voronoi tessellation. Gives sufficient sampling density conditions.

Sufficient sampling density conditions Invalid tessellation Valid tessellation

Voronoi tessellations in Nature

Farthest point sampling and Voronoi decomposition MATLAB® intermezzo Farthest point sampling and Voronoi decomposition

Representation error Voronoi decomposition replaces with the closest point . Mapping copying each into . Quantify the representation error introduced by . Let be picked randomly with uniform distribution on . Representation error = variance of

Optimal sampling In the Euclidean case: (mean squared error). Optimal sampling: given a fixed sampling size , minimize error Alternatively: Given a fixed representation error , minimize sampling size

Centroidal Voronoi tessellation In a sampling minimizing , each has to satisfy (intrinsic centroid) In the Euclidean case – center of mass In general case: intrinsic centroid of . Centroidal Voronoi tessellation (CVT): Voronoi tessellation generated by in which each is the intrinsic centroid of .

Lloyd-Max algorithm Start with some sampling (e.g., produced by FPS) Construct Voronoi decomposition For each , compute intrinsic centroids Update In the limit , approaches the hexagonal honeycomb shape – the densest possible tessellation. Lloyd-Max algorithms is known under many other names: vector quantization, k-means, etc.

Sampling as clustering Partition the space into clusters with centers to minimize some cost function Maximum cluster radius Average cluster radius In the discrete setting, both problems are NP-hard Lloyd-Max algorithm, a.k.a. k-means is a heuristic, sometimes minimizing average cluster radius (if converges globally – not guaranteed)

Farthest point sampling encore Start with some , For Find the farthest point Compute the sampling radius Lemma .

Proof For any

Proof (cont) Since , we have

Almost optimal sampling Theorem (Hochbaum & Shmoys, 1985) Let be the result of the FPS algorithm. Then In other words: FPS is worse than optimal sampling by at most 2.

Idea of the proof Let denote the optimal clusters, with centers Distinguish between two cases One of the clusters contains two or more of the points Each cluster contains exactly one of the points

Proof (first case) Assume one of the clusters contains two or more of the points , e.g. triangle inequality Hence:

Proof (second case) Assume each of the clusters contains exactly one of the points , e.g. Then, for any point triangle inequality

Proof (second case, cont) We have: for any , for any point In particular, for