Presentation is loading. Please wait.

Presentation is loading. Please wait.

Optimal invariant metrics for shape retrieval

Similar presentations


Presentation on theme: "Optimal invariant metrics for shape retrieval"— Presentation transcript:

1 Optimal invariant metrics for shape retrieval
Michael Bronstein Department of Computer Science Technion – Israel Institute of Technology TexPoint fonts used in EMF. Read the TexPoint manual before you delete this box.: A 1

2

3 Text search Tagged shapes Content-based search Shapes without metadata
3D warehouse Text search Person Man, person, human Tagged shapes Content-based search Shapes without metadata

4 Outline ? Feature descriptor Geometric words Bag of words

5 Invariance Rigid Scale Inelastic Topology
Local geodesic distance histogram Gaussian curvature Heat kernel signature (HKS) Scale-invariant HKS (SI-HKS) Wang, B 2010

6 Heat kernels Heat equation governs heat propagation on a surface
Initial conditions: heat distribution at time Solution : heat distribution at time Heat kernel is a fundamental solution of the heat equation with point heat source at (heat value at point after time )

7 Heat kernel signature can be interpreted as probability of Brownian motion to return to the same point after time (represents “stability” of the point) Multiscale local shape descriptor Time (scale) Sun, Ovsjanikov & Guibas SGP 2009 7

8 Heat kernel signatures represented in RGB space
Sun, Ovsjanikov, Guibas SGP 2009 Ovsjanikov, BB & Guibas NORDIA 2009 8

9 Scale invariance Original shape Scaled by HKS= HKS=
Not scale invariant! B, Kokkinos CVPR 2010

10 Scaling = shift and multiplicative constant in HKS
Scale-invariant heat kernel signature Log scale-space log + d/d Fourier transform magnitude 100 200 300 -15 -10 -5 t 100 200 300 -0.04 -0.03 -0.02 -0.01 t 2 4 6 8 10 12 14 16 18 20 1 3 =2k/T Scaling = shift and multiplicative constant in HKS Undo scaling Undo shift B, Kokkinos CVPR 2010

11 Scale invariance Heat Kernel Signature Scale-invariant
B, Kokkinos CVPR 2010

12 Scale invariance Heat Kernel Signature Scale-invariant
B, Kokkinos CVPR 2010

13 Modeling vs learning Wang, B 2010 13

14 Learning invariance T Positives P Negatives N

15 Similarity learning positive false positive negative false negative
with high probability 15

16 Similarity-preserving hashing
= # of distinct bits Collision: with high probability with low probability Gionis, Indik, Motwani 1999 Shakhnarovich 2005 16

17 Boosting -1 +1 Construct 1D embedding Similarity is approximated by
Downweight pairs with Upweight pairs with BBK 2010; BB Ovsjanikov, Guibas 2010 Shakhnarovich 2005 17

18 Boosting -1 -1 -1 +1 +1 -1 -1 +1 +1 +1 Construct 1D embedding
Similarity is approximated by +1 +1 Downweight pairs with Upweight pairs with BBK 2010; BB Ovsjanikov, Guibas 2010 Shakhnarovich 2005 18

19 SHREC 2010 dataset 19

20 Total dataset size: 1K shapes (715 queries) Positives: 10K
Negatives: 100K BB et al, 3DOR 2010 SHREC 2010 dataset 20

21 ShapeGoogle with HKS descriptor
BB et al, 3DOR 2010 ShapeGoogle with HKS descriptor 21

22 ShapeGoogle with SI-HKS descriptor
BB et al, 3DOR 2010 ShapeGoogle with SI-HKS descriptor 22

23 Similarity sensitive hashing (96 bit)
BB et al, 3DOR 2010 Similarity sensitive hashing (96 bit) 23

24 WaldHash Construct embedding by maximizing positive Early decision
negative Remove pairs with and sample in new pairs into the training set Downweight pairs with Upweight pairs with B2, Ovsjanikov, Guibas 2010 24

25 30% B2, Ovsjanikov, Guibas 2010 25

26 Incommensurable spaces! How to compare apples to oranges?
Cross-modal similarity Modality 1 Modality 2 Incommensurable spaces! Objects belonging to different modalities usually have different dimensionality and structure and are generated by different processes. Comparing such data is like comparing apples to oranges. Triangular meshes Point clouds How to compare apples to oranges? BB, Michel, Paragios CVPR 2010

27 Cross-modality embedding
The key idea of our paper is to embed incommensurable data into a common metric space, in such a way that positive pairs are mapped to nearby points, while negative pairs are mapped to far away points in the embedding space. BB, Michel, Paragios CVPR 2010 with high probability

28 Cross-modality hashing
The key idea of our paper is to embed incommensurable data into a common metric space, in such a way that positive pairs are mapped to nearby points, while negative pairs are mapped to far away points in the embedding space. Collision: with high probability with low probability BB, Michel, Paragios CVPR 2010

29 Cross-representation 3D shape retrieval
Database Query 1052 shapes In first example application of our generic approach, we tried to retrieve three dimensional shapes with the query and the database represented using different descriptors. 8x8 dimensional bag of expressions 32-dimensional bag of words BB, Michel, Paragios CVPR 2010

30 Mean average precision
Retrieval performance Mean average precision Our cross-modality metric outperforms Euclidean distances applied to each modality independently. It is only slightly inferior to optimal uni-modal metrics. BB, Michel, Paragios CVPR 2010 Number of bits


Download ppt "Optimal invariant metrics for shape retrieval"

Similar presentations


Ads by Google