Presentation is loading. Please wait.

Presentation is loading. Please wait.

Ronen Basri Tal Hassner Lihi Zelnik-Manor Weizmann Institute Caltech

Similar presentations


Presentation on theme: "Ronen Basri Tal Hassner Lihi Zelnik-Manor Weizmann Institute Caltech"— Presentation transcript:

1 Ronen Basri Tal Hassner Lihi Zelnik-Manor Weizmann Institute Caltech
Approximate Nearest Subspace Search with applications to pattern recognition Ronen Basri Tal Hassner Lihi Zelnik-Manor Weizmann Institute Caltech

2 Subspaces in Computer Vision
Basri & Jacobs, PAMI’03 Illumination Faces Nayar et al., IUW’96 Objects Viewpoint, Motion Dynamic textures Zelnik-Manor & Irani, PAMI’06

3 Nearest Subspace Search
Query Which is the Nearest Subspace?

4 Is there a sublinear solution?
Sequential Search Database n subspaces d dimensions k subspace dimension Sequential search: O(ndk) Too slow!! Is there a sublinear solution?

5 A Related Problem: Nearest Neighbor Search
Database n points d dimensions Sequential search: O(nd) There is a sublinear solution!

6 Approximate NN r (1+)r Fast!! Tree search (KD-trees)
Locality Sensitive Hashing r (1+)r Query: Logarithmic Preprocessing: O(dn) Fast!!

7 Is it possible to speed-up Nearest Subspace Search?
Existing point-based methods cannot be applied LSH Tree search

8 Our Suggested Approach
Reduction to points Works for both linear and affine spaces Sequential Our Run time Database size

9 Problem Definition Find Mapping Apply standard point ANN to u,v
Independent mappings Monotonic in distance A linear function of original distance Apply standard point ANN to u,v

10 Finding a Reduction We are lucky !! Feeling lucky? Constants?
Depends on query

11 Basic Reduction Want: minimize /

12 Geometry of Basic Reduction
Query Lies on a cone Database Lies on a sphere and on a hyper-plane

13 Improving the Reduction

14 Final Reduction = constants

15 Additive Constant is Inherent
Can We Do Better? If =0 Trivial mapping Additive Constant is Inherent

16 Final Mapping Geometry

17 ANS Complexities Linear in n Log in n Preprocessing: O(nkd2)
Query: O(d2)+TANN(n,d2)

18 Dimensionality May be Large
Embedding in d2 Might need to use small ε Current solution: Use random projections (use Johnson-Lindenstrauss Lemma) Repeat several times and select the nearest

19 Synthetic Data n=5000, k=4 d=60, k=4 Varying database size
Varying dimension Sequential Sequential Our Our Run time Run time Database size dimension d=60, k=4 n=5000, k=4

20 Face Recognition (YaleB)
Database 64 illuminations k=9 subspaces Query: New illumination

21 Face Recognition Result
Wrong Match Wrong Person True NS Approx NS

22 Retiling with Patches Wanted Query Patch database Approx Image

23 Retiling with Subspaces
Wanted Subspace database Query Approx Image

24 Patches + ANN ~0.6sec

25 Subspaces + ANS ~1.2 sec

26 Patches + ANN ~0.6sec

27 Subspaces + ANS ~1.2 sec

28 Summary Fast, approximate nearest subspace search
Reduction to point ANN Useful applications in computer vision Disadvantages: Embedding in d2 Additive constant  Other methods? Additional applications? A lot more to be done…..

29 THANK YOU


Download ppt "Ronen Basri Tal Hassner Lihi Zelnik-Manor Weizmann Institute Caltech"

Similar presentations


Ads by Google