Approximate Nearest Subspace Search with applications to pattern recognition Ronen Basri Tal Hassner Lihi Zelnik-Manor Weizmann Institute Caltech.

Slides:



Advertisements
Similar presentations
A Nonlinear Approach to Dimension Reduction Robert Krauthgamer Weizmann Institute of Science Joint work with Lee-Ad Gottlieb TexPoint fonts used in EMF.
Advertisements

k-Nearest Neighbors Search in High Dimensions
Nearest Neighbor Search in High Dimensions Seminar in Algorithms and Geometry Mica Arie-Nachimson and Daniel Glasner April 2009.
Image classification Given the bag-of-features representations of images from different classes, how do we learn a model for distinguishing them?
Overcoming the L 1 Non- Embeddability Barrier Robert Krauthgamer (Weizmann Institute) Joint work with Alexandr Andoni and Piotr Indyk (MIT)
Presented by Xinyu Chang
Searching on Multi-Dimensional Data
MIT CSAIL Vision interfaces Towards efficient matching with random hashing methods… Kristen Grauman Gregory Shakhnarovich Trevor Darrell.
Similarity Search in High Dimensions via Hashing
Fast Algorithm for Nearest Neighbor Search Based on a Lower Bound Tree Yong-Sheng Chen Yi-Ping Hung Chiou-Shann Fuh 8 th International Conference on Computer.
Low Complexity Keypoint Recognition and Pose Estimation Vincent Lepetit.
Cambridge, Massachusetts Pose Estimation in Heavy Clutter using a Multi-Flash Camera Ming-Yu Liu, Oncel Tuzel, Ashok Veeraraghavan, Rama Chellappa, Amit.
4/15/2017 Using Gaussian Process Regression for Efficient Motion Planning in Environments with Deformable Objects Barbara Frank, Cyrill Stachniss, Nichola.
A novel supervised feature extraction and classification framework for land cover recognition of the off-land scenario Yan Cui
Data Structures and Functional Programming Algorithms for Big Data Ramin Zabih Cornell University Fall 2012.
Fast High-Dimensional Feature Matching for Object Recognition David Lowe Computer Science Department University of British Columbia.
Coherency Sensitive Hashing (CSH) Simon Korman and Shai Avidan Dept. of Electrical Engineering Tel Aviv University ICCV2011 | 13th International Conference.
Image classification Given the bag-of-features representations of images from different classes, how do we learn a model for distinguishing them?
Computational Support for RRTs David Johnson. Basic Extend.
Given by: Erez Eyal Uri Klein Lecture Outline Exact Nearest Neighbor search Exact Nearest Neighbor search Definition Definition Low dimensions Low dimensions.
Approximate Nearest Subspace Search with Applications to Pattern Recognition Ronen Basri, Tal Hassner, Lihi Zelnik-Manor presented by Andrew Guillory and.
Approximate Nearest Neighbors and the Fast Johnson-Lindenstrauss Transform Nir Ailon, Bernard Chazelle (Princeton University)
Efficient Nearest-Neighbor Search in Large Sets of Protein Conformations Fabian Schwarzer Itay Lotan.
J Cheng et al,. CVPR14 Hyunchul Yang( 양현철 )
Dimensionality Reduction
Lightseminar: Learned Representation in AI An Introduction to Locally Linear Embedding Lawrence K. Saul Sam T. Roweis presented by Chan-Su Lee.
FLANN Fast Library for Approximate Nearest Neighbors
Embedding and Sketching Alexandr Andoni (MSR). Definition by example  Problem: Compute the diameter of a set S, of size n, living in d-dimensional ℓ.
(Fri) Young Ki Baik Computer Vision Lab.
Efficient Algorithms for Matching Pedro Felzenszwalb Trevor Darrell Yann LeCun Alex Berg.
Image Based Positioning System Ankit Gupta Rahul Garg Ryan Kaminsky.
Performance Tuning on Multicore Systems for Feature Matching within Image Collections Xiaoxin Tang*, Steven Mills, David Eyers, Zhiyi Huang, Kai-Cheung.
Nearest Neighbor Searching Under Uncertainty
SVM-KNN Discriminative Nearest Neighbor Classification for Visual Category Recognition Hao Zhang, Alex Berg, Michael Maire, Jitendra Malik.
Scale-less Dense Correspondences Tal Hassner The Open University of Israel ICCV’13 Tutorial on Dense Image Correspondences for Computer Vision.
Fast Similarity Search for Learned Metrics Prateek Jain, Brian Kulis, and Kristen Grauman Department of Computer Sciences University of Texas at Austin.
A fast algorithm for the generalized k- keyword proximity problem given keyword offsets Sung-Ryul Kim, Inbok Lee, Kunsoo Park Information Processing Letters,
Computer Vision Lab. SNU Young Ki Baik Nonlinear Dimensionality Reduction Approach (ISOMAP, LLE)
Lecture 27: Recognition Basics CS4670/5670: Computer Vision Kavita Bala Slides from Andrej Karpathy and Fei-Fei Li
Query Sensitive Embeddings Vassilis Athitsos, Marios Hadjieleftheriou, George Kollios, Stan Sclaroff.
Fast Similarity Search in Image Databases CSE 6367 – Computer Vision Vassilis Athitsos University of Texas at Arlington.
1 Embedding and Similarity Search for Point Sets under Translation Minkyoung Cho and David M. Mount University of Maryland SoCG 2008.
Geometric Problems in High Dimensions: Sketching Piotr Indyk.
An Approximate Nearest Neighbor Retrieval Scheme for Computationally Intensive Distance Measures Pratyush Bhatt MS by Research(CVIT)
Approximate Nearest Neighbors: Towards Removing the Curse of Dimensionality Piotr Indyk, Rajeev Motwani The 30 th annual ACM symposium on theory of computing.
Multi-dimensional Search Trees CS302 Data Structures Modified from Dr George Bebis.
Distinctive Image Features from Scale-Invariant Keypoints
11 Lecture 24: MapReduce Algorithms Wrap-up. Admin PS2-4 solutions Project presentations next week – 20min presentation/team – 10 teams => 3 days – 3.
Summer School on Hashing’14 Dimension Reduction Alex Andoni (Microsoft Research)
Nearest-Neighbor Searching Under Uncertainty Wuzhou Zhang Joint work with Pankaj K. Agarwal, Alon Efrat, and Swaminathan Sankararaman. To appear in PODS.
Nonlinear Dimension Reduction: Semi-Definite Embedding vs. Local Linear Embedding Li Zhang and Lin Liao.
SIFT.
SIFT Scale-Invariant Feature Transform David Lowe
Grassmannian Hashing for Subspace searching
Lecture 8:Eigenfaces and Shared Features
Sublinear Algorithmic Tools 3
Lecture 10: Sketching S3: Nearest Neighbor Search
Object Modeling with Layers
Rob Fergus Computer Vision
Lecture 16: Earth-Mover Distance
Yair Bartal Lee-Ad Gottlieb Hebrew U. Ariel University
Locality Sensitive Hashing
SIFT.
Dimension versus Distortion a.k.a. Euclidean Dimension Reduction
CS5112: Algorithms and Data Structures for Applications
Lecture 15: Least Square Regression Metric Embeddings
Nonlinear Dimension Reduction:
Minwise Hashing and Efficient Search
President’s Day Lecture: Advanced Nearest Neighbor Search
Ronen Basri Tal Hassner Lihi Zelnik-Manor Weizmann Institute Caltech
Presentation transcript:

Approximate Nearest Subspace Search with applications to pattern recognition Ronen Basri Tal Hassner Lihi Zelnik-Manor Weizmann Institute Caltech

Subspaces in Computer Vision Zelnik-Manor & Irani, PAMI’06 Basri & Jacobs, PAMI’03 Nayar et al., IUW’96 Illumination Faces Objects Viewpoint, Motion Dynamic textures …

Query Nearest Subspace Search Which is the Nearest Subspace?

Sequential Search Sequential search: O(ndk) Too slow!! Is there a sublinear solution? Database d dimensions n subspaces k subspace dimension

A Related Problem: Nearest Neighbor Search d dimensions n points Sequential search: O(nd) There is a sublinear solution! Database

Approximate NN (1+  )r Tree search (KD-trees) Locality Sensitive Hashing Fast!! Query: Logarithmic Preprocessing: O(dn) r

Is it possible to speed-up Nearest Subspace Search? Existing point-based methods cannot be applied Tree searchLSH

Our Suggested Approach Reduction to points Works for both linear and affine spaces Run time Sequential Our Database size

Problem Definition Find Mapping Apply standard point ANN to u,v A linear function of original distance Monotonic in distance Independent mappings

Finding a Reduction Constants? Depends on query Feeling lucky? We are lucky !!

Basic Reduction Want: minimize  / 

Geometry of Basic Reduction Database Lies on a sphere and on a hyper-plane Query Lies on a cone

Improving the Reduction

Final Reduction = constants

Can We Do Better? If  =0 Trivial mappingAdditive Constant is Inherent

Final Mapping Geometry

ANS Complexities Preprocessing: O(nkd 2 ) Linear in n Log in n Query: O(d 2 )+T ANN (n,d 2 )

Dimensionality May be Large Embedding in d 2 Might need to use small ε Current solution: –Use random projections (use Johnson- Lindenstrauss Lemma) –Repeat several times and select the nearest

Synthetic Data Varying database size d=60, k=4 Run time Sequential Our Database size Varying dimension n=5000, k=4 Run time Sequential Our dimension

Face Recognition (YaleB) Database 64 illuminations k=9 subspaces Query: New illumination

Face Recognition Result Wrong Match Wrong Person True NS Approx NS

Retiling with Patches Patch databaseQueryApprox Image Wanted

Retiling with Subspaces Subspace database QueryApprox Image Wanted

Patches + ANN ~0.6sec

Subspaces + ANS ~1.2 sec

Patches + ANN ~0.6sec

Subspaces + ANS ~1.2 sec

Summary Fast, approximate nearest subspace search Reduction to point ANN Useful applications in computer vision Disadvantages: –Embedding in d 2 –Additive constant  Other methods? Additional applications? A lot more to be done…..

THANK YOU