Fast Nearest Neighbor Search in the Hamming Space

Slides:



Advertisements
Similar presentations
Aggregating local image descriptors into compact codes
Advertisements

Presented by Xinyu Chang
Searching on Multi-Dimensional Data
A NOVEL LOCAL FEATURE DESCRIPTOR FOR IMAGE MATCHING Heng Yang, Qing Wang ICME 2008.
Statistical Classification Rong Jin. Classification Problems X Input Y Output ? Given input X={x 1, x 2, …, x m } Predict the class label y  Y Y = {-1,1},
Multimedia DBs. Multimedia dbs A multimedia database stores text, strings and images Similarity queries (content based retrieval) Given an image find.
Using Structure Indices for Efficient Approximation of Network Properties Matthew J. Rattigan, Marc Maier, and David Jensen University of Massachusetts.
KNN, LVQ, SOM. Instance Based Learning K-Nearest Neighbor Algorithm (LVQ) Learning Vector Quantization (SOM) Self Organizing Maps.
Pattern Recognition. Introduction. Definitions.. Recognition process. Recognition process relates input signal to the stored concepts about the object.
FLANN Fast Library for Approximate Nearest Neighbors
What Is the Most Efficient Way to Select Nearest Neighbor Candidates for Fast Approximate Nearest Neighbor Search? Masakazu Iwamura, Tomokazu Sato and.
Connect with life Connect with life
COMMON EVALUATION FINAL PROJECT Vira Oleksyuk ECE 8110: Introduction to machine Learning and Pattern Recognition.
1 Motivation Web query is usually two or three words long. –Prone to ambiguity –Example “keyboard” –Input device of computer –Musical instruments How can.
Challenges in Mining Large Image Datasets Jelena Tešić, B.S. Manjunath University of California, Santa Barbara
An Approximate Nearest Neighbor Retrieval Scheme for Computationally Intensive Distance Measures Pratyush Bhatt MS by Research(CVIT)
Approximate Nearest Neighbors: Towards Removing the Curse of Dimensionality Piotr Indyk, Rajeev Motwani The 30 th annual ACM symposium on theory of computing.
KNN & Naïve Bayes Hongning Wang Today’s lecture Instance-based classifiers – k nearest neighbors – Non-parametric learning algorithm Model-based.
1 Microarray Clustering. 2 Outline Microarrays Hierarchical Clustering K-Means Clustering Corrupted Cliques Problem CAST Clustering Algorithm.
Optimal Data-Dependent Hashing for Nearest Neighbor Search Alex Andoni (Columbia University) Joint work with: Ilya Razenshteyn.
Example Apply hierarchical clustering with d min to below data where c=3. Nearest neighbor clustering d min d max will form elongated clusters!
Mete Ozay, Fatos T. Yarman Vural —Presented by Tianxiao Jiang
Summer School on Hashing’14 Dimension Reduction Alex Andoni (Microsoft Research)
KNN & Naïve Bayes Hongning Wang
Machine Learning Clustering: K-means Supervised Learning
Fast nearest neighbor searches in high dimensions Sami Sieranoja
Azure API Management Jothi Prakash A
Microsoft Connect /18/ :32 PM
Возможности Excel 2010, о которых следует знать
Subtraction – Place Value and Negative Numbers
Outline Nonlinear Dimension Reduction Brief introduction Isomap LLE
Microsoft Connect /17/2018 5:15 AM
Near(est) Neighbor in High Dimensions
Lecture 22 Clustering (3).
Microsoft Connect /26/2018 6:09 PM
A Fast and Scalable Nearest Neighbor Based Classification
Microsoft Ignite NZ October 2016 SKYCITY, Auckland.
Digital display units This template is for informational purposes only. MICROSOFT MAKES NO WARRANTIES, EXPRESS, IMPLIED, OR STATUTORY, AS TO THE INFORMATION.
Intranet web banner units
Title of Presentation 12/2/2018 3:48 PM
Application Insights Diagnostics Preview
Технология deep zoom Михаил Черномордиков
Locality Sensitive Hashing
Web Development in Visual Studio 2017
Clustering 77B Recommender Systems
Open sourced Power BI visuals The power you need for your custom visualization needs Lukasz Pawlowski Senior Program
Instance Based Learning
Welcome to Azure Notebooks
Internal social media units
C++ Productivity Improvements
2/22/2019 1:12 PM The Journey To Provision and Manage a Thousand Machine Cluster for Machine Learning Neil Sant Gat © Microsoft Corporation. All rights.
Microsoft Connect /22/2019 9:54 PM
Microsoft Connect /24/ :10 PM
Microsoft Connect /25/2019 1:20 PM
Connected Animations Varun Shandilya Senior Program Manager Windows UI.
Nearest Neighbors CSC 576: Data Mining.
CS5112: Algorithms and Data Structures for Applications
8/04/2019 9:13 PM © 2006 Microsoft Corporation. All rights reserved. Microsoft, Windows, Windows Vista and other product names are or may be registered.
4/20/ :00 PM © Microsoft Corporation. All rights reserved. MICROSOFT MAKES NO WARRANTIES, EXPRESS, IMPLIED OR STATUTORY, AS TO THE INFORMATION IN.
“Hey Mom, I’ll Fix Your Computer”
My Experience In Troubleshooting BIZTALK360
My Experience In Handling BIZTALK360 Support
Topological Signatures For Fast Mobility Analysis
Title of Presentation 5/24/2019 1:26 PM
5/30/2019 1:59 PM © 2016 Microsoft Corporation. All rights reserved. MICROSOFT MAKES NO WARRANTIES, EXPRESS, IMPLIED OR STATUTORY, AS TO THE INFORMATION.
Microsoft Connect /29/2019 1:53 AM
Skype for Business Online Assessment Results
Power BI – The Circle is Complete
Clustering.
Data Policy and AI Erich Andersen
Presentation transcript:

Fast Nearest Neighbor Search in the Hamming Space OS3: Multimedia Machine Learning Fast Nearest Neighbor Search in the Hamming Space Zhansheng Jiang, Lingxi Xie, Xiaotie Deng, Weiwei Xu, and Jingdong Wang

Nearest Neighbor Search Nearest Neighbor (NN) Search An optimization problem for finding the closest points for a particular query point from a set of reference data points. Application Large scale similar image search

Nearest Neighbor Search Search in the Hamming Space Multi-Index Hashing FLANN Fast Neighborhood Graph Search Using Cartesian Concatenation

Data Structure Bridge Vectors Augmented Neighborhood Graph Similar to the concept of cluster centers in Product Quantization Computed by k-means clustering for binary vectors Minimize the average distance of each vector to its nearest bridge vector Augmented Neighborhood Graph Bridge graph – connect bridge vectors and their nearest reference vectors Neighborhood graph – connect each reference vector to its nearest reference vectors

Compute Bridge Vectors Defined as K-means Clustering for Binary Vectors Initialize centers randomly by the subvectors of reference vectors assignment step: find the nearest center for the subvector of each reference vector update step: optimize the centers for each cluster (by voting in each dimension)

Construct Augmented Neighborhood Graph Bridge Graph Find t nearest bridge vectors for each reference vector (by Multi-Sequence algorithm) Keep b nearest reference vectors for each bridge vector Neighborhood Graph Find k nearest reference vectors for each reference vector Approximate NN search method for large scale dataset

Reference vector: yellow Build 2015 12/9/2018 3:26 PM A X B F C Y I E D Z G H Query vector: white Bridge vector: red Reference vector: yellow © 2015 Microsoft Corporation. All rights reserved. MICROSOFT MAKES NO WARRANTIES, EXPRESS, IMPLIED OR STATUTORY, AS TO THE INFORMATION IN THIS PRESENTATION.

Push X PQ = {X} 5NN = {} A X B F C I Y E D Z G H Build 2015 12/9/2018 3:26 PM A X B F C Y I E D Z G H Push X PQ = {X} 5NN = {} © 2015 Microsoft Corporation. All rights reserved. MICROSOFT MAKES NO WARRANTIES, EXPRESS, IMPLIED OR STATUTORY, AS TO THE INFORMATION IN THIS PRESENTATION.

Pop X and push ABCEF and Y PQ = {ECFBYA} 5NN = {} Build 2015 12/9/2018 3:26 PM A X B F C Y I E D Z G H Pop X and push ABCEF and Y PQ = {ECFBYA} 5NN = {} © 2015 Microsoft Corporation. All rights reserved. MICROSOFT MAKES NO WARRANTIES, EXPRESS, IMPLIED OR STATUTORY, AS TO THE INFORMATION IN THIS PRESENTATION.

Pop E and Push D PQ = {CDFBYA} 5NN = {E} A X B F C I Y E D Z G H Build 2015 12/9/2018 3:26 PM A X B F C Y I E D Z G H Pop E and Push D PQ = {CDFBYA} 5NN = {E} © 2015 Microsoft Corporation. All rights reserved. MICROSOFT MAKES NO WARRANTIES, EXPRESS, IMPLIED OR STATUTORY, AS TO THE INFORMATION IN THIS PRESENTATION.

Pop C PQ = {DFBYA} 5NN = {EC} A X B F C I Y E D Z G H Build 2015 12/9/2018 3:26 PM A X B F C Y I E D Z G H Pop C PQ = {DFBYA} 5NN = {EC} © 2015 Microsoft Corporation. All rights reserved. MICROSOFT MAKES NO WARRANTIES, EXPRESS, IMPLIED OR STATUTORY, AS TO THE INFORMATION IN THIS PRESENTATION.

Pop D and push G PQ = {FGBYA} 5NN = {ECD} A X B F C I Y E D Z G H Build 2015 12/9/2018 3:26 PM A X B F C Y I E D Z G H Pop D and push G PQ = {FGBYA} 5NN = {ECD} © 2015 Microsoft Corporation. All rights reserved. MICROSOFT MAKES NO WARRANTIES, EXPRESS, IMPLIED OR STATUTORY, AS TO THE INFORMATION IN THIS PRESENTATION.

Pop F PQ = {GBYA} 5NN = {ECDF} A X B F C I Y E D Z G H Build 2015 12/9/2018 3:26 PM A X B F C Y I E D Z G H Pop F PQ = {GBYA} 5NN = {ECDF} © 2015 Microsoft Corporation. All rights reserved. MICROSOFT MAKES NO WARRANTIES, EXPRESS, IMPLIED OR STATUTORY, AS TO THE INFORMATION IN THIS PRESENTATION.

Pop G and push H PQ = {BHYA} 5NN = {ECDFG} A X B F C I Y E D Z G H Build 2015 12/9/2018 3:26 PM A X B F C Y I E D Z G H Pop G and push H PQ = {BHYA} 5NN = {ECDFG} © 2015 Microsoft Corporation. All rights reserved. MICROSOFT MAKES NO WARRANTIES, EXPRESS, IMPLIED OR STATUTORY, AS TO THE INFORMATION IN THIS PRESENTATION.

Build 2015 12/9/2018 3:26 PM © 2015 Microsoft Corporation. All rights reserved. MICROSOFT MAKES NO WARRANTIES, EXPRESS, IMPLIED OR STATUTORY, AS TO THE INFORMATION IN THIS PRESENTATION.