Download presentation
Presentation is loading. Please wait.
Published byRoxanne Norton Modified over 9 years ago
1
VLSH: Voronoi-based Locality Sensitive Hashing Sung-eui Yoon Authors: Lin Loi, Jae-Pil Heo, Junghwan Lee, and Sung-Eui Yoon KAIST http://sglab.kaist.ac.kr/VLSH/
2
2 Main Goals ● Provide efficient nearest neighbor search for various motion planners, PRM and RRT ● Works with high-dimensional data sets ● Supports a diverse set of distance metrics
3
3 Nearest Neighbor Search in Motion Planning 1.Generate collision free samples 2.For each sample, find near neighbors
4
4 K-Nearest Neighbor Search ● Example with K = 6 q: query point data points
5
5 Approximate K-Nearest Neighbor Search (Ak-NNS) ● Allow approximation factor, ε > 1, to the distance, d, of the exact NN q d q dεdε Ak-NNS
6
6 Main Contributions
7
7 Previous Work ● Spatial subdivision data structures (e.g., kd-trees) ● Suffer from the curse of dimensionality [Weber et al. 1998] ● Culling techniques using the triangle inequality [Chavez et al. 01] ● Geometric Near Access Tree (GNAT) [Brin, 95] ● Used widely at OOPSMP and OMPL ● GPU based acceleration [Pan et al., 10]
8
8 Previous Work on LSH and Embedding ● Locality Sensitivity Hashing (LSH) ● Fast algorithms for high-dimensional NNS problems [Datar and Indyk 2004] ● Supports for a limited set of motion planning distance metrics (e.g., Euler angles) [Pan et al. 2010] ● Embedding ● Well studied topic [Indyk et al., 04] ● Embed motion planning spaces to the Euclidean space [Plaku and Kavraki 2006]
9
9 Background on LSH ● Randomly generate a projection vector ● Project points onto vector ● Bin the projected points to a segment, whose width is w, i.e. quantization factor ● All the data in a bin has the same hash code Quantization factor w
10
10 Background on LSH ● Multiple projections NN of : g1g1 g2g2 g3g3 Data points Query point
11
11 Issues of LSH
12
12 An Example w a0a0 a1a1 The number of data points in each bin can vary a lot!
13
13 VLSH: Voronoi-based Locality Sensitive Hashing ● Consists of two steps: 1.Embedding to the Euclidean space 2.Invoking a localized LSH
14
14 Phase 1: Embedding ● Pick pivot points from the data set ● Compute distance to all pivot points in the MP space for defining an embedded point, v’ ● Use L 2 metric as a distance metric between embedded points ● Support arbitrary motion planning metrics with a low distortion [Bourgain 85] d(p 0,v) d(p 1,v) d(p 2,v) v’= (d(p 0,v), d(p 1,v), d(p 2,v)) p1p1 p2p2 : pivot point : sample point p0p0 v
15
15 Phase 2: Invoking a Local LSH ● During embedding process, a point can be associated with its closest pivot ● Pivot points implicitly construct Voronoi regions Pivot points Data points Voronoi diagram
16
16 Implicit Voronoi Region of a Pivot ● Construct a localized LSH for points contained in each Voronoi region of a pivot point ● Use a localized quantization factor ● Explicit construction for Voronoi regions is not necessary ● Assigns a point to its closest pivot
17
17 Expanding Voronoi Regions ● Considering only points within the Voronoi region of each pivot results in disconnected graphs ● Points near Voronoi regions are not connected
18
18 Expanding Voronoi Regions
19
19 Results w/ and w/o Expansion Disconnected graphs (Ak-NNS w/o expansion) Well-connected graphs (Ak-NNS w/ expansion)
20
20 Query-Time Algorithm ● Compute the embedded point from a query point ● Find the closest pivot point and use its localized LSH ● Return candidate nearest neighbors located in hash-buckets of the localized LSH
21
21 Test Configurations ● Intel i7 3.3 GHz CPU with C++ implementation ● Compare our method against LSH and GNAT (OOPSMP) ● Test them with samples from a PRM planner ● 15 nearest neighbor search, i.e. k = 15 ● 10 pivots for our method
22
22 Benchmarks ● Wiper: 6 dimensions (1 robot for the wiper) ● Bug trap: ● 24 dimensions: 4 rod robots ● 36 dimensions: 6 rod robots
23
23 Wiper: Performance Evaluation ● VLSH vs. GNAT (Em): ● 3.7x faster ● VLSH vs. LSH (Em): ● 2.6x faster
24
24 Wiper: Quality Evaluation ● Measure fractional distance error (fde) [Plaku and Kavraki 06] ● Measure difference between computed approximate and ground truth results ● Lower values indicate more accurate results
25
25 Results: Bug trap, 24 dim. ● VLSH vs GNAT (Em): ● 3.2x faster ● VLSH vs LSH (Em): ● Up to 1.6x faster
26
26 Results: Bug trap, 36 dim. ● VLSH vs GNAT (Em): ● 3.6x faster ● VLSH vs LSH (Em): ● Up to 1.4x faster
27
27 Conclusions ● Fast approximate nearest neighbor search algorithm for high-dimensional motion planning problems ● Achieve up to 3.7x faster running time over prior approaches ● Supports all distance metrics and consider data distributions for higher accuracy
28
28 Limitation and Future Work ● Memory overhead ● Duplicate points in our method ● Overall it is not significant, since it takes tens of MB in the testes cases ● Support RRTs that dynamically generate data points ● Supports GPUs for higher performance
29
29 Acknowledgements ● Anonymous reviewers ● Our funding agency Project webpage: http://sglab.kaist.ac.kr/VLSH/
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.