Download presentation
Presentation is loading. Please wait.
Published byVirginia Melton Modified over 8 years ago
1
Oct 23, 2005FOCS 20051 Metric Embeddings with Relaxed Guarantees Alex Slivkins Cornell University Joint work with Ittai Abraham, Yair Bartal, Hubert Chan, Kedar Dhamdhere, Anupam Gupta and Jon Kleinberg Ofer Neiman Hebrew University
2
2 Estimating Internet latencies Latency (round-trip time) is a notion of distance in Internet distance matrix defined by latencies is almost a metric Estimate latencies for most node pairs with low load on nodes linear or near-linear #distance measurements long line of papers in Systems: Guyton+’95, Francis+’01, Ng+’02, Pias+’03, Ng+’03, Lim+’03, Tang+’03, Lehman+’04, Dabek+’04, Costa+’04 Setting: large overlay network in Internet P2P network, file-sharing system, online computer game Applications: nearest neighbors (servers, file replicas, gamers), diverse node set (fault-tolerant storage), low-stretch P2P routing
3
3 Estimate latencies via embedding Global Network Positioning (GNP) [Ng+Zhang’02] select small #nodes as “beacons” users measure latencies to beacons embed into low-dim Euclidian space embed the beacons first embed non-beacons one by one magic: 90% node pairs are embedded with relative error <.5 900 random nodes, 15 beacons, 7 dimensions lots of follow-up work: NPS [Ng+ ’03], Vivaldi [Cox+ ’04], Lighthouses [Pias+ ’03], BigBang [Tankel+ ’03], ICS [Lim+ ’03], Virtual Landmarks [Tang+ ’03], PIC [Costa+ ’04], PALM [Lehman+ ’04]
4
4 How can we explain the magic? How can we explain GNP theoretically? assume that latencies form a metric Well-studied problem: embed a metric into Euclidean space e.g. any metric can be embedded with distortion O(log n) all prior work assumes full access to distance matrix Cannot estimate all distances with small #beacons allow -slack: guarantees for all but -fraction of node pairs recall: empirical results of GNP have -slack, too new angle for theoretical work on metric embeddings
5
5 Beacon-based embeddings with slack Thm [Kleinberg, Slivkins, Wexler FOCS’04] for any doubling metric and any >0: embedding into l p (p 1) distortion O(log 1/ ) with -slack uses only Õ(1/ ) randomly selected beacons This paper: extend the result of [KSW’04] to all metrics we achieve f( ) = O(log 1/ ) and use only Õ(1/ ) beacons (same) target dimension O(log 2 1/ ) matching lower bound on distortion (even without beacons) yes Can we embed any metric with -slack and distortion f( )? any f( ) and any (non-beacon-based) embedding is non-trivial ?
6
6 Plan Introduction proof sketch extension Ofer Neiman takes over and talks about: lower bound theorem open questions
7
7 Proof Sketch (1/3) Beacon-based framework randomly select small subset S of nodes as “beacons” coordinates of node = fn(its distances to beacons) Define a block of coordinates as follows: for each i,j = 1, 2,..., (log 1/ ) (i,j)-th coordinate of node u = distance from u to S (i,j) = { random subset of S of size 2 i } similar to the embedding in [Bourgain’85] which used n (#nodes) instead of 1/
8
8 Proof Sketch (2/3) Def r(v) = min radius of ball around v that contains n nodes edge (u,v) is long if d(u,v) 4 min[ r(u), r(v) ] short if d(u,v) < min[ r(u), r(v) ] medium-length otherwise uv d(u,v) r(v) n nodes long edge uv r(v) n nodes short edge
9
9 Proof Sketch (2/3) Def r(v) = min radius of ball around v that contains n nodes edge (u,v) is long if d(u,v) 4 min[ r(u), r(v) ] short if d(u,v) < min[ r(u), r(v) ] medium-length otherwise Bourgain-style embedding handles long edges: long edges shrunk by at most O(log 1/ ), no edge expanded uv d(u,v) r(v) n nodes n short edges ignore medium-length edges ?? d(u,v)= (r(u)+r(v))
10
10 Proof Sketch (3/3) medium-length edges: d(u,v)= (r(u)+r(v)) add another block of coordinates such that for any (u,v), the embedded (u,v)-distance in this block is (r(u)+r(v)) uv d(u,v) r(v) n nodes u u 1 u 2 u 3... ??? v v 1 v 2 v 3... ??? Bourgainnew block
11
11 Proof Sketch (3/3) medium-length edges: d(u,v)= (r(u)+r(v)) add another block of coordinates such that for any (u,v), the embedded (u,v)-distance in this block is (r(u)+r(v)) how? each coordinate of u is r(u), sign chosen at random Beacon-based solution: estimate r(u) from distances to beacons uv d(u,v) r(v) n nodes u u 1 u 2 u 3... +r(u) +r(u) –r(u) v v 1 v 2 v 3... +r(v) +r(v) –r(v) Bourgainnew block
12
12 Plan Introduction proof sketch extension Ofer Neiman takes over and talks about: lower bound theorem open questions
13
13 Extension: one embedding for all Thm for decomposable metrics, embedding into l p, p 1: for any >0, distortion O(log 1/ ) 1/p on all but -fraction of edges graceful degradation: one embedding works for any ! target dimension as small as O(log 2 n) extends a result from [KSW’04] on growth-constrained metrics decomposable metrics include doubling metrics and shortest-paths metrics of graphs excluding a fixed minor
14
14 Extension: one embedding for all Thm for decomposable metrics, embedding into l p, p 1: for any >0, distortion O(log 1/ ) 1/p on all but -fraction of edges graceful degradation: one embedding works for any ! target dimension as small as O(log 2 n) extends a result from [KSW’04] on growth-constrained metrics Can we extend this to all metrics? Partial result: embedding into l 1 with distortion O(log 1/ ) and target dimension as large as O(n 2 )
15
15 Plan Introduction proof sketch extension Ofer Neiman takes over and talks about: lower bound theorem open questions
16
16 Lower Bound Results Given a family of metric spaces H, having lower bound D(n) on the distortion of a regular embedding into some family X, for some |H|=n. We show lower bound of for ε-slack embedding of H into X.
17
17 Lower Bounds General idea: Take a metric Hє H which is “ hard ” for a regular embedding into some family X. Replace every point in H by a set, creating H’. H’ contain many isomorphic copies of H. In any ε-slack embedding into X, at least one copy will incur high distortion.
18
18 Metric Composition Choose and create a metric, for any xєH. Distances in C x are bounded by δ. H’ u1u1 ukuk u2u2 CuCu v1v1 vkvk v2v2 CvCv d d uv d H
19
19 Let f be any ε-slack embedding ignoring the edges E, by definition |E|<εn 2 /2. Assume that distortion(f)=R Lower Bound
20
20 T : set of vertices intersecting less than edges in E. in T t CxCx CyCy Finding a copy For each pair (v x, v y ), find t є C y such that (v x, t), (v y, t) E vxvx vyvy For each x є H, pick v x є C x ∩ T.
21
21 distortion for ε-slack embedding into trees. distortion for ε-slack embedding of doubling or l 1 metrics into l 2. distortion for randomized ε-slack embedding into a distribution of trees. Main corollaries distortion for ε-slack embedding into l p.
22
22 Follow-up work gracefully degrading distortion into l p, with dimension O( log (n)). implies constant average distortion! distortion for slack embedding into a tree metric.
23
23 Open problems Gracefully degrading embedding into a tree with distortion Algorithmic applications … Finding more gracefully degrading embeddings ( l 1 → l 2 ??). Volume respecting slack embeddings.
24
24 Extra: general embedding theorem Thm embedding into l p space (p 1) with distortion D(n) any >0: embedding with distortion D(Õ(1/ )) and -slack for any family of finite metrics closed under taking subsets beacon-based algorithm using Õ(1/ ) beacons similar theorem about lower bounds Some applications: decomposable metrics into l p : distortion O(log 1/ ) 1/p negative type metrics (e.g. l 1 ) into l 2 : distortion Õ(log 1/ ) 1/2
25
25 Extra: distortion with -slack Cannot estimate all distances with small #beacons partition nodes into k equal-size clusters, k>#beacons; distance 0 within each cluster, 1 between clusters in clusters without beacons, distance to every beacon is 1 no way to tell which node lies in which cluster We allow guarantees for all but -fraction of node pairs recall: empirical results of GNP have -slack, too new angle for theoretical work on metric embeddings
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.