Download presentation
Presentation is loading. Please wait.
Published byEleanor Horn Modified over 9 years ago
1
C OMMUNICATION S TEPS F OR P ARALLEL Q UERY P ROCESSING Paraschos Koutris Paul Beame Dan Suciu University of Washington PODS 2013
2
M OTIVATION Understand the complexity of parallel query processing on big data Focus on shared-nothing architectures – MapReduce is such an example Dominating parameters of computation: – Communication cost – Number of communication rounds 2
3
C OMPUTATION M ODELS The MapReduce model – [Afrati et al., 2012] tradeoff between reducer size (input size of a reducer) and replication rate (in how many reducers a tuple is sent) The MUD model [Feldman et al., 2010] – (Massive, Unordered, Distributed) model The MRC model [Karloff et al., 2010] – MapReduce computation + load balancing 3
4
T HE MPC M ODEL N: total input size (in bits) p: number of servers – Servers have unlimited computational power Computation proceeds in synchronous rounds: – Local computation – Global communication 4 Input N.................. Server 1 Server p Round 1Round 2...
5
MPC P ARAMETERS Each server receives in total a bounded number of bits: O(N/p × p ε ) 0 ≤ ε < 1 Complexity parameters: – Number of computation rounds r – Space exponent ε (governs data replication) 5 What are the space exponent/round tradeoffs for query processing in the MPC model ?
6
O UR R ESULTS ONE ROUND: – Lower bounds on the space exponent for any (randomized) algorithm that computes a Conjunctive Query – The lower bound holds for a class of inputs (matching databases), for which we show tight upper bounds MULTIPLE ROUNDS: – Almost tight space exponent/round tradeoffs for tree-like Conjunctive Queries under a weaker communication model 6
7
O UTLINE 1.Warm-up: The Triangle Query 2.One Communication Round 3.Multiple Communication Rounds 7
8
C ONJUNCTIVE Q UERIES We mainly study full Conjuctive Queries w/o self-joins: Q(x, y, z, w, v) = R(x,y,z), S(x,w,v), T(v,z) The hypergraph of the query Q: – Variables as vertices – Atoms as hyperedges 8 x y z v w R S T
9
T HE T RIANGLE Q UERY (1) Find all triangles Q(x,y,z) = R(x,y), S(y,z), T(z,x) 2-round Algorithm: – ROUND 1: [R hash-join S] R(a, b) h(b) S(b, c) h(b) Join locally U(a, b, c) = {R(a, b), S(b, c)} – ROUND 2: [T’ hash-join T] U(a, b, c) h(c) T(c, a) h(c) Join locally Q(a,b,c) = {U(a, b,c), T(c, a)} – Replication ε = 0 9
10
T HE T RIANGLE Q UERY (2) 1-round Algorithm: [Ganguly ’92, Afrati ’10, Suri ’11] – The p servers form a cube: [p 1/3 ] × [p 1/3 ] × [p 1/3 ] – Send each tuple to servers: R(a, b) (h 1 (a), h 2 (b), - ) S(b, c) (-, h 2 (b), h 3 (c) ) each tuple replicated p 1/3 times T(c, a) (h 1 (a), -, h 3 (c) ) Replication ε = 1/3 10 (h 1 (a), h 2 (b), h 3 (c))
11
L OWER B OUND F OR T RIANGLES (1) Say that R, S, T are random permutations over [n] 2 Expected #triangles = 1 Lemma: For any deterministic algorithm and ε=0, the p servers report in expectation O(1/p 1/2 ) tuples Each relation contains N = (n logn) bits of information Any server knows a 1/p fraction of input: N/p bits 11 Theorem: No (randomized) algorithm can compute triangles in one round with space exponent ε < 1/3
12
L OWER B OUND F OR T RIANGLES (2) a xy = Pr[server knows tuple R(x,y)] a xy ≤ 1/n ∑ x,y a xy = O(n/p) Similarly for S(y,z), T(z,x): b yz, c zx Friedgut’s inequality: ∑ x,y,z a xy b yz c zx ≤ (∑ x,y a xy 2 ) 1/2 (∑ y,z b yz 2 ) 1/2 (∑ z,x c zx 2 ) ½ #know-triangles = O(1/p 3/2 ) Summing over all servers, O(1/p 1/2 ) known output tuples 12
13
O UTLINE 1.Warm-up: The Triangle Query 2.One Communication Round 3.Multiple Communication Rounds 13
14
M ATCHING D ATABASES Every relation R(A 1, …, A k ) contains exactly n tuples Every attribute A i contains each value in {1, …, n} only once A matching database has no skew 14 1 2 n-1 n … … Relation R(X, Y, Z) 1 2 3 n … XYZ 132 223 n-1nn ………
15
F RACTIONAL V ERTEX C OVER Vertex cover number τ: minimum number of variables that cover every hyperedge Fractional vertex cover number τ*: minimum weight of variables such that each hyperedge has weight at least 1 15 Vertex Cover τ = 2 1/2 Fractional Vertex Cover τ* = 3/2 0 0 x y z v w Q(x, y, z, w, v) = R(x,y,z), S(x,w,v), T(v,z)
16
L OWER B OUNDS 16 Theorem: Any randomized algorithm in the MPC model will fail to compute a Conjunctive Query Q with: 1 round ε < 1 – 1/ τ*(Q) Input a matching database
17
U PPER B OUNDS 17 Theorem: The HYPERCUBE (randomized) algorithm can compute any Conjunctive Query Q with: 1 round ε ≥ 1 – 1/ τ*(Q) Input a matching database (no skew) Exponentially small probability of failure (on input N)
18
H YPER C UBE A LGORITHM Q(x 1,…, x k ) = S 1 (…), …, S l (…) Compute τ* and minimum cover: v 1, v 2, …, v k Assign to each variable x i a share exponent e(i) = v i / τ* Assign each of the p servers to points on a k-dimensional hypercube: [p] = [p e(1) ] × … × [p e(k) ] Hash each tuple to the appropriate subcube 18 Q(x,y,z,w,v)=R(x,y,z),S(x,w,v),T(v,z) τ* = 3/2 : v x = v v = v z = ½ v y = v w = 0 e(x) = e(v) = e(z) = 1/3 e(y) = e(w) = 0 [p] =[p 1/3 ]×[p 0 ]×[p 1/3 ]×[p 0 ]×[p 1/3 ] e.g. S(a,b,c) (h x (a), 1, -, 1, h v (c))
19
E XAMPLES 19 Cycle query: C k (x 1,…, x k ) = S 1 (x 1, x 2 ), …, S k (x k, x 1 ) – τ* = k/2 – ε = 1 - 2/k Star query: T k (z, x 1,…, x k ) = S 1 (z, x 1 ), …, S k (z, x k ) – τ* = 1 – ε = 0 Line query: L k (x 0, x 1,…, x k ) = S 1 (x 0, x 1 ), …, S k (x k-1, x k ) – τ* = k/2 – ε = 1 - 1/ k/2
20
O UTLINE 1.Warmup: The Triangle Query 2.One Communication Round 3.Multiple Communication Rounds 20
21
M ULTIPLE R OUNDS 21 Our results apply to a weaker model (tuple-based MPC): – only join tuples can be sent in rounds > 1 e.g. {R(a,b), S(b,c)} – routing of each tuple t depends only on t Theorem: For every tree-like query Q, any tuple-based MPC algorithm requires at least log(diam(Q)) / log 2/(1-ε) rounds This bound agrees with the upper bound within 1 round diam(Q): largest distance between two vertices in the hypergraph tree-like queries: #variables + #atoms - Σ(arities) = 1
22
E XAMPLE 22 Line query: L k (x 0,x 1,…, x k ) = S 1 (x 0,x 1 ), …, S k (x k-1,x k ) – tree-like: #variables = k+1 #atoms = k Σ(arities) = 2k – diam(L k ) = k – For space exponent ε = 0, we need at least log(k)/log(2/(1-0)) = log(k) rounds x1x1 x2x2 x3x3 x4x4 … xkxk diameter = k
23
C ONNECTED C OMPONENTS 23 As a corollary of our results for multiple rounds, we obtain lower bounds beyond conjunctive queries: Theorem: Any tuple-based MPC algorithm that computes the Connected Components in any undirected graph with space exponent any ε<1 requires requires Ω(log p) communication rounds
24
C ONCLUSIONS Tight lower and upper bounds for one communication round in the MPC model The first lower bounds for multiple communication rounds Connected components cannot be computed in a constant number of rounds Open Problems: – Lower and upper bounds for skewed data – Lower bounds for > 1 rounds in the general model 24
25
Thank you ! 25
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.