Download presentation
Presentation is loading. Please wait.
Published byHelen Gaines Modified over 9 years ago
1
Separations inQuery Complexity Pointer Functions Basedon Andris Ambainis University of Latvia Joint work with: Kaspars Balodis, Aleksandrs Belovs Troy Lee, Miklos Santha, and Juris Smotrovs
2
2 / 39 Models of Computation DeterministicRandomized | Quantum
3
Query algorithms Task: compute f(x 1,..., x N ). Input variables x i accessed via queries. Complexity = number of queries. Provable lower bounds.
4
ComputationModels D:D: Deterministic (Decision Tree) x1x1 x 2 x3x3 x3x3 01 0101 Complexity: on input: in total: Number of queries Worst input (length (depth of the path) tree) 2 or 3 3 x 2
5
ComputationModels R: Randomized (Probability distribution on decision trees) x1x1 x 2 x3x3 x3x3 01 0101 x 2 0101 Complexity: on input: in total: Expected number of queries Worst input 2 or 8/3
6
Quantum query model U 0, U 1, …, U T – independent of x 1, …, x N. Q – queries: |i (-1) x i |i . U0U0 QQ U1U1 UTUT … ComputationModels Q: Quantum (Quantum query algorithms)
7
Computation Models Q 2 vs R 2 vs D? D – deterministic (decision tree) R – randomized – R 0 – zero error; – R 1 – one sided error; – R 2 – bounded error; Q – quantum – Q E – exact; – Q 2 – bounded error;
8
Separations for partial functions 01011001 00000000 Accept if exactly half of the variables are ones Reject if all input variables are zeroes Example: (modified) Deutsch-Jozsa problem Q E =1, R 1 =1 0000 D=N/2+1
9
[Aaronson, A, tomorrow]: – FORRELATION (testing correlation between u and Fv); – Q 2 =1, R 2 = ( N). Separations for partial functions Total functions???
10
Q vs D: – Grover on N elements: Q 2 =O( N), D=N. R vs D: – R 0 =O(N 0.7537... ), D=N. Total functions 1996 1986
11
AND-OR Tree AN DAN D OR AN D x1x1 x2x2 x3x3 x4x4 x5x5 x6x6 x7x7 x8x8 [Snir’85, Saks & Wigderson’86]: O(n 0.7537... ), R0 = R1 = R2 =R0 = R1 = R2 = D = nD = n
12
Total functions * up to log N factors
13
Two notes
14
[A, 2013]: – Q E = O(N 0.8675... ), R 2 = R 0 = D = N. Our result 1: – Q E = O( N), R 0 = D = N. Our result 2: – Q E = O(N 2/3 ), R 2 =N. Quantum exact vs. classical
15
Our result: – R 2 = O( N), R 0 = N. The first separation between two types of randomized (with error and no error). Classical result
16
Go¨o¨s-Pitassi-Watson
17
Paper
18
Goal Clique vs Independent Set. f with following properties: – D – large; – f=1 can be certified by values for a small number of variables. – Certificates are unambiguous.
19
D versus 1-certificates Function of nm variables n short 1-certificates BUT not unambiguous. 1 01 10 1 010 001 1 10 f=1 iff there exists unique all-1 column m D=nm
20
D versus 1-certificates Function of nm variables n 1 01 10 1 010 001 1 10 f=1 iff there exists unique all-1 column m D=nm Should specify which 0 to choose from each column
21
Pointers Introduction Go¨ o¨ s-Pitassi-Watson Paper Goal D versus 1-certificates Pointers Features Our Modifications R 1 versus R 0 R 0 versus D Conclusion f=1 iff 1 01 10 1 010 001 1 10 there is an all-1 column b, in b there is a unique r with non-zero pointer, following the pointers from r, we traverse exactly one zero in all other columns.
22
Pointers D = nm and unambiguous short 1-certificates. 1 01 10 1 010 001 1 10
23
Features Highly elusive (flexible) Still traversable (if know where to start).
24
OurModifications
25
BinaryTree Instead of a list 0 0 0 0 1 0 0 0 1 0000000 More elusive Random access we use a balanced binary tree
26
Definition f=1 iff There is a (unique) all-1 column b ; in b, there is a unique element r with non-zero pointers; for each j ≠ b, following a path T (j) from r gives a zero in the j th column. 1 1 1 1 010 00100 1 01
27
Q2/R0Q2/R0 versus D
28
Definition Back pointers to columns. F=1 iff all the leaves back point to the all-1 column b. 1 1 1 1 010 00100 1 01
29
Summary 1 1 1 1 010 00100 1 01
30
LowerBound 1. 0 1 0 1 1 1 1 0 0 0 0 1 1 1 1 Adversary method. Let n=2m. If the k-th element is queried in a column: If k≤m, return Otherwise, return with back pointer to column k-m. At the end, the column contains m and m with back pointers to all columns 1, 2,..., m.
31
LowerBound 1 1 1 1 0 0 0 0 1 1 1 1 1 At the end, the column contains m and m with back pointers to all columns 1, 2,..., m. 0 The algorithm does not know the value of the function until it has queried >m elements in each of m columns.
32
UpperBound:Informal Each column contains a back pointer to the all-1 column 1 1 1 1 010 00100 1 01 BUT there can be several back pointers Which is the right one?
33
1 1 1 1 0 0 0 0 UpperBound:Informal We try each back pointer by quering a few elements in the column, and proceed to a column where no zeroes were found. Even if this is not the all-1 column, we can find a column with fewer 0s, with a high probability.
34
1 1 1 1 0 0 0 0 UpperBound:Informal Column with M zeroes Column with M/2 zeroes Column with M/4 zeroes...
35
UpperBound:Formal Algorithm Let c the first column, and k ← n. be While k> 1,> 1, Let c ← ProcessColumn( c, k ), and k ← k/2. ProcessColumn(column c,integer k ) Query all elements in column c. no zeroes, verify column c. Ifthereare For each zero a : Let j of a. in column j. (Probability < be the backpointer (n/k) elements 1 Query O (nm) 2 that no zero found if there are > k/2 of them). If no zero was found, return j. Reject
36
Summary a D algorithm is Ω(nm ). Lower Upper bound for (n + m). Quadratic separation between R 0 and D. a R 0 algorithm is O 4th power separation between Q 2 and D.
37
Conclusion
38
Results Q 2 = O(D 1/4 ) Q 2 = O(R 0 1/3 ) Q E = O(R 0 1/2 ) Q E = O(R 1 2/3 ) R 1 = O(R 0 1/2 ) R 0 = O(D 1/2 ) Optimal
39
Open Problems Can we resolve R 2 ↔ D ? Known: R 2 = Ω(D 1/3 ) and R 2 = O(D 1/2 ). Can we separate R 2 from R 1 ? The same about Q ↔ D Known: Q = Ω(D 1/6 ) and Q = O(D 1/4 ) and Q E ↔ D ? Known: Q E = Ω(D 1/3 ) and Q E = O(D 1/2 ).
40
AnyAnyquestions?
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.