Computability & Complexity II Chris Umans Caltech.

Slides:



Advertisements
Similar presentations
CS151 Complexity Theory Lecture 3 April 6, CS151 Lecture 32 Introduction A motivating question: Can computers replace mathematicians? L = { (x,
Advertisements

FORMAL LANGUAGES, AUTOMATA AND COMPUTABILITY
NP-complete and NP-hard problems Transitivity of polynomial-time many-one reductions Concept of Completeness and hardness for a complexity class Definition.
INHERENT LIMITATIONS OF COMPUTER PROGRAMS CSci 4011.
Umans Complexity Theory Lectures Lecture 2a: Reductions & Completeness.
Complexity 12-1 Complexity Andrei Bulatov Non-Deterministic Space.
1 L is in NP means: There is a language L’ in P and a polynomial p so that L 1 · L 2 means: For some polynomial time computable map r : 8 x: x 2 L 1 iff.
Complexity 18-1 Complexity Andrei Bulatov Probabilistic Algorithms.
CS151 Complexity Theory Lecture 3 April 6, Nondeterminism: introduction A motivating question: Can computers replace mathematicians? L = { (x,
CS151 Complexity Theory Lecture 6 April 15, 2015.
CS151 Complexity Theory Lecture 5 April 13, 2015.
CS21 Decidability and Tractability
CS21 Decidability and Tractability
February 23, 2015CS21 Lecture 201 CS21 Decidability and Tractability Lecture 20 February 23, 2015.
Tractable and intractable problems for parallel computers
CS151 Complexity Theory Lecture 7 April 20, 2004.
CS151 Complexity Theory Lecture 1 March 30, 2015.
CS151 Complexity Theory Lecture 5 April 13, 2004.
FORMAL LANGUAGES, AUTOMATA AND COMPUTABILITY Read sections 7.1 – 7.3 of the book for next time.
CS151 Complexity Theory Lecture 7 April 20, 2015.
Computability and Complexity 20-1 Computability and Complexity Andrei Bulatov Class NL.
Analysis of Algorithms CS 477/677
CS151 Complexity Theory Lecture 2 April 3, Time and Space A motivating question: –Boolean formula with n nodes –evaluate using O(log n) space?
CS151 Complexity Theory Lecture 1 March 30, 2004.
CS Master – Introduction to the Theory of Computation Jan Maluszynski - HT Lecture NP-Completeness Jan Maluszynski, IDA, 2007
Chapter 11: Limitations of Algorithmic Power
Toward NP-Completeness: Introduction Almost all the algorithms we studies so far were bounded by some polynomial in the size of the input, so we call them.
Chapter 11 Limitations of Algorithm Power Copyright © 2007 Pearson Addison-Wesley. All rights reserved.
Umans Complexity Theory Lectures Lecture 2c: EXP Complete Problem: Padding and succinctness.
February 20, 2015CS21 Lecture 191 CS21 Decidability and Tractability Lecture 19 February 20, 2015.
CS151 Complexity Theory Lecture 2 April 1, CS151 Lecture 22 Time and Space A motivating question: –Boolean formula with n nodes –evaluate using.
INHERENT LIMITATIONS OF COMPUTER PROGRAMS CSci 4011.
February 18, 2015CS21 Lecture 181 CS21 Decidability and Tractability Lecture 18 February 18, 2015.
Computational Complexity Theory Lecture 2: Reductions, NP-completeness, Cook-Levin theorem Indian Institute of Science.
Theory of Computing Lecture 17 MAS 714 Hartmut Klauck.
Prabhas Chongstitvatana1 NP-complete proofs The circuit satisfiability proof of NP- completeness relies on a direct proof that L  p CIRCUIT-SAT for every.
Fall 2013 CMU CS Computational Complexity Lecture 3 and 4 Non-determinism, NTIME hierarchy threorem, Ladner’s Proof 9/17/2013.
CSCI 2670 Introduction to Theory of Computing November 29, 2005.
Week 10Complexity of Algorithms1 Hard Computational Problems Some computational problems are hard Despite a numerous attempts we do not know any efficient.
CSE 024: Design & Analysis of Algorithms Chapter 9: NP Completeness Sedgewick Chp:40 David Luebke’s Course Notes / University of Virginia, Computer Science.
1 Lower Bounds Lower bound: an estimate on a minimum amount of work needed to solve a given problem Examples: b number of comparisons needed to find the.
Complexity 25-1 Complexity Andrei Bulatov Counting Problems.
CSC401 – Analysis of Algorithms Chapter 13 NP-Completeness Objectives: Introduce the definitions of P and NP problems Introduce the definitions of NP-hard.
CS151 Complexity Theory Lecture 12 May 6, QSAT is PSPACE-complete Theorem: QSAT is PSPACE-complete. Proof: 8 x 1 9 x 2 8 x 3 … Qx n φ(x 1, x 2,
Umans Complexity Theory Lectures Lecture 1a: Problems and Languages.
1 Chapter 34: NP-Completeness. 2 About this Tutorial What is NP ? How to check if a problem is in NP ? Cook-Levin Theorem Showing one of the most difficult.
Fall 2013 CMU CS Computational Complexity Lecture 7 Alternating Quantifiers, PH, PSPACE. These slides are mostly a resequencing of Chris Umans’
Fall 2013 CMU CS Computational Complexity Lectures 8-9 Randomness, communication, complexity of unique solutions These slides are mostly a resequencing.
Umans Complexity Theory Lectures Lecture 7b: Randomization in Communication Complexity.
Umans Complexity Theory Lectures Lecture 2b: A P-Complete Problem.
CSCI 3130: Formal languages and automata theory Andrej Bogdanov The Chinese University of Hong Kong The Cook-Levin.
CS6045: Advanced Algorithms NP Completeness. NP-Completeness Some problems are intractable: as they grow large, we are unable to solve them in reasonable.
Umans Complexity Theory Lectures Lecture 1c: Robust Time & Space Classes.
Fall 2013 CMU CS Computational Complexity Lecture 2 Diagonalization, 9/12/2013.
NP-complete Languages
CSCI 2670 Introduction to Theory of Computing December 2, 2004.
CSCI 2670 Introduction to Theory of Computing December 7, 2005.
Computability Examples. Reducibility. NP completeness. Homework: Find other examples of NP complete problems.
The NP class. NP-completeness Lecture2. The NP-class The NP class is a class that contains all the problems that can be decided by a Non-Deterministic.
1 Design and Analysis of Algorithms Yoram Moses Lecture 13 June 17, 2010
Umans Complexity Theory Lectures
Intro to Theory of Computation
CS151 Complexity Theory Lecture 2 April 5, 2017.
Chapter 34: NP-Completeness
CS21 Decidability and Tractability
Umans Complexity Theory Lectures
CS21 Decidability and Tractability
CS151 Complexity Theory Lecture 7 April 23, 2019.
CS151 Complexity Theory Lecture 1 April 2, 2019.
CS151 Complexity Theory Lecture 5 April 16, 2019.
Presentation transcript:

Computability & Complexity II Chris Umans Caltech

June 25, 2004CBSSS2 Complexity Theory Classify problems according to the computational resources required –running time –storage space –parallelism –randomness –rounds of interaction, communication, others… Attempt to answer: what is computationally feasible with limited resources?

June 25, 2004CBSSS3 The central questions Is finding a solution as easy as recognizing one? P = NP? Is every sequential algorithm parallelizable? P = NC? Can every efficient algorithm be converted into one that uses a tiny amount of memory? P = L? Are there small Boolean circuits for all problems that require exponential running time? EXP  P/poly? Can every randomized algorithm be converted into a deterministic algorithm one? P = BPP?

June 25, 2004CBSSS4 Central Questions We think we know the answers to all of these questions … … but no one has been able to prove that even a small part of this “world-view” is correct. If we’re wrong on any one of these then computer science will change dramatically

June 25, 2004CBSSS5 Outline We’ll look at three of these problems: –P vs. NP “the power of nondeterminism” –L vs. P “the power of sequential computation” –P vs. BPP “the power of randomness”

June 25, 2004CBSSS6 P vs. NP

June 25, 2004CBSSS7 Completeness In an ideal world, given language L 1. state an algorithm deciding 2. prove that no algorithm does better we are pretty good at part 1 we are currently completely helpless when it comes to part 2, for most problems that we care about

June 25, 2004CBSSS8 Completeness in place of part 2 we can –relate the difficulty of problems to each other via reductions –prove that a problem is a “hardest” problem in a complexity class via completeness powerful, successful surrogate for lower bounds

June 25, 2004CBSSS9 Recall: reductions “many-one” reduction: –now, require f computable in polynomial time yes no yes no AB reduction from language A to language B f f

June 25, 2004CBSSS10 Completeness complexity class C (set of languages) language L is C-complete if –L is in C –every language in C reduces to L formalizes “L is hardest problem in complexity class C”: –L in P implies every language in C is in P related concept: language L is C-hard if –every language in C reduces to L

June 25, 2004CBSSS11 Some problems we care about 3SAT = {  : 3-CNF  is satisfiable} TSP = { : there is a tour of all nodes of G with weight ≤ k} SUBSET-SUM = { : some subset of {x 1, x 2, … x n } sums to B} IS= { : there is a subset of at least k vertices of G with no edges between them} Only know exponential time algorithms for these problems

June 25, 2004CBSSS12 From last week… We have defined the complexity classes P (polynomial time), EXP (exponential time) all languages decidable RE HALT EXP P some language

June 25, 2004CBSSS13 Why not EXP 3-SAT, TSP, SUBSET-SUM, IS, … Why not show these are EXP-complete? all possess positive feature that some problems in EXP probably don’t have: a solution can be recognized in polynomial time

June 25, 2004CBSSS14 NP Definition: NP = all languages decidable by a Nondeterministic TM in Polynomial time Theorem: language L is in NP if and only if it is expressible as: L = { x |  y, |y| ≤ |x| k, (x, y)  R } where R is a language in P. poly-time TM M R deciding R is a “verifier” “witness” or “certificate” efficiently verifiable

June 25, 2004CBSSS15 Poly-time verifiers Example: 3SAT expressible as 3SAT = {φ : φ is a 3-CNF formula for which  assignment A for which (φ, A)  R} R = {(φ, A) : A is a sat. assign. for φ} –satisfying assignment A is a “witness” of the satisfiability of φ (it “certifies” satisfiability of φ) –R is decidable in poly-time

June 25, 2004CBSSS16 all languages EXP decidable NP P P  NP  EXP believe all containments proper assuming P ≠ NP, can prove a problem is hard by showing it is NP-complete

June 25, 2004CBSSS17 NP-complete problems L is NP-complete if –L is in NP –every problem in NP reduces to L Theorem (Cook): 3SAT is NP-complete. –opened door to showing thousands of problems we care about are NP-complete

June 25, 2004CBSSS18 NP-complete problems Example reduction: –3SAT = { φ : φ is a 3-CNF Boolean formula that has a satisfying assignment } (3-CNF = AND of OR of ≤ 3 literals) –IS = { | G is a graph with an independent set V’  V of size ≥ k } (ind. set = set of vertices no 2 of which are connected by an edge)

June 25, 2004CBSSS19 Ind. Set is NP-complete The reduction f: given φ = (x  y   z)  (  x  w  z)  …  (…) we produce graph G φ : x y zz  x x wz one triangle for each of m clauses edge between every pair of contradictory literals set k = m …

June 25, 2004CBSSS20 Ind. Set is NP-complete φ = (x  y   z)  (  x  w  z)  …  (…) Claim: φ has a satisfying assignment if and only if G has an independent set of size at least k –Proof? IS  NP. Reduction shows NP-complete. x y zz  x x wz …

June 25, 2004CBSSS21 Interlude: Transforming Turing Machine computations into Boolean circuits

June 25, 2004CBSSS22 Configurations Useful convention: Turing Machine configurations. Any point in computation represented by string: C = x 1 x 2 … x i q x i+1 x i+2 … x m start configuration for single-tape TM on input x: q start x 1 x 2 …x n x1x1... state = q x2x2 …xixi x i+1 …xmxm

June 25, 2004CBSSS23 Turing Machine Tableau Tableau (configurations written in an array) for machine M on input x=x 1 …x n : x1/qsx1/qs x2x2 …xnxn _ … x1x1 x2/q1x2/q1 …xnxn _ … x1/q1x1/q1 a…xnxn _ … _/q a _…__ … height = time taken width = space used

June 25, 2004CBSSS24 Turing Machine Tableau Important observation: contents of cell in tableau determined by 3 others above it: a/q 1 ba b/q 1 a a a aba b

June 25, 2004CBSSS25 Turing Machine Tableau Can build Boolean circuit STEP –input (binary encoding of) 3 cells –output (binary encoding of) 1 cell ab/q 1 a a STEP each output bit is some function of inputs can build circuit for each size is independent of size of tableau

June 25, 2004CBSSS26 Turing Machine Tableau w copies of STEP compute row i from i-1 x1/qsx1/qs x2x2 …xnxn _ … x1x1 x2/q1x2/q1 …xnxn _ … width w tableau for M on input x … … STEP

June 25, 2004CBSSS27 x 1/ q s x2x2 …xnxn _ … STEP iff cell contains q accept ignore these Circuit C built from TM M C(x) = 1 iff M accepts input x x1x1 x2x2 xnxn Size = O(width x height)

June 25, 2004CBSSS28 Back to NP-completeness

June 25, 2004CBSSS29 Circuit-SAT is NP-complete Circuit SAT: given a Boolean circuit (gates , ,  ), with variables y 1, y 2, …, y m is there some assignment that makes it output 1? Theorem: Circuit SAT is NP-complete. Proof: –clearly in NP

June 25, 2004CBSSS30 NP-completeness –Given L  NP of form L = { x |  y such that (x, y)  R } x1x1 x2x2 …xnxn y1y1 y2y2 …ymym circuit produced from TM deciding R 1 iff (x,y)  R –hardwire input x; leave y as variables

June 25, 2004CBSSS31 3SAT is NP-complete Idea: auxiliary variable for each gate in circuit x 1 x 2 x 3 x 4 …x n gigi (g i  z) (  z   g i ) (z   g i )  z gigi (  z 1  g i ) (  z 2  g i ) (  g i  z 1  z 2 ) (z 1  z 2  g i )  z 1 z 2 gigi (  g i  z 1 ) (  g i  z 2 ) (  z 1   z 2  g i ) (z 1  z 2  g i )  z 1 z 2

June 25, 2004CBSSS32 P vs. NP Most “hard” problems we care about have turned out to be NP-complete –L is in NP –every problem in NP reduces to L Not known that P ≠ NP arguably most significant open problem in Computer Science and Math.

June 25, 2004CBSSS33 L vs. P

June 25, 2004CBSSS34 Multitape TMs A useful variant: k-tape TM q0q0 read-only input tape finite control … k read/write heads k-1 “work tapes” … … 0 … …

June 25, 2004CBSSS35 Space complexity SPACE(f(n)) = languages decidable by a multi-tape TM that touches at most f(n) squares of its work tapes, where n is the input length, and f :N  N Important space class: L = SPACE (O(log n))

June 25, 2004CBSSS36 L and P configuration graph: nodes are configurations, edge (C, C’) iff C yields C’ in one step # configurations for a TM that runs in space O(log n) is n k for some constant k can determine if reach q accept or q reject from start configuration by exploring configuration graph (e.g. use DFS) Conclude: L  P

June 25, 2004CBSSS37 all languages EXP decidable NP P L  P  NP  EXP believe all containments proper L

June 25, 2004CBSSS38 Logspace A question: –Boolean formula with n nodes –evaluate using O(log n) space?    101 depth-first traversal requires storing intermediate values idea: short-circuit ANDs and ORs when possible

June 25, 2004CBSSS39 Logspace   10   101  Can we evaluate an n node Boolean circuit using O(log n) space?

June 25, 2004CBSSS40 A P-complete problem We don’t know how to prove L ≠ P But, can identify problems in P least likely to be in L using P-completeness. need stronger reduction (why?) yes no yes no L1L1 L2L2 f f

June 25, 2004CBSSS41 A P-complete problem logspace reduction: f computable by TM that uses O(log n) space Theorem: If L 2 is P-complete, then L 2 in L implies L = P

June 25, 2004CBSSS42 A P-complete problem Circuit Value (CVAL): given a variable-free Boolean circuit (gates , , , 0, 1), does it output 1? Theorem: CVAL is P-complete. Proof: –already argued in P

June 25, 2004CBSSS43 A P-complete problem –L arbitrary language in P –TM M decides L in n k steps x1x1 x2x2 x3x3 …xnxn circuit produced from TM deciding L 1 iff x  L –hardwire input x –poly-size circuit; logspace reduction

June 25, 2004CBSSS44 L vs. P Not known that L ≠ P Interesting connection to parallelizability: –small-space algorithm automatically gives efficient parallel algorithm –efficient parallel algorithm automatically gives small-space algorithm P-complete problems least likely to be parallelizable

June 25, 2004CBSSS45 P vs. BPP

June 25, 2004CBSSS46 Communication complexity Goal: compute f(x, y) while communicating as few bits as possible between Alice and Bob count number of bits exchanged (computation free) at each step: one party sends bits that are a function of held input and received bits so far two parties: Alice and Bob function f:{0,1} n x {0,1} n  {0,1} Alice holds x  {0,1} n ; Bob holds y  {0,1} n

June 25, 2004CBSSS47 Communication complexity simple function (equality): EQ(x, y) = 1 iff x = y simple protocol: –Alice sends x to Bob (n bits) –Bob sends EQ(x, y) to Alice (1 bit) –total: n + 1 bits

June 25, 2004CBSSS48 Communication complexity Can we do better? –deterministic protocol? –probabilistic protocol? at each step: one party sends bits that are a function of held input and received bits so far and the result of some coin tosses required to output f(x, y) with high probability over all coin tosses

June 25, 2004CBSSS49 Communication complexity Theorem: no deterministic protocol can compute EQ(x, y) while exchanging fewer than n+1 bits. Proof: –“input matrix”: X = {0,1} n Y = {0,1} n f(x,y)

June 25, 2004CBSSS50 Communication complexity –assume without loss of generality 1 bit sent at a time –A sends 1 bit depending only on x: X = {0,1} n Y = {0,1} n inputs x causing A to send 1 inputs x causing A to send 0

June 25, 2004CBSSS51 Communication complexity –B sends 1 bit depending only on y and received bit: X = {0,1} n Y = {0,1} n inputs y causing B to send 1 inputs y causing B to send 0

June 25, 2004CBSSS52 Communication complexity –at end of protocol involving k bits of communication, matrix is partitioned into at most 2 k combinatorial rectangles –bits sent in protocol are the same for every input (x, y) in given rectangle –conclude: f(x,y) must be constant on each rectangle

June 25, 2004CBSSS53 Communication complexity –any partition into combinatorial rectangles with constant f(x,y) must have 2 n + 1 rectangles –protocol that exchanges ≤ n bits can only create 2 n rectangles, so must exchange at least n+1 bits. X = {0,1} n Y = {0,1} n Matrix for EQ:

June 25, 2004CBSSS54 Communication complexity protocol for EQ employing randomness? –Alice picks random prime p in {1...4n 2 }, sends: p (x mod p) –Bob sends: (y mod p) –players output 1 if and only if: (x mod p) = (y mod p)

June 25, 2004CBSSS55 Communication complexity –O(log n) bits exchanged –if x = y, always correct –if x ≠ y, incorrect if and only if: p divides |x – y| –# primes in range is ≥ 2n –# primes dividing |x – y| is ≤ n –probability incorrect ≤ 1/2 Randomness gives an exponential advantage!!

June 25, 2004CBSSS56 BPP model: probabilistic Turing Machine –deterministic TM with additional read-only tape containing “coin flips” BPP (Bounded-error Probabilistic Poly-time) –L  BPP if there is a p.p.t. TM M: x  L  Pr y [M(x,y) accepts] ≥ 2/3 x  L  Pr y [M(x,y) rejects] ≥ 2/3 –“p.p.t” = probabilistic polynomial time

June 25, 2004CBSSS57 all languages EXP decidable P L  P  BPP  EXP L BPP

June 25, 2004CBSSS58 P vs. BPP Theorem: If E ( = TIME(2 O(n) )) contains a hard problem (L that requires exponential-size Boolean circuits) then P = BPP. “for decision problems, randomness probably does not help”