Intro to Theory of Computation

Slides:



Advertisements
Similar presentations
Measuring Time Complexity
Advertisements

The Recursion Theorem Sipser – pages Self replication Living things are machines Living things can self-reproduce Machines cannot self reproduce.
Measuring Time Complexity Sipser 7.1 (pages )
Chapter 3 Growth of Functions
FORMAL LANGUAGES, AUTOMATA AND COMPUTABILITY Read sections 7.1 – 7.3 of the book for next time.
1 Data Structures A program solves a problem. A program solves a problem. A solution consists of: A solution consists of:  a way to organize the data.
CS Master – Introduction to the Theory of Computation Jan Maluszynski - HT Lecture 8+9 Time complexity 1 Jan Maluszynski, IDA, 2007
25 June 2015Comp 122, Spring 2004 Asymptotic Notation, Review of Functions & Summations.
CS 310 – Fall 2006 Pacific University CS310 Complexity Section 7.1 November 27, 2006.
DATA STRUCTURES AND ALGORITHMS Lecture Notes 1 Prepared by İnanç TAHRALI.
Algorithm analysis and design Introduction to Algorithms week1
CS 461 – Nov. 21 Sections 7.1 – 7.2 Measuring complexity Dividing decidable languages into complexity classes. Algorithm complexity depends on what kind.
Mathematics Review and Asymptotic Notation
1 Computer Algorithms Lecture 3 Asymptotic Notation Some of these slides are courtesy of D. Plaisted, UNC and M. Nicolescu, UNR.
Theory of Computing Lecture 15 MAS 714 Hartmut Klauck.
Asymptotic Analysis-Ch. 3
CMPT 438 Algorithms Chapter 3 Asymptotic Notations.
Theory of Computing Lecture 21 MAS 714 Hartmut Klauck.
Measuring complexity Section 7.1 Giorgi Japaridze Theory of Computability.
Growth of Functions. 2 Analysis of Bubble Sort 3 Time Analysis – Best Case The array is already sorted – no swap operations are required.
Asymptotic Growth Rates  Themes  Analyzing the cost of programs  Ignoring constants and Big-Oh  Recurrence Relations & Sums  Divide and Conquer 
Big-O. Algorithm Analysis Exact analysis: produce a function f(n) measuring how many basic steps are needed for a given inputs n On any input of size.
Analysis of Algorithm. Why Analysis? We need to know the “behavior” of algorithms – How much resource (time/space) does it use So that we know when to.
13 February 2016 Asymptotic Notation, Review of Functions & Summations.
Asymptotic Notation Faculty Name: Ruhi Fatima
Algorithms Lecture #05 Uzair Ishtiaq. Asymptotic Notation.
Sofya Raskhodnikova Intro to Theory of Computation L ECTURE 20 Last time Mapping reductions Computation history method Today Computation history method.
CSCI 2670 Introduction to Theory of Computing November 15, 2005.
Analysis of Algorithms CS 477/677 Instructor: Monica Nicolescu Lecture 2.
February 8, 2016CS21 Lecture 151 CS21 Decidability and Tractability Lecture 15 February 8, 2016.
CSCI 2670 Introduction to Theory of Computing
Introduction to Algorithms: Asymptotic Notation
CSC 4170 Theory of Computation The class P Section 7.2.
Complexity & the O-Notation
Time complexity Here we will consider elements of computational complexity theory – an investigation of the time (or other resources) required for solving.
Busch Complexity Lectures: Turing Machines
Asymptotic Notations Algorithms perform f(n) basic operations to accomplish task Identify that function Identify size of problem (n) Count number of operations.
Asymptotic Analysis.
Introduction to Algorithms
What is an Algorithm? Algorithm Specification.
Complexity & the O-Notation
CSC 4170 Theory of Computation The class P Section 7.2.
Intro to Theory of Computation
Growth of functions CSC317.
CS 3343: Analysis of Algorithms
Intro to Theory of Computation
Asymptotic Notations Algorithms Lecture 9.
Intro to Theory of Computation
CSCI 2670 Introduction to Theory of Computing
Asymptotic Notations Algorithms perform f(n) basic operations to accomplish task Identify that function Identify size of problem (n) Count number of operations.
Intro to Theory of Computation
Asymptotic Analysis.
Chapter 2: Fundamentals of the Analysis of Algorithm Efficiency
Chapter 14 Time Complexity.
Advanced Analysis of Algorithms
CS21 Decidability and Tractability
CS154, Lecture 12: Time Complexity
Intro to Theory of Computation
CSC 4170 Theory of Computation Time complexity Section 7.1.
Theory of Computability
Asymptotic Notations Algorithms perform f(n) basic operations to accomplish task Identify that function Identify size of problem (n) Count number of operations.
G.PULLAIAH COLLEGE OF ENGINEERING AND TECHNOLOGY
At the end of this session, learner will be able to:
Theory of Computability
CS210- Lecture 2 Jun 2, 2005 Announcements Questions
CSC 4170 Theory of Computation Time complexity Section 7.1.
Theory of Computability
Richard Anderson Autumn 2015 Lecture 4
Intro to Theory of Computation
Presentation transcript:

Intro to Theory of Computation LECTURE 20 Last time Computation history method Review, test. Today Recursion theorem Complexity theory CS 332 Sofya Raskhodnikova

Recursion Theorem Making TMs that can obtain their own descriptions with applications to computer viruses 10/13/2019

A TM 𝑷 𝒘 that prints w There is a computable function q that on input w outputs 𝑃 𝑤 , where 𝑃 𝑤 is a TM that prints w. TM computing q: ``On input w, Print 〈 𝑃 𝑤 〉 and halt.’’ 𝑃 𝑤 =``Erase input. Print w and halt.’’ 10/13/2019

TM S that prints 〈𝑺〉 Theorem. There is a TM S that erases input, prints 〈𝑆〉 and halts. Proof: 𝑞 𝑤 = 〈𝑃 𝑤 〉 A= 𝑃 〈𝐵〉 B =``On input 𝑀 , where 𝑀 is a TM, Compute 𝑃 〈𝑀〉 =𝑞( 𝑀 ). Construct TM 𝑆′: Output 〈 𝑆 ′ 〉 and halt.’’ S =``Erase input. Run TM A. Run TM B. Halt.’’ 𝑺′ =``Erase input. Run TM 𝑃 〈𝑀〉 . Run TM 𝑀. Halt.’’ 10/13/2019

In English Write this sentence. Write two copies of the following, the second one in quotes: ``Write two copies of the following, the second one in quotes:’’ 10/13/2019

Recursion Theorem If there is a TM T that computes a function 𝑡 𝑤, 𝑀 then there is a TM R that computes 𝑟 𝑤 =𝑡 𝑤, 𝑅 . Punchline: ``Obtain your own description’’ is a valid step in an algorithmic description of a TM. P’ is P modified to leave w# on the tape; R accepts if T accepts; rejects if it rejects R =``On input 𝑤, Place # after 𝑤. Run TM A. Run TM B. Run TM T.’’ A= 𝑃′ 〈𝐵,𝑇〉 10/13/2019

Recursion Theorem If there is a TM T that computes a function 𝑡 𝑤, 𝑀 then there is a TM R that computes 𝑟 𝑤 =𝑡 𝑤, 𝑅 . B =``On input w# 𝑀 1, 𝑀 2 , where 𝑀 1 , 𝑀 2 are TMs, Compute 𝑃′ 〈 𝑀 1 , 𝑀 2 〉 =𝑞′( 𝑀 1 , 𝑀 2 ). Construct TM 𝑅′: Output w#〈 𝑅 ′ 〉 and halt.’’ 𝑹 =``On input 𝑤, Place # after 𝑤. Run TM A. Run TM B. Run TM T.’’ 𝑹′ =``On input 𝑤, Place # after 𝑤. Run TM 𝑃′ 〈 𝑀 1 , 𝑀 2 〉 . Run TM 𝑀 1 . Run TM 𝑀 2 ’’ A= 𝑃′ 〈𝐵,𝑇〉 10/13/2019

Recursion Theorem Punchline: ``Obtain your own description’’ is a valid step in an algorithmic description of a TM. P’ is P modified to leave w# on the tape; R accepts if T accepts; rejects if it rejects 10/13/2019

Application of recursion theorem Give an alternative proof that 𝐴 𝑇𝑀 is undecidable. (on the board) 10/13/2019

Application of recursion theorem A TM M is minimal if there is no TM equivalent to M that has a shorter description than 𝑀 . 𝑀𝐼 𝑁 𝑇𝑀 ={ 𝑀 ∣𝑀 is minimal TM}. Show that 𝑀𝐼 𝑁 𝑇𝑀 is not Turing-recognizable. (on the board) 10/13/2019

Complexity Theory Already learned Automata theory Computability theory Last unit: complexity theory First topic: time complexity Measuring complexity (as in Algorithms) Asymptotic notation (as in Algorithms) Relationships between models 10/13/2019

How much time/memory needed to decide a language? Example: Consider 𝐴= 0 𝑚 1 𝑚 𝑚≥0}. Time needed for 1-tape TM? 𝑀 1 =“ FINITE CONTROL .. 1 1 .. 1 Scan input and reject if it is not of the form 0 ∗ 1 ∗ . Repeat while both 0s and 1s remain on the tape: Cross off one 0 and one 1 Accept if no 0s and no 1s left; otherwise reject.” 10/13/2019

Running time analysis If M is a TM and 𝑓:ℕ→ℕ then Focus on worst case: upper bound on running time for all inputs of given length Exact time depends on the computer instead measure asymptotic growth If M is a TM and 𝑓:ℕ→ℕ then “M runs in time 𝑓(𝑛)” means for every input 𝑤∈ Σ ∗ of length 𝑛, M on 𝑤 halts within 𝑓(𝑛) steps 10/13/2019

Asymptotic notation O-notation (upper bounds): f(n) = O(g(n)) means there exist constants c > 0, n0 > 0 such that 0  f(n)  cg(n) for all n  n0. EXAMPLE: 2n2 = O(n3) (c = 2, n0 = 1) functions, not values 10/13/2019 L20.16

Asymptotic Notation One-sided equality: 𝑇(𝑛) = O(𝑓(𝑛)). Not transitive: 𝑓(𝑛) = 5𝑛3; 𝑔(𝑛) = 3𝑛2 𝑓(𝑛) = O(𝑛3) and 𝑔(𝑛) = O(𝑛3) but 𝑓 𝑛 ≠𝑔(𝑛). Alternative notation: 𝑓(𝑛)  O(𝑔(𝑛)). 10/13/2019

Examples 106 𝑛3+ 2𝑛2 −𝑛 +10 = O(𝑛3) 𝑛 + log⁡𝑛 = 𝑛 (log⁡𝑛+ 𝑛 ) = O( 𝑛 ) also O(𝑛2) 10/13/2019

-notation (lower bounds) O-notation is an upper-bound notation. It makes no sense to say f(n) is at least O(n2). f(n) = (g(n)) means there exist constants c > 0, n0 > 0 such that 0  cg(n)  f(n) for all n  n0. EXAMPLE: (c = 1, n0 = 16) 𝑛 =Ω( log 𝑛) 10/13/2019

-notation (lower bounds) Be careful: “Any comparison-based sorting algorithm requires at least O(𝑛 log 𝑛) comparisons.” Meaningless! Use  for lower bounds. 10/13/2019

-notation (tight bounds) (g(n)) means both O(g(n)) and (g(n)) EXAMPLE: Polynomials are simple: ad nd + ad–1nd–1 +  + a1n + a0 = (nd) 10/13/2019

o-notation and -notation O-notation and -notation are like  and . o-notation and -notation are like < and >. f(n) = o(g(n)) means for every constant c > 0, there exists a constants n0 > 0 such that 0  f(n)  cg(n) for all n  n0. EXAMPLE: 2n2 = o(n3) (n0 = 2/c) 10/13/2019

Overview Notation … means … Think… Example lim n←∞ 𝑓(𝑛) 𝑔 𝑛 f(n)=O(g(n)) ∃ c>0, n0>0, ∀ n > n0 : 0 ≤f(n) < cg(n) Upper bound 100n2 = O(n3) If it exists, it is < 1 f(n)=(g(n)) ∃c>0, n0>0, ∀ n > n0 : 0 ≤ cg(n) < f(n) Lower bound n100 = (2n) If it exists, it is > 0 f(n)=(g(n)) both of the above: f=(g) and f = O(g) Tight bound log(n!) = (n log n) If it exists, it is > 0 and < 1 f(n)=o(g(n)) ∀ c>0, ∃ n0>0, ∀ n > n0 : 0 ≤ f(n) < cg(n) Strict upper bound n2 = o(2n) Limit exists, =0 f(n)=(g(n)) ∀ c>0, ∃n0>0, ∀ n > n0 : 0 ≤ cg(n) < f(n) Strict lower bound n2 = (log n) Limit exists, =1 10/13/2019

Common Functions: Asymptotic Bounds Polynomials. a0 + a1n + … + adnd is (nd) if ad > 0. Logarithms. log a n = (log b n) for all constants a, b > 0. For every x > 0, log n = o(nx). Exponentials. For all r >1 and all d > 0, nd = o(rn). Factorial. 𝑛!=𝑛 𝑛−1 ⋯1. By Sterling’s formula, log grows slower than every polynomial can avoid specifying the base Every exponential grows faster than every polynomial 10/13/2019

Sort by asymptotic order of growth n log n log n n2 2n n log log n n! n1,000,000 n1/log(n) log(n!) 𝑛 2 2 𝑛 2 2 2 𝑛 10/13/2019

Time complexity classes TIME(𝒇(𝒏)) is a class of languages. 𝑨∈ TIME(𝒇(𝒏)) means that some 1-tape TM M that runs in time O(𝑓(𝑛)) decides A. 10/13/2019

The class P P is the class of languages decidable in polynomial time on a deterministic 1-tape TM: 𝑷= 𝑘 𝑇𝐼𝑀𝐸 𝑛 𝑘 . The same class even if we substitute another reasonable deterministic model. Roughly the class of problems realistically solvable on a computer. 10/13/2019