Download presentation
Presentation is loading. Please wait.
Published byBuck Harrell Modified over 9 years ago
1
1 Jeff Edmonds York University COSC 2011 Abstract Data Types Positions and Pointers Loop Invariants System Invariants Time Complexity Classifying Functions Adding Made Easy Understand Quantifiers Recursion Midterm Review
2
2 Review slides.
3
3 Midterm Review Review slides.
4
4 Midterm Review Review slides.
5
5 Midterm Review Review slides.
6
6 Midterm Review Review slides.
7
7 Midterm Review Review slides. Review the assignment notes and solutions!
8
8 Midterm Review Review slides. Review the assignment notes and solutions! Review 3101 Steps0: Basic Math First Order Logic Time Complexity Logs and Exp Growth Rates Adding Made Easy Recurrence Relations
9
9 Midterm Review Review slides. Review the assignment notes and solutions! Review 3101 Steps0: Basic Math Steps1: Loop Invariants Steps2: Recursion
10
10 Jeff Edmonds York University COSC 2011 Lecture 1 Abstractions (Hierarchy) Elements Sets Lists, Stacks, & Queues Trees Graphs Iterators Abstract Positions/Pointers Abstract Data Types
11
11 Software Engineering Software must be: –Readable and understandable Allows correctness to be verified, and software to be easily updated. –Correct and complete Works correctly for all expected inputs –Robust Capable of handling unexpected inputs. –Adaptable All programs evolve over time. Programs should be designed so that re-use, generalization and modification is easy. –Portable Easily ported to new hardware or operating system platforms. –Efficient Makes reasonable use of time and memory resources. James Elder
12
Abstract Data Types (ADTs) An ADT is a model of a data structure that specifies –The type of data stored –Operations supported on these data An ADT does not specify how the data are stored or how the operations are implemented. The abstraction of an ADT facilitates –Design of complex systems. Representing complex data structures by concise ADTs, facilitates reasoning about and designing large systems of many interacting data structures. –Encapsulation/Modularity. If I just want to use an object / data structure, all I need to know is its ADT (not its internal workings). Abstraction James Elder
13
13 Abstract Data Types Restricted Data Structure: Some times we limit what operation can be done for efficiency understanding Stack: A list, but elements can only be pushed onto and popped from the top. Queue: A list, but elements can only be added at the end and removed from the front. Important in handling jobs. Priority Queue: The “highest priority” element is handled next.
14
14 Data Structures Implementations Array List –(Extendable) Array Node List –Singly or Doubly Linked List Stack –Array –Singly Linked List Queue –Circular Array –Singly or Doubly Linked List Priority Queue –Unsorted doubly-linked list –Sorted doubly-linked list –Heap (array-based) Adaptable Priority Queue –Sorted doubly-linked list with location-aware entries –Heap with location-aware entries Tree –Linked Structure Binary Tree –Linked Structure –Array
15
15 Jeff Edmonds York University COSC 2011 Lecture 2 Abstract Positions/Pointers Positions in an Array Pointers in C References in Java Implementing Positions in Trees Building Trees Positions and Pointers
16
16 High Level Positions/Pointers Positions: Given a data structure, we want to have one or more current elements that we are considering. Conceptualizations: Fingers in pies Pins on maps Little girl dancing there Me See Goodrich Sec 7.3 Positional Lists
17
17 head 2039 Positions/Pointers: Implementations of Positions/Pointers Now lets redo it in Java. element next The right hand side of the “=” specifies a memory location. So does its left hand side. The action is to put the value contained in the first into the second. 2039 5 element next 2182 head.next head.next; = 2182
18
18 Implementing Positions in Trees class LinkedBinaryTree { class Node { E element; Node parent; Node left; Node right; } private Node root = null; tree
19
19 Implementing Positions in Trees class LinkedBinaryTree { Position sibling(Position p) { Node n=p; if( n.parent.right = n ) return n.parent.left; else return n.parent.right; } At any time the user can move a position to the sibling. p 3 = tree.sibling(p 2 ); p2p2 p1p1 p3p3 … tree if( n.parent != null ) else throw new IllegalArgumentException(“p is the root"); }
20
20 Implementing Positions in Trees class LinkedBinaryTree { Position addRight(Position p,E e) { Node n=p; if( n.right = null ) n.right = return else throw new IllegalArgumentException( "p already has a right child"); } At any time the user can add a position/node to the right of a position. p 3 = tree.addRight(p 2,“Toronto”); … p2p2 p1p1 p3p3 Toronto new Node(e,,null,null); n n.right; tree
21
21 Implementing Positions in Trees Defining the class of trees nodes can have many children. We use the data structure Set or List to store the Positions of a node’s children. class LinkedTree { tree
22
22 Jeff Edmonds York University COSC 2011 Lecture 3 Contracts Assertions Loop Invariants The Sum of Objects Insertion and Selection Sort Binary Search Like Examples Bucket (Quick) Sort for Humans Reverse Polish Notation (Stack) Whose Blocking your View (Stack) Parsing (Stack) Data Structure Invariants Stack and Queue in an Array Linked Lists Contracts, Assertions, and Invariants
23
23 PreconditionPostcondition On Step At a Time next I implored you to not worry about the entire computation. next It can be difficult to understand where computation go. i-1 i i i 0 T+1 i-1 i i i 0 T+1 Trust who passes you the baton and go around once
24
24 Iterative Algorithm with Loop Invariants Precondition: What is true about input Post condition: What is true about output.
25
25 Iterative Algorithm with Loop Invariants Goal: Your goal is to prove that no matter what the input is, as long as it meets the precondition, and no matter how many times your algorithm iterates, as long as eventually the exit condition is met, then the post condition is guarantee to be achieved. Proves that IF the program terminates then it works &
26
26 Iterative Algorithm with Loop Invariants Loop Invariant: Picture of what is true at top of loop.
27
27 Iterative Algorithm with Loop Invariants Establishing the Loop Invariant. Our computation has just begun. All we know is that we have an input instance that meets the Pre Condition. Being lazy, we want to do the minimum work. And to prove that it follows that the Loop Invariant is then made true. codeA Establishing Loop Invariant
28
28 Iterative Algorithm with Loop Invariants Maintaining the loop invariant (while making progress) 79 km75 km Exit We arrived at the top of the loop knowing only the Loop Invariant is true and the Exit Condition is not. We must take one step (iteration) (making some kind of progress). And then prove that the Loop Invariant will be true when we arrive back at the top of the loop. ¬ codeB Maintaining Loop Invariant Exit
29
29 codeC Obtain the Post Condition Exit We know the Loop Invariant is true because we have maintained it. We know the Exit Condition is true because we exited. We do a little extra work. And then prove that it follows that the Post Condition is then true. Obtain the Post Condition: Exit Iterative Algorithm with Loop Invariants
30
30 Iterative Algorithm with Loop Invariants 88 14 98 25 62 52 79 30 23 31 14,23,25,30,31,52,62,79,88,98 Precondition: What is true about input Post condition: What is true about output. Insertion Sort
31
31 Iterative Algorithm with Loop Invariants Loop Invariant: Picture of what is true at top of loop. 14 98 25 62 79 30 23,31,52,88 Sorted sub-list
32
32 Iterative Algorithm with Loop Invariants 98 25 62 79 23,31,52,88 14 98 25 79 30 23,31,52,62,88 14 30 6 elements to school Making progress while Maintaining the loop invariant 79 km75 km Exit
33
33 Iterative Algorithm with Loop Invariants 88 14 98 25 62 52 79 30 23 31 88 14 98 25 62 52 79 30 23 31 n elements to school 14,23,25,30,31,52,62,79,88,98 0 elements to school Beginning & Ending Exit 0 kmExit
34
34 Iterative Algorithm with Loop Invariants n+1 n+1 n+1 n+1 n+1 n n n n n n = 1+2+3+…+n = (n 2 ) Running Time
35
35 Define ProblemDefine Loop Invariants Define Measure of Progress Define StepDefine Exit ConditionMaintain Loop Inv Make ProgressInitial ConditionsEnding 79 km to school Exit 79 km75 km Exit 0 kmExit Iterative Algorithm with Loop Invariants
36
36 Proves that IF the program terminates then it works & codeA Establishing Loop Invariant codeC Clean up loose ends Exit ¬ codeB Maintaining Loop Invariant Exit Iterative Algorithm with Loop Invariants
37
37 Iterative Algorithm with Loop Invariants Precondition: What is true about input Post condition: What is true about output. Binary Search key 25 356131821 25364349515360727483889195
38
38 Iterative Algorithm with Loop Invariants Loop Invariant: Picture of what is true at top of loop. key 25 356131821 25364349515360727483889195 If the key is contained in the original list, then the key is contained in the sub-list.
39
39 Iterative Algorithm with Loop Invariants Making progress while Maintaining the loop invariant 79 km75 km Exit key 25 356131821 25364349515360727483889195 If key ≤ mid, then key is in left half. If key > mid, then key is in right half.
40
40 Iterative Algorithm with Loop Invariants key 25 356131821 25364349515360727483889195 If key ≤ mid, then key is in left half. If key > mid, then key is in right half. Running Time The sub-list is of size n, n / 2, n / 4, n / 8,…,1 Each step (1) time. Total = (log n)
41
41 Iterative Algorithm with Loop Invariants Beginning & Ending Exit 0 kmExit key 25 356131821 25364349515360727483889195
42
42 Parsing with a Stack Input: A string of brackets. Output: Each “(”, “{”, or “[” must be paired with a matching “)”, “}”, or “[”. Loop Invariant: Prefix has been read. Matched brackets are matched and removed. Unmatched brackets are on the stack. Stack [( Opening Bracket: Push on stack. Closing Bracket: If matches that on stack pop and match. else return(unmatched) [()(()){ (( }] )
43
43 Dude! You have been teaching 3101 too long. This is not an course on Algorithms, but on Data Structures! Data Structure Invariants The importance of invariants is the same. Differences: 1.An algorithm must terminate with an answer, while systems and data structures may run forever. 2.An algorithm gets its full input at the beginning, while data structures gets a continuous stream of instructions from the user. Both have invariants that must be maintained.
44
44 Data Structure Invariants Assume we fly in from Mars and Invariants Data Struc t is true: Invariants Data Struc t+1 postCond Push Maintaining Loop Invariant Exit Invariants Data Struc t Push Operation preCond Push Assume the user correctly calls the Push Operation: preCond Push The input is info for a new element. Implementer must ensure: postCond Push The element is pushed on top of the stack. Invariants Data Struc t+1
45
45 Data Structure Invariants Invariants Data Struc t+1 postCond Push Maintaining Loop Invariant Exit Invariants Data Struc t Push Operation preCond Push top = top + 1; A[top] = info;
46
46 Data Structure Invariants Queue: Add and Remove from opposite ends. Algorithm dequeue() if isEmpty() then throw EmptyQueueException else info A[bottom] bottom (bottom + 1) mod N return e
47
47 Data Structure Invariants Invariants Data Struc t preCond Push postCond Push Invariants Data Struc t+1 Don’t panic. Just draw the pictures and move the pointers.
48
48 Data Structure Invariants Invariants Data Struc t preCond Push postCond Push Invariants Data Struc t+1
49
49 Data Structure Invariants Invariants Data Struc t preCond Push postCond Push Invariants Data Struc t+1 Special Case: Empty
50
50 Data Structure Invariants Invariants Data Struc t preCond Remove Rear postCond Remove Rear Invariants Data Struc t+1 How about removing an element from the rear? Is it so easy??? last must point at the second last element. How do we find it? You have to walk there from first! time # of elements instead of constant
51
51 Data Structure Invariants FrontRear Add ElementTime Constant Remove ElementTime Constant Time n Stack: Add and Remove from same end. Actually, for a Stack the last pointer is not needed.
52
52 Data Structure Invariants FrontRear Add ElementTime Constant Remove ElementTime Constant Time n Stack: Add and Remove from same end. Queue: Add and Remove from opposite ends.
53
53 Data Structure Invariants FrontRear Add ElementTime Constant Remove ElementTime Constant Time n Time Constant trailer header nodes/positions elements Doubly-linked lists allow more flexible list
54
54 Data Structure Invariants Exit
55
55 Jeff Edmonds York University COSC 2011 Lecture 4 Asymptotic Analysis of Time Complexity History of Classifying Problems Growth Rates Time Complexity Linear vs Constant Time Binary Search Time (logn) Insertion Sort Time (Quadratic) Don't Redo Work Test (linear) vs Search (Exponential) Multiplying (Quadratic vs Exponential) Bits of Input Cryptography Amortized Time Complexity Worst Case Input Classifying Functions (BigOh) Adding Made Easy Logs and Exponentials Understand Quantifiers
56
56 Some Math Time Complexity t(n) = (n 2 ) Input Size Time Classifying Functions f(i) = n (n) Logs and Exps 2 a × 2 b = 2 a+b 2 log n = n Adding Made Easy ∑ i=1 f(i). Logic Quantifiers g b Loves(b,g) b g Loves(b,g) Recurrence Relations T(n) = a T(n/b) + f(n)
57
57 Specifies how the running time depends on the size of the input. The Time Complexity of an Algorithm “size” of input “size” of input “time” T(n) executed. “time” T(n) executed. Work for me to give you the instance. Work for you to solve it. A function mapping
58
58 History of Classifying Problems Computable Exp = 2 n Poly = n c Quadratic = n 2 nlogn log n Fast sorting Look at input Slow sorting Considered Feasible Brute Force (Infeasible) Mathematicians’ dream Constant Time does not depend on input. Linear = n Binary Search Halting Impossible
59
59 Quadratic = n 2 log n Look at input Slow sorting Brute Force (Infeasible) Constant Time does not depend on input. Linear = n Binary Search input size timetime Growth Rates Exp = 2 n 5 log n n n2n2 2n2n
60
60 Search: Input: A linked list. Input: A linked list. Output: Find the end. Output: Find the end. Alg: Walk there. Alg: Walk there. Time Time Insert Front: Input: A linked list. Input: A linked list. Output: Add record to front. Output: Add record to front. Alg: Play with pointers. Alg: Play with pointers. Time Time # of records = n. = 4 Linear vs Constant Time
61
61 Time Time = 4 Linear vs Constant Time = Constant time = O(1) Time does not “depend” on input. a Java Program J, an integer k, Time(J,I) inputs I, Time(J,I) ≤ k Is this “Constant time” = O(1)? Time n Yes because bounded by a constant
62
62 Test/Evaluate: Input: Circuit & Assignment Input: Circuit & Assignment Output: Value at output. Output: Value at output. Alg: Let values percolate down. Alg: Let values percolate down. Time: Time:Search/Satisfiablity: Input: Circuit Input: Circuit Output: An assignment giving true: Output: An assignment giving true: Alg: Try all assignments. Alg: Try all assignments. (Brute Force) (Brute Force) Time: Time: Test vs Search FT F x3x3x3x3 x2x2x2x2 x1x1x1x1OR ORANDAND OR NOT FFF FFT # of gates. 2n 2n 2n 2n
63
63 * * * * * * * * * * * * * * * * * * * * * * * * * * * * n2n2 Grade School vs Kindergarten a × b = a + a + a +... + a b Running Time T(n) = Time multiply = θ(b) = linear time. T(n) = Time multiply = θ(n 2 ) = quadratic time. Which is faster? 92834765225674897 × 83883977590110394875 9
64
64 Size of Input Instance Size of paper # of bits # of digits Value - n = 2 in 2 - n = 17 bits - n = 5 digits - n = 83920 Intuitive Formal Reasonable Unreasonable # of bits = log 2 (Value) Value = 2 # of bits 2’’ 83920 5 1’’
65
65 Specifies how the running time depends on the size of the input. The Time Complexity of an Algorithm “size” of input “size” of input “time” T(n) executed. “time” T(n) executed. Work for me to give you the instance. Work for you to solve it. A function mapping
66
66 * * * * * * * * * * * * * * * * * * * * * * * * * * * * n2n2 Grade School vs Kindergarten a × b = a + a + a +... + a b Running Time T(n) = Time multiply = θ(b) = linear time. T(n) = Time multiply = θ(n 2 ) = quadratic time. Which is faster? 92834765225674897 × 8388397759011039475 n = # digits = 20 Time ≈ 20 2 ≈ 400 b = value = b = value = 8388397759011039475 Time ≈ Time ≈ 8388397759011039475
67
67 * * * * * * * * * * * * * * * * * * * * * * * * * * * * n2n2 Grade School vs Kindergarten a × b = a + a + a +... + a b Running Time T(n) = Time multiply = θ(b) = linear time. T(n) = Time multiply = θ(n 2 ) = quadratic time. Which is faster? 92834765225674897 × 8388397759011039475 n = # digits = 20 Time ≈ 20 2 ≈ 400 b = value ≈ 10 n ≈ 10 n Time ≈ 10 n ≈ exponential!!!
68
68 * * * * * * * * * * * * * * * * * * * * * * * * * * * * n2n2 Grade School vs Kindergarten a × b = a + a + a +... + a b Running Time T(n) = Time multiply = θ(b) = linear time. T(n) = Time multiply = θ(n 2 ) = quadratic time. Which is faster? 92834765225674897 × 8388397759011039475 n = # digits = 20 Time ≈ 20 2 ≈ 400 Adding a single digit multiplies the time by 10! Adding a single digit multiplies the time by 10! 9
69
69 Time Complexity of Algorithm O(n 2 ): Prove that for every input of size n, the algorithm takes no more than cn 2 time. Ω(n 2 ): Find one input of size n, for which the algorithm takes at least this much time. θ (n 2 ): Do both. The time complexity of an algorithm is the largest time required on any input of size n.
70
70 Time Complexity of Problem O(n 2 ): Provide an algorithm that solves the problem in no more than this time. Ω(n 2 ): Prove that no algorithm can solve it faster. θ (n 2 ): Do both. The time complexity of a problem is the time complexity of the fastest algorithm that solves the problem.
71
71 Classifying Functions Functions Poly Logarithmic Polynomial Exponential ExpDouble Exp Constant (log n) 5 n5n5 2 5n 5 2 n5n5 2 << (log n) θ(1) n θ(1) 2 θ(n) θ(1)2 n θ(1) 2 θ(n) 2
72
72 Classifying Functions LinearQuadratic Cubic ? θ(n 2 ) θ(n) θ(n 3 ) Polynomial = n θ(1) θ(n 4 ) Others θ(n 3 log 7 (n)) log(n) not absorbed because not Mult-constant
73
73 BigOh and Theta? 5n 2 + 8n + 2log n = (n 2 ) Drop low-order terms. Drop multiplicative constant. 5n 2 log n + 8n + 2log n = (n 2 log n)
74
74 Notations Theta f(n) = θ(g(n))f(n) ≈ c g(n) BigOh f(n) = O(g(n))f(n) ≤ c g(n) Omega f(n) = Ω(g(n))f(n) ≥ c g(n) Little Oh f(n) = o(g(n))f(n) << c g(n) Little Omega f(n) = ω(g(n))f(n) >> c g(n)
75
75 Definition of Theta f(n) = θ(g(n))
76
76 Definition of Theta f(n) is sandwiched between c 1 g(n) and c 2 g(n) f(n) = θ(g(n))
77
77 Definition of Theta f(n) is sandwiched between c 1 g(n) and c 2 g(n) for some sufficiently small c 1 (= 0.0001) for some sufficiently large c 2 (= 1000) f(n) = θ(g(n))
78
78 Definition of Theta For all sufficiently large n f(n) = θ(g(n))
79
79 Definition of Theta For all sufficiently large n For some definition of “sufficiently large” f(n) = θ(g(n))
80
80 Gauss ∑ i=1..n i = 1 + 2 + 3 +... + n = (# of terms · last term) Arithmetic Sum n+1 n+1 n+1 n+1 n+1 n n n n n n Adding Made Easy
81
81 Gauss ∑ i=1..n i = 1 3 + 2 3 + 3 3 +... + n 3 = (# of terms · last term) Arithmetic Sum True when ever terms increase slowly Adding Made Easy
82
82 ∑ i=0..n r i = r 0 + r 1 + r 2 +... + r n = (biggest term) Geometric Increasing Adding Made Easy
83
83 Geometric Like: If f(n) 2 Ω(n), then ∑ i=1..n f(i) = θ(f(n)). Arithmetic Like: If f(n) = n θ(1)-1, then ∑ i=1..n f(i) = θ(n · f(n)). Harmonic: If f(n) = 1 / n, then ∑ i=1..n f(i) = log e n + θ(1). Bounded Tail: If f(n) n -1-Ω(1), then ∑ i=1..n f(i) = θ(1). ( For +, -, , , exp, log functions f(n) ) This may seem confusing, but it is really not. It should help you compute most sums easily. Adding Made Easy
84
84 Logs and Exp properties of logarithms: log b (xy) = log b x + log b y log b (x/y) = log b x - log b y log b x a = alog b x log b a = log x a/log x b properties of exponentials: a (b+c) = a b a c a bc = (a b ) c a b /a c = a (b-c) b = a log a b b c = a c*log a b
85
85 Easy. I choose a trillion trillion. Say, I have a game for you. We will each choose an integer. You win if yours is bigger. I am so nice, I will even let you go first. Well done. That is big! Understand Quantifiers!!! But I choose a trillion trillion and one so I win.
86
86 You laugh but this is a very important game in theoretical computer science. You choose the size of your Java program. Then I choose the size of the input. Likely |I| >> |J| So you better be sure your Java program can handle such long inputs. Understand Quantifiers!!!
87
87 The first order logic we can state that I win the game: x, y, y>x The proof: Let x be an arbitrary integer. Let y = x+1 Note y = x+1 >x Understand Quantifiers!!! Good game. Let me try again. I will win this time!
88
88 Understand Quantifiers!!! Fred LaytonJohn Bob Sam One politician Fred LaytonJohn HarperBob Sam Could be a different politician. politician, voters, Loves(v, p) voters, politician, Loves(v, p)
89
89 Fred LaytonJohn Bob Sam Fred LaytonJohn HarperBob Sam “There is a politician that is loved by everyone.” This statement is “about” a politician. The existence of such a politician. We claim that this politician is “loved by everyone”. politician, voters, Loves(v, p) voters, politician, Loves(v, p) “Every voter loves some politician.” This statement is “about” voters. Something is true about every voter. We claim that he “loves some politician.” Understand Quantifiers!!!
90
90 A Computational Problem P states for each possible input I what the required output P(I) is. An Algorithm/Program/Machine M is a set of instructions a set of instructions (described by a finite string “M”) on a given input I on a given input I follow instructions and follow instructions and produces output M(I) produces output M(I) or runs for ever. or runs for ever. Eg: Sorting Eg: Insertion Sort Understand Quantifiers!!!
91
91 M(I)=P(I) I, M, Problem P is computable if There exists is a single algorithm/machine that solves P for every input Understand Quantifiers!!! Play the following game to prove it!
92
92 M(I)=P(I) I, M, Problem P is computable if Understand Quantifiers!!! Two players: a prover and a disprover.
93
93 M(I)=P(I) I, M, Problem P is computable if Understand Quantifiers!!! They read the statement left to right. I produce the object when it is a I produce the object when it is a . I can always win if and only if statement is true. The order the players go REALY matters.
94
94 I have a machine M that I claim works. I win if M on input I gives the correct output Oh yeah, I have an input I for which it does not. M(I)=P(I) I, M, Problem P is computable if What we have been doing all along. Understand Quantifiers!!!
95
95 M, I, M(I) P(I) M(I)=P(I) I, M, Problem P is uncomputable if I win if M on input I gives the wrong output I have a machine M that I claim works. I find one counter example input I for which his machine M fails us. Problem P is computable if Generally very hard to do. Understand Quantifiers!!!
96
96 M, I, M(I) Halting(I) M(I)=Sorting(I) I, M, I, M, M(I) Halting(I) The order the players go REALY matters. If you don’t know if it is true or not, trust the game. true Problem P is uncomputable if Problem P is computable if true Understand Quantifiers!!!
97
97 M, I, M(I) Halting(I) Given I either Halting(I) = yes or Halting(I) = no. I give you an input I. I, M yes (I) says yes I, M no (I) says no I, M, M(I) Halting(I) I don’t know which, but one of these does the trick. true M(I)=Sorting(I) I, M, Problem P is computable if true Problem P is uncomputable if A tricky one. Understand Quantifiers!!!
98
98 Problem P is computable in polynomial time. Problem P is not computable in polynomial time. Problem P is computable in exponential time. The computational class “Exponential Time" is strictly bigger than the computational class “Polynomial Time”. M, c, n 0, I, M(I)=P(I) & (|I| < n 0 or Time(M,I) ≤ |I| c ) M, c, n 0, I, M(I)≠P(I) or (|I| ≥ n 0 & Time(M,I) > |I| c ) M, c, n 0, I, M(I)=P(I) & (|I| < n 0 or Time(M,I) ≤ 2 c|I| ) [ M, c, n 0, I, M(I)≠P(I) or (|I| ≥ n 0 & Time(M,I) > |I| c )] [ M, c, n 0, I, M(I)=P(I) & (|I| < n 0 or Time(M,I) ≤ 2 c|I| )] P, & Understand Quantifiers!!!
99
99 Jeff Edmonds York University COSC 2011 Lecture 5 One Step at a Time Stack of Stack Frames Friends and Strong Induction Recurrence Relations Towers of Hanoi Check List Merge & Quick Sort Simple Recursion on Trees Binary Search Tree Things not to do Heap Sort & Priority Queues Trees Representing Equations Pretty Print Parsing Iterate over all s-t Paths Recursive Images Ackermann's Function Recursion
100
100 PreconditionPostcondition On Step At a Time next I implored you to not worry about the entire computation. next It can be difficult to understand where computation go. i-1 i i i 0 T+1 x/4 3y (x-3 x 1 ) y Strange(x,y): x 1 = x/4 ; y1 = 3y; f 1 = Strange( x 1, y 1 ); x 2 = x - 3 x1; y 2 = y; f 2 = Strange( x 2, y 2 ); return( f 1 +f 2 );
101
101 Consider your input instance If it is small enough solve it on your own. Allocate work –Construct one or more sub-instances It must be smaller and meet the precondition –Assume by magic your friends give you the answer for these. Use this help to solve your own instance. Do not worry about anything else. –Micro-manage friends by tracing out what they and their friend’s friends do. –Who your boss is. X = 7 Y = 15 XY = 105 X = 9 Y = 5 XY = 45 Strange(x,y): If x < 4 then return( xy ); x 1 = x/4 ; y 1 = 3y; f 1 = Strange( x 1, y 1 ); x 2 = x - 3 x 1 ; y 2 = y; f 2 = Strange( x 2, y 2 ); return( f 1 +f 2 ); ? Know Precond: ints x,y Postcond: ??? Friends & Strong Induction x = 30; y = 5 x 1 = 7; y 1 = 15 f 1 = 105 x 2 = 9; y 2 = 5 f 2 = 45 return 150
102
102 Recurrence Relations Time of Recursive Program procedure Eg(I n ) n = |I n | if(n 1) then put “Hi” else loop i=1..n c put “Hi” loop i=1..a I n/b = I n cut in b pieces Eg(I n/b ) T(1) = 1 T(n) = a T(n/b) + n c n is the “size” of our input. a is the number of “friends” n/b is the “size” of friend’s input. n c is the work I personally do.
103
103 n T(n)T(n) = n/2 11111111111111111111111111111111.... …………………….... 111111111111111111111111111111111 n/4
104
104 Evaluating: T(n) = aT(n/b)+f(n) Level Instance size Work in stack frame # stack frames Work in Level 0n f(n)f(n) 1 1 · f(n) 1 n/bn/bf(n/b)f(n/b) a a · f(n/b)a · f(n/b) 2 n/b2n/b2 f(n/b2)f(n/b2) a2a2 a 2 · f(n/b 2 ) i n/bin/bi f(n/bi)f(n/bi) aiai a i · f(n/b i ) h = log n / log b n/bhn/bh T(1) n log a / log b n · T(1) log a / log b Total Work T(n) = ∑ i=0..h a i × f(n/b i )
105
105 Evaluating: T(n) = aT(n/b)+f(n)
106
106 Time for top level: Time for base cases: Dominated?: c = 1 < 2 = log a / log b θ(n ) = log a / log b θ(n ) = θ(n 2 ) log 4 / log 2 Hence, T(n) = ? = θ(base cases) = θ(n ) = θ(n 2 ). log a / log b Evaluating: T(n) = aT(n/b) + n c = 4T(n/2) + n n c = = n 1 1 If we reduce the number of friends from 4 to 3 is the savings just 25%?
107
107 Time for top level: Time for base cases: Dominated?: c = 1 < 1.58 = log a / log b θ(n ) = log a / log b θ(n ) = θ(n 1.58 ) log 3 / log 2 Hence, T(n) = ? = θ(base cases) = θ(n ) = θ(n 1.58 ). log a / log b Evaluating: T(n) = aT(n/b) + n c = 3T(n/2) + n n c = = n 1 1 Not just a 25% savings! θ(n 2 ) vs θ(n 1.58.. )
108
108 Time for top level: n 2, c=2 Time for base cases: Dominated?: c = 1.58 = log a / log b θ(n ) = log a / log b θ(n ) = θ(n 1.58 ) log 3 / log 2 Hence, T(n) = ? = θ(top level) = θ(n 2 ). Evaluating: T(n) = aT(n/b) + n c = 3T(n/2) + n 2 2 >
109
109 Time for top level: n 1.58, c=1.58 Time for base cases: Dominated?: c = 1.58 = log a / log b Hence, all θ(logn) layers require a total of θ(n 1.58 ) work. The sum of these layers is no longer geometric. Hence T(n) = θ(n 1.58 logn) θ(n ) = log a / log b θ(n ) = θ(n 1.58 ) log 3 / log 2 Evaluating: T(n) = aT(n/b) + n c = 3T(n/2) + n 1.58
110
110 Evaluating: T(n) = aT(n/b)+f(n) Level Instance size Work in stack frame # stack frames Work in Level 0n f(n)f(n) 1 1 · f(n) 1 n/bn/bf(n/b)f(n/b) a a · f(n/b)a · f(n/b) 2 n/b2n/b2 f(n/b2)f(n/b2) a2a2 a 2 · f(n/b 2 ) i n/bin/bi f(n/bi)f(n/bi) aiai a i · f(n/b i ) h = log n / log b n/bhn/bh T(1) n log a / log b n · T(1) log a / log b All levels the same: Top Level to Base Cases
111
111 Evaluating: T(n) = aT(n/b)+f(n)
112
112 Check Lists for Recursive Programs This is the format of “all” recursive programs. Don’t deviate from this. Or else!
113
113 Merge Sort 88 14 98 25 62 52 79 30 23 31 Split Set into Two (no real work) 25,31,52,88,98 Get one friend to sort the first half. 14,23,30,62,79 Get one friend to sort the second half.
114
114 Merge Sort Merge two sorted lists into one 25,31,52,88,98 14,23,30,62,79 14,23,25,30,31,52,62,79,88,98
115
115 Java Implementation Andranik Mirzaian
116
116 Java Implementation Andranik Mirzaian
117
117 Quick Sort 88 14 98 25 62 52 79 30 23 31 Partition set into two using randomly chosen pivot 14 25 30 23 31 88 98 62 79 ≤ 52 ≤
118
118 Quick Sort 14 25 30 23 31 88 98 62 79 ≤ 52 ≤ 14,23,25,30,31 Get one friend to sort the first half. 62,79,98,88 Get one friend to sort the second half.
119
119 Quick Sort 14,23,25,30,31 62,79,98,8852 Glue pieces together. (No real work) 14,23,25,30,31,52,62,79,88,98 Faster Because all done “in place” ie in the input array
120
120 Java Implementation Andranik Mirzaian 120
121
121 Recursion on Trees A binary tree is: - the empty tree - a node with a right and a left sub-tree. 3 8 1 32 2 7 6 5 9 4 1 (define)
122
122 Recursion on Trees number of nodes = ? 3 8 1 32 2 7 6 5 9 4 1 6 5 Get help from friends (friends)
123
123 Recursion on Trees number of nodes 3 8 1 32 2 7 6 5 9 4 1 = number on left + number on right + 1 = 6 + 5 + 1 = 12 6 5 (friends)
124
124 Recursion on Trees Base Case ? 3 8 1 32 2 7 6 5 9 4 1 number of nodes 0 Base case! (base case)
125
125 Recursion on Trees 3 8 1 32 2 7 6 5 9 4 1 (communication) Being lazy, I will only consider my root node and my communication with my friends. I will never look at my children subtrees but will trust them to my friends. 6
126
126 Recursion on Trees (code)
127
127 Recursion on Trees 3 Designing Program/Test Cases generic 3 3 0 0 0+0+1=1 n1n1 n2n2 n 1 + n 2 + 1 0 n1n1 n 1 +0+1 Same code works! Try same code 0 Base Case (cases)
128
128 Recursion on Trees One stack frame for each node in the tree and for empty trees hang off And constant work per stack frame. = (n) × Time: T(n) = ∑ stack frame Work done by stack frame (1) = (n) (time)
129
129 Recursion on Trees number of nodes = One friend for each sub-tree. 3 8 1 32 2 7 6 5 9 4 1 4 4 2 2 4 Many Children 4 + 2 + 4 + 2 + 1 = 13 (friends) (mult-children)
130
130 Recursion on Trees 3 Designing Program/Test Cases generic 3 3 0 +1=1 n1n1 n 1 + 1 Same code works! Try same code generic n1n1 n3n3 n 1 + n 2 + n 3 + 1 n2n2 But is this needed (if not the input) (cases) (mult-children)
131
131 Recursion on Trees (code) (mult-children)
132
132 Recursion on Trees Time: T(n) = ∑ stack frame Work done by stack frame (edge Tree ) (time) (mult-children) = ∑ stack frame (# subroutine calls) (node Tree ) (n) = ∑ node (# edge in node)
133
133 We pass the recursive program a “binary tree”. But what type is it really? This confused Jeff at first. class LinkedBinaryTree { class Node { E element; Node parent; Node left; Node right; } Node root = null; Tree Recursion on Trees
134
134 One would think tree is of type LinkedBinaryTree. Then getting its left subtree, would be confusing. class LinkedBinaryTree { class Node { E element; Node parent; Node left; Node right; } Node root = null; Tree Recursion on Trees left_Tree (LinkedBinaryTree tree)
135
135 One would think tree is of type LinkedBinaryTree. Then getting its left subtree, would be confusing. class LinkedBinaryTree { class Node { E element; Node parent; Node left; Node right; } Node root = null; Tree Recursion on Trees left_Tree (LinkedBinaryTree tree)
136
136 It is easier to have tree be of type Node. But it is thought of as the subtree rooted at the node pointed at. The left child is then class LinkedBinaryTree { class Node { E element; Node parent; Node left; Node right; } Node root = null; Tree Recursion on Trees (Node tree) tree tree.left or tree.Getleft() or Tree.leftSub(tree) or leftSub(tree)
137
137 But the outside user does not know about pointers to nodes. class LinkedBinaryTree { class Node { E element; Node parent; Node left; Node right; } Node root = null; public int NumberNodes() { return NumberNodesRec( root ); } Tree Recursion on Trees tree private int NumberNodesRec (Node tree) NumberNodesRec
138
138 End Midterm Review
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.