COSC 3101A - Design and Analysis of Algorithms 2 Asymptotic Notations Continued Proof of Correctness: Loop Invariant Designing Algorithms: Divide and Conquer.

Slides:



Advertisements
Similar presentations
Analysis of Algorithms
Advertisements

Razdan with contribution from others 1 Algorithm Analysis What is the Big ‘O Bout? Anshuman Razdan Div of Computing.
Comp 122, Spring 2004 Divide and Conquer (Merge Sort)
1 Divide & Conquer Algorithms. 2 Recursion Review A function that calls itself either directly or indirectly through another function Recursive solutions.
Chapter 2. Getting Started. Outline Familiarize you with the to think about the design and analysis of algorithms Familiarize you with the framework to.
A Basic Study on the Algorithm Analysis Chapter 2. Getting Started 한양대학교 정보보호 및 알고리즘 연구실 이재준 담당교수님 : 박희진 교수님 1.
ALGORITHMS Introduction. Definition Algorithm: Any well-defined computational procedure that takes some value or set of values as input and produces some.
I Advanced Algorithms Analysis. What is Algorithm?  A computer algorithm is a detailed step-by-step method for solving a problem by using a computer.
Analysis of Algorithms CS 477/677 Instructor: Monica Nicolescu Lecture 5.
Analysis of Algorithms CS 477/677 Sorting – Part B Instructor: George Bebis (Chapter 7)
CS Section 600 CS Section 002 Dr. Angela Guercio Spring 2010.
CS Section 600 CS Section 002 Dr. Angela Guercio Spring 2010.
CS 253: Algorithms Chapter 7 Mergesort Quicksort Credit: Dr. George Bebis.
Chapter 3 Growth of Functions
Sorting. Input: A sequence of n numbers a 1, …, a n Output: A reordering a 1 ’, …, a n ’, such that a 1 ’ < … < a n ’
CS421 - Course Information Website Syllabus Schedule The Book:
Analysis of Algorithms CS 477/677 Instructor: Monica Nicolescu.
Lecture 2: Divide and Conquer I: Merge-Sort and Master Theorem Shang-Hua Teng.
1 Data Structures A program solves a problem. A program solves a problem. A solution consists of: A solution consists of:  a way to organize the data.
CS3381 Des & Anal of Alg ( SemA) City Univ of HK / Dept of CS / Helena Wong 2. Analysis of Algorithms - 1 Analysis.
25 June 2015Comp 122, Spring 2004 Asymptotic Notation, Review of Functions & Summations.
CS Main Questions Given that the computer is the Great Symbol Manipulator, there are three main questions in the field of computer science: What kinds.
Analysis of Algorithms CS 477/677
Introduction CIS 606 Spring The sorting problem Input: A sequence of n numbers 〈 a 1, a 2, …, a n 〉. Output: A permutation (reordering) 〈 a’ 1,
Introduction to Algorithm design and analysis
Algorithm Efficiency CS 110: Data Structures and Algorithms First Semester,
Mathematics Review and Asymptotic Notation
1 Computer Algorithms Lecture 3 Asymptotic Notation Some of these slides are courtesy of D. Plaisted, UNC and M. Nicolescu, UNR.
Algorithm Correctness A correct algorithm is one in which every valid input instance produces the correct output. The correctness must be proved mathematically.
Analysis of Algorithms
10/14/ Algorithms1 Algorithms - Ch2 - Sorting.
Lecture 2 Algorithm Analysis Arne Kutzner Hanyang University / Seoul Korea.
CMPT 438 Algorithms. Why Study Algorithms? Necessary in any computer programming problem ▫Improve algorithm efficiency: run faster, process more data,
Asymptotic Analysis-Ch. 3
CMPT 438 Algorithms Chapter 3 Asymptotic Notations.
ALGORITHMS THIRD YEAR BANHA UNIVERSITY FACULTY OF COMPUTERS AND INFORMATIC Lecture three Dr. Hamdy M. Mousa.
Getting Started Introduction to Algorithms Jeff Chastine.
Program Efficiency & Complexity Analysis. Algorithm Review An algorithm is a definite procedure for solving a problem in finite number of steps Algorithm.
September 17, 2001 Algorithms and Data Structures Lecture II Simonas Šaltenis Nykredit Center for Database Research Aalborg University
Data Structure Introduction.
Divide-and-Conquer UNC Chapel HillZ. Guo. Divide-and-Conquer It’s a technique instead of an algorithm Recursive in structure – Divide the problem into.
1/6/20161 CS 3343: Analysis of Algorithms Lecture 2: Asymptotic Notations.
Midterm Review 1. Midterm Exam Thursday, October 15 in classroom 75 minutes Exam structure: –TRUE/FALSE questions –short questions on the topics discussed.
Introduction to Algorithms (2 nd edition) by Cormen, Leiserson, Rivest & Stein Chapter 2: Getting Started.
E.G.M. PetrakisAlgorithm Analysis1  Algorithms that are equally correct can vary in their utilization of computational resources  time and memory  a.
Algorithms A well-defined computational procedure that takes some value as input and produces some value as output. (Also, a sequence of computational.
2IS80 Fundamentals of Informatics Fall 2015 Lecture 6: Sorting and Searching.
1 Ch. 2: Getting Started. 2 About this lecture Study a few simple algorithms for sorting – Insertion Sort – Selection Sort (Exercise) – Merge Sort Show.
Lecture # 1 Introduction Analysis of Algorithm by Qamar Abbas Analysis of Algorithms.
Lecture 2 Algorithm Analysis Arne Kutzner Hanyang University / Seoul Korea.
Lecture # 6 1 Advance Analysis of Algorithms. Divide-and-Conquer Divide the problem into a number of subproblems Similar sub-problems of smaller size.
Big O David Kauchak cs302 Spring Administrative Assignment 1: how’d it go? Assignment 2: out soon… Lab code.
BITS Pilani Pilani Campus Data Structure and Algorithms Design Dr. Maheswari Karthikeyan Lecture1.
Sorting. 2 The Sorting Problem Input: A sequence of n numbers a 1, a 2,..., a n Output: A permutation (reordering) a 1 ’, a 2 ’,..., a n ’ of the input.
BITS Pilani Pilani Campus Data Structure and Algorithms Design Dr. Maheswari Karthikeyan Lecture3.
CS6045: Advanced Algorithms Sorting Algorithms. Sorting Input: sequence of numbers Output: a sorted sequence.
Analysis of Algorithms CS 477/677 Instructor: Monica Nicolescu Lecture 2.
Analysis of Algorithms CS 477/677 Instructor: Monica Nicolescu Lecture 4.
Algorithms Sorting – Part 3.
CMPT 438 Algorithms.
Analysis of Algorithms CS 477/677
Design and Analysis of Algorithms
CS 3343: Analysis of Algorithms
CS 3343: Analysis of Algorithms
CS 3343: Analysis of Algorithms
Divide and Conquer (Merge Sort)
Lecture No 6 Advance Analysis of Institute of Southern Punjab Multan
CS200: Algorithms Analysis
Divide and Conquer (Merge Sort)
Algorithms and Data Structures Lecture II
Presentation transcript:

COSC 3101A - Design and Analysis of Algorithms 2 Asymptotic Notations Continued Proof of Correctness: Loop Invariant Designing Algorithms: Divide and Conquer

5/11/2004 Lecture 2COSC3101A2 Typical Running Time Functions 1 (constant running time): –Instructions are executed once or a few times logN (logarithmic) –A big problem is solved by cutting the original problem in smaller sizes, by a constant fraction at each step N (linear) –A small amount of processing is done on each input element N logN –A problem is solved by dividing it into smaller problems, solving them independently and combining the solution

5/11/2004 Lecture 2COSC3101A3 Typical Running Time Functions N 2 (quadratic) –Typical for algorithms that process all pairs of data items (double nested loops) N 3 (cubic) –Processing of triples of data (triple nested loops) N K (polynomial) 2 N (exponential) –Few exponential algorithms are appropriate for practical use

5/11/2004 Lecture 2COSC3101A4 Logarithms In algorithm analysis we often use the notation “ log n ” without specifying the base Binary logarithm Natural logarithm

5/11/2004 Lecture 2COSC3101A5 Review: Asymptotic Notations(1)

5/11/2004 Lecture 2COSC3101A6 Review: Asymptotic Notations(2) if and only if

5/11/2004 Lecture 2COSC3101A7 Review: Asymptotic Notations(3) A way to describe behavior of functions in the limit –How we indicate running times of algorithms –Describe the running time of an algorithm as n grows to  O notation: asymptotic “less than”: f(n) “≤” g(n)  notation: asymptotic “greater than”: f(n) “≥” g(n)  notation: asymptotic “equality”: f(n) “=” g(n)

5/11/2004 Lecture 2COSC3101A8 Big-O Examples(1) –2n 2 = O(n 3 ): – n 2 = O(n 2 ): – 1000n n = O(n 2 ): – n = O(n 2 ): 2n 2 ≤ cn 3  2 ≤ cn  c = 1 and n 0 = 2 n 2 ≤ cn 2  c ≥ 1  c = 1 and n 0 = n n ≤ cn 2  1000n+1000 ≤ cn  c=2000 and n 0 = 1 n ≤ cn 2  cn ≥ 1  c = 1 and n 0 = 1

5/11/2004 Lecture 2COSC3101A9 Big-O Examples(2) E.g.: prove that n 2 ≠ O(n) –Assume  c & n 0 such that:  n≥ n 0 : n 2 ≤ cn –Choose n = max (n 0, c) –n 2 = n  n ≥ n  c  n 2 ≥ cn contradiction!!!

5/11/2004 Lecture 2COSC3101A10 More on Asymptotic Notations There is no unique set of values for n 0 and c in proving the asymptotic bounds Prove that 100n + 5 = O(n 2 ) –100n + 5 ≤ 100n + n = 101n ≤ 101n 2 for all n ≥ 5 n 0 = 5 and c = 101 is a solution –100n + 5 ≤ 100n + 5n = 105n ≤ 105n 2 for all n ≥ 1 n 0 = 1 and c = 105 is also a solution Must find SOME constants c and n 0 that satisfy the asymptotic notation relation

5/11/2004 Lecture 2COSC3101A11 Big-  Examples – 5n 2 =  (n) –100n + 5 ≠  (n 2 ) –n =  (2n), n 3 =  (n 2 ), n =  (logn)  c, n 0 such that: 0  cn  5n 2  cn  5n 2  c = 1 and n 0 = 1  c, n 0 such that: 0  cn 2  100n n + 5  100n + 5n (  n  1) = 105n cn 2  105n  n(cn – 105)  0 Since n is positive  cn – 105  0  n  105/c  contradiction: n cannot be smaller than a constant

5/11/2004 Lecture 2COSC3101A12  Examples –n 2 /2 –n/2 =  (n 2 ) ½ n 2 - ½ n ≤ ½ n 2  n ≥ 0  c 2 = ½ ½ n 2 - ½ n ≥ ½ n 2 - ½ n * ½ n (  n ≥ 2 ) = ¼ n 2  c 1 = ¼ –n ≠  (n 2 ): c 1 n 2 ≤ n ≤ c 2 n 2  only holds for: n ≤ 1/ c 1 –6n 3 ≠  (n 2 ): c 1 n 2 ≤ 6n 3 ≤ c 2 n 2  only holds for: n ≤ c 2 /6 –n ≠  ( logn ): c 1 logn ≤ n ≤ c 2 logn  c 2 ≥ n/logn,  n≥ n 0 – impossible

5/11/2004 Lecture 2COSC3101A13 Comparisons of Functions Theorem: f(n) =  (g(n))  f = O(g(n)) and f =  (g(n)) Transitivity: –f(n) =  (g(n)) and g(n) =  (h(n))  f(n) =  (h(n)) –Same for O and  Reflexivity: –f(n) =  (f(n)) –Same for O and  Symmetry: –f(n) =  (g(n)) if and only if g(n) =  (f(n)) Transpose symmetry: –f(n) = O(g(n)) if and only if g(n) =  (f(n))

5/11/2004 Lecture 2COSC3101A14 More Examples(1) For each of the following pairs of functions, either f(n) is O(g(n)), f(n) is Ω(g(n)), or f(n) = Θ(g(n)). Determine which relationship is correct. –f(n) = log n 2 ; g(n) = log n + 5 –f(n) = n; g(n) = log n 2 –f(n) = log log n; g(n) = log n –f(n) = n; g(n) = log 2 n –f(n) = n log n + n; g(n) = log n –f(n) = 10; g(n) = log 10 –f(n) = 2 n ; g(n) = 10n 2 –f(n) = 2 n ; g(n) = 3 n f(n) =  (g(n)) f(n) =  (g(n)) f(n) = O(g(n)) f(n) =  (g(n)) f(n) =  (g(n)) f(n) =  (g(n)) f(n) = O(g(n))

5/11/2004 Lecture 2COSC3101A15 More Examples(2)  notation –n 2 /2 – n/2 –(6n 3 + 1)lgn/(n + 1) –n vs. n 2  notation –n vs. 2n –n 3 vs. n 2 –n vs. logn –n vs. n 2 =  (n 2 ) n ≠  (n 2 ) =  (n 2 lgn) n =  (2n) O notation –2n 2 vs. n 3 –n 2 vs. n 2 –n 3 vs. nlogn n 3 =  (n 2 ) n =  (logn) n   (n 2 ) 2n 2 = O(n 3 ) n 2 = O(n 2 ) n 3  O(nlgn)

5/11/2004 Lecture 2COSC3101A16 Asymptotic Notations in Equations On the right-hand side –  (n 2 ) stands for some anonymous function in  (n 2 ) 2n 2 + 3n + 1 = 2n 2 +  (n) means: There exists a function f(n)   (n) such that 2n 2 + 3n + 1 = 2n 2 + f(n) On the left-hand side 2n 2 +  (n) =  (n 2 ) No matter how the anonymous function is chosen on the left-hand side, there is a way to choose the anonymous function on the right-hand side to make the equation valid.

5/11/2004 Lecture 2COSC3101A17 Limits and Comparisons of Functions Using limits for comparing orders of growth: compare ½ n (n-1) and n 2

5/11/2004 Lecture 2COSC3101A18 Limits and Comparisons of Functions L’Hopital rule: compare and

5/11/2004 Lecture 2COSC3101A19 Loop Invariant A loop invariant is a relation among program variables that –is true when control enters a loop, –remains true each time the program executes the body of the loop, –and is still true when control exits the loop. Understanding loop invariants can help us –analyze algorithms, –check for errors, –and derive algorithms from specifications.

5/11/2004 Lecture 2COSC3101A20 Proving Loop Invariants Initialization (base case): –It is true prior to the first iteration of the loop Maintenance (inductive step): –If it is true before an iteration of the loop, it remains true before the next iteration Termination: –When the loop terminates, the invariant - usually along with the reason that the loop terminated - gives us a useful property that helps show that the algorithm is correct –Stop the induction when the loop terminates Proving loop invariants works like induction

5/11/2004 Lecture 2COSC3101A21 Loop Invariant for Insertion Sort(1) Alg.: INSERTION-SORT(A) for j ← 2 to n do key ← A[ j ] Insert A[ j ] into the sorted sequence A[1.. j -1] i ← j - 1 while i > 0 and A[i] > key do A[i + 1] ← A[i] i ← i – 1 A[i + 1] ← key a8a8 a7a7 a6a6 a5a5 a4a4 a3a3 a2a2 a1a key Invariant: at the start of the for loop the elements in A[1.. j-1] are in sorted order

5/11/2004 Lecture 2COSC3101A22 Loop Invariant for Insertion Sort(2) Initialization: –Just before the first iteration, j = 2 : the subarray A[1.. j-1] = A[1], (the element originally in A[1] ) – is sorted

5/11/2004 Lecture 2COSC3101A23 Loop Invariant for Insertion Sort(3) Maintenance: –the while inner loop moves A[j -1], A[j -2], A[j -3], and so on, by one position to the right until the proper position for key (which has the value that started out in A[j] ) is found –At that point, the value of key is placed into this position.

5/11/2004 Lecture 2COSC3101A24 Loop Invariant for Insertion Sort(4) Termination: –The outer for loop ends when j > n ( i.e, j = n + 1 )  j-1 = n –Replace n with j-1 in the loop invariant: the subarray A[1.. n] consists of the elements originally in A[1.. n], but in sorted order The entire array is sorted! jj - 1

5/11/2004 Lecture 2COSC3101A25 Steps in Designing Algorithms(1) 1.Understand the problem Specify the range of inputs the algorithm should handle 2.Learn about the model of the implementation technology RAM (Random-access machine), sequential execution 3.Choosing between an exact and an approximate solution Some problems cannot be solved exactly: nonlinear equations, evaluating definite integrals Exact solutions may be unacceptably slow 4.Choose the appropriate data structures

5/11/2004 Lecture 2COSC3101A26 Steps in Designing Algorithms(2) 5.Choose an algorithm design technique General approach to solving problems algorithmically that is applicable to a variety of computational problems Provide guidance for developing solutions to new problems 6.Specify the algorithm Pseudocode: mixture of natural and programming language 7.Prove the algorithm’s correctness Algorithm yields the correct result for any legitimate input, in a finite amount of time Mathematical induction, loop-invariants

5/11/2004 Lecture 2COSC3101A27 Steps in Designing Algorithms(3) 8. Analyze the Algorithm Predicting the amount of resources required: memory: how much space is needed? computational time: how fast the algorithm runs? FACT: running time grows with the size of the input Input size (number of elements in the input) –Size of an array, polynomial degree, # of elements in a matrix, # of bits in the binary representation of the input, vertices and edges in a graph Def: Running time = the number of primitive operations (steps) executed before termination –Arithmetic operations (+, -, *), data movement, control, decision making (if, while), comparison

5/11/2004 Lecture 2COSC3101A28 Steps in Designing Algorithms(4) 9.Coding the algorithm Verify the ranges of the input Efficient/inefficient implementation It is hard to prove the correctness of a program (typically done by testing)

5/11/2004 Lecture 2COSC3101A29 Classification of Algorithms By problem types –Sorting –Searching –String processing –Graph problems –Combinatorial problems –Geometric problems –Numerical problems By design paradigms –Divide-and-conquer –Incremental –Dynamic programming –Greedy algorithms –Randomized/probabilistic

5/11/2004 Lecture 2COSC3101A30 Divide-and-Conquer Divide the problem into a number of subproblems –Similar sub-problems of smaller size Conquer the sub-problems –Solve the sub-problems recursively –Sub-problem size small enough  solve the problems in straightforward manner Combine the solutions to the sub-problems –Obtain the solution for the original problem

5/11/2004 Lecture 2COSC3101A31 Merge Sort Approach To sort an array A[p.. r]: Divide –Divide the n-element sequence to be sorted into two subsequences of n/2 elements each Conquer –Sort the subsequences recursively using merge sort –When the size of the sequences is 1 there is nothing more to do Combine –Merge the two sorted subsequences

5/11/2004 Lecture 2COSC3101A32 Merge Sort Alg.: MERGE-SORT (A, p, r) if p < r Check for base case then q ←  (p + r)/2  Divide MERGE-SORT (A, p, q) Conquer MERGE-SORT (A, q + 1, r) Conquer MERGE (A, p, q, r) Combine Initial call: MERGE-SORT (A, 1, n) p r q

5/11/2004 Lecture 2COSC3101A33 Example – n Power of q = Example

5/11/2004 Lecture 2COSC3101A34 Example – n Power of

5/11/2004 Lecture 2COSC3101A35 Example – n Not a Power of q = q = 9 q =

5/11/2004 Lecture 2COSC3101A36 Example – n Not a Power of

5/11/2004 Lecture 2COSC3101A37 Merging Input: Array A and indices p, q, r such that p ≤ q < r –Subarrays A[p.. q] and A[q r] are sorted Output: One single sorted subarray A[p.. r] p r q

5/11/2004 Lecture 2COSC3101A38 Merging Idea for merging: –Two piles of sorted cards Choose the smaller of the two top cards Remove it and place it in the output pile –Repeat the process until one pile is empty –Take the remaining input pile and place it face-down onto the output pile

5/11/2004 Lecture 2COSC3101A39 Merge - Pseudocode Alg.: MERGE(A, p, q, r) 1.Compute n 1 and n 2 2.Copy the first n 1 elements into L[1.. n 1 + 1] and the next n 2 elements into R[1.. n 2 + 1] 3.L[n 1 + 1] ←  ; R[n 2 + 1] ←  4. i ← 1; j ← 1 5. for k ← p to r 6. do if L[ i ] ≤ R[ j ] 7. then A[k] ← L[ i ] 8. i ← i else A[k] ← R[ j ] 10. j ← j + 1 pq rq + 1 L R   p r q n1n1 n2n2

5/11/2004 Lecture 2COSC3101A40 Example: MERGE(A, 9, 12, 16) prq

5/11/2004 Lecture 2COSC3101A41 Example: MERGE(A, 9, 12, 16)

5/11/2004 Lecture 2COSC3101A42 Example (cont.)

5/11/2004 Lecture 2COSC3101A43 Example (cont.)

5/11/2004 Lecture 2COSC3101A44 Example (cont.) Done!

5/11/2004 Lecture 2COSC3101A45 Running Time of Merge Initialization (copying into temporary arrays): –  (n 1 + n 2 ) =  (n) Adding the elements to the final array (the last for loop): –n iterations, each taking constant time   (n) Total time for Merge: –  (n)

5/11/2004 Lecture 2COSC3101A46 Analyzing Divide-and Conquer Algorithms The recurrence is based on the three steps of the paradigm: –T(n) – running time on a problem of size n –Divide the problem into a subproblems, each of size n/b: takes D(n) –Conquer (solve) the subproblems aT(n/b) –Combine the solutions C(n)  (1) if n ≤ c T(n) = aT(n/b) + D(n) + C(n) otherwise

5/11/2004 Lecture 2COSC3101A47 MERGE-SORT Running Time Divide: –compute q as the average of p and r: D(n) =  (1) Conquer: –recursively solve 2 subproblems, each of size n/2  2T (n/2) Combine: –MERGE on an n -element subarray takes  (n) time  C(n) =  (n)  (1) if n =1 T(n) = 2T(n/2) +  (n) if n > 1

5/11/2004 Lecture 2COSC3101A48 Correctness of Merge Sort Loop invariant ( at the start of the for loop) –A[p…k-1] contains the k-p smallest elements of L[1.. n 1 + 1] and R[1.. n 2 + 1] in sorted order –L[i] and R[j] are the smallest elements not yet copied back to A p r

5/11/2004 Lecture 2COSC3101A49 Proof of the Loop Invariant Initialization –Prior to first iteration: k = p  subarray A[p..k-1] is empty –A[p..k-1] contains the k – p = 0 smallest elements of L and R –L and R are sorted arrays (i = j = 1)  L[1] and R[1] are the smallest elements in L and R

5/11/2004 Lecture 2COSC3101A50 Proof of the Loop Invariant Maintenance –Assume L[i] ≤ R[j]  L[i] is the smallest element not yet copied to A –After copying L[i] into A[k], A[p..k] contains the k – p + 1 smallest elements of L and R –Incrementing k (for loop) and i reestablishes the loop invariant

5/11/2004 Lecture 2COSC3101A51 Proof of the Loop Invariant Termination –At termination k = r + 1 –By the loop invariant: A[p..k-1] = A[p…r] contains the k – p = r – p + 1 smallest elements of L and R in sorted order –Exactly the number of elements to be sorted  MERGE(A, p, q, r) is correct k = r + 1

5/11/2004 Lecture 2COSC3101A52 Readings Chapters 2.2, 3 Appendix A