2 -1 Chapter 2 The Complexity of Algorithms and the Lower Bounds of Problems.

Slides:



Advertisements
Similar presentations
A simple example finding the maximum of a set S of n numbers.
Advertisements

Analysis of Algorithms
S. J. Shyu Chap. 1 Introduction 1 The Design and Analysis of Algorithms Chapter 1 Introduction S. J. Shyu.
© The McGraw-Hill Companies, Inc., Chapter 2 The Complexity of Algorithms and the Lower Bounds of Problems.
The Divide-and-Conquer Strategy
Using Divide and Conquer for Sorting
Chapter 19: Searching and Sorting Algorithms
1 Sorting Problem: Given a sequence of elements, find a permutation such that the resulting sequence is sorted in some order. We have already seen: –Insertion.
September 19, Algorithms and Data Structures Lecture IV Simonas Šaltenis Nykredit Center for Database Research Aalborg University
1 ICS 353 Design and Analysis of Algorithms Spring Semester (062) King Fahd University of Petroleum & Minerals Information & Computer Science.
Introduction to Analysis of Algorithms
2 -1 Chapter 2 The Complexity of Algorithms and the Lower Bounds of Problems.
CSC 2300 Data Structures & Algorithms March 27, 2007 Chapter 7. Sorting.
1 -1 Chapter 1 Introduction Why Do We Need to Study Algorithms? To learn strategies to design efficient algorithms. To understand the difficulty.
Chapter 4: Divide and Conquer The Design and Analysis of Algorithms.
2 -1 Analysis of algorithms Best case: easiest Worst case Average case: hardest.
Analysis of Algorithms CS 477/677 Midterm Exam Review Instructor: George Bebis.
5 - 1 § 5 The Divide-and-Conquer Strategy e.g. find the maximum of a set S of n numbers.
TTIT33 Algorithms and Optimization – Dalg Lecture 2 HT TTIT33 Algorithms and optimization Lecture 2 Algorithms Sorting [GT] 3.1.2, 11 [LD] ,
Liang, Introduction to Java Programming, Sixth Edition, (c) 2007 Pearson Education, Inc. All rights reserved Chapter 23 Algorithm Efficiency.
2 -1 Chapter 2 The Complexity of Algorithms and the Lower Bounds of Problems.
1 -1 Chapter 1 Introduction Why to study algorithms? Sorting problem: To sort a set of elements into increasing or decreasing order. 11, 7, 14,
Analysis of Algorithms CS 477/677
DAST 2005 Week 4 – Some Helpful Material Randomized Quick Sort & Lower bound & General remarks…
The Complexity of Algorithms and the Lower Bounds of Problems
Analysis of Algorithms COMP171 Fall Analysis of Algorithms / Slide 2 Introduction * What is Algorithm? n a clearly specified set of simple instructions.
© 2006 Pearson Addison-Wesley. All rights reserved10 A-1 Chapter 10 Algorithm Efficiency and Sorting.
1 Time Analysis Analyzing an algorithm = estimating the resources it requires. Time How long will it take to execute? Impossible to find exact value Depends.
C++ Programming: Program Design Including Data Structures, Third Edition Chapter 19: Searching and Sorting Algorithms.
C++ Programming: Program Design Including Data Structures, Fourth Edition Chapter 19: Searching and Sorting Algorithms.
5 -1 Chapter 5 The Divide-and-Conquer Strategy A simple example finding the maximum of a set S of n numbers.
1 Lecture 16: Lists and vectors Binary search, Sorting.
Chapter 3 Sec 3.3 With Question/Answer Animations 1.
Order Statistics. Order statistics Given an input of n values and an integer i, we wish to find the i’th largest value. There are i-1 elements smaller.
Analysis of Algorithms These slides are a modified version of the slides used by Prof. Eltabakh in his offering of CS2223 in D term 2013.
The Lower Bounds of Problems
The Selection Problem. 2 Median and Order Statistics In this section, we will study algorithms for finding the i th smallest element in a set of n elements.
Sorting Fun1 Chapter 4: Sorting     29  9.
Télécom 2A – Algo Complexity (1) Time Complexity and the divide and conquer strategy Or : how to measure algorithm run-time And : design efficient algorithms.
CMPT 438 Algorithms. Why Study Algorithms? Necessary in any computer programming problem ▫Improve algorithm efficiency: run faster, process more data,
September 29, Algorithms and Data Structures Lecture V Simonas Šaltenis Aalborg University
Introduction to Analysis of Algorithms COMP171 Fall 2005.
Asymptotic Notation (O, Ω, )
Data Structure Introduction.
1 The Theory of NP-Completeness 2 Cook ’ s Theorem (1971) Prof. Cook Toronto U. Receiving Turing Award (1982) Discussing difficult problems: worst case.
Chapter 18: Searching and Sorting Algorithms. Objectives In this chapter, you will: Learn the various search algorithms Implement sequential and binary.
Java Methods Big-O Analysis of Algorithms Object-Oriented Programming
Algorithm Analysis Part of slides are borrowed from UST.
Midterm Review 1. Midterm Exam Thursday, October 15 in classroom 75 minutes Exam structure: –TRUE/FALSE questions –short questions on the topics discussed.
1 COMMONWEALTH OF AUSTRALIA Copyright Regulations 1969 WARNING This material has been reproduced and communicated to you by or on behalf.
LIMITATIONS OF ALGORITHM POWER
Liang, Introduction to Java Programming, Sixth Edition, (c) 2007 Pearson Education, Inc. All rights reserved Chapter 23 Algorithm Efficiency.
Lecture 14 Lower Bounds Decision tree model Linear-time reduction.
CSE 340: Review (at last!) Measuring The Complexity Complexity is a function of the size of the input O() Ω() Θ() Complexity Analysis “same order” Order.
© The McGraw-Hill Companies, Inc., Chapter 1 Introduction.
Sorting Lower Bounds n Beating Them. Recap Divide and Conquer –Know how to break a problem into smaller problems, such that –Given a solution to the smaller.
1 Chapter 2 Algorithm Analysis Reading: Chapter 2.
CMPT 438 Algorithms.
Design and Analysis of Algorithms
(2,4) Trees 11/15/2018 9:25 AM Sorting Lower Bound Sorting Lower Bound.
The Complexity of Algorithms and the Lower Bounds of Problems
(2,4) Trees 12/4/2018 1:20 PM Sorting Lower Bound Sorting Lower Bound.
The Lower Bounds of Problems
Fundamentals of the Analysis of Algorithm Efficiency
(2,4) Trees 2/28/2019 3:21 AM Sorting Lower Bound Sorting Lower Bound.
Topic 5: Heap data structure heap sort Priority queue
Algorithm Course Algorithms Lecture 3 Sorting Algorithm-1
Presentation transcript:

2 -1 Chapter 2 The Complexity of Algorithms and the Lower Bounds of Problems

2 -2 The goodness of an algorithm Time complexity (more important) Space complexity For a parallel algorithm : time-processor product For a VLSI circuit : area-time (AT, AT 2 )

2 -3 Measure the goodness of an algorithm Time complexity of an algorithm efficient (algorithm) worst-case average-case Amortized We can use the number of comparisons to measure a sorting algorithm.

2 -4 NP-complete ? Undecidable ? Is the algorithm best ? optimal (algorithm) We can also use the number of comparisons to measure a sorting problem. Measure the difficulty of a problem

2 -5 Asymptotic notations Def: f(n) = O(g(n))"at most"  c, n 0  |f(n)|  c|g(n)|  n  n 0 e.g. f(n) = 3n g(n) = n 2  n 0 =2, c=4  f(n) = O(n 2 ) e.g. f(n) = n 3 + n = O(n 3 ) e. g. f(n) = 3n = O(n 3 ) or O(n 100 )

2 -6 Def : f(n) =  (g(n)) “at least“, “lower bound"  c, and n 0,  |f(n)|  c|g(n)|  n  n 0 e. g. f(n) = 3n =  (n 2 ) or  (n) Def : f(n) =  (g(n))  c 1, c 2, and n 0,  c 1 |g(n)|  |f(n)|  c 2 |g(n)|  n  n 0 e. g. f(n) = 3n =  (n 2 ) Def : f(n)  o(g(n))  0 e.g. f(n) = 3n 2 +n = o(3n 2 )

2 -7 Problem size Time Complexity Functions log 2 n n nlog 2 n0.33x x x10 5 n2n n2n x10 30 > n!3x10 6 >10 100

2 -8 Rate of growth of common computing time functions

2 -9 Common computing time functions  (1)   (log n)   (n)   (n log n)   (n 2 )   (n 3 )   (2 n )   (n!)   (n n ) Exponential algorithm:  (2 n ) polynomial algorithm Algorithm A :  (n 3 ), algorithm B :  (n) Should Algorithm B run faster than A? NO ! It is true only when n is large enough!

2 -10 Analysis of algorithms Best case: easiest Worst case Average case: hardest

Example: loop (1/3) Given a integer N, try to determine whether N is prime number How to answer the question above? 2 -11

Example: loop (2/3) count = 0; for( i=2 ; i<N ; i++) { count++; if(N%i == 0) return IS_NOT_A_PRIME; } return IS_A_PRIME; 2 -12

Example: loop (3/3) NPrime?# divide 11prime prime prime prime NPrime?# divide prime prime

Example: recursive T(n) = T(n-1) + 1 T(n) = 2T(n-1) + 1 T(n) = T(n/2) + 1 T(n) = aT(n/b) + f(n) ?? Selection sort Hanoi tower Binary search

A General Method of Solving Divide-and-Conquer 2 -15

16 Deduction-1 given rewrite it into template,

17 Example1-(1) Example 1 : Solution : (1)

18 Deduction-2 而 iteration and cancellation 又 log n terms,

19 Deduction-3 Ο : upper bound ex: Ω : lower bound ex: Θ : exactly ex: ο : exactly,even constant ex: 不同的 g(n) 對應不同的

20 Example1-(2) Example 1 : Solution : From (1)

21 Example2 Example 2 : Solution :,

22 General Representation In general given,, 仍然是只差 constant If g(n) is monotone

23 Example3-- Pan ’ s Matrix Example 3 : Solution :

24 More General Representation More general given,

25 Example4 Example 4 : Solution :,

26 Reference Bentley, Haken & Saxe “A General Method of Solving Divide-and-Conquer Recurrences”, SIGACT News Fall 1980,page36-44

2 -27 Straight insertion sort input: 7,5,1,4,3 7,5,1,4,3 5,7,1,4,3 1,5,7,4,3 1,4,5,7,3 1,3,4,5,7

2 -28 Straight insertion sort Algorithm 2.1 Straight Insertion Sort Input: x 1,x 2,...,x n Output: The sorted sequence of x 1,x 2,...,x n For j := 2 to n do Begin i := j-1 x := x j While x 0 do Begin x i+1 := x i i := i-1 End x i+1 := x End

Analysis of Straight insertion sort best case:  (n) Sorted sequence worst case:  (n 2 ) Reverse order average case:  (n 2 ) 2 -29

2 -30 Analysis of # of exchanges Method 1 (straightforward) x j is being inserted into the sorted sequence x 1 x x j-1 If x j is the kth (1  k  j) largest, it takes (k  1) exchanges. e.g   # of exchanges required for x j to be inserted:

2 -31 # of exchanges for sorting:

2 -32

2 -33 Binary search sorted sequence : (search 9) step 1  step 2  step 3  best case: 1 step =  (1) worst case: (  log 2 n  +1) steps =  (log n) average case:  (log n) steps

2 -34 n cases for successful search n+1 cases for unsuccessful search Average # of comparisons done in the binary tree: A(n) =, where k =  log n  +1

2 -35 A(n) =  k as n is very large = log n =  (log n)

2 -36

2 -37 Straight selection sort Comparison vs data movement For each number, Comparison:  (n) Data movement:  (1) In worst case, Comparison:  (n 2 ) Data movement:  (n)

2 -38

2 -39 Quick sort Recursively apply the same procedure.

2 -40 Best case :  (n log n) A list is split into two sublists with almost equal size. log n rounds are needed. In each round, n comparisons (ignoring the element used to split) are required.

2 -41 Worst case :  (n 2 ) In each round, the number used to split is either the smallest or the largest.

2 -42 Average case:  (n log n)

2 -43

2 -44

D ranking finding Def: Let A = (a 1,a 2 ), B = (b 1,b 2 ). A dominates B iff a 1 > b 1 and a 2 > b 2 Def: Given a set S of n points, the rank of a point x is the number of points dominated by x. B A C D E rank(A)= 0 rank(B) = 1 rank(C) = 1 rank(D) = 3 rank(E) = 0

2 -46 Straightforward algorithm: compare all pairs of points : O(n 2 )

2 -47 Input: A set S of planar points P 1,P 2,…,P n Output: The rank of every point in S Step 1: (Split the points along the median line L into A and B.) a.If S contains only one point, return its rank its rank as 0. b.Otherwise,choose a cut line L perpendicular to the x-axis such that n/2 points of S have X-values L (call this set of points A) and the remainder points have X-values L(call this set B).Note that L is a median X-value of this set. Step 2: Find ranks of points in A and ranks of points in B, recursively. Step 3: Sort points in A and B according to their y-values. Scan these points sequentially and determine, for each point in B, the number of points in A whose y-values are less than its y- value. The rank of this point is equal to the rank of this point among points in B, plus the number of points in A whose y- values are less than its y-value. Divide-and-conquer 2-D ranking finding

2 -48

2 -49 Lower bound Def : A lower bound of a problem is the least time complexity required for any algorithm which can be used to solve this problem. ☆ worst case lower bound ☆ average case lower bound The lower bound for a problem is not unique. e.g.  (1),  (n),  (n log n) are all lower bounds for sorting. (  (1),  (n) are trivial)

2 -50 At present, if the highest lower bound of a problem is  (n log n) and the time complexity of the best algorithm is O(n 2 ). We may try to find a higher lower bound. We may try to find a better algorithm. Both of the lower bound and the algorithm may be improved.. If the present lower bound is  (n log n) and there is an algorithm with time complexity O(n log n), then the algorithm is optimal.

permutations for 3 data elements a 1 a 2 a The worst case lower bound of sorting

2 -52 Straight insertion sort: input data: (2, 3, 1) (1) a 1 :a 2 (2) a 2 :a 3, a 2  a 3 (3) a 1 :a 2, a 1  a 2 input data: (2, 1, 3) (1)a 1 :a 2, a 1  a 2 (2)a 2 :a 3

2 -53 Decision tree for straight insertion sort

2 -54 D ecision tree for bubble sort

2 -55 Lower bound of sorting To find the lower bound, we have to find the smallest depth of a binary tree. n! distinct permutations n! leaf nodes in the binary decision tree. balanced tree has the smallest depth:  log(n!)  =  (n log n) lower bound for sorting:  (n log n)

2 -56 Method 1:

2 -57 Method 2: Stirling approximation: n!  log n!  log  n log n   (n log n)

2 -58

2 -59 Heapsort — An optimal sorting algorithm A heap : parent  son

2 -60 output the maximum and restore: Heapsort: construction output

2 -61 Phase 1: construction input data: 4, 37, 26, 15, 48 restore the subtree rooted at A(2): restore the tree rooted at A(1):

2 -62 Phase 2: output

2 -63 Implementation using a linear array not a binary tree. The sons of A(h) are A(2h) and A(2h+1). time complexity: O(n log n)

2 -64 Time complexity Phase 1: construction

2 -65 Time complexity Phase 2: output

2 -66 Average case lower bound of sorting By binary decision tree The average time complexity of a sorting algorithm: the external path length of the binary tree n! The external path length is minimized if the tree is balanced. (all leaf nodes on level d or level d  1)

2 -67 Average case lower bound of sorting

2 -68 Compute the min external path length 1. Depth of balanced binary tree with c leaf nodes: d =  log c  Leaf nodes can appear only on level d or d  x 1 leaf nodes on level d  1 x 2 leaf nodes on level d x 1 + x 2 = c x 1 + = 2 d-1  x 1 = 2 d  c x 2 = 2(c  2 d-1 )

External path length: M= x 1 (d  1) + x 2 d = (2 d  1)(d  1) + 2(c  2 d-1 )d = c(d  1) + 2(c  2 d-1 ), d  1 =  log c  = c  log c  + 2(c  2  log c  ) 4. c = n! M = n!  log n!  + 2(n!  2  log n!  ) M/n! =  log n!  + 2 =  log n!  + c, 0  c  1 =  (n log n) Average case lower bound of sorting:  (n log n)

2 -70 Quicksort & Heapsort Quicksort is optimal in the average case. (  (n log n) in average ) (i)worst case time complexity of heapsort is  (n log n) (ii)average case lower bound:  (n log n) average case time complexity of heapsort is  (n log n). Heapsort is optimal in the average case.

2 -71 Improving a lower bound through oracles Problem P: merge two sorted sequences A and B with lengths m and n. (1) Binary decision tree: There are ways ! leaf nodes in the binary tree.  The lower bound for merging:  log   m + n  1 (conventional merging)

2 -72

2 -73 (2) Oracle: The oracle tries its best to cause the algorithm to work as hard as it might. (to give a very hard data set) Sorted sequences: A: a 1 < a 2 < … < a m B: b 1 < b 2 < … < b m The very hard case: a 1 < b 1 < a 2 < b 2 < … < a m < b m

2 -74 We must compare: a 1 : b 1 b 1 : a 2 a 2 : b 2 : b m-1 : a m-1 a m : b m Otherwise, we may get a wrong result for some input data. e.g. If b 1 and a 2 are not compared, we can not distinguish a 1 < b 1 < a 2 < b 2 < … < a m < b m and a 1 < a 2 < b 1 < b 2 < … < a m < b m Thus, at least 2m  1 comparisons are required. The conventional merging algorithm is optimal for m = n.

2 -75 Finding lower bound by problem transformation Problem A reduces to problem B (A  B) iff A can be solved by using any algorithm which solves B. If A  B, B is more difficult. Note: T(tr 1 ) + T(tr 2 ) < T(B) T(A)  T(tr 1 ) + T(tr 2 ) + T(B)  O(T(B))

2 -76 The lower bound of the convex hull problem sorting  convex hull A B an instance of A: (x 1, x 2, …, x n ) ↓ transformation an instance of B: {( x 1, x 1 2 ), ( x 2, x 2 2 ), …, ( x n, x n 2 )} assume: x 1 < x 2 < … < x n

2 -77 If the convex hull problem can be solved, we can also solve the sorting problem. The lower bound of sorting:  (n log n) The lower bound of the convex hull problem:  (n log n)

2 -78 The lower bound of the Euclidean minimal spanning tree (MST) problem sorting  Euclidean MST A B an instance of A: (x 1, x 2, …, x n ) ↓ transformation an instance of B: {( x 1, 0), ( x 2, 0), …, ( x n, 0)} Assume x 1 < x 2 < x 3 < … < x n  there is an edge between (x i, 0) and (x i+1, 0) in the MST, where 1  i  n  1

2 -79 If the Euclidean MST problem can be solved, we can also solve the sorting problem. The lower bound of sorting:  (n log n) The lower bound of the Euclidean MST problem:  (n log n)