Presentation is loading. Please wait.

Presentation is loading. Please wait.

CS 3343: Analysis of Algorithms

Similar presentations


Presentation on theme: "CS 3343: Analysis of Algorithms"— Presentation transcript:

1 CS 3343: Analysis of Algorithms
Quick sort and average running time analysis 2/25/2019

2 Quick sort Another divide and conquer sorting algorithm – like merge sort Anyone remember the basic idea? The worst-case and average-case running time? Learn some new algorithm analysis tricks 2/25/2019

3 Quick sort Quicksort an n-element array:
Divide: Partition the array into two subarrays around a pivot x such that elements in lower subarray £ x £ elements in upper subarray. Conquer: Recursively sort the two subarrays. Combine: Trivial. £ x x ≥ x Key: Linear-time partitioning subroutine. 2/25/2019

4 Partition All the action takes place in the partition() function £ x x
Rearranges the subarray in place End result: two subarrays All values in first subarray  all values in second Returns the index of the “pivot” element separating the two subarrays p q r £ x x ≥ x 2/25/2019

5 Pseudocode for quicksort
QUICKSORT(A, p, r) if p < r then q  PARTITION(A, p, r) QUICKSORT(A, p, q–1) QUICKSORT(A, q+1, r) Initial call: QUICKSORT(A, 1, n) 2/25/2019

6 Idea of partition If we are allowed to use a second array, it would be easy 6 10 5 8 13 3 2 11 6 5 3 2 11 13 8 10 2 5 3 6 11 13 8 10 2/25/2019

7 Another idea Keep two iterators: one from head, one from tail 6 10 5 8
13 3 2 11 6 2 5 3 13 8 10 11 3 2 5 6 13 8 10 11 2/25/2019

8 In-place Partition 3 6 2 10 5 6 8 3 13 8 3 2 10 11 2/25/2019

9 Partition In Words Partition(A, p, r):
Select an element to act as the “pivot” (which?) Grow two regions, A[p..i] and A[j..r] All elements in A[p..i] <= pivot All elements in A[j..r] >= pivot Increment i until A[i] > pivot Decrement j until A[j] < pivot Swap A[i] and A[j] Repeat until i >= j Swap A[j] and A[p] Return j Note: different from book’s partition(), which uses two iterators that both move forward. 2/25/2019

10 Partition Code What is the running time of partition()?
Partition(A, p, r) x = A[p]; // pivot is the first element i = p; j = r + 1; while (TRUE) { repeat i++; until A[i] > x or i >= j; j--; until A[j] < x or j < i; if (i < j) Swap (A[i], A[j]); else break; } swap (A[p], A[j]); return j; What is the running time of partition()? partition() runs in (n) time 2/25/2019

11 p r 6 10 5 8 13 3 2 11 x = 6 i j 6 10 5 8 13 3 2 11 scan i j 6 2 5 8 13 3 10 11 swap i j Partition example 6 2 5 8 13 3 10 11 scan i j 6 2 5 3 13 8 10 11 swap i j 6 2 5 3 13 8 10 11 scan j i p q r 3 2 5 6 13 8 10 11 final swap 2/25/2019

12 6 10 5 8 11 3 2 13 Quick sort example 3 2 5 6 11 8 10 13 2 3 5 6 10 8 11 13 2 3 5 6 8 10 11 13 2 3 5 6 8 10 11 13 2/25/2019

13 Analysis of quicksort Assume all input elements are distinct.
In practice, there are better partitioning algorithms for when duplicate input elements may exist. Let T(n) = worst-case running time on an array of n elements. 2/25/2019

14 Worst-case of quicksort
Input sorted or reverse sorted. Partition around min or max element. One side of partition always has no elements. (arithmetic series) 2/25/2019

15 Worst-case recursion tree
T(n) = T(0) + T(n–1) + n 2/25/2019

16 Worst-case recursion tree
T(n) = T(0) + T(n–1) + n T(n) 2/25/2019

17 Worst-case recursion tree
T(n) = T(0) + T(n–1) + n n T(0) T(n–1) 2/25/2019

18 Worst-case recursion tree
T(n) = T(0) + T(n–1) + n n T(0) (n–1) T(0) T(n–2) 2/25/2019

19 Worst-case recursion tree
T(n) = T(0) + T(n–1) + n n T(0) (n–1) T(0) (n–2) T(0) T(0) 2/25/2019

20 Worst-case recursion tree
T(n) = T(0) + T(n–1) + n height n height = n T(0) (n–1) T(0) (n–2) T(0) T(0) 2/25/2019

21 Worst-case recursion tree
T(n) = T(0) + T(n–1) + n n n height = n T(0) (n–1) T(0) (n–2) T(0) T(0) 2/25/2019

22 Worst-case recursion tree
T(n) = T(0) + T(n–1) + n n n height = n Q(1) (n–1) Q(1) (n–2) T(n) = Q(n) + Q(n2) = Q(n2) Q(1) Q(1) 2/25/2019

23 Best-case analysis (For intuition only!)
If we’re lucky, PARTITION splits the array evenly: T(n) = 2T(n/2) + Q(n) = Q(n log n) (same as merge sort) What if the split is always ? What is the solution to this recurrence? 2/25/2019

24 Analysis of “almost-best” case
2/25/2019

25 Analysis of “almost-best” case
2/25/2019

26 Analysis of “almost-best” case
2/25/2019

27 Analysis of “almost-best” case
log10/9n O(n) leaves Q(1) Q(1) 2/25/2019

28 Analysis of “almost-best” case
log10n log10/9n O(n) leaves Q(1) Q(n log n) Q(1) n log10n £ T(n) £ n log10/9n + O(n) 2/25/2019

29 Quicksort Runtimes Best-case runtime Tbest(n)  (n log n)
Worst-case runtime Tworst(n)  (n2) Worse than mergesort? Why is it called quicksort then? Its average runtime Tavg(n)  (n log n ) Better even, the expected runtime of randomized quicksort is (n log n) 2/25/2019

30 Randomized quicksort Randomly choose an element as pivot
Every time need to do a partition, throw a die to decide which element to use as the pivot Each element has 1/n probability to be selected Rand-Partition(A, p, r) d = random(); // a random number between 0 and 1 index = p + floor((r-p+1) * d); // p<=index<=r swap(A[p], A[index]); Partition(A, p, r); // now do partition using A[p] as pivot 2/25/2019

31 Running time of randomized quicksort
T(0) + T(n–1) + dn if 0 : n–1 split, T(1) + T(n–2) + dn if 1 : n–2 split, M T(n–1) + T(0) + dn if n–1 : 0 split, T(n) = The expected running time is an average of all cases Expectation 2/25/2019

32 2/25/2019

33 Solving recurrence Recursion tree (iteration) method
- Good for guessing an answer Substitution method - Generic method, rigid, but may be hard Master method - Easy to learn, useful in limited cases only - Some tricks may help in other cases 2/25/2019

34 Substitution method The most general method to solve a recurrence (prove O and  separately): Guess the form of the solution: (e.g. using recursion trees, or expansion) Verify by induction (inductive step). 2/25/2019

35 Expected running time of Quicksort
Guess We need to show that for some c and sufficiently large n Use T(n) instead of for convenience 2/25/2019

36 Need to show: T(n) ≤ c n log (n)
Fact: Need to show: T(n) ≤ c n log (n) Assume: T(k) ≤ ck log (k) for 0 ≤ k ≤ n-1 Proof: using the fact that if c ≥ 4. Therefore, by defintion, T(n) =  (nlogn) 2/25/2019

37 Tightly Bounding The Key Summation
Split the summation for a tighter bound What are we doing here? The lg k in the second term is bounded by lg n What are we doing here? Move the lg n outside the summation What are we doing here? 2/25/2019

38 Tightly Bounding The Key Summation
The summation bound so far The lg k in the first term is bounded by lg n/2 What are we doing here? lg n/2 = lg n - 1 What are we doing here? Move (lg n - 1) outside the summation What are we doing here? 2/25/2019

39 Tightly Bounding The Key Summation
The summation bound so far What are we doing here? Distribute the (lg n - 1) The summations overlap in range; combine them What are we doing here? The Guassian series What are we doing here? 2/25/2019

40 Tightly Bounding The Key Summation
The summation bound so far Rearrange first term, place upper bound on second What are we doing here? What are we doing? Guassian series Multiply it all out What are we doing? 2/25/2019

41 Tightly Bounding The Key Summation
2/25/2019


Download ppt "CS 3343: Analysis of Algorithms"

Similar presentations


Ads by Google