Administrivia, Lecture 4 HW #2 assigned this weekend, due Thurs Week 4 HWs will be due Thurs of Weeks 2, 4, 6, 7, 9, 10 HW #1 solutions should be posted.

Slides:



Advertisements
Similar presentations
Algorithms Analysis Lecture 6 Quicksort. Quick Sort Divide and Conquer.
Advertisements

Analysis of Algorithms CS 477/677 Instructor: Monica Nicolescu Lecture 6.
A simple example finding the maximum of a set S of n numbers.
Theory of Computing Lecture 3 MAS 714 Hartmut Klauck.
CSE 3101: Introduction to the Design and Analysis of Algorithms
Divide-and-Conquer The most-well known algorithm design strategy:
A. Levitin “Introduction to the Design & Analysis of Algorithms,” 3rd ed., Ch. 5 ©2012 Pearson Education, Inc. Upper Saddle River, NJ. All Rights Reserved.
1 Divide-and-Conquer The most-well known algorithm design strategy: 1. Divide instance of problem into two or more smaller instances 2. Solve smaller instances.
CS4413 Divide-and-Conquer
September 12, Algorithms and Data Structures Lecture III Simonas Šaltenis Nykredit Center for Database Research Aalborg University
Chapter 4: Divide and Conquer Master Theorem, Mergesort, Quicksort, Binary Search, Binary Trees The Design and Analysis of Algorithms.
Theory of Algorithms: Divide and Conquer
DIVIDE AND CONQUER APPROACH. General Method Works on the approach of dividing a given problem into smaller sub problems (ideally of same size).  Divide.
Divide and Conquer. Recall Complexity Analysis – Comparison of algorithm – Big O Simplification From source code – Recursive.
Analysis of Algorithms CS 477/677 Sorting – Part B Instructor: George Bebis (Chapter 7)
Spring 2015 Lecture 5: QuickSort & Selection
Chapter 4 Divide-and-Conquer Copyright © 2007 Pearson Addison-Wesley. All rights reserved.
Nattee Niparnan. Recall  Complexity Analysis  Comparison of Two Algos  Big O  Simplification  From source code  Recursive.
Analysis of Algorithms CS 477/677 Randomizing Quicksort Instructor: George Bebis (Appendix C.2, Appendix C.3) (Chapter 5, Chapter 7)
CSC 2300 Data Structures & Algorithms March 23, 2007 Chapter 7. Sorting.
Updated QuickSort Problem From a given set of n integers, find the missing integer from 0 to n using O(n) queries of type: “what is bit[j]
CS38 Introduction to Algorithms Lecture 7 April 22, 2014.
Introduction to Algorithms Rabie A. Ramadan rabieramadan.org 4 Some of the sides are exported from different sources.
Lecture 8 Jianjun Hu Department of Computer Science and Engineering University of South Carolina CSCE350 Algorithms and Data Structure.
Algorithm Design Strategy Divide and Conquer. More examples of Divide and Conquer  Review of Divide & Conquer Concept  More examples  Finding closest.
CS 253: Algorithms Chapter 7 Mergesort Quicksort Credit: Dr. George Bebis.
Chapter 4 Divide-and-Conquer Copyright © 2007 Pearson Addison-Wesley. All rights reserved.
Ch. 7 - QuickSort Quick but not Guaranteed. Ch.7 - QuickSort Another Divide-and-Conquer sorting algorithm… As it turns out, MERGESORT and HEAPSORT, although.
Analysis of Algorithms CS 477/677 Instructor: Monica Nicolescu.
Analysis of Algorithms CS 477/677 Instructor: Monica Nicolescu.
Chapter 4: Divide and Conquer The Design and Analysis of Algorithms.
Administrivia, Lecture 5 HW #2 was assigned on Sunday, January 20. It is due on Thursday, January 31. –Please use the correct edition of the textbook!
CS3381 Des & Anal of Alg ( SemA) City Univ of HK / Dept of CS / Helena Wong 4. Recurrences - 1 Recurrences.
Design and Analysis of Algorithms - Chapter 41 Divide and Conquer The most well known algorithm design strategy: 1. Divide instance of problem into two.
1 QuickSort Worst time:  (n 2 ) Expected time:  (nlgn) – Constants in the expected time are small Sorts in place.
Design and Analysis of Algorithms - Chapter 41 Divide and Conquer The most well known algorithm design strategy: 1. Divide instance of problem into two.
Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “ Introduction to the Design & Analysis of Algorithms, ” 2 nd ed., Ch. 1 Chapter.
A. Levitin “Introduction to the Design & Analysis of Algorithms,” 3rd ed., Ch. 5 ©2012 Pearson Education, Inc. Upper Saddle River, NJ. All Rights Reserved.
Nattee Niparnan. Recall  Complexity Analysis  Comparison of Two Algos  Big O  Simplification  From source code  Recursive.
Order Statistics The ith order statistic in a set of n elements is the ith smallest element The minimum is thus the 1st order statistic The maximum is.
10/13/20151 CS 3343: Analysis of Algorithms Lecture 9: Review for midterm 1 Analysis of quick sort.
Project 2 due … Project 2 due … Project 2 Project 2.
Divide and Conquer Andreas Klappenecker [based on slides by Prof. Welch]
1 Chapter 4 Divide-and-Conquer. 2 About this lecture Recall the divide-and-conquer paradigm, which we used for merge sort: – Divide the problem into a.
CS 361 – Chapters 8-9 Sorting algorithms –Selection, insertion, bubble, “swap” –Merge, quick, stooge –Counting, bucket, radix How to select the n-th largest/smallest.
Deterministic and Randomized Quicksort Andreas Klappenecker.
Divide-and-Conquer The most-well known algorithm design strategy: 1. Divide instance of problem into two or more smaller instances 2.Solve smaller instances.
Analysis of Algorithms CS 477/677 Instructor: Monica Nicolescu Lecture 7.
COSC 3101A - Design and Analysis of Algorithms 4 Quicksort Medians and Order Statistics Many of these slides are taken from Monica Nicolescu, Univ. of.
Young CS 331 D&A of Algo. Topic: Divide and Conquer1 Divide-and-Conquer General idea: Divide a problem into subprograms of the same kind; solve subprograms.
Divide and Conquer (Part II) Multiplication of two numbers Let U = (u 2n-1 u 2n-2 … u 1 u 0 ) 2 and V = (v 2n-1 v 2n-2 …v 1 v 0 ) 2, and our goal is to.
1 Chapter 7 Quicksort. 2 About this lecture Introduce Quicksort Running time of Quicksort – Worst-Case – Average-Case.
CS6045: Advanced Algorithms Sorting Algorithms. Sorting Input: sequence of numbers Output: a sorted sequence.
Analysis of Algorithms CS 477/677
Chapter 4 Divide-and-Conquer
Chapter 4 Divide-and-Conquer
Insertion Sort
Divide-and-Conquer The most-well known algorithm design strategy:
Topic: Divide and Conquer
Lecture No 6 Advance Analysis of Institute of Southern Punjab Multan
Divide-and-Conquer The most-well known algorithm design strategy:
CS 583 Analysis of Algorithms
Divide-and-Conquer The most-well known algorithm design strategy:
CS 3343: Analysis of Algorithms
Lecture 15, Winter 2019 Closest Pair, Multiplication
Design and Analysis of Algorithms
Quicksort Quick sort Correctness of partition - loop invariant
Divide and Conquer Merge sort and quick sort Binary search
Divide-and-Conquer 7 2  9 4   2   4   7
Presentation transcript:

Administrivia, Lecture 4 HW #2 assigned this weekend, due Thurs Week 4 HWs will be due Thurs of Weeks 2, 4, 6, 7, 9, 10 HW #1 solutions should be posted tonight (TA’s have compiled them) Reading for next week: Chapter (linear-time selection), Chapter (convex hull, closest- pair), Chapter 23 (minimum spanning trees)

Divide and Conquer for Sorting (2.3/1.3) Divide (into two equal parts) Conquer (solve for each part separately) Combine separate solutions Mergesort –Divide into two equal parts –Sort each part using Mergesort (recursion!!!) –Merge two sorted subsequences

Merging Two Subsequences x[1]-x[2]- … - x[K] y[1]-y[2]- … - y[L] if y[i] > x[j]  y[i+1] > x[j] < < K+L-1 edges = # (comparisons) = linear time

Merge Sort Execution Example

Recursion Tree log n n comparisons per level log n levels total runtime = n log n

Master Method (4. 3) Recurrence T(n) = a  T(n/b) + f(n) 1) If for some  > 0 then 2) If then 3) If for some  > 0 and a  f(n/b)  c  f(n) for some c < 1 then

Master Method Examples Mergesort T(n) = 2T(n/2) +  (n) Strassen (28.2) T(n) = 7T(n/2) +  (n 2 )

Quicksort ( / ) Sorts in place like insertion sort, unlike merge sort Divide into two parts such that –elements of left part < elements of right part Conquer: recursively solve for each part separately Combine: trivial - do not do anything Quicksort(A,p,r) if p < r then q  Partition (A,p,r) Quicksort (A,p,q) Quicksort (A,q+1,r) //divide //conquer left //conquer right

Divide = Partition PARTITION(A,p,r) //Partition array from A[p] to A[r] with pivot A[p] //Result: All elements  original A[p] have index  i x = A[p] i = p - 1 j = r + 1 repeat forever repeat j = j - 1 until A[j]  x repeat i = i +1 until A[i]  x if i < j then exchange A[i]  A[j] else return j

How It Works i j i j i j i j i j i j i j i j i j 11105* * i j i j i j i j i j i j i j i j i j i j i j i j ij ij leftright i j leftright leftright leftright leftright ji leftright ji leftright ji leftright j i leftright ji leftright leftright leftright leftright leftright jj leftright ji *169 15* leftright ji leftright ji leftright ji leftright ji leftright ji leftright leftright leftright

Runtime of Quicksort Worst case: –every time nothing to move –pivot = left (right) end of subarray –O(n 2 ) n Recursion Tree of QSort

Runtime of Quicksort Best case: –every time partition in (almost) equal parts –no worse than in given proportion –O(n log n) Average case = ?

What is the DQ Recurrence for QSort? t(n) =  (n) + 1/n  j = 1 to n (t(j-1) + t(n-j)) pivots pivots equiprobable lengths of subproblems =  (n) + 2/n  k = 0 to n-1 t(k) Guess t(n)  O(n log n) t(n)   (n) + 2/n [  i=2 to n-1 c i log i]  c n log n – cn/2

Another QSort Analysis (Shift / Cancel) (1) T(n) = n-1 + 2/n  j = 1 to n-1 T(i), n  2, T(1) = 0 (2) T(n+1) = n /(n+1)  j = 1 to n T(i) (1)  (3) nT(n) = n(n-1) + 2  j = 1 to n-1 T(i) (2)  (4) (n+1)T(n+1) = (n+1)n + 2  j = 1 to n T(i) (4) - (3)  (n+1)T(n+1) – nT(n) = 2n + 2T(n)  T(n+1) = (n+2)/(n+1) T(n) + 2n/(n+1)  T(n+1)  (n+2)/(n+1) T(n) + 2

Shift / Cancel (cont.) Unroll: H n = 1 + ½ + 1/3 + …+ 1/n = ln n + + O(1/n)  = Euler’s constant = 0.577…  T(n)  2(n+1) (ln n +  – 3/2) + O(1)  T(n)  O(n log n)

Randomized Analysis of QSort Probability space –Set  of elementary events  experiment outcomes –Family F of subsets of , called events –Probability measure Pr, real-valued function on members of F –where A  , A  F  A c  F F closed under union, intersection For all A  F, 0  Pr(A)  1 Pr(  ) = 1 For disjoint events A 1, A 2,... Pr(U A i ) =  Pr(A i ) Random variable is a function from elements of  into  –e.g., event (X = x)  set of elements of  for which X assumes the fixed value x For integer-valued r.v. X, expectation E[X] =  i Pr(X=i)

Randomized Analysis of QSort (cont.) At each level, we compare each element to the splitter in its partition Def. A successful pivot p satisfies n/8 < p < 7n/8 Three facts –(1) E[  X i ] =  E[X i ] linearity of expectation –(2) If Pr(Heads) is q, expected # tosses up to and including first Head is 1/q // family size puzzle –(3) Pr(U i A i )   i Pr(A i ) prob of union  sum of probs Given that Pr(successful pivot) = 3/4, a single element e i participates in at most how many successful partitioning steps? –Ans: log 8/7 n

Randomized Analysis of QSort (cont.) What is number of partition steps expected between k th, (k+1) st successful pivots? –(2)  4/3 steps –(1)  4/3 log 8/7 n –Each element expects to participate in O(log n) pivots (comparisons) Define r.v.’s which give # comparisons for each element –(1)  get O(n log n) expected comparisons So, we have another analysis that gives us the O(n log n) expected complexity of QSort

How Badly Can We Deviate From Expectation? E.g., what is Pr (i th element sees 20 log n pivots)? –How many successes possible? –Know  log 8/7 n Pivots are independent (Bernoulli trials) –Pr(success) = 3/4, Pr(failure) = 1/4 –To see 20 log n pivots, need 20 log n - log n failures in 20 log n tries Can use (3) to bound probability of so many “bad” events, since in general events may not be independent –Can get: Pr(  20 n log n) is 1 - O(n -6 )

Selection Best pivot: median –exercise: analyze complexity when bad pivots chosen –too expensive  random splitter This leads to SELECTION: –select (L, k) returns k th - smallest element of L –e.g., k = |L|/2  median What is an efficient algorithm? –O(n log n)? –sorting D/Q Recursion: –N.B.: This if RSelect, below –recall pivot from QSort –if i<k, look in right part for (k-i) th smallest –if i>k, look in left part for k th smallest

Randomized Selection Worst case: –  (n 2 ) as in QSort analysis Suppose can guarantee “good” pivot –e.g., n/4  i  3n/4 –  subproblem size  3n/4 –Let s(n)  time to find good pivot –  t(n)  s(n) + cn + t([3n/4]) find pivot pivot, make subproblem solve subproblem

Randomized Selection Suppose further: S(n)  dn for some d; t(n)  (c+d)n + t([3n/4]) Claim: t(n)  kn for some k would follow –Constructive induction or “substitution” –Ind. Hyp.: t(m)  km for m  n-1 –Ind. Step: t(n)  (c+d)n + k(3n/4) = (c+d+3k/4)n  kn which we want to be equivalent to t(n)  kn –But this is true if k  4(c+d)

Break Celebrity Problem: Given n people and a “knows” relation, is there a celebrity? –Notation: Directed graph (and, let’s say that there can be 0, 1 or 2 directed edges between any two vertices (== people)) What is obvious algorithm? –Test each person’s celebrityhood Induction –Hyp: Can tell whether there is a celebrity among the first n-1 people –Induction Step: Have a celebrity among first n-1 people  two queries needed to verify whether known by n th person No celebrity among first n-1 people  check whether n th person is a celebrity (2(n-1) queries needed) (Else no celebrity)  (n 2 ) queries

Break (cont.) Celebrity Problem: Can you do better? –Hint: (n-1) + 2(n-1) queries suffice Why are we focusing on identifying the celebrity? Key idea: eliminate a non-celebrity with each query –K ij = 0 j not celebrity –K ij = 1 i not celebrity –(We’ve seen this “complement” idea in DQ-MAXMIN) Another question: Given the adjacency matrix of an undirected graph, describe a fast algorithm that finds all triangles in the graph. –Q: Why am I asking this now?

D/Q for Arithmetic Multiplying Large Integers –A = a 0 + r 1 a r n-1 a n-1, r  radix –“classic” approach   (n 2 ) work Can we apply D/Q? –Let n = 2s, r = 10  radix –AB = xz + 10 s (wz + xy) s wy –T(n) = 4T(n/2) +  (n)  a = 4, b = 2 in Master Method –T(n)   (n 2 ) –Need to reduce # subproblems, i.e., want a < 4 Observation: r’ = (w+x)(y+z) = wy + (wz+xy) + xz –r’  (w+x)(y+z) –p  wy –q  xz –return 10 2s p + 10 s (r’-p-q) + q –T(n)  O(n log 2 3 ) = O(n 1.59 )

Matrix Multiplication A = […], B = […] are n x n matrices a 11, a 12, etc are n/2 x n/2 submatrices M = AB = […] –where m 11 = a 11 b 11 + a 12 b 21 etc. –Evaluation requires 8 multiplies, 4 adds T(n) = 8T(n/2) + O(n)   (n 3 ) Strassen: –p 1 = (a 21 + a 22 - a 11 )(b 22 - b 12 + b 11 ) –p 2 = a 11 b 11 –p 3 = a 12 b 21 –p 4 = (a 11 -a 21 )(b 22 -b 12 ) –p 5 = (a 21 +a 22 )(b 12 - b 11 ) –p 6 = (a 12 -a 21 +a 11 -a 22 )b 22 –p 7 = a 22 (b 11 +b 22 -b 12 -b 21 )

Strassen’s Matrix Multiplication p 1 = (a 21 + a 22 - a 11 )(b 22 - b 12 + b 11 ) p 2 = a 11 b 11 p 3 = a 12 b 21 p 4 = (a 11 -a 21 )(b 22 -b 12 ) p 5 = (a 21 +a 22 )(b 12 - b 11 ) p 6 = (a 12 -a 21 +a 11 -a 22 )b 22 p 7 = a 22 (b 11 +b 22 -b 12 -b 21 ) AB 11 = p 2 + p 3 AB 12 = p 1 + p 2 + p 5 + p 6 AB 21 = p 1 + p 2 + p 4 + p 7 AB 22 = p 1 + p 2 + p 4 + p 5 T(n)   (n 2.81 ) // 7 multiplies, 24 adds –Can get to 15 adds