Analysis of Algorithms CS 477/677 Midterm Exam Review Instructor: George Bebis.

Slides:



Advertisements
Similar presentations
Analysis of Algorithms CS 477/677 Instructor: Monica Nicolescu Lecture 6.
Advertisements

Analysis of Algorithms CS 477/677 Linear Sorting Instructor: George Bebis ( Chapter 8 )
Sorting Comparison-based algorithm review –You should know most of the algorithms –We will concentrate on their analyses –Special emphasis: Heapsort Lower.
1 Sorting in Linear Time How can we do better?  CountingSort  RadixSort  BucketSort.
CS 253: Algorithms Chapter 2 Sorting Insertion sort Bubble Sort Selection sort Run-Time Analysis Credit: Dr. George Bebis.
Comp 122, Spring 2004 Divide and Conquer (Merge Sort)
Analysis of Algorithms CS 477/677 Instructor: Monica Nicolescu Lecture 5.
Analysis of Algorithms CS 477/677 Sorting – Part B Instructor: George Bebis (Chapter 7)
Spring 2015 Lecture 5: QuickSort & Selection
1 Sorting Problem: Given a sequence of elements, find a permutation such that the resulting sequence is sorted in some order. We have already seen: –Insertion.
CS 253: Algorithms Chapter 7 Mergesort Quicksort Credit: Dr. George Bebis.
CS 253: Algorithms Chapter 8 Sorting in Linear Time Credit: Dr. George Bebis.
Analysis of Algorithms CS 477/677 Instructor: Monica Nicolescu.
CS 253: Algorithms Chapter 6 Heapsort Appendix B.5 Credit: Dr. George Bebis.
Comp 122, Spring 2004 Lower Bounds & Sorting in Linear Time.
Analysis of Algorithms CS 477/677
More sorting algorithms: Heap sort & Radix sort. Heap Data Structure and Heap Sort (Chapter 7.6)
Analysis of Algorithms CS 477/677 Instructor: Monica Nicolescu.
Analysis of Algorithms CS 477/677 Final Exam Review Instructor: George Bebis.
Lecture 5: Master Theorem and Linear Time Sorting
CS 253: Algorithms Chapter 4 Divide-and-Conquer Recurrences Master Theorem Credit: Dr. George Bebis.
Analysis of Algorithms CS 477/677
Analysis of Algorithms CS 477/677
Analysis of Algorithms CS 477/677 Instructor: Monica Nicolescu.
David Luebke 1 7/2/2015 Merge Sort Solving Recurrences The Master Theorem.
Analysis of Algorithms CS 477/677 Recurrences Instructor: George Bebis (Appendix A, Chapter 4)
Lower Bounds for Comparison-Based Sorting Algorithms (Ch. 8)
Computer Algorithms Lecture 11 Sorting in Linear Time Ch. 8
Sorting in Linear Time Lower bound for comparison-based sorting
Analysis of Algorithms CS 477/677 Instructor: Monica Nicolescu Lecture 15.
2IL50 Data Structures Spring 2015 Lecture 3: Heaps.
Analyzing Recursive Algorithms A recursive algorithm can often be described by a recurrence equation that describes the overall runtime on a problem of.
David Luebke 1 10/3/2015 CS 332: Algorithms Solving Recurrences Continued The Master Theorem Introduction to heapsort.
1 Sorting in O(N) time CS302 Data Structures Section 10.4.
1 Computer Algorithms Lecture 7 Master Theorem Some of these slides are courtesy of D. Plaisted, UNC and M. Nicolescu, UNR.
Project 2 due … Project 2 due … Project 2 Project 2.
Merge Sort. What Is Sorting? To arrange a collection of items in some specified order. Numerical order Lexicographical order Input: sequence of numbers.
2IL50 Data Structures Fall 2015 Lecture 3: Heaps.
The Selection Problem. 2 Median and Order Statistics In this section, we will study algorithms for finding the i th smallest element in a set of n elements.
Analysis of Algorithms CS 477/677
Fall 2015 Lecture 4: Sorting in linear time
Analysis of Algorithms CS 477/677 Instructor: Monica Nicolescu Lecture 9.
Searching and Sorting Recursion, Merge-sort, Divide & Conquer, Bucket sort, Radix sort Lecture 5.
CPSC 311 Section 502 Analysis of Algorithm Fall 2002 Department of Computer Science Texas A&M University.
Midterm Review 1. Midterm Exam Thursday, October 15 in classroom 75 minutes Exam structure: –TRUE/FALSE questions –short questions on the topics discussed.
COSC 3101A - Design and Analysis of Algorithms 6 Lower Bounds for Sorting Counting / Radix / Bucket Sort Many of these slides are taken from Monica Nicolescu,
CSC 413/513: Intro to Algorithms Solving Recurrences Continued The Master Theorem Introduction to heapsort.
1 Computer Algorithms Lecture 8 Sorting Algorithms Some of these slides are courtesy of D. Plaisted, UNC and M. Nicolescu, UNR.
Analysis of Algorithms CS 477/677 Lecture 8 Instructor: Monica Nicolescu.
COSC 3101A - Design and Analysis of Algorithms 3 Recurrences Master’s Method Heapsort and Priority Queue Many of the slides are taken from Monica Nicolescu’s.
CS6045: Advanced Algorithms Sorting Algorithms. Heap Data Structure A heap (nearly complete binary tree) can be stored as an array A –Root of tree is.
Sorting Lower Bounds n Beating Them. Recap Divide and Conquer –Know how to break a problem into smaller problems, such that –Given a solution to the smaller.
Heapsort Lecture 4 Asst. Prof. Dr. İlker Kocabaş.
Sorting. 2 The Sorting Problem Input: A sequence of n numbers a 1, a 2,..., a n Output: A permutation (reordering) a 1 ’, a 2 ’,..., a n ’ of the input.
BITS Pilani Pilani Campus Data Structure and Algorithms Design Dr. Maheswari Karthikeyan Lecture3.
Analysis of Algorithms CS 477/677 Instructor: Monica Nicolescu Lecture 4.
Lower Bounds & Sorting in Linear Time
Analysis of Algorithms CS 477/677
Linear Sorting Sections 10.4
Advance Analysis of Lecture No 9 Institute of Southern Punjab Multan
Lecture No 6 Advance Analysis of Institute of Southern Punjab Multan
Linear Sorting Sorting in O(n) Jeff Chastine.
Lower Bounds & Sorting in Linear Time
Linear Sorting Section 10.4
Divide and Conquer (Merge Sort)
Analysis of Algorithms
Algorithms Recurrences.
The Selection Problem.
Analysis of Algorithms CS 477/677
Presentation transcript:

Analysis of Algorithms CS 477/677 Midterm Exam Review Instructor: George Bebis

2 General Advice for Study Understand how the algorithms are working –Work through the examples we did in class –“Narrate” for yourselves the main steps of the algorithms in a few sentences Know when or for what problems the algorithms are applicable Do not memorize algorithms

3 Asymptotic notations A way to describe behavior of functions in the limit –Abstracts away low-order terms and constant factors –How we indicate running times of algorithms –Describe the running time of an algorithm as n goes to  O notation: asymptotic “less than”: f(n) “≤” g(n)  notation: asymptotic “greater than”: f(n) “≥” g(n)  notation: asymptotic “equality”: f(n) “=” g(n)

4 Exercise Order the following 6 functions in increasing order of their growth rates: –nlog n, log 2 n, n 2, 2 n,, n. log 2 n n nlogn n2n2 2n2n

5 Running Time Analysis Algorithm Loop1(n) p=1 for i = 1 to 2n p = p*i Algorithm Loop2(n) p=1 for i = 1 to n 2 p = p*i O(n) O(n 2 )

6 Running Time Analysis Algorithm Loop3(n) s=0 for i = 1 to n for j = 1 to i s = s + I O(n 2 )

7 Recurrences Def.: Recurrence = an equation or inequality that describes a function in terms of its value on smaller inputs, and one or more base cases Recurrences arise when an algorithm contains recursive calls to itself Methods for solving recurrences –Substitution method –Iteration method –Recursion tree method –Master method Unless explicitly stated choose the simplest method for solving recurrences

8 Analyzing Divide and Conquer Algorithms The recurrence is based on the three steps of the paradigm: –T(n) – running time on a problem of size n –Divide the problem into a subproblems, each of size n/b: –Conquer (solve) the subproblems –Combine the solutions  (1) if n ≤ c T(n) = takes D(n) aT(n/b) C(n) aT(n/b) + D(n) + C(n) otherwise

9 Master’s method Used for solving recurrences of the form: where, a ≥ 1, b > 1, and f(n) > 0 Compare f(n) with n log b a : Case 1: if f(n) = O( n log b a -  ) for some  > 0, then: T(n) =  ( n log b a ) Case 2: if f(n) =  ( n log b a ), then: T(n) =  ( n log b a lgn ) Case 3: if f(n) =  ( n log b a +  ) for some  > 0, and if af(n/b) ≤ cf(n) for some c < 1 and all sufficiently large n, then: T(n) =  (f(n)) regularity condition

10 Sorting Insertion sort –Design approach: –Sorts in place: –Best case: –Worst case: Bubble Sort –Design approach: –Sorts in place: –Running time: Yes  (n)  (n 2 ) incremental Yes  (n 2 ) incremental

11 Analysis of Insertion Sort INSERTION-SORT(A) for j ← 2 to n do key ← A[ j ] Insert A[ j ] into the sorted sequence A[1.. j -1] i ← j - 1 while i > 0 and A[i] > key do A[i + 1] ← A[i] i ← i – 1 A[i + 1] ← key cost times c 1 n c 2 n-1 0 n-1 c 4 n-1 c 5 c 6 c 7 c 8 n-1  n 2 /2 comparisons  n 2 /2 exchanges

12 Analysis of Bubble-Sort T(n) =  (n 2 ) Alg.: BUBBLESORT(A) for i  1 to length[A] do for j  length[A] downto i + 1 do if A[j] < A[j -1] then exchange A[j]  A[j-1] T(n) =c 1 (n+1) +c2c2 c3c3 c4c4 =  (n) + (c 2 + c 2 + c 4 ) Comparisons:  n 2 /2 Exchanges:  n 2 /2

13 Sorting Selection sort –Design approach: –Sorts in place: –Running time: Merge Sort –Design approach: –Sorts in place: –Running time: Yes  (n 2 ) incremental No  (nlgn) divide and conquer

14  n 2 /2 comparisons Analysis of Selection Sort Alg.: SELECTION-SORT(A) n ← length[A] for j ← 1 to n - 1 do smallest ← j for i ← j + 1 to n do if A[i] < A[smallest] then smallest ← i exchange A[j] ↔ A[smallest] cost times c 1 1 c 2 n c 3 n-1 c 4 c 5 c 6 c 7 n-1  n exchanges

15 MERGE-SORT Running Time Divide: –compute q as the average of p and r: D(n) =  (1) Conquer: –recursively solve 2 subproblems, each of size n/2  2T (n/2) Combine: –MERGE on an n -element subarray takes  (n) time  C(n) =  (n)  (1) if n =1 T(n) = 2T(n/2) +  (n) if n > 1

16 Quicksort –Idea: –Design approach: –Sorts in place: –Best case: –Worst case: Partition –Running time Randomized Quicksort Yes  (nlgn)  (n 2 ) Divide and conquer  (n) Partition the array A into 2 subarrays A[p..q] and A[q+1..r], such that each element of A[p..q] is smaller than or equal to each element in A[q+1..r]. Then sort the subarrays recursively.  (nlgn) – on average  (n 2 ) – in the worst case

17 Analysis of Quicksort

18 The Heap Data Structure Def: A heap is a nearly complete binary tree with the following two properties: –Structural property: all levels are full, except possibly the last one, which is filled from left to right –Order (heap) property: for any node x Parent(x) ≥ x Heap

19 Array Representation of Heaps A heap can be stored as an array A. –Root of tree is A[1] –Parent of A[i] = A[  i/2  ] –Left child of A[i] = A[2i] –Right child of A[i] = A[2i + 1] –Heapsize[A] ≤ length[A] The elements in the subarray A[(  n/2  +1).. n] are leaves The root is the max/min element of the heap A heap is a binary tree that is filled in order

20 Operations on Heaps (useful for sorting and priority queues) –MAX-HEAPIFY O(lgn) –BUILD-MAX-HEAP O(n) –HEAP-SORT O(nlgn) –MAX-HEAP-INSERT O(lgn) –HEAP-EXTRACT-MAX O(lgn) –HEAP-INCREASE-KEY O(lgn) –HEAP-MAXIMUM O(1) –You should be able to show how these algorithms perform on a given heap, and tell their running time

21 Lower Bound for Comparison Sorts Theorem: Any comparison sort algorithm requires  (nlgn) comparisons in the worst case. Proof: How many leaves does the tree have? –At least n! (each of the n! permutations if the input appears as some leaf)  n! –At most 2 h leaves  n! ≤ 2 h  h ≥ lg(n!) =  (nlgn) h leaves

22 Linear Time Sorting We can achieve a better running time for sorting if we can make certain assumptions on the input data: –Counting sort: Each of the n input elements is an integer in the range [0... r] r=O(n)

23 Analysis of Counting Sort Alg.: COUNTING-SORT(A, B, n, k) 1.for i ← 0 to r 2. do C[ i ] ← 0 3.for j ← 1 to n 4. do C[A[ j ]] ← C[A[ j ]] C[i] contains the number of elements equal to i 6.for i ← 1 to r 7. do C[ i ] ← C[ i ] + C[i -1] 8. C[i] contains the number of elements ≤ i 9.for j ← n downto do B[C[A[ j ]]] ← A[ j ] 11. C[A[ j ]] ← C[A[ j ]] - 1  (r)  (n)  (r)  (n) Overall time:  (n + r)

24 RADIX-SORT Alg.: RADIX-SORT (A, d) for i ← 1 to d do use a stable sort to sort array A on digit i 1 is the lowest order digit, d is the highest-order digit  (d(n+k))

25 Analysis of Bucket Sort Alg.: BUCKET-SORT(A, n) for i ← 1 to n do insert A[i] into list B[  nA[i]  ] for i ← 0 to n - 1 do sort list B[i] with quicksort sort concatenate lists B[0], B[1],..., B[n -1] together in order return the concatenated lists O(n)  (n) O(n)  (n)