September 29, 20021 Algorithms and Data Structures Lecture V Simonas Šaltenis Aalborg University

Slides:



Advertisements
Similar presentations
Introduction to Algorithms Quicksort
Advertisements

Algorithms Analysis Lecture 6 Quicksort. Quick Sort Divide and Conquer.
BY Lecturer: Aisha Dawood. Heapsort  O(n log n) worst case like merge sort.  Sorts in place like insertion sort.  Combines the best of both algorithms.
Analysis of Algorithms
CSE 3101: Introduction to the Design and Analysis of Algorithms
Sorting Comparison-based algorithm review –You should know most of the algorithms –We will concentrate on their analyses –Special emphasis: Heapsort Lower.
ADA: 5. Quicksort1 Objective o describe the quicksort algorithm, it's partition function, and analyse its running time under different data conditions.
Using Divide and Conquer for Sorting
Spring 2015 Lecture 5: QuickSort & Selection
David Luebke 1 5/20/2015 CS 332: Algorithms Quicksort.
Updated QuickSort Problem From a given set of n integers, find the missing integer from 0 to n using O(n) queries of type: “what is bit[j]
September 19, Algorithms and Data Structures Lecture IV Simonas Šaltenis Nykredit Center for Database Research Aalborg University
Sorting Heapsort Quick review of basic sorting methods Lower bounds for comparison-based methods Non-comparison based sorting.
Analysis of Algorithms CS 477/677 Instructor: Monica Nicolescu.
CS 253: Algorithms Chapter 6 Heapsort Appendix B.5 Credit: Dr. George Bebis.
Data Structures, Spring 2006 © L. Joskowicz 1 Data Structures – LECTURE 4 Comparison-based sorting Why sorting? Formal analysis of Quick-Sort Comparison.
Analysis of Algorithms CS 477/677
CS420 lecture five Priority Queues, Sorting wim bohm cs csu.
Tirgul 4 Sorting: – Quicksort – Average vs. Randomized – Bucket Sort Heaps – Overview – Heapify – Build-Heap.
Comp 122, Spring 2004 Heapsort. heapsort - 2 Lin / Devi Comp 122 Heapsort  Combines the better attributes of merge sort and insertion sort. »Like merge.
Data Structures, Spring 2004 © L. Joskowicz 1 Data Structures – LECTURE 7 Heapsort and priority queues Motivation Heaps Building and maintaining heaps.
David Luebke 1 7/2/2015 Merge Sort Solving Recurrences The Master Theorem.
Heapsort CIS 606 Spring Overview Heapsort – O(n lg n) worst case—like merge sort. – Sorts in place—like insertion sort. – Combines the best of both.
Computer Algorithms Lecture 11 Sorting in Linear Time Ch. 8
David Luebke 1 10/3/2015 CS 332: Algorithms Solving Recurrences Continued The Master Theorem Introduction to heapsort.
Merge Sort. What Is Sorting? To arrange a collection of items in some specified order. Numerical order Lexicographical order Input: sequence of numbers.
Heaps, Heapsort, Priority Queues. Sorting So Far Heap: Data structure and associated algorithms, Not garbage collection context.
CSC 41/513: Intro to Algorithms Linear-Time Sorting Algorithms.
Binary Heap.
The Selection Problem. 2 Median and Order Statistics In this section, we will study algorithms for finding the i th smallest element in a set of n elements.
Lecture 2 Sorting. Sorting Problem Insertion Sort, Merge Sort e.g.,
Computer Sciences Department1. Sorting algorithm 3 Chapter 6 3Computer Sciences Department Sorting algorithm 1  insertion sort Sorting algorithm 2.
Introduction to Algorithms Jiafen Liu Sept
David Luebke 1 6/3/2016 CS 332: Algorithms Heapsort Priority Queues Quicksort.
Priority Queues and Heaps. October 2004John Edgar2  A queue should implement at least the first two of these operations:  insert – insert item at the.
CS 2133: Data Structures Quicksort. Review: Heaps l A heap is a “complete” binary tree, usually represented as an array:
David Luebke 1 12/23/2015 Heaps & Priority Queues.
Computer Algorithms Lecture 9 Heapsort Ch. 6, App. B.5 Some of these slides are courtesy of D. Plaisted et al, UNC and M. Nicolescu, UNR.
CSC 413/513: Intro to Algorithms Solving Recurrences Continued The Master Theorem Introduction to heapsort.
Chapter 6: Heapsort Combines the good qualities of insertion sort (sort in place) and merge sort (speed) Based on a data structure called a “binary heap”
Lecture 8 : Priority Queue Bong-Soo Sohn Assistant Professor School of Computer Science and Engineering Chung-Ang University.
CSE 5392 Fall 2005 Week 4 Devendra Patel Subhesh Pradhan.
1 Heap Sort. A Heap is a Binary Tree Height of tree = longest path from root to leaf =  (lgn) A heap is a binary tree satisfying the heap condition:
Mergeable Heaps David Kauchak cs302 Spring Admin Homework 7?
David Luebke 1 2/5/2016 CS 332: Algorithms Introduction to heapsort.
Analysis of Algorithms CS 477/677 Lecture 8 Instructor: Monica Nicolescu.
David Luebke 1 2/19/2016 Priority Queues Quicksort.
Sept Heapsort What is a heap? Max-heap? Min-heap? Maintenance of Max-heaps -MaxHeapify -BuildMaxHeap Heapsort -Heapsort -Analysis Priority queues.
Heapsort Lecture 4 Asst. Prof. Dr. İlker Kocabaş.
6.Heapsort. Computer Theory Lab. Chapter 6P.2 Why sorting 1. Sometimes the need to sort information is inherent in a application. 2. Algorithms often.
1 Algorithms CSCI 235, Fall 2015 Lecture 13 Heap Sort.
Sorting Cont. Quick Sort As the name implies quicksort is the fastest known sorting algorithm in practice. Quick-sort is a randomized sorting algorithm.
CS6045: Advanced Algorithms Sorting Algorithms. Sorting So Far Insertion sort: –Easy to code –Fast on small inputs (less than ~50 elements) –Fast on nearly-sorted.
CS6045: Advanced Algorithms Sorting Algorithms. Sorting Input: sequence of numbers Output: a sorted sequence.
M. Böhlen and R. Sebastiani9/20/20161 Data Structures and Algorithms Roberto Sebastiani
Lecture 2 Sorting.
Introduction to Algorithms Prof. Charles E. Leiserson
Heaps, Heapsort, and Priority Queues
Algorithms and Data Structures Lecture VI
Introduction to Algorithms
CS 583 Analysis of Algorithms
Heaps, Heapsort, and Priority Queues
Ch 6: Heapsort Ming-Te Chi
Lecture 3 / 4 Algorithm Analysis
Design and Analysis of Algorithms
Topic 5: Heap data structure heap sort Priority queue
HEAPS.
Solving Recurrences Continued The Master Theorem
Asst. Prof. Dr. İlker Kocabaş
Presentation transcript:

September 29, Algorithms and Data Structures Lecture V Simonas Šaltenis Aalborg University

September 29, This Lecture Sorting algorithms Quicksort a popular algorithm, very fast on average Heapsort Heap data structure

September 29, Why Sorting? “When in doubt, sort” – one of the principles of algorithm design. Sorting used as a subroutine in many of the algorithms: Searching in databases: we can do binary search on sorted data element uniqueness, duplicate elimination A large number of computer graphics and computational geometry problems closest pair

September 29, Why Sorting? (2) A large number of sorting algorithms are developed representing different algorithm design techniques. A lower bound for sorting  (n log n) is used to prove lower bounds of other problems

September 29, Sorting Algorithms so far Insertion sort, selection sort, bubble sort Worst-case running time  (n 2 ); in-place Merge sort Worst-case running time  (n log n), but requires additional memory  (n);

September 29, Quick Sort Characteristics Like insertion sort, but unlike merge sort, sorts in-place, i.e., does not require an additional array Very practical, average sort performance O(n log n) (with small constant factors), but worst case O(n 2 )

September 29, Quick Sort – the Principle To understand quick-sort, let’s look at a high-level description of the algorithm A divide-and-conquer algorithm Divide: partition array into 2 subarrays such that elements in the lower part <= elements in the higher part Conquer: recursively sort the 2 subarrays Combine: trivial since sorting is done in place

September 29, Partitioning Linear time partitioning procedure Partition(A,p,r) 01 x  A[r] 02 i  p-1 03 j  r+1 04 while TRUE 05 repeat j  j-1 06 until A[j]  x 07 repeat i  i+1 08 until A[i]  x 09 if i<j 10 then exchange A[i]  A[j] 11 else return j i j i j j i j i i j  X=10 

September 29, Quick Sort Algorithm Initial call Quicksort(A, 1, length[A]) Quicksort(A,p,r) 01 if p<r 02 then q  Partition(A,p,r) 03 Quicksort(A,p,q) 04 Quicksort(A,q+1,r)

September 29, Analysis of Quicksort Assume that all input elements are distinct The running time depends on the distribution of splits

September 29, Best Case If we are lucky, Partition splits the array evenly

September 29, Worst Case What is the worst case? One side of the parition has only one element

September 29, Worst Case (2)

September 29, Worst Case (3) When does the worst case appear? input is sorted input reverse sorted Same recurrence for the worst case of insertion sort However, sorted input yields the best case for insertion sort!

September 29, Analysis of Quicksort Suppose the split is 1/10 : 9/10

September 29, An Average Case Scenario Suppose, we alternate lucky and unlucky cases to get an average behavior n 1 n-1 (n-1)/2 (n-1)/2+1 (n-1)/2 n

September 29, An Average Case Scenario (2) How can we make sure that we are usually lucky? Partition around the ”middle” (n/2th) element? Partition around a random element (works well in practice) Randomized algorithm running time is independent of the input ordering no specific input triggers worst-case behavior the worst-case is only determined by the output of the random-number generator

September 29, Randomized Quicksort Assume all elements are distinct Partition around a random element Consequently, all splits (1:n-1, 2:n-2,..., n- 1:1) are equally likely with probability 1/n Randomization is a general tool to improve algorithms with bad worst-case but good average-case complexity

September 29, Randomized Quicksort (2) Randomized-Partition(A,p,r) 01 i  Random(p,r) 02 exchange A[r]  A[i] 03 return Partition(A,p,r) Randomized-Quicksort(A,p,r) 01 if p<r then 02 q  Randomized-Partition(A,p,r) 03 Randomized-Quicksort(A,p,q) 04 Randomized-Quicksort(A,q+1,r)

September 29, Selection Sort A takes  (n) and B takes  (1):  (n 2 ) in total Idea for improvement: use a smart data structure to do both A and B in  (1) and to spend only O(lg n) time in each iteration reorganizing the structure and achieving the total running time of O(n log n) Selection-Sort(A[1..n]): For i  n downto 2 A: Find the largest element among A[1..i] B: Exchange it with A[i] Selection-Sort(A[1..n]): For i  n downto 2 A: Find the largest element among A[1..i] B: Exchange it with A[i]

September 29, Heap Sort Binary heap data structure A array Can be viewed as a nearly complete binary tree All levels, except the lowest one are completely filled The key in root is greater or equal than all its children, and the left and right subtrees are again binary heaps Two attributes length[A] heap-size[A]

September 29, Heap Sort (3) Parent (i) return  i/2  Left (i) return 2i Right (i) return 2i+1 Heap propertiy: A[Parent(i)]  A[i] Level:

September 29, Heap Sort (4) Notice the implicit tree links; children of node i are 2i and 2i+1 Why is this useful? In a binary representation, a multiplication/division by two is left/right shift Adding 1 can be done by adding the lowest bit

September 29, Heapify i is index into the array A Binary trees rooted at Left(i) and Right(i) are heaps But, A[i] might be smaller than its children, thus violating the heap property The method Heapify makes A a heap once more by moving A[i] down the heap until the heap property is satisfied again

September 29, Heapify (2)

September 29, Heapify Example

September 29, Heapify: Running Time The running time of Heapify on a subtree of size n rooted at node i is determining the relationship between elements:  (1) plus the time to run Heapify on a subtree rooted at one of the children of i, where 2n/3 is the worst-case size of this subtree. Alternatively Running time on a node of height h: O(h)

September 29, Building a Heap Convert an array A[1...n], where n = length[A], into a heap Notice that the elements in the subarray A[(  n/2  + 1)...n] are already 1-element heaps to begin with!

Building a Heap

September 29, Building a Heap: Analysis Correctness: induction on i, all trees rooted at m > i are heaps Running time: n calls to Heapify = n O(lg n) = O(n lg n) Good enough for an O(n lg n) bound on Heapsort, but sometimes we build heaps for other reasons, would be nice to have a tight bound Intuition: for most of the time Heapify works on smaller than n element heaps

September 29, Building a Heap: Analysis (2) Definitions height of node: longest path from node to leaf height of tree: height of root time to Heapify = O(height (k) of subtree rooted at i) assume n = 2 k – 1 (a complete binary tree k =  lg n  )

September 29, Building a Heap: Analysis (3) How? By using the following "trick" Therefore Build-Heap time is O(n)

September 29, Heap Sort The total running time of heap sort is O(n lg n) + Build-Heap(A) time, which is O(n)

Heap Sort

September 29, Heap Sort: Summary Heap sort uses a heap data structure to improve selection sort and make the running time asymptotically optimal Running time is O(n log n) – like merge sort, but unlike selection, insertion, or bubble sorts Sorts in place – like insertion, selection or bubble sorts, but unlike merge sort

September 29, Next Week ADTs and Data Structures Definition of ADTs Elementary data structures