Download presentation
Presentation is loading. Please wait.
Published byDaniel Lawrence Modified over 9 years ago
1
Outline Priority Queues Binary Heaps Randomized Mergeable Heaps
2
Priority Queues A priority queue stores a collection of Comparable elements Operations: add(x) : Add x to the collection findMin() : Return the smallest element in the collection deleteMin() : Remove the smallest element in the collection In the JCF Queue interface, these are: add(x)/offer(x) : add(x) peek() : findMind() remove()/poll() : deleteMin()
3
Priority Queues: Applications We can sort a set of elements using a priority queue. How? public static void sort (T[] a) { Queue pq = new PriorityQueue (); for (T x : a) pq. add(x); for (int i = 0; i < a. length ; i++) a[i] = pq. remove (); } We just need to insert them all into a priority queue and then removing them
4
Priority Queues: Applications In simulations, priority queues store events ordered by time of occurrence
5
Priority Queues: Applications In simulations, priority queues store events ordered by time of occurrence
6
Priority Queues: Applications In simulations, priority queues store events ordered by time of occurrence
7
Priority Queues: Applications In simulations, priority queues store events ordered by time of occurrence
8
Priority Queues: Applications In simulations, priority queues store events ordered by time of occurrence
9
Priority Queues Which structure should we use to implement a priority queue? Priority queues can be implemented as (binary) heaps A binary heap is a binary tree where each node u stores a value u.prio Heap property: u.prio < u.left.prio and u.prio < u.right.prio
10
Priority Queues Heap property: u.prio < u.left.prio and u.prio < u.right.prio A binary heap is a binary tree where each node u stores a value u.prio 5.19.21.32.1 1.10.9 0.5 0.3 0.1 1.9 0.40.8
11
Priority Queues A complete binary tree of height d has up to 2 d+1 - 1 nodes To store n nodes, we require 2 d+1 - 1 ≥ n A complete heap uses a complete binary tree: 5.19.21.32.1 1.10.9 0.5 0.3 0.1 1.9 0.40.8 2 d+1 ≥ n + 1 d+1 ≥ log 2 (n + 1) d ≥ log 2 (n + 1) – 1 = O(log n) A complete heap of size n has height O(log n)
12
Priority Queues How can we maps the nodes of a complete binary tree to the positions of an array? Eytzinger method: a[0] is the root the left child of a[i] is a[2*i+1] the right child of a[i] is a[2*i+2] the parent of a[i] is a[(i-1)/2]
13
Eytzinger method Eytzinger method: 10987 43 1 2 0 11 65 141312 11 234 5678910 01 1312 14
14
Eytzinger method Eytzinger method: 10987 43 1 2 0 11 65 141312 11 234 5678910 01 1312 14 root Left child Right child a[2*0+1] = a[1] a[2*0+2]= a[2] i = 0
15
Eytzinger method Eytzinger method: 10987 43 1 2 0 11 65 141312 l cde fghijk ab nm o Left child Right child a[2*5+1] = a[11] a[2*0+2]= a[12] i = 5
16
Implicit Binary Heap An implicit binary heap represents a complete binary heap in an array a using the Eytzinger method 1.9 0.30.91.1 0.80.42.11.39.25.1 0.10.5 5.19.21.32.1 1.10.9 0.5 0.3 0.1 1.9 0.40.8 No extra pointers, all the data lives in a
17
Implicit Binary Heap public class BinaryHeap > extends AbstractQueue { protected T[] a; protected int n; protected int left (int i) { return 2*i + 1; } protected int right (int i) { return 2*i + 2; } protected int parent (int i) { return (i -1)/2; }... }
18
Finding the minimum public T peek () { return a [0]; } Finding the minimum value in a heap is easy It's stored at the root This is a[0] Runs in O(1) time How to find the minimum value in a heap?
19
Inserting elements To insert x into a (implicit binary) heap, we 1. add x as a leaf 2. while x is smaller than parent(x) swap x with its parent Runs in O(log n) time How to insert an element into a heap?
20
1.9 0.30.91.1 0.80.42.11.39.25.1 0.10.5 5.19.21.32.1 1.10.9 0.5 0.3 0.1 1.9 0.40.8 0.2 1. add x as a leaf 0.2 Inserting elements
21
1.9 0.30.91.1 0.20.42.11.39.25.1 0.10.5 5.19.21.32.1 1.10.9 0.5 0.3 0.1 1.9 0.40.2 0.8 1. add x as a leaf 2. while x is smaller than parent(x) swap x with its parent 0.2
22
Inserting elements 1.9 0.20.91.1 0.30.42.11.39.25.1 0.10.5 5.19.21.32.1 1.10.9 0.5 0.2 0.1 1.9 0.40.3 0.8 1. add x as a leaf 2. while x is smaller than parent(x) swap x with its parent 0.2
23
Inserting elements public boolean offer (T x) { if (n + 1 > a. length ) grow (); a[n++] = x; bubbleUp (n -1); return true ; } protected void bubbleUp (int i) { int p = parent (i); while (i > 0 && a[i]. compareTo (a[p]) < 0) { swap (i,p); i = p; p = parent (i); }
24
Removing the minimum To delete the minimum from a (implicit binary) heap, we 1. copy x=a[n-1] into a[0] 2. while x is larger than either of its children swap x with the smaller of its children Runs in O(log n) time How can we remove the minimum value from a heap?
25
1.9 0.20.91.1 0.30.42.11.39.25.1 0.10.5 5.19.21.32.1 1.10.9 0.5 0.2 0.1 1.9 0.40.3 0.8 1. copy x=a[n-1] into a[0] Removing the minimum
26
1.9 0.20.91.1 0.30.42.11.39.25.1 0.80.5 5.19.21.32.1 1.10.9 0.5 0.2 0.8 1.9 0.40.3 2. while x is larger than either of its children swap x with the smaller of its children 0.8 1. copy x=a[n-1] into a[0] 0.2 0.8 0.2
27
Removing the minimum 1.9 0.80.91.1 0.3 0.42.11.39.25.1 0.2 0.5 5.19.21.32.1 1.10.9 0.5 0.8 0.2 1.9 0.40.3 2. while x is larger than either of its children swap x with the smaller of its children 0.8 1. copy x=a[n-1] into a[0] 0.3 0.8 0.3 0.8
28
public T poll () { T x = a [0]; swap (0, --n); trickleDown (0); return x; } Removing the minimum
29
protected void trickleDown (int i) { do { int j = -1; int r = right (i); if (r < n && a[r]. compareTo (a[i]) < 0) { int l = left (i); if (a[l]. compareTo (a[r]) < 0) { j = l; } else { j = r; } } else { int l = left (i); if (l < n && a[l]. compareTo (a[i]) < 0) { j = l; } } if (j >= 0) swap (i, j); i = j; } while (i >= 0); }
30
Summary Theorem : A (implicit) binary heap supports the operations add(x) and deleteMin() in O(log n) (amortized) time and supports the findMin() operation in O(1) time.
31
Building a heap We can insert elements a[0],...,a[n-1] one at a time. This takes O(1 + log(1)) + O(1 + log(2)) + O(1 + log n) Runs in O(nlog n) time How can we build a heap from an unsorted array a? How quickly can we make a into an implicit binary heap? public BinaryHeap (T[] ia) { a = ia; for (int n = 1; n < a. length ; n++) { add(a[n]); } } Can we do better?
32
Building a heap First build n/2 heaps of size 1 We can do it in O(n) time. How? By working bottom up. public BinaryHeap (T[] ia) { a = ia; n = a. length ; for (int i = n/2; i >= 0; i --) { trickleDown (i); } } Next build n/4 heaps of size 3 Next build n/8 heaps of size 7 build 1 heap of size n …
33
Building a heap in linear time (analysis) We call trickleDown(i), n/2 j times where j is the root of a heap of size 2 j - 1 (n/4) * O(1) trickleDown(i) then takes O(log(2 j - 1)) = O(j) time Total running time is (n/8) * O(2) (n/16) * O(3) 1 * O(log n) = O(n) …
34
Heapsort The heapsort algorithm for sorting an array a of length n: Make a into a heap in (O(n) time) Repeat n times Delete the minimum Each deletion takes O(log n) time Runs in O(n log n) time Doesn't require any extra space ( does all work inside of input array a) public T deleteMin () { swap (0, --n); trickleDown (0); }
35
Merging two heaps We know how to add one element to a heap. What can we do if we want to add another heap? In other words, how can we merge 2 heaps? public Node merge(Node h1, Node h2){ while(h2.size()>0){ T u = h2.peek(); h2.deleteMin(); h1.add(u); } The cost for deleteMin() and add() is O(log n) We perform those operations n times Total cost O(n log n)
36
Merging two heaps Can we do better? Actually we can do it in O(log n). How? merge(h1,h2) can be defined recursively 1. If either of h1 or h2 is nil, we just return h2 or h1. 2. We take the one with the biggest minimum value and merge it either with the right or left child of the other. 3. We continue the process recursively.
37
Merging two heaps What is the complexity of this operation? How can we reduce it? Using a heap ordered binary tree structure we can reduce the cost to O(log n). Why? in an implicit binary heap it would be O(n) we need to copy all the elements in a new array
38
Merging two heaps Node merge(Node h1, Node h2) { if (h1 == nil) return h2; if (h2 == nil) return h1; if (compare(h2.x, h1.x) < 0) return merge(h2, h1); // now we know h1.x <= h2.x if (rand.nextBoolean()) { h1.left = merge(h1.left, h2); h1.left.parent = h1; } else { h1.right = merge(h1.right, h2); h1.right.parent = h1; } return h1; }
39
Merging two heaps Why we need the random function to merge the heaps? How can we define the add() and remove() operations in a heap ordered binary tree ? public boolean add(T x) { Node u = newNode(); u.x = x; r = merge(u, r); r.parent = nil; n++; return true; } public T remove() { T x = r.x; r = merge(r.left, r.right); if (r != nil) r.parent = nil; n--; return x; }
40
Analysis of merge(h1,h2) Lemma: The expected length of a random walk in a binary tree with n nodes is at most log(n+1) proof by induction: For n=0 is true log(0+1)=0 Suppose it is true for every n 0 < n n 1 the size of the left sub-tree and n 2 = n - n 1 -1 E[W] = 1 + log(n 1 +1)/2 + log(n 2 +1)/2 E[W] ≤ 1 + log( (n-1)/2 + 1) = 1 + log( (n + 1)/2 ) = log(n + 1)
41
Summary Lemma: The expected running time of merge(h1,h2) is at most O(log n), where n = h1.size() + h2.size() Theorem: A MeldableHeap implements the (priority) Queue interface and supports the operations add(x), remove() and merge(h1,h2) in O(log n), expected time per operation.
42
quicksort(S): Recall Quicksort 21304 69857 p=5 S<S< S<S< S=S= S=S= S>S> S>S> – Pick a random element p from S – Compare every element to p to get 3 sets S < = {x ϵ S : x < p} S = = {x ϵ S : x = p} S > = {x ϵ S : x > p} 23401 78956 – quicksort(S < ) – quicksort(S > )
43
Quicksort 21854 69037 5 p=5 < < > > 21804 69537 5 i i j j j j i i 21804 69837 5 i i j j 21304 69837 5 i i j j i = j 21304 69837 5
44
Quicksort 21854 69037 21304 6987 5 21304 7986 5 23401 7896 5 21403 7896 5 23401 7896 5 23401 78956
45
Recall Heapsort The heapsort algorithm for sorting an array a of length n: Make a into a heap in (O(n) time) Repeat n times Delete the minimum 21854 69037 21405 69837 23401 78956
46
Recall Heapsort 21854 69037 25401 69837 23401 56 25814 6937 35824 697 65834 97 67845 9 69857 8967
47
23401 78956 65834 97 67845 9 69857 8967 879 89 9
48
Mergesort The mergesort algorithm for sorting an array a of length n: Divide a in 2 parts a 1 and a 2 Recursively, sort a 1 and a 2 with mergesort Merge a 1 and a 2 21854 69037 21854 69037 45812 67903 23401 78956
49
Merging process 45812 67903 2 3 4 0 7 895 6 i0i0 i0i0 i1i1 i1i1 i 0 < i 1 i 1 < i 0 i 0 ++ i 1 ++ 1
50
Analysis of Mergesort The merging process ( step 3 ) costs (O(n) time) The mergesort algorithm for sorting an array a of length n: 1. Divide a in 2 parts a 1 and a 2 2. Recursively, sort a 1 and a 2 with mergesort 3. Merge a 1 and a 2 It is performed log n times. Total costs (O(n log n) time) Theorem: The Mergesort algorithm can sort an array of n items in O(n log n) time
51
Sorting Algorithms So far, we have seen 3 sorting algorithms: Quicksort: O(n log n) expected time Heapsort: O(n log n) time Mergesort: O(n log n) time Is there a faster sorting algorithm (maybe O(n) time)? Answer: No and yes
52
Comparision based Sorting Algorithms The algorithm Quicksort, Heapsort and Mergesort are sort by comparing the elements. Every comparison-based sorting algorithm takes (n log n) time for some input A comparison tree is a full binary tree: each internal node u is labelled with a pair u.i and u.j each leaf is labelled with a permutation of { 0,..., n – 1}
53
Comparision based Sorting Algorithms a[0](>/<)a[1] a[1](>/<)a[2]a[0](>/<)a[2] a[0]<a[1]<a[2]a[0](>/<)a[2] a[1]<a[0]<a[2]a[1](>/<)a[2] a[0]<a[2]<a[1]a[2]<a[0]<a[1] a[1]<a[2]<a[0]a[2]<a[1]<a[0] < < < < < < < < < < > > > > > > > > > >
54
Comparision based Sorting Algorithms Lemma: Every comparison tree that sorts any input of length n has at least n! Leaves Theorem: Every comparison tree that sorts any input of length n has height at least (n/2)log 2 (n/2) The height is log 2 (n!). n!=O(n n ). log 2 (n n ) = n log 2 (n). The height is O( n log n ).
55
Sorting in O(n) time In-class problem: Design an algorithm that takes an array a of n integers in the range {0,..., k-1} and sorts them in O(n + k) time
56
Counting sort int [] countingSort (int [] a, int k) { int c[] = new int [k]; for (int i = 0; i < a. length ; i ++) c[a[i ]]++; for (int i = 1; i < k; i ++) c[i] += c[i -1]; int b[] = new int [a. length ]; for (int i = a.length -1; i >= 0; i --) b[--c[a[i ]]] = a[i]; return b; }
57
Sorting in O(n) time 1. First count the repetition of any possible number. 2. Compute the starting position of the number 3. Copy the numbers on the output array
58
3 Recall Heapsort 90172 97420 00000 00000 9104625993 0 0 1 1 2 2 6 6 7 7 9 9 3 3 4 4 5 5 8 8 1 1 1 1 1 2 2 22 1 2 1 3 23 4 1 3 1 5 0 2 2 1 2 1 3 1 5 2+3 3+5 5 8 8+1 9 9+2 11 11+1 12 12+1 13 13+2 15 15+0 15 15+5 20 01100 234226774599999 90172 974209104625993
59
Radix sort Radix-sort uses the counting sort algorithm to sort integers one “digit" at a time integers have w bits “digit" has d bits uses w/d passes of counting-sort Starts by sorting least-significant digits first works up to most significant digits Correctness depends on fact that counting sort is stable if a[i] = a[j] and i < j then a[i] appears before a[j] in the output
60
Radix sort Theorem: The radix-sort algorithm can sort an array a of n w-bit integers in O(n + 2 d ) time Theorem: The radix-sort algorithm can sort an array a of n integers in the range {0,..., n c – 1} in O(cn) time.
61
Summary Comparison based sorting algorithms: Quicksort: O(n log n) expected time Heapsort: O(n log n) time Mergesort: O(n log n) time counting-sort: can sort an array a of n integers in the range {0,..., k-1} in O(n + k) time radix-sort can sort an array a of n integers in the range {0,..., n c – 1} in O(cn) time. No comparison based sorting algorithms:
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.