Download presentation
Presentation is loading. Please wait.
1
Priority queues CS310 – Data Structures Professor Roch Weiss, Chapter 6.9, 21 All figures marked with a chapter and section number are copyrighted © 2006 by Pearson Addison-Wesley unless otherwise indicated. All rights reserved.
2
2 Priority Queues A priority queue is a collection where each item is assigned a measure of importance –assigned by user or –derived from item Excellent abstract data type for scheduling when tasks differ in importance.
3
3 Priority queue operators Basic operators: –enqueue(item)Add an item –dequeue() Get & remove the most important item –isempty() Other typical operators: –peek()Get oldest instance of most important item w/o remove –clear()reinitialize the queue
4
4 Binary heaps Proposed by Williams in 1964 as part of a sorting algorithm called heapsort. Binary heaps are commonly used to implement priority queues. Complexity of binary heap operations –insert (enqueue) average O(1) worst case O(log N) –peek() - O(1) –dequeue() – O(log N)
5
5 Binary heaps Your text uses the following equivalent operators: –findMin() for peak() –deleteMin() for dequeue() –insert() for enqueue() We will refer to an item’s key as the value we use to determine priority.
6
6 Binary heap structure As we indicated that O(log N) the worst case insertion and the average case removal, one would expect an implementation to have some sort of balanced tree.
7
7 Complete binary trees In particular, we will expect a complete binary tree which has the following property: –With the exception of the last level, every node has two children. –Leaves in the last level are added from left to right, without skipping any nodes. Figure 21.3 (a)
8
8 Implicit representation Complete trees can easily be represented by an array. –left(i) = 2*i –right(i) = 2*i+1 –parent(i) = floor(i/2) Figure 21.1 Look Ma, no links!
9
9 Implicit representation In addition to the array, we need to maintain the count of nodes. It will be convenient to have a pseudo-root node (much like we saw for red-black trees), we can place it in position 0.
10
10 Heap-order property For all nodes C with parent P: key(P)≤key(C)
11
11 Binary heaps To be a binary heap, a tree must: –have the heap-order property –and be a complete tree. Note that we can also create an inverted heap where the heap-order property is that a child must be smaller than or equal to its parent. Let us look at an implementation…
12
12
13
13
14
14 peek() called element() in this implementation
15
15 Insertion or enqueueing To maintain the heap structure, there is only one new node that can be used. Figure 21.3 (a)
16
16 Insertion Let us refer to the position where a new item can be replaced as the hole. Suppose we wanted to insert an item whose key was 14:
17
17 Insertion Inserting an item with key 14 into the hole would violate the heap property, so we percolate the hole upwards:
18
18 Percolating up…
19
19 Insertion code We have already stated the worst case complexity, how can we justify it?
20
20 Complexity Worst case: O(log N) Average case: –It has been shown (beyond the scope of ths class) that on average 2.6 comparisons are required implying that an item moves up 1.6 levels. –Constant time O(1).
21
21 deleteMin() or dequeue() Finding the minimum… trivial Deleting the minimum –We cannot leave a hole at the root –Last child must be moved elsewhere –Heap property must still apply –Strategy: percolate down
22
22 Percolating down
23
23 Percolating up
24
24
25
25 Average & worst case complexity: O(log N)
26
26 Constructing a heap Suppose we want to create a heap of N items from a complete tree. Given that we have seen insertion routines with average cost of O(1), what would we expect the cost to be?
27
27 Linear time heap construction Suppose that we have a complete tree Suppose that the left and right children are heaps: What can we do make the tree a heap?
28
28 Recursive view Suppose T 1 and T 2 are heaps If R < C 1 and R < C 2 this is a heap otherwise, we could percolate R downwards.
29
29 Recursive view Of course T 1 and T 2 are probably not heaps if we are building a heap from an arbitrary set of keys. Suppose we recursively called makeheap on the left and right subtrees and then called percolateDown on the root…
30
30 Implementation The preceding recursive implementation would work, but we will discuss an iterative implementation inspired by the recursion. The recursive implementation would work its way down the tree, building the heap for the lowest levels first. We implement a bottom up processing strategy.
31
31 Heap construction Consider nodes with no children. Does the heap property hold?
32
32 Heap construction As childless nodes are heaps, we only need to consider nodes with children. Exercise: –Suppose we have 14 nodes in a complete tree. –How many of them have children?
33
33 A complete tree of 14 nodes In general, N/2 nodes have children. We could prove this formally…
34
34 Heap construction Start with the last parent node. It has 1 or 2 children. If we call percolateDown(parent), will it be a heap? Some possible subtrees for the last parent node…
35
35 Heap construction Consequently, we can make the last parent a heap. The same logic can be applied to all of the nodes on the same level as the last parent. Each percolate operation on this level requires up to three comparisons.
36
36
37
37
38
38 Heap construction Each node in level L-1 is a heap. We can now process level L-2 using our strategy that we outlined recursively even though we are not using recursion.
39
39 After L-2 is processed, we process L-3
40
40 Linear time heap construction
41
41 Proof that our heap building routine is linear We need to establish a bound on the number of comparisons made. In the previous slides, we drew a dashed line to the smallest child to indicate that we had compared the node and its children. We will use this idea of writing on the graph to construct what is known as a marking proof.
42
42 Proof that our heap building routine is linear We will prove this for a perfect tree. –A perfect tree is a tree where with N/2 children on the last level. While this may require more operations than a complete tree, it shows an upper bound on the number of operations.
43
43 Marking proof for linear heap construction As the leaves of a binary heap are themselves heaps, nothing needs to be done for leaf nodes. Let us consider the first interior level. To construct a heap for the subtrees rooted at this level, 3 comparisons need to be made for each node. We will denote this by shading the edge between the parent and its leftmost child.
44
44 Marking proof In this 4 level example, 8 edges were marked, indicating that no more than 24 comparisons need to be made to process level 3.
45
45 Marking proof Consider the next level up. percolateDown needs to make decisions about the previous level down the tree which we can show my marking the left child:
46
46 Marking proof However, we also need to take into account that percolateDown() may move the item down to the next level. We can do this by marking the right child of the left subchild: Note that it does not matter that percolateDown may have used the right subtree.
47
47 Example of marking level L-2
48
48 Processing the next level For each additional level: –Mark the left most child. –Mark each right child until we reach a leaf.
49
49 Sample marking of tree of height 4.
50
50 Things to notice For each node of height ≥ 1: –We marked the leftmost edge to indicate the 3 comparisons needed to possibly move the node down to the next level. –We marked an additional height – 1 edges need to possibly percolate an item down to a leaf node. –Remember: We don’t care which path we would have actually used when percolating down, it is the length of the marked path that is important. No path is ever marked twice. Every edge except those along the right most path is marked.
51
51 Marking proof If we were to count the number of darkened edges, it would tell us how many groups of 3 comparisons are needed. So, if we can compute the number of internal edges and subtract off the number of edges in the right most path, we would know how many comparisons are needed.
52
52 Marking proof As an edge exists between a node’s parent and itself, there are exactly N-1 links in a tree of size N. The rightmost path (the only one which is not marked) is of height H. Therefore, there are N-1-H marked edges meaning at most 3(N-1-H) comparisons, or O(N).
53
53 Other heap operations decreaseKey() –Increases the priority by lowering the number. –Can be implelemented by changing the key value and percolating up. merge(otherHeap) –With an array based implementation, there is not a pretty way to do this. –Copy otherHeap into the array and rebuild the heap. Complexity?
54
54 heap sort Remember, heaps were originally proposed as part of a sorting algorithm. 1.Add items as a group and build a heap. 2.Remove each item, producing the sorted list. What is the complexity?
55
55 A few details… In most cases, we might not want to actually copy the elements into a second heap data structure as this requires additional space. Note that as Java is just copying a pointer to an object, the space requirements are not really a problem, but they could be for other languages.
56
56 Saving space We can get around this by simply storing the indices of the input array and referencing the array during the heap routines (requires code modifications).
57
57 Saving space As elements are removed from the heap, array entries become available. If we were really interested in saving space, we could put the indices of the sorted items into the empty space. This of course means that the elements are stored in ascending order. How could we change the order?
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.