1 Theory I Algorithm Design and Analysis (9 – Priority queues: Fibonacci heaps) T. Lauer.

Slides:



Advertisements
Similar presentations
Fibonacci Heaps Especially desirable when the number of calls to Extract-Min & Delete is small (note that all other operations run in O(1) This arises.
Advertisements

DATA STRUCTURES AND ALGORITHMS Lecture Notes 9 Prepared by İnanç TAHRALI.
Heaps1 Part-D2 Heaps Heaps2 Recall Priority Queue ADT (§ 7.1.3) A priority queue stores a collection of entries Each entry is a pair (key, value)
Priority Queues  MakeQueuecreate new empty queue  Insert(Q,k,p)insert key k with priority p  Delete(Q,k)delete key k (given a pointer)  DeleteMin(Q)delete.
Chapter 4: Trees Part II - AVL Tree
Advanced Data structure
1 Theory I Algorithm Design and Analysis (10 - Shortest paths in graphs) T. Lauer.
CHAPTER 5 PRIORITY QUEUES (HEAPS) §1 ADT Model Objects: A finite ordered list with zero or more elements. Operations:  PriorityQueue Initialize( int.
By Amber McKenzie and Laura Boccanfuso. Dijkstra’s Algorithm Question: How do you know that Dijkstra’s algorithm finds the shortest path and is optimal.
Priority Queues. Container of elements where each element has an associated key A key is an attribute that can identify rank or weight of an element Examples.
FALL 2004CENG 351 Data Management and File Structures1 External Sorting Reference: Chapter 8.
Priority queues CS310 – Data Structures Professor Roch Weiss, Chapter 6.9, 21 All figures marked with a chapter and section number are copyrighted © 2006.
1 Theory I Algorithm Design and Analysis (3 - Balanced trees, AVL trees) Prof. Th. Ottmann.
FALL 2006CENG 351 Data Management and File Structures1 External Sorting.
Chapter 6: Priority Queues Priority Queues Binary Heaps Mark Allen Weiss: Data Structures and Algorithm Analysis in Java Lydia Sinapova, Simpson College.
CSE 373 Data Structures Lecture 12
Fibonacci Heaps. Single Source All Destinations Shortest Paths
Priority Queues. Container of elements where each element has an associated key A key is an attribute that can identify rank or weight of an element Examples.
Priority Queues1 Part-D1 Priority Queues. Priority Queues2 Priority Queue ADT (§ 7.1.3) A priority queue stores a collection of entries Each entry is.
5.9 Heaps of optimal complexity
Princeton University COS 423 Theory of Algorithms Spring 2002 Kevin Wayne Fibonacci Heaps These lecture slides are adapted from CLRS, Chapter 20.
UMass Lowell Computer Science Analysis of Algorithms Prof. Karen Daniels Fall, 2001 Lecture 11 Tuesday, 12/4/01 Advanced Data Structures Chapters.
CSE 326: Data Structures Binomial Queues Ben Lerner Summer 2007.
Priority Queues, Heaps & Leftist Trees
§3 Binary Heap 1. Structure Property: 【 Definition 】 A binary tree with n nodes and height h is complete iff its nodes correspond to the nodes numbered.
Heapsort Based off slides by: David Matuszek
1 HEAPS & PRIORITY QUEUES Array and Tree implementations.
Minimal Spanning Trees What is a minimal spanning tree (MST) and how to find one.
ANALYSIS OF SOFT HEAP Varun Mishra April 16,2009.
1 Binomial heaps, Fibonacci heaps, and applications.
Compiled by: Dr. Mohammad Alhawarat BST, Priority Queue, Heaps - Heapsort CHAPTER 07.
PRIORITY QUEUES (HEAPS). Queues are a standard mechanism for ordering tasks on a first-come, first-served basis However, some tasks may be more important.
Chapter 9 Heap Structures
For Monday Read Weiss, chapter 7, sections 1-3. Homework –Weiss, chapter 4, exercise 6. Make sure you include parentheses where appropriate.
Chapter 11 Heap. Overview ● The heap is a special type of binary tree. ● It may be used either as a priority queue or as a tool for sorting.
Data Structures Week 8 Further Data Structures The story so far  Saw some fundamental operations as well as advanced operations on arrays, stacks, and.
Chapter 21 Priority Queue: Binary Heap Saurav Karmakar.
WEEK 3 Leftist Heaps CE222 Dr. Senem Kumova Metin CE222_Dr. Senem Kumova Metin.
1 Heaps (Priority Queues) You are given a set of items A[1..N] We want to find only the smallest or largest (highest priority) item quickly. Examples:
Chapter 2: Basic Data Structures. Spring 2003CS 3152 Basic Data Structures Stacks Queues Vectors, Linked Lists Trees (Including Balanced Trees) Priority.
B-trees Eduardo Laber David Sotelo. What are B-trees? Balanced search trees designed for secondary storage devices Similar to AVL-trees but better at.
Change Keys in heaps Fibonacci heap Zhao Xiaobin.
Data StructuresData Structures Priority Queue. Recall Queues FIFO:First-In, First-Out Some contexts where this seems right? Some contexts where some things.
1 Fat heaps (K & Tarjan 96). 2 Goal Want to achieve the performance of Fibonnaci heaps but on the worst case. Why ? Theoretical curiosity and some applications.
1 Heaps A heap is a binary tree. A heap is best implemented in sequential representation (using an array). Two important uses of heaps are: –(i) efficient.
CMSC 341 Binomial Queues and Fibonacci Heaps. Basic Heap Operations OpBinary Heap Leftist Heap Binomial Queue Fibonacci Heap insertO(lgN) deleteMinO(lgN)
Internal and External Sorting External Searching
CS 367 Introduction to Data Structures Lecture 8.
Heaps © 2010 Goodrich, Tamassia. Heaps2 Priority Queue ADT  A priority queue (PQ) stores a collection of entries  Typically, an entry is a.
Fibonacci Heaps. Analysis FibonacciAnalysis.ppt Video  iew/cop5536sahni
1 Fibonacci heaps: idea List of multiway trees which are all heap-ordered. Definition: A tree is called heap-ordered if the key of each node is greater.
Fibonacci Heap Fibonacci heapshave better asymptotic time bounds than binary heaps for the INSERT, UNION, and DECREASE-KEY operations, and they.
Fibonacci Heaps. Fibonacci Binary insert O(1) O(log(n)) find O(1) N/A union O(1) N/A minimum O(1) O(1) decrease key O(1) O(log(n)) delete O(log(n) O(log(n))
Leftist Trees Linked binary tree.
Data Structures Binomial Heaps Fibonacci Heaps Haim Kaplan & Uri Zwick
Heaps © 2010 Goodrich, Tamassia Heaps Heaps
Bohyung Han CSE, POSTECH
CMSC 341 Lecture 13 Leftist Heaps
Priority Queues MakeQueue create new empty queue
Fibonacci Heaps Remove arbitrary is useful in (for example) correspondence structures and may also be used to do an increase key in a min structure (remove.
Part-D1 Priority Queues
ערמות בינומיות ופיבונצ'י
Priority Queues (Chapter 6.6):
Fibonacci Heaps Remove arbitrary is useful in (for example) correspondence structures.
Union-Find Partition Structures
Binomial heaps, Fibonacci heaps, and applications
Fibonacci Heaps.
Binomial heaps, Fibonacci heaps, and applications
Priority Queues (Chapter 6):
Priority Queues Binary Heaps
Presentation transcript:

1 Theory I Algorithm Design and Analysis (9 – Priority queues: Fibonacci heaps) T. Lauer

2 Priority Queues Data structures for managing elements (insert, delete) –Stack: “last in – first out” –Queue:“first in – first out” –Dictionary:“find any element” (lookup) New structure: priority queue –a priority is assigned to each inserted element –the element with highest priority can be accessed and removed –Real-world example: “to-do” list: New tasks are added The most urgent/important one is carried out first

3 Priority queues: elements The key of an element (node) in a priority queue represents its assigned priority. The priority has to be of an ordered type (e.g. int, long, float, …) so the can be compared. The priority is not a unique identifier of an element (e.g. for lookup). Hence, several elements could have the same priority. We store the key as an int (a smaller value represents a higher priority). class Node { Object content; // content int key; // priority }

4 Priority queues: operations Operations on a priority queue Q: Q.insert(int k) : Insert a new element with key (= priority) k. Q.accessmin() : Return the element with minimum key (= highest priority). Q.deletemin() : Remove the element with minimum key. Q.decreasekey(Node N, int k) : Decrease the key of node N to the value k. Q.delete(Node N) : Delete the given node N. Q.meld(PriorityQueue P) : Merge Q with the priority queue P. Q.isEmpty() : Return true iff Q is empty. Remark: Efficient lookup of an element with a given key is not supported in priority queues. Hence, decreasekey and delete require direct access to the respective element N (not just the key).

5 Fibonacci heaps: idea List of multiway trees which are all heap-ordered. Definition: A tree is called heap-ordered if the key of each node is greater than or equal to the key of its parent (if there is a parent). The roots of the trees are organized in a doubly-connected circular list (root list). The Fibonacci heap can be accessed through a pointer to a (the) node with minimum key

6 Representation of trees Child-sibling representation (cf. Exercise sheet 2) To be able to go upward in the tree, a pointer to the parent node is added. In order to realize deletion of a child (and the concatenation of two child lists) in O(1), we use doubly-connected circular lists. Hence, each node contains 4 pointers: child, parent, left, right

7 Fibonacci heaps: node format class FibNode { Object content; // the content int key; // priority FibNode parent, child; // pointers to parent and one child FibNode left, right; // pointers to left and right sibling int rank; // number of children of this node boolean mark; // marker } rank indicates the number of children of the node (the outdegree of the node) The meaning of mark will become clear later. It is true if the node has lost any of its children since it last became the child of another node.

8 Fibonacci heap: detailed figure

9 Fibonacci heap: simplified figure

10 Fibonacci heaps: operations Q.accessmin() : Return the node Q.min (or null if Q is empty). Q.insert(int k) : Create a new node N with key k and insert it in the root list of Q. If k < Q.min.key, update the minimum pointer (set Q.min = N )

Manipulation of trees in Fibonacci heaps There are only three ways how a tree can change in a Fibonacci heap: link: “growth” of trees Two trees are linked into a new tree. cut: cut out a subtree A subtree is cut out of a tree and inserted in the root list as a new tree. remove: decompose a tree at the root Removes the root of a tree from the root list and replaces it by the list of its children, i.e. the children become new roots.

12 Tree manipulation: link link: Input: two root nodes of the same rank k Method: joins two trees of the same rank by making the root with larger key a child of the root with smaller key. Unmark the new child if it was marked. The total number of trees in the Fibonacci heap is reduced by 1, the number of nodes does not change. Output:one root node of rank k+1 Cost:

13 Tree manipulation: cut cut: Input:one non-root node Method: separate the node (including the subtree rooted in it) from its parent and insert it as a new tree in the root list. The total number of trees is increased by 1, the number of nodes does not change. Cost:

14 Tree manipulation: remove remove: Input:one root node (with rank k) Method: delete the root of a tree and inserts its k children in the root list. The number of trees is increased by k-1, the number of nodes is decreased by 1. Cost: [if the parent pointers of the children are not deleted!]

15 Further operations By combining the basic methods –link –cut –remove we can construct the missing operations –deletemin –decreasekey –delete

16 Deletion of the minimum node Q.deletemin() : If Q is empty, return null. Otherwise: –Delete the minimal node (using remove). –„Consolidate“ (clean up) the root list: Join (link) two root nodes of the same rank, until only nodes with distinct rank appear in the root list. While doing this, remove all “dangling” parent pointers of root nodes. –Find the new minimum among the root nodes. –Return the deleted node.

17 deletemin: example

18 deletemin: example

19 deletemin: example

20 deletemin: example

21 deletemin: example

22 deletemin: example

23 deletemin: example

24 decreasekey Q.decreasekey(FibNode N, int k) : Set the key value of N to k. If the heap order is violated ( k < N.parent.key ): –Cut N off its parent (using cut) –If the parent is marked ( N.parent.mark == true ), also cut it from its own parent; if that parent is marked, also cut it, etc. (“cascading cuts”) –Mark the node whose child was last cut (unless it is a root node). –Update the minimum pointer (if k < min.key ).

25 decreasekey: example

26 decreasekey: example

27 decreasekey: example

28 decreasekey: example

29 decreasekey: example

30 decreasekey: example

31 decreasekey: example

32 decreasekey: example

33 decreasekey: example

34 decreasekey: example

35 Marking history of a node A node N is unmarked when it becomes the child of another node (via link). When N loses a child node (via cut), N is marked (unless N is a root). When a marked node loses another child, it is cut off itself and becomes a root.  Hence, if a node N is marked we know that N has lost one child since N itself was last made the child of another node.  Also, a node that is not a root cannot have lost more than one child!

36 Deletion of a given node Q.delete(FibNode N) : Set the key of N to a value smaller than the current minimum, e.g. Q.decreasekey(N, -  ) Then perform Q.deletemin().

37 delete: example

38 delete: example -

39 delete: example -

40 delete: example

41 delete: example

42 delete: example

43 delete: example

44 Other operations Q.meld(FibHeap P) : Append the root list of P to the root list of Q. Then update the minimum pointer of Q : if P.min.key < Q.min.key, set Q.min = P.min. Q.isEmpty() : If Q.size == 0, return true; otherwise false.

45 Analysis Complexity of deletemin(): remove:O(1) consolidate:? updatemin:O(#root nodes after consolidate) = After consolidating there is only one root with any given rank (otherwise, they would have been linked). We define maxRank(n) as the largest possible rank a root node can have in a Fibonacci heap of size n. (We will calculate maxRank(n) later.) Now we have to determine the complexity of consolidate.

46 deletemin: consolidating the root list How can we implement consolidate efficiently? Observations: –Obviously, each root node has to be visited at least once –At the end, each rank may occur at most once Idea: Insert the root nodes in a temporary array. Each node is inserted at the position corresponding to its rank. If a position is occupied, we know that we have already seen another node with the same rank. We can now link the two nodes and (try to) insert the root of the resulting tree at the next higher array position.

47 consolidate: example

48 consolidate: example Rank array:

49 consolidate: example Rank array: 52

50 consolidate: example Rank array: 52

51 consolidate: example Rank array: 52

consolidate: example Rank array: 52

53 consolidate: example

54 Analysis of consolidate rankArray = new FibNode[maxRank(n)+1]; // create the array for “each FibNode N in rootlist” { while (rankArray[N.rank] != null) { // position occupied N = link(N, rankArray[N.rank]); // link tree roots rankArray[N.rank-1] = null; // delete old pos. } rankArray[N.rank] = N; // insert in array }

55 Analysis of the for-loop for “each FibNode N in rootlist” { while (rankArray[N.rank] != null) { N = link(N, rankArray[N.rank]); rankArray[N.rank-1] = null; } rankArray[N.rank] = N; } Let r before = #root nodes before consolidate and r after = #root nodes after consolidate. The total number of link operations in the loop is r before - r after. The total number of array modifications in the loop is #links + r before (because each link and each iteration of the loop ends with one modification). Since r before = r after + #links, we get 2·#links + r after array modifications. Therefore, the complexity of the for-loop is #links·O(1) + (2·#links + r after )·O(1) = O(#links) + O(maxRank(n))

56 Complexity of deletemin remove:O(1) Creating the rank array:O(maxRank(n)) for-loop:O(#links) + O(maxRank(n)) Update minimum pointer:O(maxRank(n)) Total cost:O(#links) + O(maxRank(n))

57 Analysis Complexity of decreasekey(): Set key to new value:O(1) cut:O(1) Cascading cuts: #cuts · O(1) Mark:O(1) Total cost: O(#cuts)

58 Analysis Complexity of delete(): Sum of the costs for decreasekey and deletemin Total cost: O(#cuts) + O(#links) + O(maxRank(n))

59 Amortized analysis Observations: For deletemin, the number of link operations affects the total cost. For decreasekey, it is the number of cascading cuts. Idea: Pay those costs from savings (“accounting method”)! Assumption: the cost for each link and each cut is 1€. (1)Make sure that for each root node there is always 1€ on the savings account, so we can pay for the link operation if that node becomes the child of another one. (2)Make sure that for each marked node, there are always 2€ on the savings account, so we can –pay the cut operation for that node during “cascading cuts” and –still have 1€ left as savings for the new root node

60 Example

Increase the savings For which operations do we have to pay extra (in order to gain savings for later)? New root nodes originate from: –insert: attach 1€ to each new node inserted in the root list –decreasekey: pay 1€ extra for the cut-off node. –deletemin: during remove, pay 1€ for each child of the removed node, i.e. up to maxRank(n) €. Marked nodes may only result at the end of decreasekey: –For each mark operation pay an extra 2€ for the marked node.

62 Amortized cost of insert Create the node:O(1) Insert into root list:O(1) + O(1) Total amortized cost:O(1)

63 Amortized cost of deletemin remove:O(1) + O(maxRank(n)) Create the rank array:O(maxRank(n)) link operations:#links · O(1)paid from savings! Other insertions:O(maxRank(n)) Update minimum pointer:O(maxRank(n)) Total amortized cost:O(maxRank(n))

64 Example

65 Example remove: Pay 1€ extra for each child of the removed node.

66 Example link: Pay all link operations from the savings.

67 Example link: Pay all link operations from the savings.

68 Example link: Pay all link operations from the savings.

69 Example link: Pay all link operations from the savings.

70 Example link: Pay all link operations from the savings.

71 Amortized cost of decreasekey Set key to new value:O(1) cut:O(1) + O(1) Cascading cuts: #cuts · O(1)paid from savings! Mark:O(1) + 2 · O(1) Total amortized cost:O(1)

72 Example

73 Example Pay 1€ extra for the initial cut operation.

74 Example All cascading cuts are paid form savings.

75 Example All cascading cuts are paid form savings.

76 Example When marking the last parent, pay an additional 2€.

77 Amortized cost of delete Sum of the costs of decreasekey and deletemin: O(1) + O(maxRank(n)) Total amortized cost: O(maxRank(n))

78 Amortized Analysis Amortized costs Insert:O(1) Accessmin:O(1) Deletemin:O(maxRank(n)) Decreasekey:O(1) Delete:O(maxRank(n)) Meld:O(1) To do: show that maxRank(n) = O(log n). I.e. the largest possible rank of a node in a Fibonacci heap is logarithmic in the size n of the Fibonacci heaps.

79 Calculation of maxRank(n) Lemma 1: Let N be a node in a Fibonacci heap and k = N.rank. Consider the children C 1,..., C k of N in the order, in which they have beed added to N (with link). Then: (1)C 1.rank ≥ 0 (2) C i.rank ≥ i - 2for i = 2,..., k Proof: (1) obvious (2)When S i was made achild of N, C 1,..., C i-1 had already been children of N, i.e. we had N.rank ≥ i-1. Since link always links two nodes of the same rank, at the time of linking C i.rank ≥ i-1. Since then, C i cannot have lost more than one of its children (otherwise it would have been cut off too), hence: C i.rank ≥ i - 2

80 Calculation of maxRank(n) Lemma 2: Let N be a node in a Fibonacci heap and k = N.rank. Let size(N) be the number of nodes in the subtree rooted in N. Then:size(N) ≥ F k+2 ≥ k (F k = k-th Fibonacci number) i.e. a node with k children is root of a subtree with at least F k+2 nodes. Proof: Let S(k) = min {size(N) | N with N.rank = k}, i.e. the smallest possible size of a tree whose root has rank k. (Obviously, S(0) = 1.) Again, let C 1,..., C k be the children of N in the order of their linking to N. Then (Lemma 1)

81 Calculation of maxRank(n) Theorem: The maximum rank maxRank(n) of any node in a Fibonacci heap with n nodes is bounded by O(log n). Proof: Let N be a node in a Fibonacci heap with n nodes and let k = N.rank. Thenn ≥ size(N) ≥ k (cf. Lemma 2) Hencek ≤ log (n) = O(log n)

82 Conclusion Linear list(Min-)heap Fibonacci heap insert: O(1)O(log n) O(1) accessmin: O(1)O(1)O(1) deletemin: O(n) O(log n)O(log n)* decreasekey: O(1)O(log n) O(1)* delete: O(n) O(log n)O(log n)* meld:O(1) O(m log(n+m)) O(1) *Amortized cost