Download presentation
Presentation is loading. Please wait.
Published byLaurence Rodgers Modified over 9 years ago
1
CS 3343: Analysis of Algorithms Lecture 16: Binary search trees & red- black trees
2
Review: Hash tables Problem: collision T 0 m - 1 h(k 1 ) h(k 4 ) h(k 2 ) = h(k 5 ) h(k 3 ) k4k4 k2k2 k3k3 k1k1 k5k5 U (universe of keys) K (actual keys) |U| >> K & |U| >> m collision
3
Chaining Chaining puts elements that hash to the same slot in a linked list: —— T k4k4 k2k2 k3k3 k1k1 k5k5 U (universe of keys) K (actual keys) k6k6 k8k8 k7k7 k1k1 k4k4 —— k5k5 k2k2 k3k3 k8k8 k6k6 k7k7
4
Hashing with Chaining Chained-Hash-Insert (T, x) –Insert x at the head of list T[h(key[x])]. –Worst-case complexity – O(1). Chained-Hash-Delete (T, x) –Delete x from the list T[h(key[x])]. –Worst-case complexity – proportional to length of list with singly-linked lists. O(1) with doubly-linked lists. Chained-Hash-Search (T, k) –Search an element with key k in list T[h(k)]. –Worst-case complexity – proportional to length of list.
5
Analysis of Chaining Assume simple uniform hashing: each key in table is equally likely to be hashed to any slot Given n keys and m slots in the table, the load factor = n/m = average # keys per slot Average cost of an unsuccessful search for a key is (1+ ) (Theorem 11.1) Average cost of a successful search is (2 + /2) = (1 + ) (Theorem 11.2) If the number of keys n is proportional to the number of slots in the table, = n/m = O(1) –The expected cost of searching is constant if is constant
6
Hash Functions: The Division Method h(k) = k mod m –In words: hash k into a table with m slots using the slot given by the remainder of k divided by m –Example: m = 31 and k = 78 => h(k) = 16. Advantage: fast Disadvantage: value of m is critical –Bad if keys bear relation to m –Or if hash does not depend on all bits of k Pick m = prime number not too close to power of 2 (or 10)
7
Hash Functions: The Multiplication Method For a constant A, 0 < A < 1: h(k) = m (kA mod 1) = m (kA - kA ) Advantage: Value of m is not critical Disadvantage: relatively slower Choose m = 2 P, for easier implementation Choose A not too close to 0 or 1 Knuth: Good choice for A = ( 5 - 1)/2 Example: m = 1024, k = 123, A 0.6180339887… h(k) = 1024(123 · 0.6180339887 mod 1) = 1024 · 0.018169... = 18. Fractional part of kA
8
A Universal Hash Function Choose a prime number p that is larger than all possible keys Choose table size m ≥ n Randomly choose two integers a, b, such that 1 a p -1, and 0 b p -1 h a,b (k) = ((ak+b) mod p) mod m Example: p = 17, m = 6 h 3,4 (8) = ((3*8 + 4) % 17) % 6 = 11 % 6 = 5 With a random pair of parameters a, b, the chance of a collision between x and y is at most 1/m Expected search time for any input is (1)
9
Today Binary search trees Red-black trees
10
Binary Search Trees Data structures that can support dynamic set operations. –Search, Minimum, Maximum, Predecessor, Successor, Insert, and Delete. Can be used to build –Dictionaries. –Priority Queues. Basic operations take time proportional to the height of the tree – O(h).
11
BST – Representation Represented by a linked data structure of nodes. root(T) points to the root of tree T. Each node contains fields: –Key –left – pointer to left child: root of left subtree (maybe nil) –right – pointer to right child : root of right subtree. (maybe nil) –p – pointer to parent. p[root[T]] = NIL (optional). –Satellite data
12
Binary Search Tree Property Stored keys must satisfy the binary search tree property. – y in left subtree of x, then key[y] key[x]. – y in right subtree of x, then key[y] key[x]. 56 26200 18 28 190 213 12 24 27
13
Inorder Traversal Inorder-Tree-Walk (x) 1. if x NIL 2. then Inorder-Tree-Walk(left[x]) 3. print key[x] 4. Inorder-Tree-Walk(right[x]) Inorder-Tree-Walk (x) 1. if x NIL 2. then Inorder-Tree-Walk(left[x]) 3. print key[x] 4. Inorder-Tree-Walk(right[x]) How long does the walk take? (n) The binary-search-tree property allows the keys of a binary search tree to be printed, in (monotonically increasing) order, recursively. 56 26200 18 28 190 213 12 24 27
14
Tree Search Tree-Search(x, k) 1. if x = NIL or k = key[x] 2. then return x 3. if k < key[x] 4. then return Tree-Search(left[x], k) 5. else return Tree-Search(right[x], k) Tree-Search(x, k) 1. if x = NIL or k = key[x] 2. then return x 3. if k < key[x] 4. then return Tree-Search(left[x], k) 5. else return Tree-Search(right[x], k) Running time: O(h) 56 26200 18 28 190 213 12 24 27 Example: search for 27
15
Iterative Tree Search Iterative-Tree-Search(x, k) 1. while x NIL and k key[x] 2. do if k < key[x] 3. then x left[x] 4. else x right[x] 5. return x Iterative-Tree-Search(x, k) 1. while x NIL and k key[x] 2. do if k < key[x] 3. then x left[x] 4. else x right[x] 5. return x The iterative tree search is more efficient on most computers. The recursive tree search is more straightforward. 56 26200 18 28 190 213 12 24 27
16
Finding Min & Max Tree-Minimum(x) Tree-Maximum(x) 1. while left[x] NIL 1. while right[x] NIL 2. do x left[x] 2. do x right[x] 3. return x Tree-Minimum(x) Tree-Maximum(x) 1. while left[x] NIL 1. while right[x] NIL 2. do x left[x] 2. do x right[x] 3. return x Q: How long do they take? The binary-search-tree property guarantees that: » The minimum is located at the left-most node. » The maximum is located at the right-most node.
17
Predecessor and Successor Successor of node x is the node y such that key[y] is the smallest key greater than key[x]. The successor of the largest key is NIL. Search consists of two cases. –If node x has a non-empty right subtree, then x’s successor is the minimum in the right subtree of x. –If node x has an empty right subtree, then: As long as we move to the left up the tree (move up through right children), we are visiting smaller keys. x’s successor y is the node that x is the predecessor of (x is the maximum in y’s left subtree). In other words, x’s successor y, is the lowest ancestor of x whose left child is also an ancestor of x.
18
Pseudo-code for Successor Tree-Successor(x) 1. if right[x] NIL 2. then return Tree-Minimum(right[x]) 3. y p[x] 4. while y NIL and x = right[y] 5. do x y 6. y p[y] 7. return y Tree-Successor(x) 1. if right[x] NIL 2. then return Tree-Minimum(right[x]) 3. y p[x] 4. while y NIL and x = right[y] 5. do x y 6. y p[y] 7. return y Code for predecessor is symmetric. Running time: O(h) 56 26200 18 28 190 213 12 24 27 Example: successor of 56 190
19
Pseudo-code for Successor Tree-Successor(x) 1. if right[x] NIL 2. then return Tree-Minimum(right[x]) 3. y p[x] 4. while y NIL and x = right[y] 5. do x y 6. y p[y] 7. return y Tree-Successor(x) 1. if right[x] NIL 2. then return Tree-Minimum(right[x]) 3. y p[x] 4. while y NIL and x = right[y] 5. do x y 6. y p[y] 7. return y Code for predecessor is symmetric. Running time: O(h) 56 26200 18 28 190 213 12 24 27 Example: successor of 28 Lowest node whose left child is an ancestor of x. 56
20
BST Insertion – Pseudocode Tree-Insert(T, z) 1.y NIL 2.x root[T] 3.while x NIL 4. do y x 5. if key[z] < key[x] 6. then x left[x] 7. else x right[x] 8.p[z] y 9.if y = NIL 10. then root[t] z 11. else if key[z] < key[y] 12. then left[y] z 13. else right[y] z Tree-Insert(T, z) 1.y NIL 2.x root[T] 3.while x NIL 4. do y x 5. if key[z] < key[x] 6. then x left[x] 7. else x right[x] 8.p[z] y 9.if y = NIL 10. then root[t] z 11. else if key[z] < key[y] 12. then left[y] z 13. else right[y] z Change the dynamic set represented by a BST. Ensure the binary-search- tree property holds after change. Similar to Tree-Search Insert z in place of NIL 56 26200 18 28 190 213 12 24 27 e.g. insert 195 195 Running time: O(h)
21
Tree-Delete (T, x) if x has no children case 0 then remove x if x has one child case 1 then make p[x] point to child if x has two children (subtrees) case 2 then swap x with its successor perform case 0 or case 1 to delete it TOTAL: O(h) time to delete a node
22
Case 0 X has no children e.g. delete 190 56 26200 18 28 190 213 12 24 27
23
Case 1 X has one child e.g. delete 28 56 26200 18 28 190 213 12 24 27
24
Case 2 X has two children e.g. delete 26 56 26200 18 28 190 213 12 24 27
25
Case 2 X has two children e.g. delete 26 56 27200 18 28 190 213 12 24 26 Swap with successor
26
Case 2 X has two children e.g. delete 26 56 27200 18 28 190 213 12 24 26 Case 0
27
Case 2 X has two children e.g. delete 26 56 26200 18 33 190 213 12 24 27 28
28
Case 2 X has two children e.g. delete 26 56 27200 18 33 190 213 12 24 26 28 Swap with successor
29
Case 2 X has two children e.g. delete 26 56 27200 18 33 190 213 12 24 26 28 Case 1
30
Case 2 X has two children e.g. delete 26 56 27200 18 33 190 213 12 24 28 Case 1
31
Correctness of Tree-Delete How do we know case 2 should go to case 0 or case 1 instead of back to case 2? –Because when x has 2 children, its successor is the minimum in its right subtree, and that successor has no left child (hence 0 or 1 child). Equivalently, we could swap with predecessor instead of successor. It might be good to alternate to avoid creating lopsided tree.
32
Deletion – Pseudocode Tree-Delete(T, z) /* Determine which node to splice out: either z or z’s successor. */ 1. if left[z] = NIL or right[z] = NIL 2. then y z// case 0 or 1 3. else y Tree-Successor[z] // case 2 /* Set x to a non-NIL child of x, or to NIL if y has no children. */ 4.if left[y] NIL 5. then x left[y] 6. else x right[y] /* y is removed from the tree by manipulating pointers of p[y] and x */ 7.if x NIL 8. then p[x] p[y] /* Continued on next slide */ Tree-Delete(T, z) /* Determine which node to splice out: either z or z’s successor. */ 1. if left[z] = NIL or right[z] = NIL 2. then y z// case 0 or 1 3. else y Tree-Successor[z] // case 2 /* Set x to a non-NIL child of x, or to NIL if y has no children. */ 4.if left[y] NIL 5. then x left[y] 6. else x right[y] /* y is removed from the tree by manipulating pointers of p[y] and x */ 7.if x NIL 8. then p[x] p[y] /* Continued on next slide */
33
Deletion – Pseudocode Tree-Delete(T, z) (Contd. from previous slide) 9. if p[y] = NIL 10. then root[T] x 11. else if y left[p[i]] 12. then left[p[y]] x 13. else right[p[y]] x /* If z’s successor was spliced out, copy its data into z */ 14.if y z 15. then key[z] key[y] 16. copy y’s satellite data into z. 17.return y Tree-Delete(T, z) (Contd. from previous slide) 9. if p[y] = NIL 10. then root[T] x 11. else if y left[p[i]] 12. then left[p[y]] x 13. else right[p[y]] x /* If z’s successor was spliced out, copy its data into z */ 14.if y z 15. then key[z] key[y] 16. copy y’s satellite data into z. 17.return y
34
Querying a Binary Search Tree All dynamic-set search operations can be supported in O(h) time. h = (lg n) for a balanced binary tree (and for an average tree built by adding nodes in random order.) h = (n) for an unbalanced tree that resembles a linear chain of n nodes in the worst case.
35
Red-black trees: Overview Red-black trees are a variation of binary search trees to ensure that the tree is balanced. –Height is O(lg n), where n is the number of nodes. Operations take O(lg n) time in the worst case.
36
Red-black Tree Binary search tree + 1 bit per node: the attribute color, which is either red or black. All other attributes of BSTs are inherited: –key, left, right, and p. All empty trees (leaves) are colored black. –We use a single sentinel, nil, for all the leaves of red-black tree T, with color[nil] = black. –The root’s parent is also nil[T ].
37
Red-black Tree – Example 26 17 30 47 3850 41 nil[T] Remember: every internal node has two children, even though nil leaves are not usually shown.
38
Red-black Properties 1.Every node is either red or black. 2.The root is black. 3.Every leaf (nil) is black. 4.If a node is red, then both its children are black. 5.For each node, all paths from the node to descendant leaves contain the same number of black nodes.
39
Height of a Red-black Tree Height of a node: –Number of edges in a longest path to a leaf. Black-height of a node x, bh(x): –bh(x) is the number of black nodes (including nil[T]) on the path from x to leaf, not counting x. Black-height of a red-black tree is the black- height of its root. –By Property 5, black height is well defined.
40
Height of a Red-black Tree Example: Height of a node: h(x) = # of edges in a longest path to a leaf. Black-height of a node bh(x) = # of black nodes on path from x to leaf, not counting x. How are they related? –bh(x) ≤ h(x) ≤ 2 bh(x) 26 17 30 47 3850 41 nil[T] h=4 bh=2 h=3 bh=2 h=2 bh=1 h=2 bh=1 h=1 bh=1 h=1 bh=1 h=1 bh=1
41
Height of a red-black tree Theorem. A red-black tree with n keys has height h 2 log(n + 1). Proof. (The book uses induction. Read carefully.) I NTUITION : Merge red nodes into their black parents.
42
Height of a red-black tree Theorem. A red-black tree with n keys has height h 2 log(n + 1). Proof. (The book uses induction. Read carefully.) I NTUITION : Merge red nodes into their black parents.
43
Height of a red-black tree Theorem. A red-black tree with n keys has height h 2 log(n + 1). Proof. (The book uses induction. Read carefully.) I NTUITION : Merge red nodes into their black parents.
44
Height of a red-black tree Theorem. A red-black tree with n keys has height h 2 log(n + 1). Proof. (The book uses induction. Read carefully.) I NTUITION : Merge red nodes into their black parents.
45
Height of a red-black tree Theorem. A red-black tree with n keys has height h 2 log(n + 1). Proof. (The book uses induction. Read carefully.) I NTUITION : Merge red nodes into their black parents.
46
Height of a red-black tree Theorem. A red-black tree with n keys has height h 2 log(n + 1). Proof. (The book uses induction. Read carefully.) This process produces a tree in which each node has 2, 3, or 4 children. The 2-3-4 tree has uniform depth h of leaves. I NTUITION : Merge red nodes into their black parents. h
47
Proof (continued) h h We have h h/2, since at most half the leaves on any path are red. The number of leaves in each tree is n n 2 h‘ – 1 log(n + 1) h' h/2 h 2 log(n + 1).
48
Operations on RB Trees All operations can be performed in O(lg n) time. The query operations, which don’t modify the tree, are performed in exactly the same way as they are in BSTs. Insertion and Deletion are not straightforward. Why?
49
Rotations y x Left-Rotate(T, x) x y Right-Rotate(T, y)
50
Rotations Rotations are the basic tree-restructuring operation for almost all balanced search trees. Rotation takes a red-black-tree and a node, Changes pointers to change the local structure, and Won’t violate the binary-search-tree property. Left rotation and right rotation are inverses. y x Left-Rotate(T, x) x y Right-Rotate(T, y)
51
Left Rotation – Pseudo-code Left-Rotate (T, x) 1.y right[x] // Set y. 2.right[x] left[y] //Turn y’s left subtree into x’s right subtree. 3.if left[y] nil[T ] 4. then p[left[y]] x 5.p[y] p[x] // Link x’s parent to y. 6.if p[x] = nil[T ] 7. then root[T ] y 8. else if x = left[p[x]] 9. then left[p[x]] y 10. else right[p[x]] y 11.left[y] x // Put x on y’s left. 12.p[x] y Left-Rotate (T, x) 1.y right[x] // Set y. 2.right[x] left[y] //Turn y’s left subtree into x’s right subtree. 3.if left[y] nil[T ] 4. then p[left[y]] x 5.p[y] p[x] // Link x’s parent to y. 6.if p[x] = nil[T ] 7. then root[T ] y 8. else if x = left[p[x]] 9. then left[p[x]] y 10. else right[p[x]] y 11.left[y] x // Put x on y’s left. 12.p[x] y x y x y x y
52
Rotation The pseudo-code for Left-Rotate assumes that –right[x] nil[T ], and –root’s parent is nil[T ]. Left Rotation on x, makes x the left child of y, and the left subtree of y into the right subtree of x. Pseudocode for Right-Rotate is symmetric: exchange left and right everywhere. Time: O(1) for both Left-Rotate and Right-Rotate, since a constant number of pointers are modified.
53
Reminder: Red-black Properties 1.Every node is either red or black. 2.The root is black. 3.Every leaf (nil) is black. 4.If a node is red, then both its children are black. 5.For each node, all paths from the node to descendant leaves contain the same number of black nodes.
54
Insertion in RB Trees Insertion must preserve all red-black properties. Should an inserted node be colored Red? Black? Basic steps: –Use Tree-Insert from BST (slightly modified) to insert a node x into T. Procedure RB-Insert(x). –Color the node x red. –Fix the modified tree by re-coloring nodes and performing rotation to preserve RB tree property. Procedure RB-Insert-Fixup.
55
Insertion RB-Insert(T, z) 1. y nil[T] 2.x root[T] 3.while x nil[T] 4. do y x 5. if key[z] < key[x] 6. then x left[x] 7. else x right[x] 8.p[z] y 9.if y = nil[T] 10. then root[T] z 11. else if key[z] < key[y] 12. then left[y] z 13. else right[y] z RB-Insert(T, z) 1. y nil[T] 2.x root[T] 3.while x nil[T] 4. do y x 5. if key[z] < key[x] 6. then x left[x] 7. else x right[x] 8.p[z] y 9.if y = nil[T] 10. then root[T] z 11. else if key[z] < key[y] 12. then left[y] z 13. else right[y] z RB-Insert(T, z) Contd. 14.left[z] nil[T] 15.right[z] nil[T] 16.color[z] RED 17.RB-Insert-Fixup (T, z) RB-Insert(T, z) Contd. 14.left[z] nil[T] 15.right[z] nil[T] 16.color[z] RED 17.RB-Insert-Fixup (T, z) How does it differ from the Tree-Insert procedure of BSTs? Which of the RB properties might be violated? Fix the violations by calling RB-Insert-Fixup.
56
Insertion – Fixup Problem: we may have one pair of consecutive reds where we did the insertion. Solution: rotate it up the tree and away… Three cases have to be handled…
57
Case 1 – uncle y is red p[p[z]] (z’s grandparent) must be black, since z and p[z] are both red and there are no other violations of property 4. Make p[z] and y black now z and p[z] are NOT both red. But property 5 might now be violated. Make p[p[z]] red restores property 5. What’s the new problem now? The next iteration has p[p[z]] as the new z (i.e., z moves up 2 levels). When to stop? C A D B z y p[z]p[z] p[p[z]] z is a right child and p[z] is a left child here. Similar if z is a left child or if p[z] is a right child. new z A D B C C
58
Case 2 – y is black, z is a right child Left rotate around p[z], p[z] and z switch roles now z is a left child, and both z and p[z] are red. Takes us immediately to case 3. Similar if z is a left child and p[z] is a right child C A B z y C B A z y p[z]p[z] p[z]p[z]
59
Case 3 – y is black, z is a left child Make p[z] black and p[p[z]] red. Then right rotate on p[p[z]]. Ensures property 4 is maintained. No longer have 2 reds in a row. p[z] is now black no more iterations. Similar if both z and p[z] are right children B A C B A y p[z]p[z] C z y p[z]p[z] z
60
Insertion – Fixup RB-Insert-Fixup (T, z) 1.while color[p[z]] = RED 2. do if p[z] = left[p[p[z]]] // p[z] is a left child 3. then y right[p[p[z]]] // y: uncle 4. if color[y] = RED 5. then color[p[z]] BLACK // Case 1 6. color[y] BLACK // Case 1 7. color[p[p[z]]] RED // Case 1 8. z p[p[z]] // Case 1 RB-Insert-Fixup (T, z) 1.while color[p[z]] = RED 2. do if p[z] = left[p[p[z]]] // p[z] is a left child 3. then y right[p[p[z]]] // y: uncle 4. if color[y] = RED 5. then color[p[z]] BLACK // Case 1 6. color[y] BLACK // Case 1 7. color[p[p[z]]] RED // Case 1 8. z p[p[z]] // Case 1
61
Insertion – Fixup RB-Insert-Fixup ( T, z ) (Contd.) 9. else if z = right[p[z]] // color[y] RED 10. then z p[z] // Case 2 11. LEFT-ROTATE ( T, z ) // Case 2 12. color[p[z]] BLACK // Case 3 13. color[p[p[z]]] RED // Case 3 14. RIGHT-ROTATE ( T, p[p[z]] ) // Case 3 15. else (if p[z] = right[p[p[z]]])(same as 3-14 16. with “right” and “left” exchanged) 17.color[root[T ]] BLACK RB-Insert-Fixup ( T, z ) (Contd.) 9. else if z = right[p[z]] // color[y] RED 10. then z p[z] // Case 2 11. LEFT-ROTATE ( T, z ) // Case 2 12. color[p[z]] BLACK // Case 3 13. color[p[p[z]]] RED // Case 3 14. RIGHT-ROTATE ( T, p[p[z]] ) // Case 3 15. else (if p[z] = right[p[p[z]]])(same as 3-14 16. with “right” and “left” exchanged) 17.color[root[T ]] BLACK
62
Algorithm Analysis O(lg n) time to get through RB-Insert up to the call of RB-Insert-Fixup. Within RB-Insert-Fixup: –Each iteration takes O(1) time. –Each iteration but the last moves z up 2 levels. –O(lg n) levels O(lg n) time. –Thus, insertion in a red-black tree takes O(lg n) time. –Note: there are at most 2 rotations overall.
63
RB-Insert Example 15 Example: Insert x =15. 8 8 11 10 18 26 22 7 7 3 3 I DEA : Insert x in tree. Color x red. Only red- black property 4 might be violated. Move the violation up the tree by recoloring until it can be fixed with rotations and recoloring.
64
Insertion into a red-black tree 15 Example: Insert x =15. Recolor, move the violation up the tree. 8 8 11 10 18 26 22 7 7 3 3 I DEA : Insert x in tree. Color x red. Only red- black property 4 might be violated. Move the violation up the tree by recoloring until it can be fixed with rotations and recoloring. Case 1
65
Insertion into a red-black tree 15 Example: Insert x =15. Recolor, move the violation up the tree. 8 8 11 10 18 26 22 7 7 3 3 I DEA : Insert x in tree. Color x red. Only red- black property 4 might be violated. Move the violation up the tree by recoloring until it can be fixed with rotations and recoloring.
66
Insertion into a red-black tree 8 8 11 10 18 26 22 7 7 15 Example: Insert x =15. Recolor, move the violation up the tree. R IGHT -R OTATE (18). 3 3 I DEA : Insert x in tree. Color x red. Only red- black property 4 might be violated. Move the violation up the tree by recoloring until it can be fixed with rotations and recoloring. Case 2
67
Insertion into a red-black tree 8 8 11 10 18 26 22 7 7 15 Example: Insert x =15. Recolor, move the violation up the tree. R IGHT -R OTATE (18). 3 3 I DEA : Insert x in tree. Color x red. Only red- black property 4 might be violated. Move the violation up the tree by recoloring until it can be fixed with rotations and recoloring.
68
Insertion into a red-black tree 8 8 11 10 18 26 22 7 7 15 Example: Insert x =15. Recolor, move the violation up the tree. R IGHT -R OTATE (18). L EFT -R OTATE (7) 3 3 I DEA : Insert x in tree. Color x red. Only red- black property 4 might be violated. Move the violation up the tree by recoloring until it can be fixed with rotations and recoloring. Case 3
69
Insertion into a red-black tree I DEA : Insert x in tree. Color x red. Only red- black property 4 might be violated. Move the violation up the tree by recoloring until it can be fixed with rotations and recoloring. 8 8 11 10 18 26 22 7 7 15 Example: Insert x =15. Recolor, move the violation up the tree. R IGHT -R OTATE (18). L EFT -R OTATE (7) 3 3
70
Insertion into a red-black tree I DEA : Insert x in tree. Color x red. Only red- black property 4 might be violated. Move the violation up the tree by recoloring until it can be fixed with rotations and recoloring. 8 8 11 10 18 26 22 7 7 15 Example: Insert x =15. Recolor, move the violation up the tree. R IGHT -R OTATE (18). L EFT -R OTATE (7) and recolor. 3 3
71
Insertion into a red-black tree I DEA : Insert x in tree. Color x red. Only red- black property 4 might be violated. Move the violation up the tree by recoloring until it can be fixed with rotations and recoloring. 8 8 11 10 18 26 22 7 7 15 Example: Insert x =15. Recolor, move the violation up the tree. R IGHT -R OTATE (18). L EFT -R OTATE (7) and recolor. 3 3 Done!
72
Deletion Deletion, like insertion, should preserve all the RB properties. The properties that may be violated depends on the color of the deleted node. –Red – OK. Why? –Black? Steps: –Do regular BST deletion. –Fix any violations of RB properties that may result. –We will skip. Read on your own.
73
Analysis O(lg n) time to get through RB-Delete up to the call of RB-Delete-Fixup. Within RB-Delete-Fixup: –O(lg n) time.
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.