Augmenting AVL trees.

Slides:



Advertisements
Similar presentations
Chapter 13. Red-Black Trees
Advertisements

CS 261 – Data Structures AVL Trees. Binary Search Tree: Balance Complexity of BST operations: proportional to the length of the path from the root to.
1 AVL-Trees (Adelson-Velskii & Landis, 1962) In normal search trees, the complexity of find, insert and delete operations in search trees is in the worst.
CS261 Data Structures AVL Trees. Goals Pros/Cons of a BST AVL Solution – Height-Balanced Trees.
Trees Types and Operations
Time Complexity of Basic BST Operations Search, Insert, Delete – These operations visit the nodes along a root-to- leaf path – The number of nodes encountered.
Binary Trees, Binary Search Trees CMPS 2133 Spring 2008.
Augmenting AVL trees. How we’ve thought about trees so far Good for determining ancestry Can be good for quickly finding an element Good for determining.
David Luebke 1 5/22/2015 CS 332: Algorithms Augmenting Data Structures: Interval Trees.
1 Binary Search Trees Implementing Balancing Operations –AVL Trees –Red/Black Trees Reading:
10/22/2002CSE Red Black Trees CSE Algorithms Red-Black Trees Augmenting Search Trees Interval Trees.
Binary Search Trees. BST Properties Have all properties of binary tree Items in left subtree are smaller than items in any node Items in right subtree.
Trees, Binary Search Trees, Recursion, Project 2 Bryce Boe 2013/08/01 CS24, Summer 2013 C.
Data Structures - CSCI 102 Binary Tree In binary trees, each Node can point to two other Nodes and looks something like this: template class BTNode { public:
AVL trees. Today AVL Deletes and rotations, then testing your knowledge of these concepts! Before I get into details, I want to show you some animated.
AVL trees. Today AVL delete and subsequent rotations Testing your knowledge with interactive demos!
Starting at Binary Trees
1 Trees 4: AVL Trees Section 4.4. Motivation When building a binary search tree, what type of trees would we like? Example: 3, 5, 8, 20, 18, 13, 22 2.
Chapter 7 Trees_Part3 1 SEARCH TREE. Search Trees 2  Two standard search trees:  Binary Search Trees (non-balanced) All items in left sub-tree are less.
CS 367 – Introduction to Data Structures
AVL trees1 AVL Trees Height of a node : The height of a leaf is 1. The height of a null pointer is zero. The height of an internal node is the maximum.
AVL Trees and Heaps. AVL Trees So far balancing the tree was done globally Basically every node was involved in the balance operation Tree balancing can.
CSE 3358 NOTE SET 13 Data Structures and Algorithms.
Augmenting AVL trees. How we’ve thought about trees so far Good for determining ancestry Can be good for quickly finding an element Good for determining.
1 AVL Trees II Implementation. 2 AVL Tree ADT A binary search tree in which the balance factor of each node is 0, 1, of -1. Basic Operations Construction,
BSTs, AVL Trees and Heaps Ezgi Shenqi Bran. What to know about Trees? Height of a tree Length of the longest path from root to a leaf Height of an empty.
Priority Queues and Heaps Tom Przybylinski. Maps ● We have (key,value) pairs, called entries ● We want to store and find/remove arbitrary entries (random.
CSE332: Data Abstractions Lecture 7: AVL Trees
Lecture 15 Nov 3, 2013 Height-balanced BST Recall:
AA Trees.
Binary search trees Definition
Lec 13 Oct 17, 2011 AVL tree – height-balanced tree Other options:
BCA-II Data Structure Using C
Binary Search Trees Rem Collier Room A1.02
CS 332: Algorithms Red-Black Trees David Luebke /20/2018.
AVL trees.
CO4301 – Advanced Games Development Week 10 Red-Black Trees continued
Hashing Exercises.
Augmenting AVL trees.
COMP 103 Binary Search Trees.
ITEC 2620M Introduction to Data Structures
O(lg n) Search Tree Tree T is a search tree made up of n elements: x0 x1 x2 x3 … xn-1 No function (except transverse) takes more than O(lg n) in the.
Lecture 25 Splay Tree Chapter 10 of textbook
CSE373: Data Structures & Algorithms Lecture 7: AVL Trees
Binary Trees, Binary Search Trees
Chapter 22 : Binary Trees, AVL Trees, and Priority Queues
Monday, April 16, 2018 Announcements… For Today…
CMSC 341 Lecture 10 B-Trees Based on slides from Dr. Katherine Gibson.
Balanced-Trees This presentation shows you the potential problem of unbalanced tree and show two way to fix it This lecture introduces heaps, which are.
AVL Trees: AVL Trees: Balanced binary search tree
Search Sorted Array: Binary Search Linked List: Linear Search
CPSC 221: Algorithms and Data Structures Lecture #6 Balancing Act
CSE373: Data Structures & Algorithms Lecture 5: AVL Trees
A Robust Data Structure
Binary Trees, Binary Search Trees
CSE 373: Data Structures and Algorithms
CSC 143 Binary Search Trees.
BST Insert To insert an element, we essentially do a find( ). When we reach a NULL pointer, we create a new node there. void BST::insert(const Comp &
ITCS6114 Algorithms and Data Structures
CSE 373 Data Structures and Algorithms
CSE 373: Data Structures and Algorithms
Richard Anderson Spring 2016
AVL Tree Chapter 6 (cont’).
CSE 326: Data Structures Splay Trees
Augmenting Data Structures: Interval Trees
Search Sorted Array: Binary Search Linked List: Linear Search
Binary Trees, Binary Search Trees
Data Structures Using C++ 2E
CS 261 – Data Structures AVL Trees.
AVL Trees: AVL Trees: Balanced binary search tree
Presentation transcript:

Augmenting AVL trees

How we’ve thought about trees so far Good for determining ancestry Can be good for quickly finding an element

Enter: idea of augmenting a tree Other kinds of uses? Any thoughts? Finding a minimum/maximum… (heaps are probably just as good or better) Finding an average? More complicated things?!!!11one Enter: idea of augmenting a tree

Augmenting Can quickly compute many global properties that seem to need knowledge of the whole tree! Examples: size of any sub-tree height of any sub-tree averages of keys/values in a sub-tree min+max of keys/values in any sub-tree, … Can quickly compute any function f(u) so long as you only need to know f(u.left) and f(u.right)!

Augmenting an AVL tree Can augment any kind of tree Only balanced trees are guaranteed to be fast After augmenting an AVL tree to compute f(u), we can still do all operations in O(lg n)!

Simple first example We are going to do one simple example Then, you will help with a harder one! Problem: augment an AVL tree so we can do: Insert(key): add key in O(lg n) Delete(key): remove key in O(lg n) Height(node): get height of sub-tree rooted at node in O(1) A regular AVL tree already does this Store some extra data at each node… but what? How do we do this?

Can we compute this function quickly? Function we want to compute: Height(u) = H(u) If someone gives us H(uL) and H(uR), can we compute H(u)? What formula should we use? If u is a leaf then H(u) = 0 Else H(u) = max{H(uL), H(uR)}+1 u uL uR H(uL) H(u)=? H(uR)

Augmenting AVL tree to compute H(u) Each node u contains key: the key left, right: child pointers h: height of sub-tree rooted at u How? The usual stuff… Secret sauce! d, 1 d, 3 d, 1 d, 2 d, 0 d, 2 d, 3 Insert(d) a, 2 a, 1 a, 0 e, 0 b, 1 b, 1 e, 0 2 Insert(a) Insert(e) b, 1 b, 0 a, 2 a, 0 c, 0 c, 0 Insert(b) Insert(c) c, 0

Algorithm idea: From the last slide, we develop an algorithm Insert(key): 1 BST search for where to put key 2 Insert key into place like in a regular AVL tree 3 Fix balance factors and rotate as you would in AVL insert, but fix heights at the same time. (Remember to fix heights all the way to the root. Don’t stop before reaching the root!) (When you rotate, remember to fix heights of all nodes involved, just like you fix balance factors!)

Harder problem: scheduling conflicts Your calendar contains a bunch of time intervals [lo,hi] where you are busy We want to be able to quickly tell whether a new booking conflicts with an earlier booking.

Breaking the problem down You must design a data structure D to efficiently do: Insert(D; x): Insert interval x into D. Delete(D; x): Delete interval x from D. Search(D; x): If D contains an interval that overlaps with x, return any such interval. Otherwise, return null. All functions must run in O(lg n) The hard part

Figuring out the data structure - 1 Iterative process; HARD to get right the first time! Need a way to insert intervals into the tree Use low end-point of interval as the key Example tree: 30, 34 26, 36 60, 80 8, 16 29, 36 48, 52

Figuring out the data structure - 2 What function do we want to compute? Does an interval x intersect any interval in the tree? What info should we store at each node u? Mhi(u) = Maximum high endpoint of any node in the subtree. How can we use the info stored at each node to compute the desired function (by looking at a small number of nodes)? Start by computing whether an interval x intersects any interval in a subtree.

Algorithm for Search within a subtree Returns an interval in the subtree rooted at u that intersects [lo, hi] Search(lo, hi, u): if u is null then return null

Algorithm for Search within a subtree Returns an interval in the subtree rooted at u that intersects [lo, hi] Search(lo, hi, u): if u is null then return null if [lo, hi] intersects [lo(u), hi(u)] then return [lo(u), hi(u)]

Algorithm for Search within a subtree Returns an interval in the subtree rooted at u that intersects [lo, hi] Search(lo, hi, u): if u is null then return null if [lo, hi] intersects [lo(u), hi(u)] then return [lo(u), hi(u)] else (no intersection) if lo < lo(u) return Search(lo, hi, left(u)) Every node v on this side has lo(v) > hi lo hi lo hi lo(u) hi(u) u

Algorithm for Search within a subtree Returns an interval in the subtree rooted at u that intersects [lo, hi] Search(lo, hi, u): if u is null then return null if [lo, hi] intersects [lo(u), hi(u)] then return [lo(u), hi(u)] else (no intersection) if lo < lo(u) return Search(lo, hi, left(u)) else (lo ≥ lo(u)) if lo > Mhi(left(u)) then return Search(lo, hi, right(u)) u lo(u) hi(u) Every node v on this side has hi(v) < lo k lo hi lo hi

Algorithm for Search within a subtree Returns an interval in the subtree rooted at u that intersects [lo, hi] Search(lo, hi, u): if u is null then return null if [lo, hi] intersects [lo(u), hi(u)] then return [lo(u), hi(u)] else (no intersection) if lo < lo(u) return Search(lo, hi, left(u)) else (lo ≥ lo(u)) if lo > Mhi(left(u)) then return Search(lo, hi, right(u)) else (lo ≤ Mhi(left(u)) return Search(lo, hi, left(u)) u lo(u) hi(u) v lo hi lo(v) hi(v) = Mhi(left(u))

Final algorithm for Search Search(lo, hi, u): if u is null then return null if [lo, hi] intersects [lo(u), hi(u)] then return [lo(u), hi(u)] else (no intersection) if lo < lo(u) return Search(lo, hi, left(u)) else (lo ≥ lo(u)) if lo > Mhi(left(u)) then return Search(lo, hi, right(u)) else (lo ≤ Mhi(left(u)) return Search(lo, hi, left(u)) Search(D, x=[lo, hi]): return Search(lo, hi, root(D))

Algorithms for Insert and Delete Insert(D, x=[lo, hi]): Do regular AVL insertion of key lo, also storing hi. Set Mhi of the new node to hi. Fix balance factors and perform rotations as usual, but also update Mhi(u) whenever you update the balance factor of a node u. Update Mhi(u) for all ancestors, and for every node involved in a rotation, using formula: Mhi(u) = max{hi(u), Mhi(left(u)), Mhi(right(u))}. Delete(D, x=[lo, hi]): similar to Insert

Why O(lg n) time? Insert/Delete: normal AVL operation = O(lg n) Search PLUS: update Mhi(u) for each u on path to the root Length of this path ≤ tree height, so O(lg n) in an AVL tree PLUS: update Mhi(u) for each node involved in a rotation At most O(lg n) rotations (one per node on the path from the root ≤ tree height) Each rotation involves a constant number of nodes Therefore, constant times O(lg n), which is O(lg n). Search Constant work + recursive call on a child Single recursive call means O(tree height) = O(lg n)