Introduction to Analysis of Algorithms CAS CS 330 Lecture 16 Shang-Hua Teng Thanks to Charles E. Leiserson and Silvio Micali of MIT for these slides.

Slides:



Advertisements
Similar presentations
Introduction to Algorithms
Advertisements

Chapter 13. Red-Black Trees
© 2001 by Charles E. Leiserson Introduction to AlgorithmsDay 18 L10.1 Introduction to Algorithms 6.046J/18.401J/SMA5503 Lecture 10 Prof. Erik Demaine.
Introduction to Algorithms Red-Black Trees
Augmenting Data Structures Advanced Algorithms & Data Structures Lecture Theme 07 – Part I Prof. Dr. Th. Ottmann Summer Semester 2006.
Jan Binary Search Trees What is a search binary tree? Inorder search of a binary search tree Find Min & Max Predecessor and successor BST insertion.
ALGORITHMS THIRD YEAR BANHA UNIVERSITY FACULTY OF COMPUTERS AND INFORMATIC Lecture six Dr. Hamdy M. Mousa.
Binary Search Trees Comp 550.
CS Section 600 CS Section 002 Dr. Angela Guercio Spring 2010.
David Luebke 1 5/22/2015 CS 332: Algorithms Augmenting Data Structures: Interval Trees.
Augnenting data structures Augment an existing data structure to apply to a new problem. Red-black tree as an example.
14. Augmenting Data Structures Hsu, Lih-Hsing. Computer Theory Lab. Chapter 13P Dynamic order statistics We shall also see the rank of an element―its.
Analysis of Algorithms CS 477/677 Instructor: Monica Nicolescu Lecture 10.
2IL50 Data Structures Spring 2015 Lecture 8: Augmenting Data Structures.
Analysis of Algorithms CS 477/677 Instructor: Monica Nicolescu Lecture 11.
Analysis of Algorithms CS 477/677 Instructor: Monica Nicolescu Lecture 12.
Analysis of Algorithms CS 477/677 Instructor: Monica Nicolescu Lecture 11.
Lecture 12: Balanced Binary Search Trees Shang-Hua Teng.
Sorting. How fast can we sort? All the sorting algorithms we have seen so far are comparison sorts: only use comparisons to determine the relative order.
Analysis of Algorithms CS 477/677 Red-Black Trees Instructor: George Bebis (Chapter 14)
10/22/2002CSE Red Black Trees CSE Algorithms Red-Black Trees Augmenting Search Trees Interval Trees.
AVL Trees / Slide 1 Balanced Binary Search Tree  Worst case height of binary search tree: N-1  Insertion, deletion can be O(N) in the worst case  We.
Augmenting Data Structures Advanced Algorithms & Data Structures Lecture Theme 07 – Part II Prof. Dr. Th. Ottmann Summer Semester 2006.
Heapsort CIS 606 Spring Overview Heapsort – O(n lg n) worst case—like merge sort. – Sorts in place—like insertion sort. – Combines the best of both.
Design & Analysis of Algorithms Unit 2 ADVANCED DATA STRUCTURE.
David Luebke 1 9/18/2015 CS 332: Algorithms Red-Black Trees.
Lecture X Augmenting Data Structures
© 2014 by Ali Al Najjar Introduction to Algorithms Introduction to Algorithms Red Black Tree Dr. Ali Al Najjar Day 18 L10.1.
Interval Trees CS302 Data Structures Modified from Dr Monica Nicolescu.
October 19, 2005Copyright © by Erik D. Demaine and Charles E. LeisersonL7.1 Introduction to Algorithms 6.046J/18.401J LECTURE 10 Balanced Search.
Mudasser Naseer 1 10/20/2015 CSC 201: Design and Analysis of Algorithms Lecture # 11 Red-Black Trees.
1 Red-Black Trees By Mary Hudachek-Buswell Red Black Tree Properties Rotate Red Black Trees Insertion Red Black Trees.
Lecture 10 Algorithm Analysis Arne Kutzner Hanyang University / Seoul Korea.
2IL50 Data Structures Fall 2015 Lecture 7: Binary Search Trees.
2IL50 Data Structures Fall 2015 Lecture 9: Range Searching.
Binary SearchTrees [CLRS] – Chap 12. What is a binary tree ? A binary tree is a linked data structure in which each node is an object that contains following.
Introduction to Algorithms 6.046J/18.401J/SMA5503 Lecture 11 Prof. Erik Demaine.
SEGMENT TREES. Originally introduced by Bentley(1977) Handle intervals on the real line whose end-points belong to a fixed set of N abscissae. A static.
October 19, 2005Copyright © by Erik D. Demaine and Charles E. LeisersonL7.1 Introduction to Algorithms LECTURE 8 Balanced Search Trees ‧ Binary.
AVL trees1 AVL Trees Height of a node : The height of a leaf is 1. The height of a null pointer is zero. The height of an internal node is the maximum.
Lecture 91 Data Structures, Algorithms & Complexity Insertion and Deletion in BST GRIFFITH COLLEGE DUBLIN.
Red-Black Tree Insertion Start with binary search insertion, coloring the new node red NIL l Insert 18 NIL l NIL l 1315 NIL l
Analysis of Algorithms CS 477/677 Red-Black Trees Instructor: George Bebis (Chapter 14)
Sept Red-Black Trees What is a red-black tree? -node color: red or black - nil[T] and black height Subtree rotation Node insertion Node deletion.
2IL05 Data Structures Spring 2010 Lecture 9: Augmenting Data Structures.
Analysis of Algorithms CS 477/677 Instructor: Monica Nicolescu Lecture 10.
Binary Search Trees What is a binary search tree?
Analysis of Algorithms CS 477/677
Introduction to Algorithms
BCA-II Data Structure Using C
CS 332: Algorithms Red-Black Trees David Luebke /20/2018.
Dynamic Order Statistics
AVL Tree Mohammad Asad Abbasi Lecture 12
Summary of General Binary search tree
Design and Analysis of Algorithms
CS200: Algorithms Analysis
Lecture 9 Algorithm Analysis
Chapter 14: Augmenting Data Structures
Lecture 9 Algorithm Analysis
Lecture 9 Algorithm Analysis
CS 583 Analysis of Algorithms
Introduction to Algorithms
Analysis of Algorithms CS 477/677
CSE2331/5331 Topic 7: Balanced search trees Rotate operation
Design and Analysis of Algorithms
Augmenting Data Structures: Interval Trees
Analysis of Algorithms CS 477/677
Chapter 12&13: Binary Search Trees (BSTs)
Interval Trees CS302 Data Structures Modified from Dr Monica Nicolescu.
Red-Black Trees CS302 Data Structures
Presentation transcript:

Introduction to Analysis of Algorithms CAS CS 330 Lecture 16 Shang-Hua Teng Thanks to Charles E. Leiserson and Silvio Micali of MIT for these slides

Data-structure augmentation “Methodology”: (e.g., order-statistics trees) 1.Choose an underlying data structure (AVL/red-black trees). 2.Determine additional information to be stored in the data structure (subtree sizes) 3.Make sure this information can be maintained for modifying operations (I NSERT, D ELETE —rotations). 4.Develop new dynamic-set operations that use the information (OS-S ELECT, … ).

Interval trees Goal: To maintain a dynamic set of intervals, such as time intervals

Interval trees Goal: To maintain a dynamic set of intervals, such as time intervals. low[i] = 7 10 = high[i]

Interval trees Goal: To maintain a dynamic set of intervals, such as time intervals. low[i] = 7 10 = high[i] i Query: For a given query interval i, find an interval in the set that overlaps i.

Following the methodology 1.Choose an underlying data structure. Search tree keyed on low (left) endpoint. int m int m 2.Determine additional information to be stored in the data structure. Store in each node x the largest value m[x] in the subtree rooted at x, as well as the interval int[x] corresponding to the key. a,b m a,b m a b m a b m

17, ,19 23 Example interval tree 5, , ,8 8 4,8 8 15, , , ,23 23 m[x] = max high[int[x]] m[left[x]] m[right[x]]

Modifying operations 3.Verify that this information can be maintained for modifying operations. I NSERT : Fix m’s on the way down. 6, , , , , ,20 0 6, Rotations — Fixup = O(1) time per rotation: Total I NSERT time = O(lg n); D ELETE similar. 19

New operations 4.Develop new dynamic-set operations that use the information. I NTERVAL -S EARCH (i) x  root while x  NIL and i and int[x] don’t overlap do ⊳ if left[x]  NIL and low[i]  m[left[x]] then x  left[x] else x  right[x] return x

New operations 4.Develop new dynamic-set operations that use the information. I NTERVAL -S EARCH (i) x  root while x  NIL and i and int[x] don’t overlap do ⊳ if left[x]  NIL and low[i]  m[left[x]] then x  left[x] else x  right[x] return x (low[i] > high[int[x]] or low[int[x]] > high[i]) `

Example 1: I NTERVAL -S EARCH ([14,16]) 17, , , , ,8 8 4,8 8 15, , , ,23 23 x x  root [14,16] and [17,19] don’t overlap 14  18  x  left[x]

Example 1: I NTERVAL -S EARCH ([14,16]) 17, , , , ,8 8 4,8 8 15, , , ,23 23 x [14,16] and [5,11] don’t overlap 14  8  x  right[x]

Example 1: I NTERVAL -S EARCH ([14,16]) 17, , , , ,8 8 4,8 8 15, , , ,23 23 x [14,16] and [15,18] overlap return [15,18]

Example 2: I NTERVAL -S EARCH ([12,14]) 17, , , , ,8 8 4,8 8 15, , , ,23 23 x x  root [12,14] and [17,19] don’t overlap 12  18  x  left[x]

Example 2: I NTERVAL -S EARCH ([12,14]) 17, , , , ,8 8 4,8 8 15, , , ,23 23 x [12,14] and [5,11] don’t overlap 12  8  x  right[x]

Example 2: I NTERVAL -S EARCH ([12,14]) 17, , , , ,8 8 4,8 8 15, , , ,23 23 x [12,14] and [15,18] don’t overlap 12  10  x  right[x]

Example 2: I NTERVAL -S EARCH ([12,14]) 17, , , , ,8 8 4,8 8 15, , , ,23 23 x x = NIL  no interval that overlaps [12,14] exists

Analysis Time = O(h) = O(lg n), since I NTERVAL -S EARCH does constant work at each level as it follows a simple path down the tree. List all overlapping intervals: Search, list, delete, repeat. Insert them all again at the end. This is an output-sensitive bound. Best algorithm to date: O(k + lg n). Time = O(k lg n), where k is the total number of overlapping intervals.

Correctness? I NTERVAL -S EARCH (i) x  root while x  NIL and i and int[x] don’t overlap if left[x]  NIL and low[i]  m[left[x]] then x  left[x] else x  right[x] return x

Correctness Theorem. Let L be the set of intervals in the left subtree of node x, and let R be the set of intervals in x’s right subtree. If the search goes right, then { i  L : i overlaps i } = . If the search goes left, then {i  L : i overlaps i } =   {i  R : i overlaps i } = . In other words, it’s always safe to take only 1 of the 2 children: we’ll either find something, or nothing was to be found.

Correctness proof Proof. Suppose first that the search goes right. If left[x] = NIL, then we’re done, since L = . Otherwise, the code dictates that we must have low[i] > m[left[x]]. The value m[left[x]] corresponds to the right endpoint of some interval j  L, and no other interval in L can have a larger right endpoint than high( j).  high( j) = m[left[x]] i low(i) Therefore, {i  L : i overlaps i } = .

Proof (continued) Suppose that the search goes left, and assume that {i  L : i overlaps i } = . Then, the code dictates that low[i]  m[left[x]] = high[ j] for some j  L. Since j  L, it does not overlap i, and hence high[i] < low[ j]. But, the binary-search-tree property implies that for all i  R, we have low[ j]  low[i ]. But then {i  R : i overlaps i } = .  ij i