10 -1 Chapter 10 Amortized Analysis. 10 -2 A sequence of operations: OP 1, OP 2, … OP m OP i : several pops (from the stack) and one push (into the stack)

Slides:



Advertisements
Similar presentations
10 -1 Chapter 10 Amortized Analysis A sequence of operations: OP 1, OP 2, … OP m OP i : several pops (from the stack) and one push (into the stack)
Advertisements

§6 Leftist Heaps CHAPTER 5 Graph Algorithms  Heap: Structure Property + Order Property Target : Speed up merging in O(N). Leftist Heap: Order Property.
AVL Trees1 Part-F2 AVL Trees v z. AVL Trees2 AVL Tree Definition (§ 9.2) AVL trees are balanced. An AVL Tree is a binary search tree such that.
Heaps1 Part-D2 Heaps Heaps2 Recall Priority Queue ADT (§ 7.1.3) A priority queue stores a collection of entries Each entry is a pair (key, value)
AVL Trees1 Part-F2 AVL Trees v z. AVL Trees2 AVL Tree Definition (§ 9.2) AVL trees are balanced. An AVL Tree is a binary search tree such that.
AVL Trees Balanced Trees. AVL Tree Property A Binary search tree is an AVL tree if : –the height of the left subtree and the height of the right subtree.
AVL-Trees (Part 1) COMP171. AVL Trees / Slide 2 * Data, a set of elements * Data structure, a structured set of elements, linear, tree, graph, … * Linear:
Analysis Of Binomial Heaps. Operations Insert  Add a new min tree to top-level circular list. Meld  Combine two circular lists. Delete min  Pairwise.
TCSS 342 AVL Trees v1.01 AVL Trees Motivation: we want to guarantee O(log n) running time on the find/insert/remove operations. Idea: keep the tree balanced.
Lists A list is a finite, ordered sequence of data items. Two Implementations –Arrays –Linked Lists.
C++ Programming: Program Design Including Data Structures, Third Edition Chapter 19: Heap Sort.
CS 104 Introduction to Computer Science and Graphics Problems Data Structure & Algorithms (4) Data Structures 11/18/2008 Yang Song.
Dynamic Set AVL, RB Trees G.Kamberova, Algorithms Dynamic Set ADT Balanced Trees Gerda Kamberova Department of Computer Science Hofstra University.
AVL trees. AVL Trees We have seen that all operations depend on the depth of the tree. We don’t want trees with nodes which have large height This can.
Balanced Trees. Binary Search tree with a balance condition Why? For every node in the tree, the height of its left and right subtrees must differ by.
5.9 Heaps of optimal complexity
Data Structures Using C++ 2E Chapter 11 Binary Trees and B-Trees.
Binary search trees Definition Binary search trees and dynamic set operations Balanced binary search trees –Tree rotations –Red-black trees Move to front.
CSC 2300 Data Structures & Algorithms February 16, 2007 Chapter 4. Trees.
Splay Trees Splay trees are binary search trees (BSTs) that:
Data Structures Using C++1 Chapter 11 Binary Trees.
Data Structures Using C++1 Chapter 11 Binary Trees.
Advanced Data Structures and Algorithms COSC-600 Lecture presentation-6.
Leftist Trees Linked binary tree. Can do everything a heap can do and in the same asymptotic complexity.  insert  remove min (or max)  initialize Can.
Chapter 19: Binary Trees. Objectives In this chapter, you will: – Learn about binary trees – Explore various binary tree traversal algorithms – Organize.
CS Data Structures Chapter 5 Trees. Chapter 5 Trees: Outline  Introduction  Representation Of Trees  Binary Trees  Binary Tree Traversals 
Weight balance trees (Nievergelt & Reingold 73)
Analysis of Red-Black Tree Because of the rules of the Red-Black tree, its height is at most 2log(N + 1). Meaning that it is a balanced tree Time Analysis:
AVL Tree Definition: Theorem (Adel'son-Vel'skii and Landis 1962):
2-3 Tree. Slide 2 Outline  Balanced Search Trees 2-3 Trees Trees.
Starting at Binary Trees
Search Trees. Binary Search Tree (§10.1) A binary search tree is a binary tree storing keys (or key-element pairs) at its internal nodes and satisfying.
1 Trees 4: AVL Trees Section 4.4. Motivation When building a binary search tree, what type of trees would we like? Example: 3, 5, 8, 20, 18, 13, 22 2.
1 Chapter 17: Amortized Analysis I. 2 About this lecture Given a data structure, amortized analysis studies in a sequence of operations, the average time.
CS 253: Algorithms Chapter 13 Balanced Binary Search Trees (Balanced BST) AVL Trees.
Chapter 2: Basic Data Structures. Spring 2003CS 3152 Basic Data Structures Stacks Queues Vectors, Linked Lists Trees (Including Balanced Trees) Priority.
Lecture 11COMPSCI.220.FS.T Balancing an AVLTree Two mirror-symmetric pairs of cases to rebalance the tree if after the insertion of a new key to.
Chapter 7 Trees_Part3 1 SEARCH TREE. Search Trees 2  Two standard search trees:  Binary Search Trees (non-balanced) All items in left sub-tree are less.
Data Structures Chapter 6. Data Structure A data structure is a representation of data and the operations allowed on that data. Examples: 1.Array 2.Record.
Data Structures Using Java1 Chapter 10 Binary Trees.
CSS446 Spring 2014 Nan Wang.  to study trees and binary trees  to understand how binary search trees can implement sets  to learn how red-black trees.
AVL Trees An AVL tree is a binary search tree with a balance condition. AVL is named for its inventors: Adel’son-Vel’skii and Landis AVL tree approximates.
October 19, 2005Copyright © by Erik D. Demaine and Charles E. LeisersonL7.1 Introduction to Algorithms LECTURE 8 Balanced Search Trees ‧ Binary.
Lecture 9COMPSCI.220.FS.T Lower Bound for Sorting Complexity Each algorithm that sorts by comparing only pairs of elements must use at least 
Lecture 10COMPSCI.220.FS.T Binary Search Tree BST converts a static binary search into a dynamic binary search allowing to efficiently insert and.
Initializing A Max Heap input array = [-, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11]
1 AVL Trees II Implementation. 2 AVL Tree ADT A binary search tree in which the balance factor of each node is 0, 1, of -1. Basic Operations Construction,
AVL Trees. AVL Tree In computer science, an AVL tree is the first-invented self-balancing binary search tree. In an AVL tree the heights of the two child.
AVL Trees AVL (Adel`son-Vel`skii and Landis) tree = – A BST – With the property: For every node, the heights of the left and right subtrees differ at most.
3.1 Height-Balanced Trees 3.2 Weight-Balanced Trees
Self-Adjusting Data Structures
Search Trees.
Red Black Trees
Priority Queues © 2010 Goodrich, Tamassia Priority Queues 1
AVL DEFINITION An AVL tree is a binary search tree in which the balance factor of every node, which is defined as the difference between the heights of.
AVL Trees A BST in which, for any node, the number of levels in its two subtrees differ by at most 1 The height of an empty tree is -1. If this relationship.
Chapter 10.1 Binary Search Trees
Hashing Exercises.
Dynamic Dictionaries Primary Operations: Additional operations:
Chapter 22 : Binary Trees, AVL Trees, and Priority Queues
original list {67, 33,49, 21, 25, 94} pass { } {67 94}
Pairing Heaps Actual Complexity.
(2,4) Trees 11/15/2018 9:25 AM Sorting Lower Bound Sorting Lower Bound.
AVL Trees CENG 213 Data Structures.
Tree Representation Heap.
CMSC 341 Splay Trees.
(2,4) Trees 2/28/2019 3:21 AM Sorting Lower Bound Sorting Lower Bound.
AVL Trees B.Ramamurthy 4/27/2019 BR.
CMSC 341 Splay Trees.
1 Lecture 13 CS2013.
Presentation transcript:

10 -1 Chapter 10 Amortized Analysis

10 -2 A sequence of operations: OP 1, OP 2, … OP m OP i : several pops (from the stack) and one push (into the stack) t i : time spent by OP i the average time per operation: An Example – Push and Pop

10 -3 Example: a sequence of push and pop p: pop, u: push t ave = ( )/8 = 13/8 = 1.625

10 -4 Another example: a sequence of push and pop p: pop, u: push t ave = ( )/8 = 14/8 = 1.75

10 -5 Amortized Time and Potential Function

10 -6 Suppose that before we execute Op i, there are k elements in the stack and Op i consists of n pops and 1 push. Amortized Analysis of the Push-and-Pop Sequence

10 -7 By observation, at most m pops and m pushes are executed in m operations. Thus,

10 -8 Skew Heaps Let wt(x) of node x denote the number of descendants of node x, including x. Let a heavy node x be a non-root node with wt(x) > wt(p(x)) / 2 where p(x) is the parent node of x. A node x is a light node if it is not a heavy node. For example, 5 and 10 are heavy nodes.

10 -9 Skew Heaps Two skew heaps Step 1: Merge the right paths. 5 right heavy nodes Meld: Merge + Swapping

Step 2: Swap the children along the right path. No right heavy node

Amortized Analysis of Skew Heaps Meld: merge + swapping Operations on a skew heap: find-min(h): find the min of a skew heap h. insert(x, h): insert x into a skew heap h. delete-min(h): delete the min from a skew heap h. meld(h 1, h 2 ): meld two skew heaps h 1 and h 2. The first three operations can be implemented by melding.

Potential Function of Skew Heaps wt(x): # of descendants of node x, including x. heavy node x: wt(x)  wt(p(x))/2, where p(x) is the parent node of x. light node: not a heavy node. potential function  i : # of right heavy nodes of the skew heap.

light nodes   log 2 n  # heavy=k 3   log 2 n  possible heavy nodes # of nodes: n Any path in an n-node tree contains at most  log 2 n  light nodes. The number of right heavy nodes attached to the left path is at most  log 2 n .

Amortized Time # light   log 2 n 1  # heavy = k 1 # light   log 2 n 2  # heavy = k 2

10 -15

AVL-Trees Height balance of node v: hb(v)=height of right subtree - height of left subtree

Add a new node A. Before insertion, hb(B)=hb(C)=hb(E)=0 hb(I)  0 the first nonzero from leaves.

Amortized Analysis of AVL-Trees Consider a sequence of m insertions on an empty AVL-tree. T 0 : an empty AVL-tree. T i : the tree after the i-th insertion. L i : the length of the critical path involved in the ith insertion. X 1 : total # of balance factor changing from 0 to +1 or -1 during these m insertions (rebalancing cost)

Case 1 Case 1 : Absorption Val(T i )=Val(T i-1 )+(L i  1) The tree height is not increased, we need not rebalance it.

Case 2.1 Case 2.1 Single Rotation

Case 2 Case 2 : Rebalancing the Tree

Case 2.1 Case 2.1 Single Rotation After a right rotation on the subtree rooted at A: Val(T i )=Val(T i-1 )+(L i -2)

Case 2.2 Case 2.2 Double Rotation

Case 2.2 Case 2.2 Double Rotation Val(T i )=Val(T i-1 )+(L i -2) After a left rotation on the subtree rooted at B and a right rotation on the subtree rooted at A:

Case 3 Case 3 : Height Increase Val(T i )=Val(T i-1 )+L i L i is the height of the root.

Amortized Analysis of X 1

A Self-Organizing Sequential Search Heuristics 3 methods for enhancing the performance of sequential search

10 -28

10 -29

Analysis of Move-to-the-Front Heuristics  Interword comparison: unsuccessful comparison  Intraword comparison: successful comparison  Pairwise independent property:  For any sequence S and all pairs P and Q, # of interword comparisons of P and Q is exactly # of comparisons made for the subsequence of S consisting of only P ’ s and Q ’ s. (See an example on the next page.)

Pairwise Independent Property in Move-to-the-Front

10 -32

 We can consider them separately and then add them up. The total number of interword comparisons is = 7.

Theorem for Move-to-the- Front Heuristics C M (S): # of comparisons of the move-to-the- front heuristics C O (S): # of comparisons of the optimal static ordering C M (S)  2C O (S)

Proof Proof: Inter M (S): # of interword comparisons of the move- to-the-front heuristics Inter O (S): # of interword comparisons of the optimal static ordering Let S consist of a A ’ s and b B ’ s, a  b. The optimal static ordering: BA Inter O (S) = a Inter M (S)  2a  Inter M (S)  2Inter O (S)

Consider any sequence consisting of more than two items. Because of the pairwise independent property, we have Inter M (S)  2Inter O (S) Intra M (S): # of intraword comparisons of the move- to-the-front heuristics Intra O (S): # of intraword comparisons of the optimal static ordering Intra M (S) = Intra O (S) Inter M (S) + Intra M (S)  2Inter O (S) + Intra O (S)  C M (S)  2C O (S) Proof (cont.)

The Count Heuristics The count heuristics has a similar result: C C (S)  2C O (S), where C C (S) is the cost of the count heuristics.

The Transposition Heuristics The transposition heuristics does not possess the pairwise independent property. We cannot have a similar upper bound for the cost of the transposition heuristics.

10 -39