Kinetic Heaps K, Tarjan, and Tsioutsiouliklis. Broadcast Scheduling Parametric and Kinetic Heaps Broadcast Scheduling Using a Kinetic Heap The Computational.

Slides:



Advertisements
Similar presentations
Introduction to Algorithms Quicksort
Advertisements

Planar point location -- example
COL 106 Shweta Agrawal and Amit Kumar
Tight Bounds for Dynamic Convex Hull Queries (Again) Erik DemaineMihai Pătraşcu.
Priority Queues  MakeQueuecreate new empty queue  Insert(Q,k,p)insert key k with priority p  Delete(Q,k)delete key k (given a pointer)  DeleteMin(Q)delete.
Chapter 4: Trees Part II - AVL Tree
Dynamic Planar Convex Hull Operations in Near- Logarithmic Amortized Time TIMOTHY M. CHAN.
Advanced Data structure
Rank-Pairing Heaps Robert Tarjan, Princeton University & HP Labs Joint work with Bernhard Haeupler and Siddhartha Sen, ESA
1 Finger search trees. 2 Goal Keep sorted lists subject to the following operations: find(x,L) insert(x,L) delete(x,L) catenate(L1,L2) : Assumes that.
CHAPTER 5 PRIORITY QUEUES (HEAPS) §1 ADT Model Objects: A finite ordered list with zero or more elements. Operations:  PriorityQueue Initialize( int.
Binary Trees, Binary Search Trees CMPS 2133 Spring 2008.
Analysis of Algorithms CS 477/677 Instructor: Monica Nicolescu Lecture 11.
Analysis of Algorithms CS 477/677 Instructor: Monica Nicolescu.
Binary Heaps CSE 373 Data Structures Lecture 11. 2/5/03Binary Heaps - Lecture 112 Readings Reading ›Sections
2 -1 Chapter 2 The Complexity of Algorithms and the Lower Bounds of Problems.
1 Minimize average access time Items have weights: Item i has weight w i Let W =  w i be the total weight of the items Want the search to heavy items.
Rank-Pairing Heaps Bernhard Haeupler, Siddhartha Sen, and Robert Tarjan, ESA
Convex Hull Algorithms for Dynamic Data Kanat Tangwongsan Joint work with Guy Blelloch and Umut Acar (TTI-C)
B + -Trees (Part 1). Motivation AVL tree with N nodes is an excellent data structure for searching, indexing, etc. –The Big-Oh analysis shows most operations.
5.9 Heaps of optimal complexity
Binary search trees Definition Binary search trees and dynamic set operations Balanced binary search trees –Tree rotations –Red-black trees Move to front.
Binomial Heaps Jyh-Shing Roger Jang ( 張智星 ) CSIE Dept, National Taiwan University.
Princeton University COS 423 Theory of Algorithms Spring 2002 Kevin Wayne Fibonacci Heaps These lecture slides are adapted from CLRS, Chapter 20.
UMass Lowell Computer Science Analysis of Algorithms Prof. Karen Daniels Fall, 2001 Lecture 11 Tuesday, 12/4/01 Advanced Data Structures Chapters.
0 Course Outline n Introduction and Algorithm Analysis (Ch. 2) n Hash Tables: dictionary data structure (Ch. 5) n Heaps: priority queue data structures.
Heapsort Based off slides by: David Matuszek
1 Binomial heaps, Fibonacci heaps, and applications.
Compiled by: Dr. Mohammad Alhawarat BST, Priority Queue, Heaps - Heapsort CHAPTER 07.
1 Hash Tables  a hash table is an array of size Tsize  has index positions 0.. Tsize-1  two types of hash tables  open hash table  array element type.
PRIORITY QUEUES (HEAPS). Queues are a standard mechanism for ordering tasks on a first-come, first-served basis However, some tasks may be more important.
The Power of Incorrectness A Brief Introduction to Soft Heaps.
The Binary Heap. Binary Heap Looks similar to a binary search tree BUT all the values stored in the subtree rooted at a node are greater than or equal.
Data Structure & Algorithm II.  Delete-min  Building a heap in O(n) time  Heap Sort.
Binary Trees, Binary Search Trees RIZWAN REHMAN CENTRE FOR COMPUTER STUDIES DIBRUGARH UNIVERSITY.
Data Structures Week 8 Further Data Structures The story so far  Saw some fundamental operations as well as advanced operations on arrays, stacks, and.
1 Heaps and Priority Queues Starring: Min Heap Co-Starring: Max Heap.
Trees  Linear access time of linked lists is prohibitive Does there exist any simple data structure for which the running time of most operations (search,
Data Structure & Algorithm II.  In a multiuser computer system, multiple users submit jobs to run on a single processor.  We assume that the time required.
Analysis of Algorithms CS 477/677 Instructor: Monica Nicolescu Lecture 9.
Kinetic data structures. Goal Maintain a configuration of moving objects Each object has a posted flight plan (this is essentially a well behaved function.
1 Heaps (Priority Queues) You are given a set of items A[1..N] We want to find only the smallest or largest (highest priority) item quickly. Examples:
Chapter 2: Basic Data Structures. Spring 2003CS 3152 Basic Data Structures Stacks Queues Vectors, Linked Lists Trees (Including Balanced Trees) Priority.
Foundations of Data Structures Practical Session #8 Heaps.
Four different data structures, each one best in a different setting. Simple Heap Balanced Heap Fibonacci Heap Incremental Heap Our results.
Heapsort. What is a “heap”? Definitions of heap: 1.A large area of memory from which the programmer can allocate blocks as needed, and deallocate them.
Heaps and basic data structures David Kauchak cs161 Summer 2009.
Binary Tree. Some Terminologies Short review on binary tree Tree traversals Binary Search Tree (BST)‏ Questions.
B-TREE. Motivation for B-Trees So far we have assumed that we can store an entire data structure in main memory What if we have so much data that it won’t.
1 Fat heaps (K & Tarjan 96). 2 Goal Want to achieve the performance of Fibonnaci heaps but on the worst case. Why ? Theoretical curiosity and some applications.
CMSC 341 Binomial Queues and Fibonacci Heaps. Basic Heap Operations OpBinary Heap Leftist Heap Binomial Queue Fibonacci Heap insertO(lgN) deleteMinO(lgN)
FALL 2005CENG 213 Data Structures1 Priority Queues (Heaps) Reference: Chapter 7.
AVL Trees and Heaps. AVL Trees So far balancing the tree was done globally Basically every node was involved in the balance operation Tree balancing can.
Lecture 9COMPSCI.220.FS.T Lower Bound for Sorting Complexity Each algorithm that sorts by comparing only pairs of elements must use at least 
Mergeable Heaps David Kauchak cs302 Spring Admin Homework 7?
Lecture 10COMPSCI.220.FS.T Binary Search Tree BST converts a static binary search into a dynamic binary search allowing to efficiently insert and.
1 Chapter 6 Heapsort. 2 About this lecture Introduce Heap – Shape Property and Heap Property – Heap Operations Heapsort: Use Heap to Sort Fixing heap.
1 Fibonacci heaps: idea List of multiway trees which are all heap-ordered. Definition: A tree is called heap-ordered if the key of each node is greater.
Fibonacci Heap Fibonacci heapshave better asymptotic time bounds than binary heaps for the INSERT, UNION, and DECREASE-KEY operations, and they.
1 Priority Queues (Heaps). 2 Priority Queues Many applications require that we process records with keys in order, but not necessarily in full sorted.
Binomial heaps, Fibonacci heaps, and applications
Heaps Binomial Heaps Lazy Binomial Heaps 1.
Priority Queues MakeQueue create new empty queue
A simpler implementation and analysis of Chazelle’s
Binary and Binomial Heaps
Topic 5: Heap data structure heap sort Priority queue
Binomial heaps, Fibonacci heaps, and applications
Binomial heaps, Fibonacci heaps, and applications
Priority Queues Supports the following operations. Insert element x.
CS 6310 Advanced Data Structure Wei-Shian Wang
Presentation transcript:

Kinetic Heaps K, Tarjan, and Tsioutsiouliklis

Broadcast Scheduling Parametric and Kinetic Heaps Broadcast Scheduling Using a Kinetic Heap The Computational Geometry View What is known Four New Structures Simple Heap Balanced Heap Fibonacci Heap Incremental Heap Open problems Outline

(Lecture by Mike Franklinresearch papers) Users One server, many possible items to send (say, all the same length) One broadcast channel. Users submit requests for items. Goal: Satisfy users as well as possible, making decisions on-line. (say, minimize sum of waiting times) Server: many data items Broadcast channel (single-item) Broadcast Scheduling

Greedy = Longest Wait first (LWF): Send item with largest sum of waiting times. R x W : send item with largest ( # requests x longest waiting time) Scheduling policies (vs. number of requests or longest single waiting time)  

5 Results of Franklin and others: LWF schedules well “in practice” (in simulations) but too expensive (linear-time) This claim used to justify approximations to R x W, still linear-time but with a smaller (parameterized) constant.

Questions (for an algorithm guy or gal) LWF does well compared to what? Try a competitive analysis Can we improve the cost of LWF?   What data structure? Open question 1 Will talk about this

A collection of items, each with an associated key. key (i) = a i x + b i a i,, b i reals, x a real-valued parameter a i = slope, b i = constant Operations: make an empty heap. insert item i with key a i x + b i into the heap: insert(i,a i,b i ) find an item i of minimum key for x = x 0 : find-min( x 0 ) delete item i : delete(i) Parametic Heap

A parametric heap such that successive x-values of find mins are non-decreasing. (Think of x as time.) x c = largest x in a find min so far (current time) Additional operation: decrease the key of an item i, replacing it by a key that is no larger for all x  x c : decrease-key(i,a,b) Kinetic Heap

Need a max-heap (replace find min by find max, decrease key by increase key, etc) Can implement LWF or R x W or any similar policy: Broadcast decision is find max plus delete Request is insert (if first) or increase key (if not) Only find max need be real-time, other ops can proceed concurrently with broadcasting Slopes are integers that count requests Broadcast scheduling via kinetic heap

LWF: Suppose a request for item i arrives at time t s If i is inactive then insert(i, t-t s ) If i is active with key at+b then increase-key(i, (a+1)t+(b-t s )) To broadcast an item at time t s we perform delete-max(t s ) and broadcast the item returned. Broadcast scheduling via kinetic heap (Cont.)

A key is a line  computational geometry Equivalent problems: maintain the lower envelope of a collection of lines in 2D projective duality maintain the convex hull of a set of points in 2D under insertion and deletion “kinetic” restriction = “sweep line” query constraint What is known about parametric and kinetic heaps?

Overmars and Van Leeuwen (1981) O( log n) time per query O(log 2 n) time per update, worst-case Chazelle (1985), Hershberger and Suri (1992) (deletions only) O( log n) time per query, worst-case O(n log n) for n deletions Results I

Results II Chan (1999) Dynamic hulls and envelopes O( log n) time per query O(log 1+  n) time per update, amortized Brodal and Jacob (2000) O( log n) time per query O( log n log log log n) time per insert, O( log n log log n) timer per delete, amortized

Results III Basch, Guibas, and Hershberger (1997) “Kinetic” data structure paradigm

Four different data structures, each one best in a different setting. Simple Heap Balanced Heap Fibonacci Heap Incremental Heap Our results

A balanced binary tree, with items in the leaves in left-right order by decreasing key slope. The tree is a tournament on items by current key. Simple heap (highlights) We get O(log 2 n) for delete and insert O(1) for find-min All bounds are amortized

A better deletions only data structure Dynamize using the technique of Bently and Saxe Balanced heap (highlights) We get O(log n loglog n) for delete O(log n) for find-min and insert

Based one Fibonacci heaps with lazy linking. We link only when it is safe Fibonacci heap (highlights) We get O(log 2 n) for delete O(log n) for find-min and insert O(log n) for decrease-key

Incremental heap (highlights) Assumptions insertion of i: a i = 1, increase key of i: a i := a i + 1 Group items by key slope, each group is an ordinary Fibonacci heap Move items among groups as key slopes change (“displaced” items because entire subtrees are moved) We get find-max, insert : O(1) The k th increase-key : O(log min{k,n}) delete : O(log n)

A balanced binary tree, with items in the leaves in left-right order by decreasing key slope. The tree is a tournament on items by current key. Simple heap (details)

a b e d c a b d c e a a c a       

Simple kinetic heaps (Cont) Add swap times at internal nodes

a b e d c a b d c e a a c a       

Simple kinetic heaps (Cont) Add minimum future swap time

a b e d c a b d c e a a c a       

Simple kinetic heaps (Cont) When we do a find-min if the current time moves forward, we act as follows: Locate all nodes whose winner changes. The paths from the root to those nodes form a subtree. Update this subtree

a b e d c a b d c e a a c a       

a b e d c a b d c e a a c a       

a b e d c a b d c e b a c a               

a b e d c a b d c e b b  20 3        c a

a b e d c a b d c e b b  20 3        c b

a b e d c a b d c e b b c b         6.8

a b e d c a b d c e b b c b        

a b e d c a b d c e b b e e         20 3 

a b e d c a b d c e b b e e         3  5 6.8

Simple kinetic heaps – insert & delete Use the regular mechanism of (say) red-black trees

a b e d c a b d c e b b e e f         

a b e d c a b d c e b b e e   f f f  -20       

a b e d c a b d c e b f e e     f f f  -20       

Simple kinetic heaps (analysis) Find-min takes O(1) time if we do not advance the current time. Advancing the current time can be expensive on the worst-case Amortization: Assign O(log n) potential to each node whose swap time is in the future. This potential pays for advancing the time. So find-min takes O(1) amortized time. Insert and delete take O(log 2 n) amortized time

How do we make this a parametric heap ? (Overmars-Van Leeuwen)

a b e d c a b c e        d

Looks like we use non linear space ? Each node keeps the chain of segments that are on its envelope but not on its parent envelope. This makes the space linear.

a b e d c a b c e        d

Analysis Using search trees to represent the envelopes sorted by slopes then: Find-min takes O(log n) time Insert and delete take O(log 2 n) time: You split and concatenate O(log n) such envelopes.

Balanced kinetic heaps Def: The level of a line is the depth of the highest node where it is stored. Plan: first we’ll do only deletions faster and then we’ll add insertions. Our starting point is the previous data structure.

Deletion-only data structure The level of a line is the depth of the highest node where it is stored.

a b e d c a b c e        d d

Deletion-only data structure What happens when we delete a line ?

a b e d c a b c e        d d

a b e d c a b e        d d

a b e d c a b e        d d

a b e d c a b e        d d

Balanced kinetic heaps When we delete a line, lines only decrease their level.

      

      

Analysis Key idea: Implement the envelopes using finger search trees How much time it takes to delete an element ? O(log n) +  log (# lines get on the replacement piece) +  log (# lines get off the replacement piece) If in all nodes # lines get on/off the replacement piece < log 2 n we are ok --- the delete takes O(log n loglog n) time However it could be that # lines get on/off the replacement piece > log 2 n

Analysis (Cont) If a line gets on the replacement piece it changes its level Since the total number of level changes is nlog(n) The number of chains of length > log 2 n is at most n/log(n) So the total work in moving long chain is O(n) and can be charged to the initialization phase.

A better deletions only data structure O(n) init. time and then O(log n loglog n) per delete. O(1) find-min if we are kinetic. Now we add insert. Summary so far

Balanced kinetic heaps (finalle) Balanced heaps (step 2): Use a well known reduction by Bently and Saxe to add insertions Maintain a binary counter of deletion only data structures

Balanced kinetic heaps Delete : O(log n loglog n) Insert : O(log n) Find-min : O(log n) (kinetic)

Based one Fibonacci heaps with lazy linking. We link only when it is safe Fibonacci heap (highlights) We get O(log 2 n) for delete O(log n) for find-min and insert O(log n) for decrease-key

Fibonacci Kinetic Heap Mimic Fibonacci heap, but only do links for permanent comparisons. Store unlinked tree roots of equal rank in a sorted list O(log n) find min, insert, decrease key, O(log 2 n) delete, amortized

Fibonacci heaps (Fredman & Tarjan 84) Want to do decrease-key(x,h,  ) faster than delete+insert. ( in O(1) time.)

Heap ordered (Fibonacci) trees

Linking Operations are defined via a basic operation, called linking, of Fibonacci trees: Produce a F k from two F k-1, keep heap order

Basic idea: delay the linking until they are safe S1S1 S0S0 SrSr  Regular heap

Linking The basic operation is inserting a tree of rank r into S r, making the appropriate linking if needed. Insert(i) : Insert a new node containing i into S 0 This may cascade, but we have one tree less every time we do the linking Decrease-key(i, a’.b’) : Decrease the key of i in the tree containing it making cascading cuts, link the resulting trees back into the structure.

Analysis Similar to the analysis of F-heaps Multiply the potential by log n

Incremental heap (highlights) Assumptions insertion of i: a i = 1, increase key of i: a i := a i + 1 Group items by key slope, each group is an ordinary Fibonacci heap Move items among groups as key slopes change (“displaced” items because entire subtrees are moved) We get find-max, insert : O(1) The k th increase-key : O(log min{k,n}) delete : O(log n)

Open problems Experiments: What is the best structure in practice? Better bounds: kinetic heaps in ordinary F-heap bounds? parametric heaps in O(log n) amortized time per operation? Simple kinetic heaps via splay trees?