Fractional Cascading Fractional Cascading I: A Data Structuring Technique Fractional Cascading II: Applications [Chazaelle & Guibas 1986] Dynamic Fractional.

Slides:



Advertisements
Similar presentations
Computational Geometry
Advertisements

Size-estimation framework with applications to transitive closure and reachability Presented by Maxim Kalaev Edith Cohen AT&T Bell Labs 1996.
CSE Lecture 3 – Algorithms I
Dynamic Graph Algorithms - I
CS Divide and Conquer/Recurrence Relations1 Divide and Conquer.
Lecture 3: Parallel Algorithm Design
AVL Trees1 Part-F2 AVL Trees v z. AVL Trees2 AVL Tree Definition (§ 9.2) AVL trees are balanced. An AVL Tree is a binary search tree such that.
Augmenting Data Structures Advanced Algorithms & Data Structures Lecture Theme 07 – Part I Prof. Dr. Th. Ottmann Summer Semester 2006.
I/O-Algorithms Lars Arge Fall 2014 September 25, 2014.
2/14/13CMPS 3120 Computational Geometry1 CMPS 3120: Computational Geometry Spring 2013 Planar Subdivisions and Point Location Carola Wenk Based on: Computational.
Convex Hulls in Two Dimensions Definitions Basic algorithms Gift Wrapping (algorithm of Jarvis ) Graham scan Divide and conquer Convex Hull for line intersections.
Fractional Cascading CSE What is Fractional Cascading anyway? An efficient strategy for dealing with iterative searches that achieves optimal.
I/O-Algorithms Lars Arge University of Aarhus February 21, 2005.
I/O-Algorithms Lars Arge Aarhus University February 16, 2006.
Chapter 6: Transform and Conquer Trees, Red-Black Trees The Design and Analysis of Algorithms.
I/O-Algorithms Lars Arge University of Aarhus March 1, 2005.
I/O-Algorithms Lars Arge Spring 2009 March 3, 2009.
Greedy Algorithms Reading Material: Chapter 8 (Except Section 8.5)
Point Location Computational Geometry, WS 2007/08 Lecture 5 Prof. Dr. Thomas Ottmann Algorithmen & Datenstrukturen, Institut für Informatik Fakultät für.
Lecture 6: Point Location Computational Geometry Prof. Dr. Th. Ottmann 1 Point Location 1.Trapezoidal decomposition. 2.A search structure. 3.Randomized,
Greedy Algorithms Like dynamic programming algorithms, greedy algorithms are usually designed to solve optimization problems Unlike dynamic programming.
The Design and Analysis of Algorithms
Princeton University COS 423 Theory of Algorithms Spring 2002 Kevin Wayne Fibonacci Heaps These lecture slides are adapted from CLRS, Chapter 20.
Fractional Cascading and Its Applications G. S. Lueker. A data structure for orthogonal range queries. In Proc. 19 th annu. IEEE Sympos. Found. Comput.
Chapter 1 Introduction Definition of Algorithm An algorithm is a finite sequence of precise instructions for performing a computation or for solving.
CSE53111 Computational Geometry TOPICS q Preliminaries q Point in a Polygon q Polygon Construction q Convex Hulls Further Reading.
1 Binomial heaps, Fibonacci heaps, and applications.
Comp 249 Programming Methodology Chapter 15 Linked Data Structure - Part B Dr. Aiman Hanna Department of Computer Science & Software Engineering Concordia.
Compiled by: Dr. Mohammad Alhawarat BST, Priority Queue, Heaps - Heapsort CHAPTER 07.
UNC Chapel Hill M. C. Lin Orthogonal Range Searching Reading: Chapter 5 of the Textbook Driving Applications –Querying a Database Related Application –Crystal.
Spring 2015 Lecture 11: Minimum Spanning Trees
Order Statistics. Order statistics Given an input of n values and an integer i, we wish to find the i’th largest value. There are i-1 elements smaller.
Analysis of Algorithms These slides are a modified version of the slides used by Prof. Eltabakh in his offering of CS2223 in D term 2013.
Télécom 2A – Algo Complexity (1) Time Complexity and the divide and conquer strategy Or : how to measure algorithm run-time And : design efficient algorithms.
Mehdi Mohammadi March Western Michigan University Department of Computer Science CS Advanced Data Structure.
Binary SearchTrees [CLRS] – Chap 12. What is a binary tree ? A binary tree is a linked data structure in which each node is an object that contains following.
Data Structures Using C++ 2E Chapter 9 Searching and Hashing Algorithms.
Segment Trees - motivation.  When one wants to inspect a small portion of a large and complex objects. Example: –GIS: windowing query in a map.
Data Structure Introduction.
Preview  Graph  Tree Binary Tree Binary Search Tree Binary Search Tree Property Binary Search Tree functions  In-order walk  Pre-order walk  Post-order.
Analysis of Algorithms CS 477/677 Instructor: Monica Nicolescu Lecture 9.
1 Fat heaps (K & Tarjan 96). 2 Goal Want to achieve the performance of Fibonnaci heaps but on the worst case. Why ? Theoretical curiosity and some applications.
Lecture 9COMPSCI.220.FS.T Lower Bound for Sorting Complexity Each algorithm that sorts by comparing only pairs of elements must use at least 
CMPS 3130/6130 Computational Geometry Spring 2015
CSE 589 Applied Algorithms Spring 1999 Prim’s Algorithm for MST Load Balance Spanning Tree Hamiltonian Path.
LINKED LISTS.
Fibonacci Heaps. Fibonacci Binary insert O(1) O(log(n)) find O(1) N/A union O(1) N/A minimum O(1) O(1) decrease key O(1) O(log(n)) delete O(log(n) O(log(n))
3. Polygon Triangulation
Lecture 3: Parallel Algorithm Design
Top 50 Data Structures Interview Questions
The Design and Analysis of Algorithms
CMPS 3130/6130 Computational Geometry Spring 2017
Data Structures Binomial Heaps Fibonacci Heaps Haim Kaplan & Uri Zwick
Binary Search Trees.
Binomial heaps, Fibonacci heaps, and applications
CS202 - Fundamental Structures of Computer Science II
CS202 - Fundamental Structures of Computer Science II
CS200: Algorithms Analysis
Map interface Empty() - return true if the map is empty; else return false Size() - return the number of elements in the map Find(key) - if there is an.
Algorithm design and Analysis
Enumerating Distances Using Spanners of Bounded Degree
Greedy Algorithms / Dijkstra’s Algorithm Yin Tat Lee
CS202 - Fundamental Structures of Computer Science II
Searching CLRS, Sections 9.1 – 9.3.
CS202 - Fundamental Structures of Computer Science II
Binary SearchTrees [CLRS] – Chap 12.
Binomial heaps, Fibonacci heaps, and applications
CS202 - Fundamental Structures of Computer Science II
CS202 - Fundamental Structures of Computer Science II
Time Complexity and the divide and conquer strategy
Presentation transcript:

Fractional Cascading Fractional Cascading I: A Data Structuring Technique Fractional Cascading II: Applications [Chazaelle & Guibas 1986] Dynamic Fractional Cascading [Mellhorn & Naher 1990] Elik Etzion January 2002

Agenda Preview Formal problem definition & Final results Example Application Intersecting a polygonal path with a line Data structure & Algorithm description Time & Space complexity analysis Dynamization

Preview The problem: Iterative search in sorted lists Examples: Look up a word in different dictionaries Geometric retrieval problems The solution: Fractional Cascading Correlate the lists in a way that every search uses the results of the previous search

Formal Definition U – an ordered set G = (V,E) – Catalog Graph undirected for each v  V C(v)  Ucatalog of v For each e  E R(e) = [ l(e), r(e) ] range of e locally bounded degree d  v  V and  k  U there are at most d edges e = (v,w) with k  R(e) [2,7] [22,50] [20,27] Degree = 4 Locally bounded degree = 2 [5,15]

Formal Definition - operations Query Input : k  U, G` = (V`,E`) connected sub-tree of G,  e  E` k  R(e) Output for each v  V` x  C(v) such that x is the successor of k in C(v) Deletion given a key k  C(v) and its position in C(v), delete k from C(v) Insertion given a key k  U and its successor in C(v), insert k into C(v)

Results n  |V|, Space: O(N + |E|) Time: QueryInsert / Delete Trivial nlogN1 Static/Semi- Dynamic log(N + |E|) + n1 Dynamic log(N + |E|) + nloglog (N + |E|) amortized

Example application Problem Input: Polygonal path P, Arbitrary query line l Output: intersections of P & l Solution complexity Trivialspace: O(n) time: O(n) Using FCspace: O(nlogn) time: O((k+1)log[n/(k+1)]) k – number of intersections reported

Example application - Solution Observation: a straight line l intersects a polygonal path P if and only if l intersects the convex hull CH(p) of P Notation : F(p) & S(p) – first & second half path of P Preprocessing : CH[F(P)] CH[S(P)] CH[P]

Example application - Algorithm Intersect( P, l ) { if |P| = 1 then compute P  l directly else if l doesn ’ t intersect CH(p) then exit else { Intersect ( F(p), l ) Intersect ( S(p), l ) }

Example application - Algorithm Convex hull intersection algorithm: Find the 2 slopes of l in the slope sequence of CH FC view: C atalog graph: pre-processed CH binary tree Catalogs: slope sequence of the the CHs The query key: 2 slopes of l

Example application - Complexity Space O(nlogn) - each edge participates in at most logn CHs Time (static) O(logn + size of sub tree actually visited) O((k+1)log[n/(k+1)])

Data Structure – Illustration w v [l,r] l bridge r bridge y’ x’x y B(x,y) A(w) A(v) - non proper - proper y.count x.count 99 75

Data Structure - Definitions For each node v A(v)  C(v) – augmented catalog implemented as a doubly linked list of records C(v) contains proper elements A(v) – C(v) contains non-proper elements Record members: key, next, prev, kind special n.p members: target – node of G incident to v pointer – pointer to a np element in A(x.target) (the other end of the bridge) count – number of elements until the previous bridge in_S – is in a non- balanced block

Bridges & Blocks (x,y)- a bridge between nodes v & w x  A(v) – C(v) y  A(w) – C(w) x.pointer = yy.pointer = x x.target = w y.target = v x.key = y.key x.kind = y.kind = non-proper Every edge e(v,w) has at list 2 bridges x.key=y.key = l(e), x.key = y.key = r(e) Block B(x,y)  A(v)  A(w) the elements between (x,y) bridge and its neighbor bridge between v & w |B(x,y)| = x.count + y.count

FCQuery FCQuery (G, G ’, k ) (V 1, V 2.. V n ) = order of nodes in G ’ aug_succ = BinarySearch( A(V 1 ), k ) successor[1] = FindProper(A(V 1 ), aug_succ) for i = 2.. n aug_succ = FCSearch(V i, k, succssesor[i-1] ) successor[i] = FindProper(A(V i ), aug_succ) return successor[1..n]

FCSearch & FindProper FCSearch ( w, k, x ) x ’ = x while x ’.target != w do x1 = x ’.next y = x ’.pointer While y.pred.key  k do y= y.pred return y FindProper in the static case implemented in O(1) time using a pointer from each non-proper element to its proper successor

Block Size Tradeoff Small blocks increase space complexity but decrease time complexity Large blocks increase time complexity but decrease space complexity Block Invariant There are tow constants a, b with a  b such that for all blocks B(x,y) holds: |B(x,y)|  b |B(x,y)|  a or B(x,y) is the only block between A(x.target) and A(y.target)

Block Lemma Let Then |S|  3N+12|E| Proof …

Complexity Analysis (static) Space Linear in the size of the catalog graph according to the Block Lemma Time FindProper O( 1 ) FCSearch O( 1 ) block size is constant BinarySearch O ( log(|A(V 1 )|) ) = O ( log(N + |E|) ) FCQuery – O (log(N + |E|) + n )

Dynamization Challenges FindProper can ’ t be implemented simply by using a pointer from each non-proper element to its proper successor Insertions & Deletions violate the Block Invariant Solution Data Structure based onVan Emde Boas Priority Queue Block rebalancing

Union- Split DS FindProper Input: a pointer to some item x Output: a pointer to a proper item y such that all the items between x & y are non-proper ( y is the proper successor of x) ADD Input: a pointer to some item x Effect: adds a non-proper item immediately before x Erase Input: a pointer to a non-proper item x Delete x Union Input: a pointer to a non-proper item x Effect: change the mark of x to proper Split Input: a pointer to a proper item x Effect: change the mark of x to non-proper

Insert – Illustration y0y0 y B(y,z) A(v) A(w) y’ z A(u) z’ B(y’,z’) x

Insert Algorithm Insert (x, y 0 ) ADD( x, y 0 ) if x.kind = proper then UNION(x) insert x into the doubly linked list before y 0 y = y 0, A =  do b times w = y.target if ( y.kind = non-proper and w  A and x.key  R(v,w) ) A = A  {w} y.count++ z = y.pointer if ( y.In_S = false and y.count + z.count > b) S = S  {B(y,z)} y.In_s = true, z.In_S = true y = y.next

Delete Algorithm Delete (x) if x.kind = proper then SPLIT(x) DELETE(x) remove x from the doubly linked y = x.next, A =  do b times w = y.target if ( y.kind = non-proper and w  A and x.key  R(v,w) ) A = A  {w} y.count-- z = y.pointer if ( y.In_S = false and y.count + z.count < a and B(z,y) isn ’ t the only block between v and w ) S = S  {B(y,z)} y.In_s = true, z.In_S = true y = y.next

Balance Algorithm For each block B(x,y)  S do l = compute the size of B(x,y) by running to the previous parallel bridge [ O(l) ] if ( l > b) divide B(x,y) into  3l/b  + 1 parts by inserting 2*  3l/b  non-proper elements [6l/b O(INSERT) ] else if ( l < a ) concatenate B(x,y) with its right neighbor block B(x ’,y ’ ) by deleting the (x,y) bridge [O(ERASE)] check if B(x ’,y ’ )  S by scanning b elements until reaching the (x ’,y ’ ) bridge and checking x ’.In_s flag [O(b)] // if not reached then B(x ’,y ’ )  S if B(x ’,y ’ )  S x ’.count += x.count, y ’.count += y.count if (x ’.count > b) S = S  {B(x ’,y ’ )} else S = S – B(x,y)

Complexity Analysis (Dynamic) Union – Split DS for n elements complexity Space: o (n) Time : FIND, Union & split: O(loglogn) worst case ADD, Erase: O(loglogn) amortized ADD/ ERASE in semi-dynamic: O(1) FC complexity Space: Remains Linear in the size of the catalog graph because the block invariant is kept by rebalancing Time: FindProper:O( log log(N + |E|) ) FCSearch:O( 1 ) BinarySearchO ( log(N + |E|) ) FCQuery – O (log(N + |E|) + n log log(N + |E|) ) Insert/Delete - O( log log(N + |E|) ) or O (1) or semi- dynamic Balance – O ( log(N + |E|) ) amortized (complex proof)