Paweł Gawrychowski, Nadav Krasnopolsky, Shay Mozes, Oren Weimann

Slides:



Advertisements
Similar presentations
Interval Trees Store intervals of the form [li,ri], li <= ri.
Advertisements

Traveling Salesperson Problem
Comp 122, Spring 2004 Binary Search Trees. btrees - 2 Comp 122, Spring 2004 Binary Trees  Recursive definition 1.An empty tree is a binary tree 2.A node.
AVL Trees1 Part-F2 AVL Trees v z. AVL Trees2 AVL Tree Definition (§ 9.2) AVL trees are balanced. An AVL Tree is a binary search tree such that.
Games & Adversarial Search Chapter 5. Games vs. search problems "Unpredictable" opponent  specifying a move for every possible opponent’s reply. Time.
Games & Adversarial Search
Greedy Algorithms Basic idea Connection to dynamic programming Proof Techniques.
2-dimensional indexing structure
Tirgul 10 Rehearsal about Universal Hashing Solving two problems from theoretical exercises: –T2 q. 1 –T3 q. 2.
Accessing Spatial Data
4. Ad-hoc I: Hierarchical clustering
Greedy Algorithms Reading Material: Chapter 8 (Except Section 8.5)
Greedy Algorithms Like dynamic programming algorithms, greedy algorithms are usually designed to solve optimization problems Unlike dynamic programming.
R-Trees 2-dimensional indexing structure. R-trees 2-dimensional version of the B-tree: B-tree of maximum degree 8; degree between 3 and 8 Internal nodes.
Games & Adversarial Search Chapter 6 Section 1 – 4.
CPSC 411, Fall 2008: Set 4 1 CPSC 411 Design and Analysis of Algorithms Set 4: Greedy Algorithms Prof. Jennifer Welch Fall 2008.
Min Chen School of Computer Science and Engineering Seoul National University Data Structure: Chapter 7.
Exact methods for ALB ALB problem can be considered as a shortest path problem The complete graph need not be developed since one can stop as soon as in.
Lectures on Greedy Algorithms and Dynamic Programming
Data Structure II So Pak Yeung Outline Review  Array  Sorted Array  Linked List Binary Search Tree Heap Hash Table.
B-TREE. Motivation for B-Trees So far we have assumed that we can store an entire data structure in main memory What if we have so much data that it won’t.
Lecture 20. Graphs and network models 1. Recap Binary search tree is a special binary tree which is designed to make the search of elements or keys in.
Sorting With Priority Queue In-place Extra O(N) space
Trees Chapter 15.
AA Trees.
CSCE 3110 Data Structures & Algorithm Analysis
CSCE 3110 Data Structures & Algorithm Analysis
Tries 07/28/16 11:04 Text Compression
Chapter 5 : Trees.
Dr. Bernard Chen Ph.D. University of Central Arkansas Fall 2008
CMPS 3130/6130 Computational Geometry Spring 2017
COSC160: Data Structures Binary Trees
Binary search tree. Removing a node
Priority Queues and Heaps
Lecture 15 AVL Trees Slides modified from © 2010 Goodrich, Tamassia & by Prof. Naveen Garg’s Lectures.
Priority Queues Chuan-Ming Liu
Data Structures Review Session 2
Hashing Exercises.
CS 3343: Analysis of Algorithms
Binary Search Tree In order Pre order Post order Search Insertion
Sorting.
Chapter 5. Optimal Matchings
Types of Algorithms.
Voronoi diagrams in planar graphs & computing the diameter in
Interval Heaps Complete binary tree.
CS 3343: Analysis of Algorithms
Chapter 6 Transform and Conquer.
Wednesday, April 18, 2018 Announcements… For Today…
Graphs & Graph Algorithms 2
Balanced-Trees This presentation shows you the potential problem of unbalanced tree and show two way to fix it This lecture introduces heaps, which are.
Search Sorted Array: Binary Search Linked List: Linear Search
Algorithms (2IL15) – Lecture 2
Balanced-Trees This presentation shows you the potential problem of unbalanced tree and show two way to fix it This lecture introduces heaps, which are.
Chapter 6: Transform and Conquer
Covering Uncertain Points in a Tree
Approximating Points by A Piecewise Linear Function: I
An O(n log n)-Time Algorithm for the k-Center Problem in Trees
CSE 373: Data Structures and Algorithms
Compact routing schemes with improved stretch
More advanced aspects of search
CSE 373: Data Structures and Algorithms
Clustering.
The Greedy Approach Young CS 530 Adv. Algo. Greedy.
B-Trees.
Search Sorted Array: Binary Search Linked List: Linear Search
Binary Search Tree (BST)
Heaps & Multi-way Search Trees
Priority Queues Binary Heaps
Algorithm Course Algorithms Lecture 3 Sorting Algorithm-1
Presentation transcript:

Paweł Gawrychowski, Nadav Krasnopolsky, Shay Mozes, Oren Weimann Dispersion on Trees Paweł Gawrychowski, Nadav Krasnopolsky, Shay Mozes, Oren Weimann

𝑘-dispersion on trees The Dispersion Optimization Problem: Choose 𝑘 nodes, s.t. the shortest pairwise distance is maximized. The Feasibility Test: Choose 𝑘 nodes s.t. any pairwise distance is at least 𝜆. 12 71 𝜆 8 7 21 11 Can solve opt. problem using f.t. Possible values of 𝜆 are the pairwise distances Weighted case

Previous results Weighted Dispersion Unweighted Dispersion 𝑂 𝑛 [Bhattacharya and Houle 1997] Feasibility Test Optimization problem 𝜃(𝑛 log 𝑛 ) 𝑂(𝑛 log 3 𝑛) [Bhattacharya and Houle 1999] 𝑂(𝑛 log 2 𝑛) using [Frederickson and Johnson 1983] 𝑂(𝑛 log 4 𝑛) 𝑂(𝑛) 𝑂(𝑛 log 𝑛 ) using [Frederickson and Johnson 1983] K-partitioning – delete k edges so as to maximize the weight of the lightest resulting subtree P-center – place p supply centers in a graph

Linear time feasibility test [Bhattacharya and Houle 1997] Bottom up computation - for a subtree rooted at node 𝑟, compute a subset of nodes 𝑃 s.t.: The minimal pairwise distance in 𝑃 is ≥𝜆. |𝑃| is maximized In case of a tie, min 𝑢∈𝑃 𝑑 𝑟,𝑢 is also maximized. ≥𝜆

Linear time feasibility test [Bhattacharya and Houle 1997] Bottom up computation - for a subtree rooted at node 𝑟, compute a subset of nodes 𝑃 s.t.: The minimal pairwise distance in 𝑃 is ≥𝜆. |𝑃| is maximized In case of a tie, min 𝑢∈𝑃 𝑑 𝑟,𝑢 is also maximized. ≥𝜆

Linear time feasibility test [Bhattacharya and Houle 1997] Certain node – the closest chosen node at distance ≥ 𝜆 2 from the root < 𝜆 2 𝜆 2 ≥ Candidate node – a chosen node at distance < 𝜆 2 from the root At most one candidate in a subtree. At most one candidate

Linear time feasibility test [Bhattacharya and Houle 1997] Certain node – the closest chosen node at distance ≥ 𝜆 2 from the root Candidate node – a chosen node at distance < 𝜆 2 from the root At most one candidate in a subtree. At most one candidate

Linear time feasibility test [Bhattacharya and Houle 1997] 𝑣 ≥ 𝜆 2 < 𝜆 2 Check which candidates are certain w.r.t r.

Linear time feasibility test [Bhattacharya and Houle 1997] 𝑣 Find the closest certain node to r. Check if we can take a candidate w.r.t r. We want to use this linear f.t. to search over the pairwise dist.

Solving the optimization problem 𝑂(𝑛 log 𝑛 ) possible by binary searching over an implicitly constructed array containing all pairwise distances [Bhattacharya and Houle 1997]. To achieve better running time we build a sublinear feasibility test, and use a more complex search methods [Frederickson 1990].

Our sublinear feasibility test Partition the tree in linear time into 𝑂( 𝑛 𝑏 ) fragments of size 𝑏. Preprocess s.t. during a feasibility test a fragment of size 𝑏 is handled in 𝑂( log 𝑏) time. This gives us an 𝑂( 𝑛 𝑏 log 𝑏) time feasibility test. root spine hole

Preprocessing fragments root Run linear f.t. for the subtrees hanging off the spine, reduce each of them to at most two nodes. Obtain a caterpillar. hole

Preprocessing fragments root Run linear f.t. for the subtrees hanging off the spine, reduce each of them to at most two nodes. Obtain a caterpillar. hole

Preprocessing fragments root Remove collisions. Ignore certain nodes. Prune the candidates s.t. their distances from the root and hole are monotone. hole

Preprocessing fragments root Remove collisions. Ignore certain nodes. Prune the candidates s.t. their distances from the root and hole are monotone. hole

Preprocessing fragments root Compute the solution for any possible closest chosen node. 𝑧 𝑥+𝑦+𝑧≥𝜆 𝑦 𝑥 hole

Solving the optimization problem in 𝑂(𝑛) time Using the sublinear feasibility test in a search framework due to Frederickson, we solve the optimization problem in 𝑂(𝑛 log log 𝑛) time. Can do 𝑂(𝑛 log ∗ 𝑛 ) time, by using the same approach iteratively. Partition the tree into larger fragments each time. For linear time we cannot partition the tree independently each iteration. Many technical details (glue small fragments together, and tailor the precomputed data).

Part 2 – the weighted dispersion problem Now the input tree has both edge lengths and node weights, and we want to find a subset of weight ≥𝑊 (instead of choosing 𝑘 nodes). We can no longer reduce entire subtrees to at most two nodes, since we might have many candidates.

Weighted feasibility test Again, perform a bottom-up computation. Generate a representation of the solution for the subtree rooted at some node given the solutions for its children. The representation accounts for any option of choosing candidates in the subtree. Weight of optimal solution Distance of closest node to the root

The representation - monotone polylines We only store the breakpoints. The polyline is monotonically decreasing. Weight of optimal solution Distance of closest node to the root

Constructing a polyline by merging the children 𝑣 𝑢 1 𝑢 2 Easy if v has one child. 𝑝 1 𝑝 2

Constructing a polyline by merging the children 𝑣 𝑢 1 𝑢 2 Describe the construction of a polyline. Interface and data structure later.

The required polyline interface Split the polyline. Merge two polylines. Query value for some key. Get a sorted list of the breakpoints. Batched interval increase. Batched value predecessor. Batched interval insertions. Batched interval deletions.

Batched interval increase Given polylines 𝑝 1 and 𝑝 2 (where 𝑝 2 ≥| 𝑝 1 |), increase the value of 𝑝 2 in some intervals defined by breakpoints of 𝑝 1 . 𝑂(| 𝑝 1 |⋅ log | 𝑝 2 | | 𝑝 1 | ) time.

Batched interval increase Given polylines 𝑝 1 and 𝑝 2 (where 𝑝 2 ≥| 𝑝 1 |), increase the value of 𝑝 2 in some intervals defined by breakpoints of 𝑝 1 . 𝑂(| 𝑝 1 |⋅ log | 𝑝 2 | | 𝑝 1 | ) time. 𝑝 2 BST 𝑝 2

Batched interval increase Given polylines 𝑝 1 and 𝑝 2 (where 𝑝 2 ≥| 𝑝 1 |), increase the value of 𝑝 2 in some intervals defined by breakpoints of 𝑝 1 . 𝑂(| 𝑝 1 |⋅ log | 𝑝 2 | | 𝑝 1 | ) time. 𝑝 2 2 𝑝 2 2

Batched interval increase Given polylines 𝑝 1 and 𝑝 2 (where 𝑝 2 ≥| 𝑝 1 |), increase the value of 𝑝 2 in some intervals defined by breakpoints of 𝑝 1 . 𝑂(| 𝑝 1 |⋅ log | 𝑝 2 | | 𝑝 1 | ) time. 𝑝 2 4 𝑝 2 4 𝑝 2 4 𝑝 2 4

Batched interval increase Given polylines 𝑝 1 and 𝑝 2 (where 𝑝 2 ≥| 𝑝 1 |), increase the value of 𝑝 2 in some intervals defined by breakpoints of 𝑝 1 . 𝑂(| 𝑝 1 |⋅ log | 𝑝 2 | | 𝑝 1 | ) time. 𝑝 2 | 𝑝 1 | 𝑝 2 | 𝑝 1 | 𝑝 2 | 𝑝 1 | 𝑝 2 | 𝑝 1 | 𝑝 2 | 𝑝 1 | 𝑝 2 | 𝑝 1 | 𝑝 2 | 𝑝 1 |

Batched interval increase Given polylines 𝑝 1 and 𝑝 2 (where 𝑝 2 ≥| 𝑝 1 |), increase the value of 𝑝 2 in some intervals defined by breakpoints of 𝑝 1 . 𝑂(| 𝑝 1 |⋅ log | 𝑝 2 | | 𝑝 1 | ) time. | 𝑝 1 | 𝑝 2 | 𝑝 1 | 𝑝 2 | 𝑝 1 | 𝑝 2 | 𝑝 1 | 𝑝 2 | 𝑝 1 | 𝑝 2 | 𝑝 1 | 𝑝 2 | 𝑝 1 | 𝑝 2 | 𝑝 1 |

Batched interval increase Given polylines 𝑝 1 and 𝑝 2 (where 𝑝 2 ≥| 𝑝 1 |), increase the value of 𝑝 2 in some intervals defined by breakpoints of 𝑝 1 . 𝑂(| 𝑝 1 |⋅ log | 𝑝 2 | | 𝑝 1 | ) time. Each node in BST stores the max and min in its subtree. 𝑎 𝑏 𝑝 2 | 𝑝 1 | 𝑝 2 | 𝑝 1 | 𝑝 2 | 𝑝 1 | 𝑝 2 | 𝑝 1 | 𝑝 2 | 𝑝 1 | 𝑝 2 | 𝑝 1 | 𝑝 2 | 𝑝 1 |

Batched interval increase Given polylines 𝑝 1 and 𝑝 2 (where 𝑝 2 ≥| 𝑝 1 |), increase the value of 𝑝 2 in some intervals defined by breakpoints of 𝑝 1 . 𝑂(| 𝑝 1 |⋅ log | 𝑝 2 | | 𝑝 1 | ) time. Each node only stores the difference between its parent’s key and its own. 𝑎 𝑏 𝑂(| 𝑝 1 |) for scanning the roots + 2⋅𝑂( log ( | 𝑝 2 | | 𝑝 1 | )) for each interval increase. 𝑎 𝑏

Conclusion Unweighted dispersion: Weighted dispersion: Feasibility test Optimization problem Weighted dispersion: 𝑂(𝑛) 𝑂(𝑛) 𝜃(𝑛 log 𝑛 ) log 𝑛 gap

Thank you!