Nondecreasing Paths in Weighted Graphs Or: How to optimally read a train schedule Virginia Vassilevska.

Slides:



Advertisements
Similar presentations
Single Source Shortest Paths
Advertisements

Graphs Chapter 12. Chapter Objectives  To become familiar with graph terminology and the different types of graphs  To study a Graph ADT and different.
Graph II MST, Shortest Path. Graph Terminology Node (vertex) Edge (arc) Directed graph, undirected graph Degree, in-degree, out-degree Subgraph Simple.
Chapter 7: Greedy Algorithms 7.4 Finding the Shortest Path Dijkstra’s Algorithm pp
1 Greedy Algorithms. 2 2 A short list of categories Algorithm types we will consider include: Simple recursive algorithms Backtracking algorithms Divide.
Shortest Paths Definitions Single Source Algorithms –Bellman Ford –DAG shortest path algorithm –Dijkstra All Pairs Algorithms –Using Single Source Algorithms.
1 8-ShortestPaths Shortest Paths in a Graph Fundamental Algorithms.
Graphs Chapter 12. Chapter 12: Graphs2 Chapter Objectives To become familiar with graph terminology and the different types of graphs To study a Graph.
Shortest Paths Definitions Single Source Algorithms
CS 206 Introduction to Computer Science II 11 / 05 / 2008 Instructor: Michael Eckmann.
1 CSE 417: Algorithms and Computational Complexity Winter 2001 Lecture 11 Instructor: Paul Beame.
More Graph Algorithms Weiss ch Exercise: MST idea from yesterday Alternative minimum spanning tree algorithm idea Idea: Look at smallest edge not.
MIKKEL THORUP 1999 Journal of the ACM Undirected Single-Source Shortest Paths with Positive Integer Weights in Linear Time.
Prim’s Algorithm and an MST Speed-Up
Nondecreasing Paths in Weighted Graphs Or: How to optimally read a train schedule Virginia Vassilevska Carnegie Mellon UniversitySODA 2008.
1 Shortest Path Algorithms. 2 Routing Algorithms Shortest path routing What is a shortest path? –Minimum number of hops? –Minimum distance? There is a.
Minimal Spanning Trees What is a minimal spanning tree (MST) and how to find one.
Shortest Path Algorithms. Kruskal’s Algorithm We construct a set of edges A satisfying the following invariant:  A is a subset of some MST We start with.
1 Binomial heaps, Fibonacci heaps, and applications.
Graphs CS 400/600 – Data Structures. Graphs2 Graphs  Used to represent all kinds of problems Networks and routing State diagrams Flow and capacity.
1 GRAPHS - ADVANCED APPLICATIONS Minimim Spanning Trees Shortest Path Transitive Closure.
1 Shortest Path Algorithms Andreas Klappenecker [based on slides by Prof. Welch]
Week -7-8 Topic - Graph Algorithms CSE – 5311 Prepared by:- Sushruth Puttaswamy Lekhendro Lisham.
Dijkstras Algorithm Named after its discoverer, Dutch computer scientist Edsger Dijkstra, is an algorithm that solves the single-source shortest path problem.
Algorithms: Design and Analysis Summer School 2013 at VIASM: Random Structures and Algorithms Lecture 3: Greedy algorithms Phan Th ị Hà D ươ ng 1.
Graph (II) Shortest path, Minimum spanning tree GGuy
1 Fibonacci heaps, and applications. 2 Yet a better MST algorithm (Fredman and Tarjan) Iteration i: We grow a forest, tree by tree, as follows. Start.
SPANNING TREES Lecture 21 CS2110 – Spring
Spanning Trees CSIT 402 Data Structures II 1. 2 Two Algorithms Prim: (build tree incrementally) – Pick lower cost edge connected to known (incomplete)
CSE 2331 / 5331 Topic 12: Shortest Path Basics Dijkstra Algorithm Relaxation Bellman-Ford Alg.
Runtime O(VE), for +/- edges, Detects existence of neg. loops
Week 12 - Monday.  What did we talk about last time?  Topological sort and cycle detection in directed graphs  Connectivity  Strong connectivity.
1 Prim’s algorithm. 2 Minimum Spanning Tree Given a weighted undirected graph G, find a tree T that spans all the vertices of G and minimizes the sum.
SPANNING TREES Lecture 20 CS2110 – Fall Spanning Trees  Definitions  Minimum spanning trees  3 greedy algorithms (incl. Kruskal’s & Prim’s)
SPANNING TREES Lecture 21 CS2110 – Fall Nate Foster is out of town. NO 3-4pm office hours today!
CSCE 411 Design and Analysis of Algorithms Set 9: More Graph Algorithms Prof. Jennifer Welch Spring 2012 CSCE 411, Spring 2012: Set 9 1.
Graphs Definition: a graph is an abstract representation of a set of objects where some pairs of the objects are connected by links. The interconnected.
Graphs 1 Neil Ghani University of Strathclyde. Where are we …. We studied lists: * Searching and sorting a list Then we studied trees: * Efficient search.
ליאור שפירא, חיים קפלן וחברים
Lecture 13 Shortest Path.
Graph Algorithms BFS, DFS, Dijkstra’s.
Binomial heaps, Fibonacci heaps, and applications
Chapter 7: Greedy Algorithms
Discrete Optimization Lecture 1
CS 3343: Analysis of Algorithms
Introduction to Graphs
Minimum-Cost Spanning Tree
Graphs Lecture 18 CS2110 – Fall 2009.
Graphs Chapter 11 Objectives Upon completion you will be able to:
Algorithms (2IL15) – Lecture 5 SINGLE-SOURCE SHORTEST PATHS
Minimum-Cost Spanning Tree
Minimum-Cost Spanning Tree
Autumn 2015 Lecture 10 Minimum Spanning Trees
Lecture 14 Shortest Path (cont’d) Minimum Spanning Tree
Algorithms Searching in a Graph.
Weighted Graphs & Shortest Paths
Autumn 2016 Lecture 10 Minimum Spanning Trees
Graph Algorithms: Shortest Path
CSE 417: Algorithms and Computational Complexity
Implementation of Dijkstra’s Algorithm
Binomial heaps, Fibonacci heaps, and applications
Spanning Trees Lecture 20 CS2110 – Spring 2015.
Lecture 12 Shortest Path.
Winter 2019 Lecture 10 Minimum Spanning Trees
Lecture 14 Minimum Spanning Tree (cont’d)
Lecture 13 Shortest Path (cont’d) Minimum Spanning Tree
Minimum-Cost Spanning Tree
Data Structures and Algorithm Analysis Lecture 8
More Graphs Lecture 19 CS2110 – Fall 2009.
Presentation transcript:

Nondecreasing Paths in Weighted Graphs Or: How to optimally read a train schedule Virginia Vassilevska

Traveling? Tomorrow after 8amAs early as possible!

Nondecreasing Paths in Weighted Graphs Or: How to optimally read a train schedule Virginia Vassilevska flight

Routes with Multiple Stops London Frankfurt 7pm – 1:20pm 5:30pm – 10:40am 7:45pm – 8:30pm 11:40am – 4:15pm New York Newark 11:35am – 1pm 5:30pm – 7pm 9:25pm – 9:05am 10:30am – 6pm

Scheduling You might need to make several connections. There are multiple possible stopover points, and multiple possible schedules. How do you choose which segments to combine?

Talk Overview Graph-theoretic abstraction History Two algorithms Improved algorithm, linear time

4:15*pm 11:40*am 2:20*pm 10:05*am 8pm 2pm A vertex for each flight;A vertex for each city;(Origin, flight) edges;(flight, Destination) edges;departure time weight;arrival time weight; Graph-Theoretic Abstraction LondonFrankfurt 7pm 5:30pm 7:45*pm 11:40*am New YorkNewark 11:35am 5:30pm 9:25pm 10:30*am 1pm 7pm 1:20*pm 9:05*am 10:40*am 6*pm 8:30*pm … … … … Graph:Nondecreasing path with minimum last edge?

Versions of the problem We’ll focus on single source SSNP. Single source – single destination Single source (every destination) All pairs ST

History G. Minty 1958: graph abstraction and first algorithm for SSNP E. F. Moore 1959: a new algorithm for shortest paths, and SSNP – polytime – cubic time

History Dijkstra 1959 Fredman and Tarjan 1987 – Fibonacci Heaps implementation of Dijkstra’s; until now asymptotically fastest algorithm for SSNP. Nowadays – experimental research on improving Dijkstra’s algorithm implementation O(m+n log n) m – number of edges n – number of vertices

Fibonacci Heaps Fredman and Tarjan’s Fibonacci heaps: Maintain a set of elements with weights d[·] Insert element v with d[v] = ∞ in constant time Update the weight d[v] of an element v in constant time Return and remove the element v of minimum weight d[v] in O(log N) time where N is the number of elements

Dijkstra’s Algorithm for SSNP Maintain conservative “distance” estimates for all vertices; d[S] = - , d[v] =  for all other v U contains vertices with undetermined distances. Use Fibonacci Heaps! Vertices outside of U are completed.

Dijkstra’s algorithm Set U = V and T = { }. At each iteration, pick u from U minimizing d[u]. T = T U {u}, U = U \ {u}. For all edges (u, v),  If w(u,v) ≥ d[u], set d[v] = min (d[v], w(u,v)) u T U min d[u] d[u] v w(u,v) S Iterate:

Example S P Q a b c d U - Fibonacci Heap: S: -infinity P: infinity Q: infinity a: infinity b: infinity c: infinity d: infinity Other Distances: 5

Example S P Q a b c d U - Fibonacci Heap: P: infinity Q: infinity a: 5 b: 1 c: 3 d: infinity Other Distances: S: -infinity 5 S – extract min from U

Example S P Q a b c d U - Fibonacci Heap: P: 2 Q: infinity a: 5 c: 3 d: infinity Other Distances: S: -infinity b: 1 5 b – extract min from U

Example S P Q a b c d U - Fibonacci Heap: Q: 3 a: 3 c: 3 d: 2 Other Distances: S: -infinity b: 1 P: 2 5 P – extract min from U

Example S P Q a b c d U - Fibonacci Heap: Q: 3 a: 2 c: 3 Other Distances: S: -infinity b: 1 P: 2 d: 2 5 d – extract min from U

Example S P Q a b c d U - Fibonacci Heap: Q: 3 c: 2 Other Distances: S: -infinity b: 1 P: 2 d: 2 a: 2 5 a – extract min from U

Example S P Q a b c d U - Fibonacci Heap: Q: 3 Other Distances: S: -infinity b: 1 P: 2 d: 2 a: 2 c: 2 5 c – extract min from U

Example S P Q a b c d U - Fibonacci Heap: Other Distances: S: -infinity b: 1 P: 2 d: 2 a: 2 c: 2 Q: 3 5 Q – extract min from U

Running time of Dijkstra Inserting all n nodes in the Fibonacci heap takes O(n) time. Each d[v] update is due to some edge (u, v), and each edge is touched at most once. d[v] updates take O(m) time overall.

Running time of Dijkstra Every node removed from Fibonacci heap at most once – at most n extract- mins. This takes O(n log n) time overall. Final runtime: O(m+n log n). Optimal for Dijkstra’s algorithm – nodes visited in sorted order of their distance. bottleneck

More on Dijkstra’s Suppose we only maintain F vertices in the Fibonacci heaps. The rest we maintain in some other way. Then the runtime due to the Fibonacci heaps would be O(F log F + N(F)) where N(F) is the number of edges pointing to the F vertices. For F = m/log n, this is O(m)! F N(F)

ALG2: Depth First Search DFS(v, d[v]): For all (v, u) with w(v, u) ≥ d[v]: Remove (v, u) from graph. d[u] = min (d[u], w(v,u)) DFS(u, d[u]) d[S] = - ∞, start with DFS(S, d[S]). v d[v]= u d[u]=4d[u]=3

Example S P Q a b c d Distances: S: -infinity P: infinity Q: infinity a: infinity b: infinity c: infinity d: infinity 5

Example S P Q a b c d Distances: S: -infinity P: infinity Q: infinity a: infinity b: 1 c: infinity d: infinity 5 DFS(S, -infinity)

Example S P Q a b c d Distances: S: -infinity P: 2 Q: infinity a: infinity b: 1 c: infinity d: infinity 5 DFS(b, 1)

Example S P Q a b c d Distances: S: -infinity P: 2 Q: infinity a: infinity b: 1 c: infinity d: 2 5 DFS(P, 2)

Example S P Q a b c d Distances: S: -infinity P: 2 Q: infinity a: 2 b: 1 c: infinity d: 2 5 DFS(d, 2)

Example S P Q a b c d Distances: S: -infinity P: 2 Q: infinity a: 2 b: 1 c: 2 d: 2 5 DFS(a, 2)

Example S P Q a b c d Distances: S: -infinity P: 2 Q: infinity a: 2 b: 1 c: 2 d: 2 5 DFS(c, 2)

Example S P Q a b c d Distances: S: -infinity P: 2 Q: 3 a: 2 b: 1 c: 2 d: 2 5 DFS(P, 2)

Example S P Q a b c d Distances: S: -infinity P: 2 Q: 3 a: 2 b: 1 c: 2 d: 2 5 DFS(Q, 3)

Example S P Q a b c d Distances: S: -infinity P: 2 Q: 3 a: 2 b: 1 c: 2 d: 2 5 DFS(Q, 3)

Example S P Q a b c d Distances: S: -infinity P: 2 Q: 3 a: 2 b: 1 c: 2 d: 2 5 DFS(Q, 3)

Example S P Q a b c d Distances: S: -infinity P: 2 Q: 3 a: 2 b: 1 c: 2 d: 2 5 DFS(P, 2)

Example S P Q a b c d Distances: S: -infinity P: 2 Q: 3 a: 2 b: 1 c: 2 d: 2 5 DFS(S, -infinity)

Example S P Q a b c d Distances: S: -infinity P: 2 Q: 3 a: 2 b: 1 c: 2 d: 2 5 DFS(S, -infinity)

Example S P Q a b c d Distances: S: -infinity P: 2 Q: 3 a: 2 b: 1 c: 2 d: 2 5 DFS(S, -infinity)

Runtime of DFS The number of times we call DFS(v, d[v]) for any particular v is at most indegree(v). Every such time we might have to check all outedges (w(v,u)≥ ? d[v]). Worst case running time: Σ v (indegree(v) × outdegree(v)) ≤ n Σ v indegree(v) = O(mn). Σ v indegree(v)=m

More on DFS The runtime can be improved! Suppose for a node v and weight d[v] we can access each edge (v, u) with w(v, u)≥ d[v] in O(t) time. As each edge is accessed at most once, the runtime is O(m t).

DFS with Binary Search Trees For each vertex v, insert outedges in binary search tree sorted by weight. Splay trees, treaps, AVL trees etc. support the following on totally ordered sets of size k in O(log k) time:  insert,  delete,  find, predecessor, successor

DFS with Binary Search Trees Given any weight d[v], one can find an edge (v, u) with w(v, u) ≥ d[v] in O(log [deg(v)]) time. All inserts in the beginning take O(Σ v deg(v) log [deg(v)]) = O(m log n) time, and DFS takes O(m log n) time. Σ v deg(v)=2m

Combining Dijkstra with DFS Recall:  If F nodes used in Fibonacci heaps, then the runtime due to the heaps is O(F log F + N(F))  If DFS with binary search trees is run on a set of nodes T, the runtime is O(Σ v  T { deg(v) log (deg(v)) }) O(m log log n) for T = {v | deg(v)<log n} ◄ O(m+n) for F = m/log n

Idea! Run DFS on vertices of low degree < log n: O(m log log n) time. Put the rest in Fibonacci heaps and run Dijkstra on them. There are at most O(m/log n) high degree nodes. Time due to Fibonacci heaps: O(m). We get O(m log log n). Better than O(m+n log n) for m = o(n log n/log log n).

But we wanted linear time… Fredman and Willard atomic heaps: After O(n) preprocessing, a collection of O(n) sets of O(log n) size can be maintained so that the following are in constant time:  Insert  Delete  Given w, return an element of weight ≥ w. outedges of low degree vertices

Linear runtime Replace binary trees by atomic heaps. Time due to Dijkstra with Fibonacci Heaps on O(m/log n) elements is still O(m). Time due to DFS with atomic heaps:  inserting outedges into atomic heaps takes constant time per edge;  given d[v], accessing an edge with w(v,u) ≥ d[v] takes constant time. O(m+n) time overall! But how do we combine Dijkstra and DFS?

New Algorithm Stage 1: Initialize  Find all vertices v of degree ≥ log n and insert into Fibonacci Heaps with d[v] = ∞;  For all vertices u of degree < log n, add outedges into atomic heap sorted by weights.  This stage takes O(m+n) time. Insert S with d[S] = - ∞

22 New Algorithm Stage 2: Repeat: 1. Extract vertex v from Fibonacci heaps with minimum d[v] 2. For all neighbors u of v, if w(u,v) ≥ d[v]: 1. Update d[u] if w(v,u) < d[u] 2. Run DFS(u, d[u]) on the graph spanned by low degree vertices until no more can be reached 3. If Fib.heaps nonempty, go to 1. 3 Fibonacci Heaps

Example S P Q a b c d U - Fibonacci Heap: S: -infinity P: infinity Q: infinity Other Distances: a: infinity b: infinity c: infinity d: infinity 5

S P Q a b c d U - Fibonacci Heap: P: infinity Q: infinity Other Distances: a: 5 b: 1 c: 3 d: infinity S: -infinity S – extract min from U 5 Example

S P Q a b c d U - Fibonacci Heap: P: 2 Q: infinity Other Distances: a: 5 b: 1 c: 3 d: infinity S: -infinity DFS(b, 1) 5 Example

DFS(a, 5) S P Q a b c d U - Fibonacci Heap: P: 2 Q: infinity Other Distances: a: 5 b: 1 c: 3 d: infinity S: -infinity DFS(c, 3) 3 5 Example

2 3 P – extract min from U S P Q a b c d U - Fibonacci Heap: Q: 3 Other Distances: a: 3 b: 1 c: 3 d: 2 S: -infinity P: 2 5 Example

2 3 S P Q a b c d U - Fibonacci Heap: Q: 3 Other Distances: a: 2 b: 1 c: 3 d: 2 S: -infinity P: 2 DFS(d, 2) 5 Example

2 3 S P Q a b c d U - Fibonacci Heap: Q: 3 Other Distances: a: 2 b: 1 c: 2 d: 2 S: -infinity P: 2 DFS(a, 2) 5 Example

2 3 S P Q a b c d U - Fibonacci Heap: Q: 3 Other Distances: a: 2 b: 1 c: 2 d: 2 S: -infinity P: 2 DFS(c, 2) 5 DFS(a, 3) Example

2 3 S P Q a b c d U - Fibonacci Heap: Other Distances: a: 2 b: 1 c: 2 d: 2 S: -infinity P: 2 Q: 3 5 Q – extract min from U Example

Summary We gave the first linear time algorithm for the single source nondecreasing paths problem. We did this by combining two known algorithms, and by using clever data structures. Now you can read a train schedule optimally!

Directions for future work What about the all-pairs version of the problem? That is, if we want to schedule the best trips between any two cities? Naïve algorithm: apply linear time algorithm for all possible sources – O(mn) time, O(n 3 ) in the worst case. Can we do anything better? O(n 2.9 )?

Conclusion With the aid of the right data structures simple algorithms can be fast. THANK YOU!