CS420 lecture twelve Shortest Paths wim bohm cs csu.

Slides:



Advertisements
Similar presentations
Single Source Shortest Paths
Advertisements

October 31, Algorithms and Data Structures Lecture XIII Simonas Šaltenis Nykredit Center for Database Research Aalborg University
November 14, Algorithms and Data Structures Lecture XIII Simonas Šaltenis Aalborg University
CS138A Single Source Shortest Paths Peter Schröder.
Shortest-paths. p2. Shortest-paths problems : G=(V,E) : weighted, directed graph w : E  R : weight function P=
 2004 SDU Lecture9- Single-Source Shortest Paths 1.Related Notions and Variants of Shortest Paths Problems 2.Properties of Shortest Paths and Relaxation.
Single-Source Shortest- paths. p2. Shortest-paths problems : G=(V,E) : weighted, directed graph w : E  R : weight function P=
Data Structures and Algorithms Graphs Single-Source Shortest Paths (Dijkstra’s Algorithm) PLSD210(ii)
Data Structures and Algorithms (AT70.02) Comp. Sc. and Inf. Mgmt. Asian Institute of Technology Instructor: Dr. Sumanta Guha Slide Sources: CLRS “Intro.
Shortest Paths Algorithm Design and Analysis Week 7 Bibliography: [CLRS] – chap 24 [CLRS] – chap 25.
chapter Single-Source Shortest Paths Problem Definition Shortest paths and Relaxation Dijkstra’s algorithm (can be viewed as a greedy algorithm)
CPSC 411, Fall 2008: Set 9 1 CPSC 411 Design and Analysis of Algorithms Set 9: More Graph Algorithms Prof. Jennifer Welch Fall 2008.
Lecture 20: Shortest Paths Shang-Hua Teng. Weighted Directed Graphs Weight on edges for distance
Lecture 19: Shortest Paths Shang-Hua Teng. Weighted Directed Graphs Weight on edges for distance
Jim Anderson Comp 122, Fall 2003 Single-source SPs - 1 Chapter 24: Single-Source Shortest Paths Given: A single source vertex in a weighted, directed graph.
Shortest Path Problems
Shortest Paths Definitions Single Source Algorithms –Bellman Ford –DAG shortest path algorithm –Dijkstra All Pairs Algorithms –Using Single Source Algorithms.
UMass Lowell Computer Science Analysis of Algorithms Prof. Karen Daniels Fall, 2008 Lecture 4 Tuesday, 9/30/08 Graph Algorithms: Part 1 Shortest.
1.1 Data Structure and Algorithm Lecture 11 Application of BFS  Shortest Path Topics Reference: Introduction to Algorithm by Cormen Chapter 25: Single-Source.
1 8-ShortestPaths Shortest Paths in a Graph Fundamental Algorithms.
Graph Algorithms: Part 1
Data Structures, Spring 2004 © L. Joskowicz 1 Data Structures – LECTURE 15 Shortest paths algorithms Properties of shortest paths Bellman-Ford algorithm.
BellmanFord. BellmanFord(G,w,s) 1 InitializeSingleSource(G,s) 2 for i 1 to |V[G]| do for each (u,v) E[G] 4 do Relax(u,v,w) 5 for each edge (u,v)
Shortest Paths Definitions Single Source Algorithms
DAST 2005 Tirgul 12 (and more) sample questions. DAST 2005 Q.We’ve seen that solving the shortest paths problem requires O(VE) time using the Belman-Ford.
CSE 780 Algorithms Advanced Algorithms SSSP Dijkstra’s algorithm SSSP in DAGs.
Analysis of Algorithms CS 477/677 Shortest Paths Instructor: George Bebis Chapter 24.
1 Graph Algorithms Single source shortest paths problem Dana Shapira.
CS 253: Algorithms Chapter 24 Shortest Paths Credit: Dr. George Bebis.
Data Structures, Spring 2006 © L. Joskowicz 1 Data Structures – LECTURE 15 Shortest paths algorithms Properties of shortest paths Bellman-Ford algorithm.
Theory of Computing Lecture 7 MAS 714 Hartmut Klauck.
Graphs – Shortest Path (Weighted Graph) ORD DFW SFO LAX
Shortest Path Algorithms. Kruskal’s Algorithm We construct a set of edges A satisfying the following invariant:  A is a subset of some MST We start with.
David Luebke 1 9/13/2015 CS 332: Algorithms S-S Shortest Path: Dijkstra’s Algorithm Disjoint-Set Union Amortized Analysis.
1 Shortest Path Algorithms Andreas Klappenecker [based on slides by Prof. Welch]
Jim Anderson Comp 122, Fall 2003 Single-source SPs - 1 Chapter 24: Single-Source Shortest Paths Given: A single source vertex in a weighted, directed graph.
1 Shortest Path Problems How can we find the shortest route between two points on a road map? Model the problem as a graph problem: –Road map is a weighted.
Algorithm Course Dr. Aref Rashad February Algorithms Course..... Dr. Aref Rashad Part: 6 Shortest Path Algorithms.
Chapter 24: Single-Source Shortest Paths Given: A single source vertex in a weighted, directed graph. Want to compute a shortest path for each possible.
Lecture 16. Shortest Path Algorithms
Introduction to Algorithms Jiafen Liu Sept
The single-source shortest path problem (SSSP) input: a graph G = (V, E) with edge weights, and a specific source node s. goal: find a minimum weight (shortest)
CSE 2331 / 5331 Topic 12: Shortest Path Basics Dijkstra Algorithm Relaxation Bellman-Ford Alg.
Lecture 13 Algorithm Analysis
All-Pairs Shortest Paths
1 Weighted Graphs. 2 Outline (single-source) shortest path –Dijkstra (Section 4.4) –Bellman-Ford (Section 4.6) (all-pairs) shortest path –Floyd-Warshall.
Greedy Algorithms Z. GuoUNC Chapel Hill CLRS CH. 16, 23, & 24.
CSCE 411 Design and Analysis of Algorithms Set 9: More Graph Algorithms Prof. Jennifer Welch Spring 2012 CSCE 411, Spring 2012: Set 9 1.
Single Source Shortest Paths Chapter 24 CSc 4520/6520 Fall 2013 Slides adapted from George Bebis, University of Reno, Nevada.
Shortest paths Given: A single source vertex (given s) in a weighted, directed graph. Want to compute a shortest path for each possible destination. (Single.
Single-Source Shortest Paths (25/24) HW: 25-2 and 25-3 p. 546/24-2 and 24-3 p.615 Given a graph G=(V,E) and w: E   weight of is w(p) =  w(v[i],v[i+1])
David Luebke 1 11/21/2016 CS 332: Algorithms Minimum Spanning Tree Shortest Paths.
Algorithms and Data Structures Lecture XIII
Minimum Spanning Tree Shortest Paths
Minimum Spanning Trees
Algorithms (2IL15) – Lecture 5 SINGLE-SOURCE SHORTEST PATHS
Data Structures and Algorithms (AT70. 02) Comp. Sc. and Inf. Mgmt
Basic Graph Algorithms
CS 583 Analysis of Algorithms
Advanced Algorithms Analysis and Design
Lecture 11 Topics Application of BFS Shortest Path
Algorithms and Data Structures Lecture XIII
Lecture 13 Algorithm Analysis
Lecture 13 Algorithm Analysis
Minimum Spanning Tree Algorithms
CSC 413/513: Intro to Algorithms
Lecture 13 Algorithm Analysis
Lecture 13 Algorithm Analysis
Shortest Path Problems
CS 3013: DS & Algorithms Shortest Paths.
Presentation transcript:

CS420 lecture twelve Shortest Paths wim bohm cs csu

Shortest Paths Problems Given a weighted directed graph G=(V,E) find the shortest path – path length is the sum of its edge weights. The shortest path (SP) from u to v is  if there is no path from u to v.

Variations 1) SSSP (Single source SP): find the SP from some node s to all nodes in the graph. 2) SPSP (single pair SP): find the SP from some u to some v. We can use 1) to solve 2), also there is no asymptotically faster algorithm for 2) than that for 1).

Variations 3) SDSP (single destination SP) can use 1) by reversing its edges. 4) APSP (all pair SPs) could be solved by |V| applications of 1), but can be solved faster. We will thus concentrate on SSSP and APSP.

Optimal Substructure the property that the SP from u to v via w consists of a SP from u to w, and a SP from w to v or that a sub path from p to q is a SP from p to q.

Negative weight edges and cycles Negative weight cycles create an ill defined problem. Why? If some weights in the graph are negative but no cycle in the graph has a negative length, the SP problem stays well defined. Some algorithms (Dijkstra) assume all weights to be positive, some (Bellman Ford) allow negative weights but not negative length cycles.

Zero length cycles We have already ruled out negative length cycles, a shortest path cannot contain a positive length cycle, what about zero length cycles? We can remove zero length cycles and produce a path with the same length. Hence we can assume that shortest paths have no cycles and thus have at most |V|-1 edges.

SSSP and Shortest path trees For each vertex v on the shortest path (or the shortest path in construction) we maintain its predecessor  (v), a node or nil. The predecessor sub graph G  = (V , E  ) has the vertices v and edges  (v)  nil plus the source s. Shortest path algorithms make G  a shortest path tree with root s with shortest paths to all vertices reachable from s.

d(v) For each vertex v we maintain d(v), the shortest-path estimate: an upper bound on the shortest path length to v. We initialize d-s and  -s as follows: Init-SingleSource (G,s){ for each vertex v {d[v]=  ;  [v]=nil} d[s]=0; }

∞ Arithmetic with  : if a  - , we have a+  =  +a =  if a  , a+(-  )=(-  )+a=- .

Relax Shortest paths algorithms use the operation relax edge (u,v): testing whether we can improve a path from s to v by using edge (u,v) and, if so, updating d[v] and  [v]: Relax(u,v,w) { if (d[v] > (r = d[u]+w(u,v))) {d[v] = r;  [v]= u} }

 (s,v) The final value for d[v] is denoted as  (s,v), the shortest path from s to v. Assuming the algorithm starts with Init-SingleSource and only does updates on d and  using Relax, the following properties of shortest paths and relax can be proved (Cormen et. al., ch. 24.5).

Properties Triangle Inequality: For any edge (u,v):  (s,v) <=  (s,u) + w(u,v) optimal sub structure Upper-bound Property: d[v] >=  (s,v) and once d[v] achieves  (s,v), it does not change anymore. nature of relax

More Properties No-path property: If there is no path from s to v, d[v]=  (s,v)=  relax never changed d Convergence property: If s  u  v is a shortest path and d[u]=  (s,u) before relaxing edge (u,v), then d[v] =  (s,v) afterward. optimal sub structure

and more Path-relaxation property: If p = is a shortest path from s=v 0 to v k, and the edges are relaxed in order (v 0,v 1 ), (v 1,v 2 ) etc., then d[v]=  (s,v). inductive argument Predecessor-subgraph property: Once d[v]=  (s,v) for all v in V, the predecessor subgraph is the shortest path tree rooted at s. nature of Relax

Bellman-Ford SSSP Allows negative edge weights Checks for negative length cycles returns false if there is a negative length cycle otherwise, BF returns true and and the shortest paths in d and the shortest path tree in .

Bellman-Ford(G,w,s){ Init-SingleSource(G,s) for i = 1 to |V|-1 for each (u,v) in E Relax(u,v,w) // check for negative cycles for each (u,v) in E if (d[v]> d[u]+w(u,v) return false return true } complexity?

Bellman-Ford(G,w,s){ Init-SS(G,s) for i = 1 to |V|-1 for each (u,v) in E Relax(u,v,w) // check for negative cycles for each (u,v) in E if (d[v]> d[u]+w(u,v) return false return true } complexity O(|V| * |E| )

Correctness Bellman-Ford Cormen et.al.: proof in the case that there are no negative length cycles is based on path relaxation lemma: for each shortest path the first edge is found in sweep one, the i-th edge is found in sweep i. If after |V|-1 sweeps there is still improvement then there must be a negative length path.

0 s ∞ b ∞ c ∞ a Example BF1. State: right after Init-SS. The nodes contain their d values, all  values are nil. BF does 3 sweeps over all edges, assume order (a,b), (b,a), (c,a), (s,b), (s,c). What happens in the first sweep?..... Relax all edges.....

0 s ∞ b ∞ c ∞ a Example BF1. State: right after Init-SS. The nodes contain their d values, all  values are nil. BF does 3 sweeps over all edges, assume order (a,b), (b,a), (c,a), (s,b), (s,c). Relax(u,v,w) { if (d[v] > (r = d[u]+w(u,v))) {d[v] = r;  [v]= u} } What happens in the first sweep? In the first sweep: d[b] > 0+6 so d[b]=6 and  [b]=s d[c] > 0+7 so d[c]=7 and  [c]=s

0 s 6 b 7 c ∞ a Example BF1. State: after sweep 1 BF does 3 sweeps over all edges, assume order (a,b), (b,a), (c,a), (s,b), (s,c). Relax(u,v,w) { if (d[v] > (r = d[u]+w(u,v))) {d[v] = r;  [v]= u} } What happens in the second sweep?

0 s 6 b 7 c ∞ a Example BF1. State: after sweep 1 BF does 3 sweeps over all edges, assume order (a,b), (b,a), (c,a), (s,b), (s,c). Relax(u,v,w) { if (d[v] > (r = d[u]+w(u,v))) {d[v] = r;  [v]= u} } What happens in the second sweep? In the second sweep: d[a] > 6+5 so d[a]=11 and  [a]=b d[a] > 7-3 so d[a]=4 and  [a]=c d[b] > 4-2 so d[b]=2 and  [b]=a

0 s 2 b 7 c 4 a What happens in the third sweep? What is returned?

0 s 2 b 7 c 4 a In the third sweep nothing changes BF1 returns true

0 s ∞ b ∞ c ∞ a Example BF2. Slightly changed state: right after Init-SS. The nodes contain their d values, all  values are nil. BF does 3 sweeps over all edges, assume order (a,b), (b,a), (c,a), (s,b), (s,c). Relax(u,v,w) { if (d[v] > (r = d[u]+w(u,v))) {d[v] = r;  [v]= u} } What happens in the first sweep?

0 s 6 b 7 c ∞ a Example BF2. State after first sweep BF does 3 sweeps over all edges, assume order (a,b), (b,a), (c,a), (s,b), (s,c). Relax(u,v,w) { if (d[v] > (r = d[u]+w(u,v))) {d[v] = r;  [v]= u} } In the first sweep: d[b] > 0+6 so d[b]=6 and  [b]=s d[c] > 0+7 so d[c]=7 and  [c]=s

0 s 6 b 7 c ∞ a Example BF2. State after first sweep BF does 3 sweeps over all edges, assume order (a,b), (b,a), (c,a), (s,b), (s,c). Relax(u,v,w) { if (d[v] > (r = d[u]+w(u,v))) {d[v] = r;  [v]= u} } What happens in the second sweep?

0 s 6 b 7 c 1 a Example BF2. State after second sweep BF does 3 sweeps over all edges, assume order (a,b), (b,a), (c,a), (s,b), (s,c). Relax(u,v,w) { if (d[v] > (r = d[u]+w(u,v))) {d[v] = r;  [v]= u} } In the second sweep: d[a] > 6-5 so d[a]=1 and  [a]=b

0 s 6 b 7 c 1 a Example BF2. State after second sweep BF does 3 sweeps over all edges, assume order (a,b), (b,a), (c,a), (s,b), (s,c). Relax(u,v,w) { if (d[v] > (r = d[u]+w(u,v))) {d[v] = r;  [v]= u} } What happens in the third sweep?

0 s 3 b 7 c 1 a Example BF2. State after third sweep BF does 3 sweeps over all edges, assume order (a,b), (b,a), (c,a), (s,b), (s,c). Relax(u,v,w) { if (d[v] > (r = d[u]+w(u,v))) {d[v] = r;  [v]= u} } In sweep three we can now improve d[b] but the predecessor sub graph is broken!! We now can keep lowering d[a] and and d[b] in subsequent sweeps. The Bellman-Ford algorithm detects this in its last sweep and returns false.

SSSP in a DAG By relaxing edges in a DAG in topological sort order we need to only relax each edge once because in any path p =, (v 0,v 1 ) topologically precedes (v 1,v 2 ) etc. So the path relaxation property implies d[v]=  (s,v) for all v.

Dijkstra's SSSP Dijkstra's greedy algorithm solves SSSP for directed graphs with non-negative edges. Very much like in Primm's minimal spanning tree algorithm, a set S of vertices whose minimal path has been determined is stepwise extended, and distances to yet unreached vertices are updated. The vertices are maintained in a min priority queue, keyed on their d value.

Back to Primm MST In lecture 8 Greedy algorithms we sketched Primm's MST: – the growing MST A is a tree – start with arbitrary point, pick minimal emanating edge – keep adding minimal distance to A nodes, adjusting the minimal distance of the points not in to A – store V in min priority Queue

MST-Prim(G,w,s) // G=(V,E), weights w, root s init_SingleSource(G,s) Q=V; // in initial priority Queue form, // eg heapified while Q not empty u = extract-min(Q) // first one will be s for each v in adj(u) if v in Q and w(u,v) < u.d v.d = w(u,v) ; v.  = u

Complexity Primm Q: binary min heap (see sorting lecture) – we can build the heap in O(|V|) time The while loop executes |V| times – each extract-min takes O(lg|V|) time totalling O(|V|lg|V|) time

Complexity Primm cont' The inner for loop executes |E| times in total – because each vertex is extracted only once and therefore each edge in the adjacency list representing G is considered once – v.d = w(u,v) in the conditional in the innermost loop involves a decrease-key operation in the heap taking O(lg|V|) time Hence, complexity when using priority min heap: O((|V|+|E|)(lg(|V|)) which is O(|E|lg|V|)

Dijkstra(G,w,s) { // very much like Primm S = empty; Q = V(G); // min priority queue while (Q not empty) { u=Extract-min(Q); S=S  {u} for each adjacent v of u Relax(u,v,w) } The while loop keeps Q=V-S invariant The first u=s The Relax function decreases d-values for some vertices using the decrease-key min-priority queue function (see Sorting lecture). Relax creates d and 

Dijkstra(G,w,s) { S = empty; Q = V(G); // min priority queue while (Q not empty) { u=Extract-min(Q); S=S  {u} for each adjacent v of u Relax(u,v,w) } Correctness proof very similar to Primm's MST proof

0 s ∞ b ∞ c ∞ a

0 s ∞ b ∞ c ∞ a s 2 b 7 c ∞ a

0 s ∞ b ∞ c ∞ a s 2 b 7 c ∞ a s 2 b 3 c 7 a

0 s ∞ b ∞ c ∞ a s 2 b 7 c ∞ a s 2 b 3 c 7 a s 2 b 3 c 4 a

Floyd-Warshall's APSP algorithm Floyd-Warshall's dynamic programming APSP algorithm runs in  (|V| 3 ) time and uses the adjacency matrix representation: w ij = 0 if i=j weight ij if (i,j) in E  otherwise Negative weight edges can be present, but negative length cycles cannot.

vertex sets Denote the Vertex set V to be {1,2,...,n} and let V k be the subset {1,2,...k}. The algorithm uses a recurrence, which in step k considers paths from all i to all j either (don't take node k into account) with intermediate nodes from V k-1 only or (do take node k into account) with intermediate nodes V k-1 and also node k

k intermediate node in V k ? If k is not an intermediate node then the shortest path from i to j in step k is equal to the shortest path in step k-1. If k is an intermediate node then the shortest path from i to j in step k is the shortest from i to k in step k-1 plus the shortest path from k to j in step k-1.

V k :via set via which paths from i to j go equation form of previous slide: d ij (k) = w ij if k=0 min(d ij (k-1), d ik (k-1) + d kj (k-1) ) if k>0

Floyd Warshall Floyd-Warshall(W) D=W for k = 1 to n for i = 1 to n for j = 1 to n D[i,j]=min(D[i,j], D[i,k]+D[k,j]

Predecessors in Floyd-Warshall A recurrence for the predecessor  can be derived much like the recurrence for d.  ij (0) = nil if i=j or (i,j) not in E i otherwise  ij (k) =  ij (k-1) if d ij (k-1)  d ik (k-1) + d kj (k-1)  kj (k-1) if d ij (k-1) > d ik (k-1) + d kj (k-1) Code for computing the predecessor is easily incorporated in the Floyd-Warshall function.

transitive closure Given a graph G=(V,E), the transitive closure graph G * =(V,E * ) has an edge (i,j) if there is a path from i to j in G. G * can be computed using Floyd-Warshall by – initially assigning D ij =1 if (i,j) in E and 0 otherwise, – and replacing min by or and + by and in the loop.