Algorithm Analysis Fall 2017 CS 4306/03

Slides:



Advertisements
Similar presentations
1 Introduction to Algorithms 6.046J/18.401J/SMA5503 Lecture 19 Prof. Erik Demaine.
Advertisements

* Bellman-Ford: single-source shortest distance * O(VE) for graphs with negative edges * Detects negative weight cycles * Floyd-Warshall: All pairs shortest.
Advanced Algorithm Design and Analysis (Lecture 7) SW5 fall 2004 Simonas Šaltenis E1-215b
Lecture 17 Path Algebra Matrix multiplication of adjacency matrices of directed graphs give important information about the graphs. Manipulating these.
Introduction to Algorithms 6.046J/18.401J/SMA5503
Chapter 25: All-Pairs Shortest-Paths
 2004 SDU Lecture11- All-pairs shortest paths. Dynamic programming Comparing to divide-and-conquer 1.Both partition the problem into sub-problems 2.Divide-and-conquer.
Dynamic Programming Dynamic Programming is a general algorithm design technique for solving problems defined by recurrences with overlapping subproblems.
Design and Analysis of Algorithms - Chapter 81 Dynamic Programming Warshall’s and Floyd’sAlgorithm Dr. Ying Lu RAIK 283: Data Structures.
All Pairs Shortest Paths and Floyd-Warshall Algorithm CLRS 25.2
Shortest Paths Definitions Single Source Algorithms –Bellman Ford –DAG shortest path algorithm –Dijkstra All Pairs Algorithms –Using Single Source Algorithms.
Chapter 8 Dynamic Programming Copyright © 2007 Pearson Addison-Wesley. All rights reserved.
Design and Analysis of Algorithms - Chapter 81 Dynamic Programming Dynamic Programming is a general algorithm design technique Dynamic Programming is a.
Chapter 8 Dynamic Programming Copyright © 2007 Pearson Addison-Wesley. All rights reserved.
1 Advanced Algorithms All-pairs SPs DP algorithm Floyd-Warshall alg.
Shortest Paths Definitions Single Source Algorithms
All-Pairs Shortest Paths
Dynamic Programming A. Levitin “Introduction to the Design & Analysis of Algorithms,” 3rd ed., Ch. 8 ©2012 Pearson Education, Inc. Upper Saddle River,
CS 473 All Pairs Shortest Paths1 CS473 – Algorithms I All Pairs Shortest Paths.
More Dynamic Programming Floyd-Warshall Algorithm.
A. Levitin “Introduction to the Design & Analysis of Algorithms,” 3rd ed., Ch. 8 ©2012 Pearson Education, Inc. Upper Saddle River, NJ. All Rights Reserved.
Dynamic Programming Dynamic Programming Dynamic Programming is a general algorithm design technique for solving problems defined by or formulated.
1 The Floyd-Warshall Algorithm Andreas Klappenecker.
All-pairs Shortest Paths. p2. The structure of a shortest path: All subpaths of a shortest path are shortest paths. p : a shortest path from vertex i.
Introduction to Algorithms Jiafen Liu Sept
Algorithms LECTURE 14 Shortest Paths II Bellman-Ford algorithm
The all-pairs shortest path problem (APSP) input: a directed graph G = (V, E) with edge weights goal: find a minimum weight (shortest) path between every.
All-Pairs Shortest Paths
Introduction to Algorithms All-Pairs Shortest Paths My T. UF.
Shortest Paths.
Algorithm Analysis Fall 2017 CS 4306/03
CSC317 Shortest path algorithms
Dynamic Programming Dynamic Programming is a general algorithm design technique for solving problems defined by recurrences with overlapping subproblems.
Dynamic Programming Dynamic Programming is a general algorithm design technique for solving problems defined by recurrences with overlapping subproblems.
Chapter 8 Dynamic Programming
Algorithms and Data Structures Lecture XIII
All-Pairs Shortest Paths (26.0/25)
Chapter 25: All-Pairs Shortest Paths
Warshall’s and Floyd’sAlgorithm
Lecture 7 All-Pairs Shortest Paths
Dynamic Programming Characterize the structure (problem state) of optimal solution Recursively define the value of optimal solution Compute the value of.
Shortest Paths.
Lecture 7 Shortest Path Shortest-path problems
CS200: Algorithm Analysis
Algorithms (2IL15) – Lecture 5 SINGLE-SOURCE SHORTEST PATHS
Analysis and design of algorithm
Chapter 8 Dynamic Programming
Algorithms and Data Structures Lecture XIII
Lecture 13 Algorithm Analysis
Lecture 13 Algorithm Analysis
Dynamic Programming Characterize the structure (problem state) of optimal solution Recursively define the value of optimal solution Compute the value of.
Floyd-Warshall Algorithm
Lecture 13 Algorithm Analysis
Dynamic Programming.
Advanced Algorithms Analysis and Design
Lecture 13 Algorithm Analysis
Honors Track: Competitive Programming & Problem Solving Avoiding negative edges Steven Ge.
All pairs shortest path problem
Algorithms (2IL15) – Lecture 7
Single-Source All-Destinations Shortest Paths With Negative Costs
Presented by-Kapil Kumar Cse-iii rd year
Text Book: Introduction to algorithms By C L R S
Dynamic Programming.
Shortest Paths.
Dynamic Programming Dynamic Programming is a general algorithm design technique for solving problems defined by recurrences with overlapping subproblems.
Negative-Weight edges:
Single-Source All-Destinations Shortest Paths With Negative Costs
Directed Graphs (Part II)
All-Pairs Shortest Paths
COSC 3101A - Design and Analysis of Algorithms 12
Presentation transcript:

Algorithm Analysis Fall 2017 CS 4306/03 Shortest Paths III All-pairs shortest paths Matrix-multiplication algorithm Floyd-Warshall algorithm Johnson’s algorithm Dr. Donghyun Kim Department of Computer Science College of Computing and Software Engineering Kennesaw State University Marietta, GA, USA This slide is courtesy of Prof. Charles E. Leiserson at Massachusetts Institute of Technology.

Shortest paths Single-source shortest paths Nonnegative edge weights Dijkstra’s algorithm: O(E + V lg V) General Bellman-Ford algorithm: O(VE) DAG One pass of Bellman-Ford: O(V + E) Fall 2016 CS 4306 Algorithm Analysis

Shortest paths Single-source shortest paths Nonnegative edge weights Dijkstra’s algorithm: O(E + V lg V) General Bellman-Ford: O(VE) DAG One pass of Bellman-Ford: O(V + E) All-pairs shortest paths Dijkstra’s algorithm |V| times: O(VE + V 2 lg V) Three algorithms today. Fall 2016 CS 4306 Algorithm Analysis

All-pairs shortest paths Input: Digraph G = (V, E), where V = {1, 2, …, n}, with edge-weight function w : E  R. Output: n  n matrix of shortest-path lengths (i, j) for all i, j  V. Fall 2016 CS 4306 Algorithm Analysis

All-pairs shortest paths Input: Digraph G = (V, E), where V = {1, 2, …, n}, with edge-weight function w : E  R. Output: n  n matrix of shortest-path lengths (i, j) for all i, j  V. IDEA: Run Bellman-Ford once from each vertex. Time = O(V 2E). Dense graph (n2 edges)  (n 4) time in the worst case. Good first try! Fall 2016 CS 4306 Algorithm Analysis

d (0) = Dynamic programming Consider the n  n adjacency matrix A = (aij) of the digraph, and define d (m) = weight of a shortest path from ij i to j that uses at most m edges. Claim: We have 0 if i = j,  if i  j; d (0) = ij and for m = 1, 2, …, n – 1, Fall 2016 CS 4306 Algorithm Analysis

dij k ik(m–1) Proof of claim (m) = min {d  akj }  m – 1 edges k’s j …  m – 1 edges Fall 2016 CS 4306 Algorithm Analysis

dij k ik(m–1) Proof of claim (m) = min {d  akj }  m – 1 edges k’s j … Relaxation! for k  1 to n do if dij > dik + akj then dij  dik + akj  m – 1 edges Fall 2016 CS 4306 Algorithm Analysis

dij k ik(m–1) Proof of claim (m) = min {d  akj }  m – 1 edges k’s j … Relaxation! for k  1 to n do if dij > dik + akj then dij  dik + akj  m – 1 edges Note: No negative-weight cycles implies (i, j) = dij (n–1) = dij (n) = dij (n+1) = … Fall 2016 CS 4306 Algorithm Analysis

Matrix multiplication Compute C = A · B, where C, A, and B are n  n matrices: n cij   aikbkj . k 1 Time = (n3) using the standard algorithm. Fall 2016 CS 4306 Algorithm Analysis

Matrix multiplication Compute C = A · B, where C, A, and B are n  n matrices: n cij   aikbkj . k 1 Time = (n3) using the standard algorithm. What if we map “+”  “min” and “·”  “+”? Fall 2016 CS 4306 Algorithm Analysis

Matrix multiplication Compute C = A · B, where C, A, and B are n  n matrices: n cij   aikbkj . k 1 Time = (n3) using the standard algorithm. What if we map “+”  “min” and “·”  “+”? cij = mink {aik + bkj}. Thus, D(m) = D(m–1) “” A. Fall 2016 CS 4306 Algorithm Analysis

Matrix multiplication (continued) The (min, +) multiplication is associative, and with the real numbers, it forms an algebraic structure called a closed semiring. Consequently, we can compute D(1) = D(2) = D(0) · A = A1 D(1) · A = A2 … D = D · A = A , (n–1) (n–2) n–1 yielding D(n–1) = ((i, j)). Time = (n·n3) = (n4). No better than n  B-F. Fall 2016 CS 4306 Algorithm Analysis

Improved matrix multiplication algorithm Repeated squaring: A2k = Ak × Ak. Compute A2, A4, …, Alg(n–1) . O(lg n) squarings Note: An–1 = An = An+1 = ... Time = (n3 lg n). To detect negative-weight cycles, check the diagonal for negative values in O(n) additional time. Fall 2016 CS 4306 Algorithm Analysis

Floyd-Warshall algorithm Also dynamic programming, but faster! Define c (k) = weight of a shortest path from i ij to j with intermediate vertices belonging to the set {1, 2, …, k}. (n) (0) Thus, (i, j) = cij . Also, cij = aij . Fall 2016 CS 4306 Algorithm Analysis

Floyd-Warshall recurrence (k) = min {c (k–1), c (k–1) + c (k–1)} cij ij ik kj k cik(k–1) (k–1) ckj j i c (k–1) ij intermediate vertices in {1, 2, …, k} Fall 2016 CS 4306 Algorithm Analysis

Pseudocode for Floyd- Warshall for k  1 to n do for i  1 to n do for j  1 to n do if cij > cik + ckj relaxation then cij  cik + ckj Notes: Okay to omit superscripts, since extra relaxations can’t hurt. Runs in (n3) time. Simple to code. Efficient in practice. Fall 2016 CS 4306 Algorithm Analysis

Transitive closure of a directed graph 1 if there exists a path from i to j, 0 otherwise. Compute tij = IDEA: Use Floyd-Warshall, but with (, ) instead of (min, +): (k) = t (k–1)  (t (k–1)  t (k–1)). tij ij ik kj Time = (n3). Fall 2016 CS 4306 Algorithm Analysis

Graph reweighting Theorem. Given a function h : V  R, reweight each edge (u, v)  E by wh(u, v) = w(u, v) + h(u) – h(v). Then, for any two vertices, all paths between them are reweighted by the same amount. Fall 2016 CS 4306 Algorithm Analysis

Graph reweighting Same amount! Theorem. Given a function h : V  R, reweight each edge (u, v)  E by wh(u, v) = w(u, v) + h(u) – h(v). Then, for any two vertices, all paths between them are reweighted by the same amount. Proof. Let p = v1  v2  …  vk be a path in G. We have k 1 wh ( p)   wh (vi ,vi1 ) i1 k 1  w(vi ,vi1 )h(vi )h(vi1) i1 k 1   w(vi ,vi1 )  h(v1 )  h(vk ) Same amount! i1  w( p)  h(v1 )  h(vk ) . Fall 2016 CS 4306 Algorithm Analysis

Shortest paths in reweighted graphs Corollary. h(u, v) = (u, v) + h(u) – h(v). Fall 2016 CS 4306 Algorithm Analysis

Shortest paths in reweighted graphs Corollary. h(u, v) = (u, v) + h(u) – h(v). IDEA: Find a function h : V  R such that wh(u, v)  0 for all (u, v)  E. Then, run Dijkstra’s algorithm from each vertex on the reweighted graph. NOTE: wh(u, v)  0 iff h(v) – h(u)  w(u, v). Fall 2016 CS 4306 Algorithm Analysis

(u, v) = h(u, v) – h(u) + h(v) . Johnson’s algorithm Find a function h : V  R such that wh(u, v)  0 for all (u, v)  E by using Bellman-Ford to solve the difference constraints h(v) – h(u)  w(u, v), or determine that a negative-weight cycle exists. Time = O(V E). Run Dijkstra’s algorithm using wh from each vertex u  V to compute h(u, v) for all v  V. Time = O(VE + V 2 lg V). For each (u, v)  V  V, compute (u, v) = h(u, v) – h(u) + h(v) . Time = O(V 2). Total time = O(VE + V 2 lg V). Fall 2016 CS 4306 Algorithm Analysis

Problem Set Exercises 25.1-1, 25.1-2 Exercises 25.2-1, 25.2-4, 25.2-6 Fall 2017 CS 4306 Algorithm Analysis