The Manhattan Tourist Problem Shane Wood 4/29/08 CS 329E.

Slides:



Advertisements
Similar presentations
Approximation algorithms for geometric intersection graphs.
Advertisements

CS1022 Computer Programming & Principles
Limitation of Computation Power – P, NP, and NP-complete
Max- coloring in trees SRIRAM V.PEMMARAJU AND RAJIV RAMAN BY JAYATI JENNIFER LAW.
Design and Analysis of Algorithms Approximation algorithms for NP-complete problems Haidong Xue Summer 2012, at GSU.
Depth-First Search1 Part-H2 Depth-First Search DB A C E.
Comp 122, Spring 2004 Greedy Algorithms. greedy - 2 Lin / Devi Comp 122, Fall 2003 Overview  Like dynamic programming, used to solve optimization problems.
Reducibility Class of problems A can be reduced to the class of problems B Take any instance of problem A Show how you can construct an instance of problem.
The Greedy Approach Chapter 8. The Greedy Approach It’s a design technique for solving optimization problems Based on finding optimal local solutions.
Introduction to Bioinformatics Algorithms Divide & Conquer Algorithms.
Introduction to Bioinformatics Algorithms Divide & Conquer Algorithms.
Introduction to Bioinformatics Algorithms Divide & Conquer Algorithms.
Dynamic Programming: Sequence alignment
1 Appendix B: Solving TSP by Dynamic Programming Course: Algorithm Design and Analysis.
 2004 SDU Lecture11- All-pairs shortest paths. Dynamic programming Comparing to divide-and-conquer 1.Both partition the problem into sub-problems 2.Divide-and-conquer.
Outline The power of DNA Sequence Comparison The Change Problem
Algorithm Strategies Nelson Padua-Perez Chau-Wen Tseng Department of Computer Science University of Maryland, College Park.
Computability and Complexity 23-1 Computability and Complexity Andrei Bulatov Search and Optimization.
1 Discrete Structures & Algorithms Graphs and Trees: II EECE 320.
All Pairs Shortest Paths and Floyd-Warshall Algorithm CLRS 25.2
Dynamic Programming: Edit Distance
Dynamic Programming Reading Material: Chapter 7..
Introduction to Bioinformatics Algorithms Dynamic Programming: Edit Distance.
Greedy Algorithms Reading Material: Chapter 8 (Except Section 8.5)
1 Advanced Algorithms All-pairs SPs DP algorithm Floyd-Warshall alg.
Pertemuan 23 : Penerapan Dinamik Programming (DP) Mata kuliah : K0164-Pemrograman vers 01.
Odds and Ends HP ≤ p HC (again) –Turing reductions Strong NP-completeness versus Weak NP-completeness Vertex Cover to Hamiltonian Cycle.
Greedy Algorithms Like dynamic programming algorithms, greedy algorithms are usually designed to solve optimization problems Unlike dynamic programming.
An introduction to Approximation Algorithms Presented By Iman Sadeghi.
Analysis of Algorithms
Approximation Algorithms Motivation and Definitions TSP Vertex Cover Scheduling.
1 Theory I Algorithm Design and Analysis (11 - Edit distance and approximate string matching) Prof. Dr. Th. Ottmann.
Approximation Algorithms
Sequence Alignment.
Sven Koenig, Joseph S. B. Mitchell, Apura Mudgal, Craig Tovey A Near-Tight Approximation Algorithm for the Robot Localization Problem Presented by: Mike.
Complexity Classes (Ch. 34) The class P: class of problems that can be solved in time that is polynomial in the size of the input, n. if input size is.
Pairwise Sequence Alignment (I) (Lecture for CS498-CXZ Algorithms in Bioinformatics) Sept. 22, 2005 ChengXiang Zhai Department of Computer Science University.
APPROXIMATION ALGORITHMS VERTEX COVER – MAX CUT PROBLEMS
Dynamic Programming. Well known algorithm design techniques:. –Divide-and-conquer algorithms Another strategy for designing algorithms is dynamic programming.
Dynamic Programming: Sequence alignment CS 466 Saurabh Sinha.
An Introduction to Bioinformatics 2. Comparing biological sequences: sequence alignment.
TECH Computer Science NP-Complete Problems Problems  Abstract Problems  Decision Problem, Optimal value, Optimal solution  Encodings  //Data Structure.
Flight Itinerary Problem ICS 311 Fall 2006 Matt Freeburg.
CSC 413/513: Intro to Algorithms NP Completeness.
Week 10Complexity of Algorithms1 Hard Computational Problems Some computational problems are hard Despite a numerous attempts we do not know any efficient.
Dynamic Programming: Manhattan Tourist Problem Lecture 17.
1 The Floyd-Warshall Algorithm Andreas Klappenecker.
Minimum Spanning Trees CS 146 Prof. Sin-Min Lee Regina Wang.
Depth-First Search1 DB A C E. 2 Depth-first search (DFS) is a general technique for traversing a graph A DFS traversal of a graph G – Visits all the vertices.
CS 3343: Analysis of Algorithms Lecture 18: More Examples on Dynamic Programming.
CSE 589 Part V One of the symptoms of an approaching nervous breakdown is the belief that one’s work is terribly important. Bertrand Russell.
Introduction to Algorithms All-Pairs Shortest Paths My T. UF.
Lecture 19 Minimal Spanning Trees CSCI – 1900 Mathematics for Computer Science Fall 2014 Bill Pine.
CSE 589 Applied Algorithms Spring 1999 Prim’s Algorithm for MST Load Balance Spanning Tree Hamiltonian Path.
Conceptual Foundations © 2008 Pearson Education Australia Lecture slides for this course are based on teaching materials provided/referred by: (1) Statistics.
Limitation of Computation Power – P, NP, and NP-complete
Greedy Algorithms General principle of greedy algorithm
Data Structures Lab Algorithm Animation.
CS1022 Computer Programming & Principles
An introduction to Approximation Algorithms Presented By Iman Sadeghi
CS330 Discussion 6.
CS200: Algorithm Analysis
Dynamic Programming Characterize the structure (problem state) of optimal solution Recursively define the value of optimal solution Compute the value of.
Sequence Alignment Using Dynamic Programming
Dynamic Programming Characterize the structure (problem state) of optimal solution Recursively define the value of optimal solution Compute the value of.
Approximation Algorithms
A path that uses every vertex of the graph exactly once.
Problem Solving 4.
All pairs shortest path problem
Lecture 36 CSE 331 Nov 28, 2011.
Presentation transcript:

The Manhattan Tourist Problem Shane Wood 4/29/08 CS 329E

Problem Summary A tourist group in Manhattan wishes to see as many sights moving south-east from the corner of 59 th St. and 8 th Ave. to 42 nd Street and Lexington Ave. How can we achieve this using dynamic programming?

Summary (cont.) Imagine the map as a graph with a source (59 th St. and 8 th Ave.) and a sink ( 42 nd St. and Lexington Ave.: 8 th Ave 7 th Ave 6 th Ave 5 th Ave MadisonAve Park Ave LexingtonAve ThirdAve 59 th St 57 th St 55 th St 53 rd St 51 st St 49 th St 47 th St 45 th St 43 rd St 42 nd St

4 43 rd St 42 nd St 45 th St 47 th St 49 th St 51 st St 53 rd St 55 th St 57 th St 59 th St ThirdAve LexingtonAve Park Ave MadisonAve 5 th Ave 6 th Ave 7 th Ave 8 th Ave By imagining vertices representing intersections and weighted edges representing street blocks, we can reduce the Manhattan Tourist Problem to what is known as the Longest Path Problem.

Manhattan Tourist Problem: Input: A weighted grid G with a starting source vertex and an ending sink vertex Output: A longest path in G from source to sink

Strategy It is better to solve a generalized version of the MTP Rather than solving the longest path from (0,0) (source) to (n,m) (sink for an n x m grid), we will solve (0,0) to (i,j) for 0 ≤ i ≤ n and 0 ≤ j ≤ m (an arbitrary vertex) If (i,j) is a vertex in the longest path to the sink, the longest path to (i,j) must be contained in the longest path to the sink

Exhaustive Solution Generate ALL possible paths in grid G Output the longest path Not feasible for even a moderately sized graph

A Greedy Algorithm At every vertex, choose the adjacent edge with the highest weight Easily achievable in polynomial time, but is unlikely to give the optimal solution, especially for larger graphs! Source Sink 12 v 28

DP Approach For every vertex, we want to find s i,j, the weight of the longest path from (0,0) to that vertex Base Case: –Finding s 0,j and s i,0 for all i and j is easy S 0,j S i,

The tourist can now arrive at (1,1) in one of two ways: –Traveling south from (1,0), or –Traveling east from (0,1) Once s 0,j and s i,0 are computed, we can determine s 1,1 by comparing these two possibilities and determining the max –s 1,1 = max s 0,1 + weight of edge between (0,1) and (1,1) s 1,0 + weight of edge between (1,0) and (1,1)

This same logic applies more generally: –s i,j = max We can thus compute every value for s i,j recursively with one run through the grid s i-1,j + weight of edge between (i-1,j) and (i,j) s i,j-1 + weight of edge between (i,j-1) and (i,j)

Algorithm used for DP solution Let w i,j represent the weight of a southerly move (the weight of the edge between (i,j-1) and (i,j) ) and w i,j represent the weight of an easterly move (the weight of the edge between (i-1, j) and (i,j) ) 1 s 0,0 0 2 for i 1 to n 3 s i,0 s i-1,0 + w i,0 4 for j 1 to n 5 s 0,j s 0,j-1 + w 0,j 6 for i 1 to n 7 for j 1 to m 8 s i,j max 9 return s n,m s i-1,j + w i,j s i,j-1 + w i,j Running Time: O(n x m) for an n x m grid

Further analysis Note that lines 1-5 in the algorithm are generating the base cases we use to develop later recurrence relations We can generate the longest path by keeping track of which paths are used to generate s n,m !

Questions?? Thanks!