Example 2 You are traveling by a canoe down a river and there are n trading posts along the way. Before starting your journey, you are given for each 1<=i<=j<=n,

Slides:



Advertisements
Similar presentations
Dynamic Programming 25-Mar-17.
Advertisements

CS Section 600 CS Section 002 Dr. Angela Guercio Spring 2010.
Longest Common Subsequence
Comp 122, Spring 2004 Greedy Algorithms. greedy - 2 Lin / Devi Comp 122, Fall 2003 Overview  Like dynamic programming, used to solve optimization problems.
Greedy Algorithms Be greedy! always make the choice that looks best at the moment. Local optimization. Not always yielding a globally optimal solution.
Lecture 8: Dynamic Programming Shang-Hua Teng. Longest Common Subsequence Biologists need to measure how similar strands of DNA are to determine how closely.
Overview What is Dynamic Programming? A Sequence of 4 Steps
Algorithms Dynamic programming Longest Common Subsequence.
1 Appendix B: Solving TSP by Dynamic Programming Course: Algorithm Design and Analysis.
Introduction to Algorithms
CSE115/ENGR160 Discrete Mathematics 02/28/12
1 Dynamic Programming Jose Rolim University of Geneva.
Inexact Matching of Strings General Problem –Input Strings S and T –Questions How distant is S from T? How similar is S to T? Solution Technique –Dynamic.
1 Dynamic Programming (DP) Like divide-and-conquer, solve problem by combining the solutions to sub-problems. Differences between divide-and-conquer and.
Dynamic Programming Part 1: intro and the assembly-line scheduling problem.
Introduction to Algorithms Second Edition by Cormen, Leiserson, Rivest & Stein Chapter 15.
UMass Lowell Computer Science Analysis of Algorithms Prof. Karen Daniels Fall, 2006 Lecture 1 (Part 3) Design Patterns for Optimization Problems.
CSE115/ENGR160 Discrete Mathematics 03/03/11 Ming-Hsuan Yang UC Merced 1.
UMass Lowell Computer Science Analysis of Algorithms Prof. Karen Daniels Spring, 2009 Design Patterns for Optimization Problems Dynamic Programming.
Dynamic Programming Optimization Problems Dynamic Programming Paradigm
Greedy Algorithms CIS 606 Spring Greedy Algorithms Similar to dynamic programming. Used for optimization problems. Idea – When we have a choice.
Dynamic Programming1. 2 Outline and Reading Matrix Chain-Product (§5.3.1) The General Technique (§5.3.2) 0-1 Knapsack Problem (§5.3.3)
1 Dynamic Programming Andreas Klappenecker [based on slides by Prof. Welch]
UNC Chapel Hill Lin/Manocha/Foskey Optimization Problems In which a set of choices must be made in order to arrive at an optimal (min/max) solution, subject.
Analysis of Algorithms CS 477/677
Sequence Alignment II CIS 667 Spring Optimal Alignments So we know how to compute the similarity between two sequences  How do we construct an.
UNIVERSITY OF SOUTH CAROLINA College of Engineering & Information Technology Bioinformatics Algorithms and Data Structures Chapter 11: Core String Edits.
© 2004 Goodrich, Tamassia Dynamic Programming1. © 2004 Goodrich, Tamassia Dynamic Programming2 Matrix Chain-Products (not in book) Dynamic Programming.
1 Dynamic Programming Jose Rolim University of Geneva.
Lecture 7 Topics Dynamic Programming
Longest Common Subsequence
16.Greedy algorithms Hsu, Lih-Hsing. Computer Theory Lab. Chapter 16P An activity-selection problem Suppose we have a set S = {a 1, a 2,..., a.
1 Theory I Algorithm Design and Analysis (11 - Edit distance and approximate string matching) Prof. Dr. Th. Ottmann.
Dynamic Programming Introduction to Algorithms Dynamic Programming CSE 680 Prof. Roger Crawfis.
Dynamic Programming. Well known algorithm design techniques:. –Divide-and-conquer algorithms Another strategy for designing algorithms is dynamic programming.
CS 5243: Algorithms Dynamic Programming Dynamic Programming is applicable when sub-problems are dependent! In the case of Divide and Conquer they are.
COSC 3101A - Design and Analysis of Algorithms 7 Dynamic Programming Assembly-Line Scheduling Matrix-Chain Multiplication Elements of DP Many of these.
6/4/ ITCS 6114 Dynamic programming Longest Common Subsequence.
1 Dynamic Programming Andreas Klappenecker [partially based on slides by Prof. Welch]
Copyright © The McGraw-Hill Companies, Inc. Permission required for reproduction or display. Dynamic Programming.
Introduction to Algorithms Jiafen Liu Sept
Dynamic Programming Min Edit Distance Longest Increasing Subsequence Climbing Stairs Minimum Path Sum.
Optimization Problems In which a set of choices must be made in order to arrive at an optimal (min/max) solution, subject to some constraints. (There may.
Dynamic Programming1. 2 Outline and Reading Matrix Chain-Product (§5.3.1) The General Technique (§5.3.2) 0-1 Knapsack Problem (§5.3.3)
Dynamic Programming.  Decomposes a problem into a series of sub- problems  Builds up correct solutions to larger and larger sub- problems  Examples.
Instructor Neelima Gupta Instructor: Ms. Neelima Gupta.
Chapter 7 Dynamic Programming 7.1 Introduction 7.2 The Longest Common Subsequence Problem 7.3 Matrix Chain Multiplication 7.4 The dynamic Programming Paradigm.
A Different Solution  alternatively we can use the following algorithm: 1. if n == 0 done, otherwise I. print the string once II. print the string (n.
CSCI-256 Data Structures & Algorithm Analysis Lecture Note: Some slides by Kevin Wayne. Copyright © 2005 Pearson-Addison Wesley. All rights reserved. 18.
CSC317 1 So far so good, but can we do better? Yes, cheaper by halves... orkbook/cheaperbyhalf.html.
Dynamic Programming (Edit Distance). Edit Distance Input: – Two input strings S1 (of size n) and S2 (of size m) E.g., S1 = ATTTCTAGTGGGTAAA S2 = ATCTAGTTTAGGGATA.
TU/e Algorithms (2IL15) – Lecture 3 1 DYNAMIC PROGRAMMING
TU/e Algorithms (2IL15) – Lecture 4 1 DYNAMIC PROGRAMMING II
6/13/20161 Greedy A Comparison. 6/13/20162 Greedy Solves an optimization problem: the solution is “best” in some sense. Greedy Strategy: –At each decision.
1 Chapter 15-2: Dynamic Programming II. 2 Matrix Multiplication Let A be a matrix of dimension p x q and B be a matrix of dimension q x r Then, if we.
Dr Nazir A. Zafar Advanced Algorithms Analysis and Design Advanced Algorithms Analysis and Design By Dr. Nazir Ahmad Zafar.
Dynamic Programming for the Edit Distance Problem.
Dynamic Programming Csc 487/687 Computing for Bioinformatics.
Least common subsequence:
CS 3343: Analysis of Algorithms
Longest Common Subsequence
Lecture 8. Paradigm #6 Dynamic Programming
Trevor Brown DC 2338, Office hour M3-4pm
Bioinformatics Algorithms and Data Structures
DYNAMIC PROGRAMMING.
Longest Common Subsequence
Longest Common Subsequence
Algorithms Tutorial 27th Sept, 2019.
Presentation transcript:

Example 2 You are traveling by a canoe down a river and there are n trading posts along the way. Before starting your journey, you are given for each 1<=i<=j<=n, the fee fi,j for renting a canoe from post i to post j. These fees are arbitrary. For example it is possible that f1,3=10 and f1,4=5. you begin at trading post 1 and must end at trading post n (using rented canoes). Your goal is to minimize the rental cost. Given the most efficient algorithm you can for this problem. Be sure to prove that your algorithm yields an optimal solution and analyze the time complexity.

Example 2: algorithm and proof Let m[i] be the rental cost for the best solution to go from post 1 to post i for 1<=i<=n. The final answer is m[n]. We can recursively define m[i] as follows: m[i]=0 if i=1 = min 1<=j<i (m[j] + fj,i) otherwise We now prove this is correct. The canoe must be rented starting at post 1 and then finally be returned at a station n. For a canoe returned at post i, we try all possibilities with j being the station where the canoe was previously rented. Furthermore, since fj,i is independent from how the subproblem of going from 1 to j is solved, we have the optimal substructure property. For the time complexity there are n subproblems to be solved for each post, each of which takes O(n) time. Thus, the overall time complexity is O(n2).

DP example 3: alignment Here we look at a problem from computational biology. You can think of a DNA sequence as a sequence defined on {a, c, g, t}. Suppose you are given DNA sequence D1 of n1 characters and DNA sequence D2 of n2 characters. You want to know whether these two DNA sequences are “evolutionarily related” by aligning them. An alignment is defined by inserting any number of spaces in D1 and D2 so that the resulting strings D1’ and D2’ both have the same length. For a particular alignment A, we say cost(A) is the number of mismatches. For example, one legal alignment between “ctatg” and “ttaagc” is: ct-at-g- -tta-agc The cost is 5. Give the most efficient algorithm you can (analyzed as a function of n1 and n2) to compute the alignment of minimum cost.

Example 3: algorithm & proof The general form of the subproblem we solve will be: find the best alignment for the first i characters of D1 and the first j characters of D2 for 1<=i<=n1 and 1<=j<=n2. Let D[i] be the ith character in string D. Let D[i..j] be the substring from ith character to jth charcter. Let c[i,j] be the cost of an optimal alignment for D1[1..i] and D2[1..j]. Then c[n1,n2] gives the optimal cost to the original problem. We can define c[i,j] recursively as shown: c[i,j] = i if j=0 = j if i =0 = c[i-1,j-1] if D1[i] == D2[j] = min {c[i-1,j-1], c[i-1,j], c[i, j-1]}+1 otherwise We now argue that this recursive definition is correct. You can form D1’ and D2’ (and hence the alignment) for the subproblem from the right to left as follows. In an optimal alignment either the last character of D1’ is a spacer or it is the last character i of D1and the last character of D2’ is a spacer or the last character of D2. There are thus only three possibilities for the last column of the alignment: D1[i] -, - D2[j], or D1[i] D2[j]. Hence the above recursive definition considers all possible cases that the optimal alignment could have. Since the solution to the original problem is either the value of the subproblem solution or otherwise one plus the subproblem solution, the optimal substructure property clearly holds. Thus the solution output is correct. For the time complexity it is O(n1 n2) since there are n1 * n2 subproblems each of which is solved in constant time.