Dynamic Programming Kun-Mao Chao ( 趙坤茂 ) Department of Computer Science and Information Engineering National Taiwan University, Taiwan

Slides:



Advertisements
Similar presentations
Dynamic Programming ACM Workshop 24 August Dynamic Programming Dynamic Programming is a programming technique that dramatically reduces the runtime.
Advertisements

Dynamic Programming.
Longest Common Subsequence
Greedy Algorithms Be greedy! always make the choice that looks best at the moment. Local optimization. Not always yielding a globally optimal solution.
Overview What is Dynamic Programming? A Sequence of 4 Steps
COMP8620 Lecture 8 Dynamic Programming.
Greedy Algorithms Basic idea Connection to dynamic programming Proof Techniques.
RAIK 283: Data Structures & Algorithms
Heaviest Segments in a Number Sequence Kun-Mao Chao ( 趙坤茂 ) Department of Computer Science and Information Engineering National Taiwan University, Taiwan.
Chapter 7 Dynamic Programming 7.
§ 8 Dynamic Programming Fibonacci sequence
Dynamic Programming Reading Material: Chapter 7..
Chapter 8 Dynamic Programming Copyright © 2007 Pearson Addison-Wesley. All rights reserved.
Dynamic Programming Reading Material: Chapter 7 Sections and 6.
Analysis of Algorithms
UMass Lowell Computer Science Analysis of Algorithms Prof. Karen Daniels Fall, 2005 Design Patterns for Optimization Problems Dynamic Programming.
Space-Saving Strategies for Computing Δ-points Kun-Mao Chao ( 趙坤茂 ) Department of Computer Science and Information Engineering National Taiwan University,
Dynamic Programming – Part 2 Introduction to Algorithms Dynamic Programming – Part 2 CSE 680 Prof. Roger Crawfis.
Algorithms and Data Structures Lecture X
Lecture 5 Dynamic Programming. Dynamic Programming Self-reducibility.
Counting Spanning Trees Kun-Mao Chao ( 趙坤茂 ) Department of Computer Science and Information Engineering National Taiwan University, Taiwan
Dynamic-Programming Strategies for Analyzing Biomolecular Sequences Kun-Mao Chao ( 趙坤茂 ) Department of Computer Science and Information Engineering National.
CSCE350 Algorithms and Data Structure Lecture 17 Jianjun Hu Department of Computer Science and Engineering University of South Carolina
Dynamic Programming. Well known algorithm design techniques:. –Divide-and-conquer algorithms Another strategy for designing algorithms is dynamic programming.
Dynamic Programming Dynamic Programming Dynamic Programming is a general algorithm design technique for solving problems defined by or formulated.
Dynamic-Programming Strategies for Analyzing Biomolecular Sequences.
Dynamic Programming Method for Analyzing Biomolecular Sequences Tao Jiang Department of Computer Science University of California - Riverside (Typeset.
Chapter 8 Dynamic Programming Copyright © 2007 Pearson Addison-Wesley. All rights reserved.
Dynamic Programming Chapter 15 Highlights Charles Tappert Seidenberg School of CSIS, Pace University.
Dynamic Programming.
7 -1 Chapter 7 Dynamic Programming Fibonacci sequence Fibonacci sequence: 0, 1, 1, 2, 3, 5, 8, 13, 21, … F i = i if i  1 F i = F i-1 + F i-2 if.
Minimum Routing Cost Spanning Trees Kun-Mao Chao ( 趙坤茂 ) Department of Computer Science and Information Engineering National Taiwan University, Taiwan.
Dynamic-Programming Strategies for Analyzing Biomolecular Sequences Kun-Mao Chao ( 趙坤茂 ) Department of Computer Science and Information Engineering National.
Multiple Sequence Alignment Kun-Mao Chao ( 趙坤茂 ) Department of Computer Science and Information Engineering National Taiwan University, Taiwan WWW:
INTRODUCTION. What is an algorithm? What is a Problem?
Dynamic Programming David Kauchak cs302 Spring 2013.
Heaviest Segments in a Number Sequence Kun-Mao Chao ( 趙坤茂 ) Department of Computer Science and Information Engineering National Taiwan University, Taiwan.
Computer Sciences Department1.  Property 1: each node can have up to two successor nodes (children)  The predecessor node of a node is called its.
Chapter 7 Dynamic Programming 7.1 Introduction 7.2 The Longest Common Subsequence Problem 7.3 Matrix Chain Multiplication 7.4 The dynamic Programming Paradigm.
Chapter 15 Dynamic Programming Lee, Hsiu-Hui Ack: This presentation is based on the lecture slides from Hsu, Lih-Hsing, as well as various materials from.
Trees Kun-Mao Chao ( 趙坤茂 ) Department of Computer Science and Information Engineering National Taiwan University, Taiwan
Space-Saving Strategies for Analyzing Biomolecular Sequences Kun-Mao Chao ( 趙坤茂 ) Department of Computer Science and Information Engineering National Taiwan.
TU/e Algorithms (2IL15) – Lecture 4 1 DYNAMIC PROGRAMMING II
Dynamic Programming Sequence of decisions. Problem state.
Algorithmics - Lecture 11
Sequence Alignment Kun-Mao Chao (趙坤茂)
Dynamic Programming Copyright © 2007 Pearson Addison-Wesley. All rights reserved.
Dynamic-Programming Strategies for Analyzing Biomolecular Sequences
Chapter 8 Dynamic Programming
Heaviest Segments in a Number Sequence
A Quick Note on Useful Algorithmic Strategies
Unit-4: Dynamic Programming
Dynamic Programming 1/15/2019 8:22 PM Dynamic Programming.
Dynamic Programming.
A Note on Useful Algorithmic Strategies
A Note on Useful Algorithmic Strategies
A Note on Useful Algorithmic Strategies
Sequence Alignment Kun-Mao Chao (趙坤茂)
A Note on Useful Algorithmic Strategies
Introduction to Algorithms: Dynamic Programming
Dynamic Programming.
Sequence Alignment Kun-Mao Chao (趙坤茂)
Space-Saving Strategies for Analyzing Biomolecular Sequences
Major Design Strategies
A Note on Useful Algorithmic Strategies
A Note on Useful Algorithmic Strategies
Trees Kun-Mao Chao (趙坤茂)
Dynamic Programming Kun-Mao Chao (趙坤茂)
Presentation transcript:

Dynamic Programming Kun-Mao Chao ( 趙坤茂 ) Department of Computer Science and Information Engineering National Taiwan University, Taiwan WWW:

2 Dynamic Programming Dynamic programming is a class of solution methods for solving sequential decision problems with a compositional cost structure. Richard Bellman was one of the principal founders of this approach.

3 Two key ingredients Two key ingredients for an optimization problem to be suitable for a dynamic- programming solution: Each substructure is optimal. (Principle of optimality) 1. optimal substructures 2. overlapping subproblems Subproblems are dependent. (otherwise, a divide-and- conquer approach is the choice.)

4 Three basic components The development of a dynamic- programming algorithm has three basic components: –The recurrence relation (for defining the value of an optimal solution); –The tabular computation (for computing the value of an optimal solution); –The traceback (for delivering an optimal solution).

5 Fibonacci numbers.for       i>1 i F i F i F F F The Fibonacci numbers are defined by the following recurrence:

6 How to compute F 10 ? F 10 F9F9 F8F8 F8F8 F7F7 F7F7 F6F6 ……

7 Tabular computation The tabular computation can avoid recompuation. F0F0 F1F1 F2F2 F3F3 F4F4 F5F5 F6F6 F7F7 F8F8 F9F9 F

8 Longest increasing subsequence(LIS) The longest increasing subsequence is to find a longest increasing subsequence of a given sequence of distinct integers a 1 a 2 …a n. e.g are increasing subsequences. are not increasing subsequences. We want to find a longest one.

9 A naive approach for LIS Let L[i] be the length of a longest increasing subsequence ending at position i. L[i] = 1 + max j = 0..i-1 {L[j] | a j < a i } (use a dummy a 0 = minimum, and L[0]=0) L[i] ?

10 A naive approach for LIS L[i] L[i] = 1 + max j = 0..i-1 {L[j] | a j < a i } The maximum length The subsequence 2, 3, 7, 8, 10, 13 is a longest increasing subsequence. This method runs in O(n 2 ) time.

11 Binary search Given an ordered sequence x 1 x 2... x n, where x 1 <x 2 <... <x n, and a number y, a binary search finds the largest x i such that x i < y in O(log n) time. n... n/2 n/4

12 Binary search How many steps would a binary search reduce the problem size to 1? n n/2 n/4 n/8 n/ How many steps? O(log n) steps.

13 An O(n log n) method for LIS Define BestEnd[k] to be the smallest number of an increasing subsequence of length k BestEnd[1] BestEnd[2] BestEnd[3] BestEnd[4] BestEnd[5] BestEnd[6]

14 An O(n log n) method for LIS Define BestEnd[k] to be the smallest number of an increasing subsequence of length k BestEnd[1] BestEnd[2] BestEnd[3] BestEnd[4] BestEnd[5] BestEnd[6] For each position, we perform a binary search to update BestEnd. Therefore, the running time is O(n log n).

15 Longest Common Subsequence (LCS) A subsequence of a sequence S is obtained by deleting zero or more symbols from S. For example, the following are all subsequences of “president”: pred, sdn, predent. The longest common subsequence problem is to find a maximum-length common subsequence between two sequences.

16 LCS For instance, Sequence 1: president Sequence 2: providence Its LCS is priden. president providence

17 LCS Another example: Sequence 1: algorithm Sequence 2: alignment One of its LCS is algm. a l g o r i t h m a l i g n m e n t

18 How to compute LCS? Let A=a 1 a 2 …a m and B=b 1 b 2 …b n. len(i, j): the length of an LCS between a 1 a 2 …a i and b 1 b 2 …b j With proper initializations, len(i, j)can be computed as follows.

19

20

21

22

23 Longest Common Increasing Subsequence Proposed by Yang, Huang and Chao –IPL 2005 Improvement for some special case: –Katriel and Kutz (March 2005) –Chan, Zhang, Fung, Ye and Zhu (ISAAC 2005)