§ 8 Dynamic Programming Fibonacci sequence

Slides:



Advertisements
Similar presentations
Longest Common Subsequence
Advertisements

DYNAMIC PROGRAMMING ALGORITHMS VINAY ABHISHEK MANCHIRAJU.
Chapter 7 Dynamic Programming.
Inexact Matching of Strings General Problem –Input Strings S and T –Questions How distant is S from T? How similar is S to T? Solution Technique –Dynamic.
Comp 122, Fall 2004 Dynamic Programming. dynprog - 2 Lin / Devi Comp 122, Spring 2004 Longest Common Subsequence  Problem: Given 2 sequences, X =  x.
Dynamic Programming (II)
Chapter 3 The Greedy Method 3.
Dynamic Programming Optimization Problems Dynamic Programming Paradigm
Chapter 7 Dynamic Programming 7.
Branch and Bound Searching Strategies
Dynamic Programming Reading Material: Chapter 7..
3 -1 Chapter 3 The Greedy Method 3 -2 The greedy method Suppose that a problem can be solved by a sequence of decisions. The greedy method has that each.
Dynamic Programming1. 2 Outline and Reading Matrix Chain-Product (§5.3.1) The General Technique (§5.3.2) 0-1 Knapsack Problem (§5.3.3)
4 -1 Chapter 4 The Sequence Alignment Problem The Longest Common Subsequence (LCS) Problem A string : S 1 = “ TAGTCACG ” A subsequence of S 1 :
Inexact Matching General Problem –Input Strings S and T –Questions How distant is S from T? How similar is S to T? Solution Technique –Dynamic programming.
CSC401 – Analysis of Algorithms Lecture Notes 12 Dynamic Programming
Chapter 8 Dynamic Programming Copyright © 2007 Pearson Addison-Wesley. All rights reserved.
Dynamic Programming1. 2 Outline and Reading Matrix Chain-Product (§5.3.1) The General Technique (§5.3.2) 0-1 Knapsack Problem (§5.3.3)
7 -1 Chapter 7 Dynamic Programming Fibonacci Sequence Fibonacci sequence: 0, 1, 1, 2, 3, 5, 8, 13, 21, … F i = i if i  1 F i = F i-1 + F i-2 if.
1 Branch and Bound Searching Strategies 2 Branch-and-bound strategy 2 mechanisms: A mechanism to generate branches A mechanism to generate a bound so.
7 -1 Chapter 7 Dynamic Programming Fibonacci sequence (1) 0,1,1,2,3,5,8,13,21,34,... Leonardo Fibonacci ( ) 用來計算兔子的數量 每對每個月可以生產一對 兔子出生後,
Optimal binary search trees
Longest common subsequence INPUT: two strings OUTPUT: longest common subsequence ACTGAACTCTGTGCACT TGACTCAGCACAAAAAC.
Dynamic Programming Introduction to Algorithms Dynamic Programming CSE 680 Prof. Roger Crawfis.
1 Dynamic Programming 2012/11/20. P.2 Dynamic Programming (DP) Dynamic programming Dynamic programming is typically applied to optimization problems.
Dynamic Programming – Part 2 Introduction to Algorithms Dynamic Programming – Part 2 CSE 680 Prof. Roger Crawfis.
The greedy method Suppose that a problem can be solved by a sequence of decisions. The greedy method has that each decision is locally optimal. These.
© The McGraw-Hill Companies, Inc., Chapter 3 The Greedy Method.
Dynamic Programming Dynamic programming is a technique for solving problems with a recursive structure with the following characteristics: 1.optimal substructure.
Chapter 5 Dynamic Programming 2001 년 5 월 24 일 충북대학교 알고리즘연구실.
Pairwise Sequence Alignment (I) (Lecture for CS498-CXZ Algorithms in Bioinformatics) Sept. 22, 2005 ChengXiang Zhai Department of Computer Science University.
7 -1 Chapter 7 Dynamic Programming Fibonacci sequence Fibonacci sequence: 0, 1, 1, 2, 3, 5, 8, 13, 21, … F i = i if i  1 F i = F i-1 + F i-2 if.
Dynamic Programming Dynamic Programming Dynamic Programming is a general algorithm design technique for solving problems defined by or formulated.
Dynamic Programming Tutorial &Practice on Longest Common Sub-sequence.
CS 5243: Algorithms Dynamic Programming Dynamic Programming is applicable when sub-problems are dependent! In the case of Divide and Conquer they are.
7 -1 Chapter 7 Dynamic Programming Fibonacci sequence Fibonacci sequence: 0, 1, 1, 2, 3, 5, 8, 13, 21, … F i = i if i  1 F i = F i-1 + F i-2 if.
8 -1 Dynamic Programming Fibonacci sequence Fibonacci sequence: 0, 1, 1, 2, 3, 5, 8, 13, 21, … F i = i if i  1 F i = F i-1 + F i-2 if i  2 Solved.
Contents of Chapter 5 Chapter 5 Dynamic Programming
CSC401: Analysis of Algorithms CSC401 – Analysis of Algorithms Chapter Dynamic Programming Objectives: Present the Dynamic Programming paradigm.
3 -1 Chapter 3 The Greedy Method 3 -2 A simple example Problem: Pick k numbers out of n numbers such that the sum of these k numbers is the largest.
Dynamic Programming Greed is not always good.. Jaruloj Chongstitvatana Design and Analysis of Algorithm2 Outline Elements of dynamic programming.
8 -1 Chapter 8 Dynamic Programming Fibonacci sequence Fibonacci sequence: 0, 1, 1, 2, 3, 5, 8, 13, 21, … F i = i if i  1 F i = F i-1 + F i-2 if.
1 Branch and Bound Searching Strategies Updated: 12/27/2010.
CS 3343: Analysis of Algorithms Lecture 18: More Examples on Dynamic Programming.
1 Ch20. Dynamic Programming. 2 BIRD’S-EYE VIEW Dynamic programming The most difficult one of the five design methods Has its foundation in the principle.
Dynamic Programming1. 2 Outline and Reading Matrix Chain-Product (§5.3.1) The General Technique (§5.3.2) 0-1 Knapsack Problem (§5.3.3)
15.Dynamic Programming. Computer Theory Lab. Chapter 15P.2 Dynamic programming Dynamic programming is typically applied to optimization problems. In such.
Chapter 7 Dynamic Programming 7.1 Introduction 7.2 The Longest Common Subsequence Problem 7.3 Matrix Chain Multiplication 7.4 The dynamic Programming Paradigm.
Chapter 15 Dynamic Programming Lee, Hsiu-Hui Ack: This presentation is based on the lecture slides from Hsu, Lih-Hsing, as well as various materials from.
8 -1 Dynamic Programming Fibonacci sequence Fibonacci sequence: 0, 1, 1, 2, 3, 5, 8, 13, 21, … F i = i if i  1 F i = F i-1 + F i-2 if i  2 Solved.
Dynamic Programming Tutorial &Practice on Longest Common Sub-sequence.
Branch and Bound Searching Strategies
TU/e Algorithms (2IL15) – Lecture 4 1 DYNAMIC PROGRAMMING II
Dr Nazir A. Zafar Advanced Algorithms Analysis and Design Advanced Algorithms Analysis and Design By Dr. Nazir Ahmad Zafar.
Data Structures and Algorithms (AT70. 02) Comp. Sc. and Inf. Mgmt
Lecture 5 Dynamic Programming
Dynamic Programming Several problems Principle of dynamic programming
Chapter 8 Dynamic Programming.
Dynamic Programming Comp 122, Fall 2004.
Sequence Alignment Using Dynamic Programming
Dynamic Programming General Idea
Unit-4: Dynamic Programming
Dynamic Programming 1/15/2019 8:22 PM Dynamic Programming.
Dynamic Programming Comp 122, Fall 2004.
All pairs shortest path problem
Dynamic Programming-- Longest Common Subsequence
Dynamic Programming General Idea
Longest common subsequence (LCS)
Advanced Analysis of Algorithms
Presentation transcript:

§ 8 Dynamic Programming Fibonacci sequence e.g. 0 , 1 , 1 , 2 , 3 , 5 , 8 , 13 , 21 ,……. Fi = i if i  1 Fi = Fi-1 + Fi-2 if i  2 Solved by a recursive program: Much replicated computation is done. It should be solved by a simple loop.

Shortest path e.g. to find a shortest path in a multi-stage graph. Apply the greedy method : the shortest path from S to T : 1 + 2 + 5 = 8

e.g. The greedy method can not be applied to this case: The shortest path is:

Dynamic programming approach: d(S, T) = min{1+d(A, T), 2+d(B, T), 5+d(C, T)}. d(A, T) = min{4+d(D, T), 11+d(E, T)} = min{4+18, 11+13} = 22.

d(B, T) = min{9+d(D, T), 5+d(E, T), 16+d(F, T)} d(C, T) = min{ 2+d(F, T) } = 2+2 = 4 d(S, T) = min{1+d(A, T), 2+d(B, T), 5+d(C, T)} = min{1+22, 2+18, 5+4} = 9. The above way of reasoning is a backward reasoning.

forward reasoning : d(S, A) = 1 d(S, B) = 2 d(S, C) = 5 d(S, D) = min{ d(S, A) + d(A, D), d(S, B) + d(B, D) } = min{ 1 + 4, 2 + 9 } = 5 d(S, E) = min{ d(S, A) + d(A, E), d(S, B) + d(B, E) } = min{ 1 + 11, 2 + 5 } = 7 d(S, F) = min{ d(S, A) + d(A, F), d(S, B) + d(B, F) } = min{ 2 + 16, 5 + 2 } d(S, T) = min{ d(S, D) + d(D, T), d(S, E) + d(E, T), d(S, F) + d(F, T)} = min{ 5 + 18, 7 + 13, 7 + 2 } = 9

Principle of optimality: Suppose that in solving a problem, we have to make a sequence of decisions D1, D2, …, Dn. If this sequence is optimal, then the last k decisions, 1  k  n must be optimal. e.g. the shortest path problem If i, i1, i2, …, j is a shortest path from i to j, then i1, i2, …, j must be a shortest path from i1 to j.

The longest common subsequence (LCS) problem a string : A = b a c a d a subsequence of A: deleting 0 or more symbols from A (not necessarily consecutive). e.g. ad, ac, bac, acad, bacad, bcd. common subsequences of A = b a c a d and B = a c c b a d c b : ad, ac, bac, acad. the longest common subsequence of A and B: a c a d.

Let A = a1 a2  am and B = b1 b2  bn Let Li,j denote the length of the longest common subsequence of a1 a2  ai and b1 b2  bj. L0,0 = L0,j = Li,0 = 0 for 1im, 1jn.

The dynamic programming approach to calculate matrix L : Time complexity : O(mn)

e.g. A = b a c a d, B = a c c b a d c b After all Li,j’s have been found, we can trace back to find the longest common subsequence of A and B.

The edit distance problem 3 edit operations: insertion, deletion, replacement e.g string A=‘vintner’, string B=‘writers’ v intner wri t ers RIMDMDMMI M: match, I: insert, D:delete, R: replace The edit cost of each I, D, or R is 1. The edit distance between A and B: 5.

Let A = a1a2…am and B = b1b2…bn. Let Di,j denote the edit distance between a1a2…ai and b1b2…bj Di,0 = i, 0  i  m D0,j = j, 0  j  n Di,j = min{Di-1, j + 1, Di, j-1 + 1, Di-1, j-1 + ti, j}, 1  i  m, 1  j  n where ti, j = 0 if ai = bj and ti, j = 1 if ai  bj.

The dynamic programming approach to calculate matrix D: time complexity: O(mn)

e.g. A=‘vintner’, B=‘writers’ The 3 optimal alignments : v-intner- -:gap wri-t-ers -vintner- vintner- writ-ers

0/1 knapsack problem i Wi Pi 1 10 40 2 3 20 5 30 n objects , weight W1, W2, ,Wn profit P1, P2, ,Pn capacity M maximize subject to xi = 0 or 1, 1  i  n e.g. i Wi Pi 1 10 40 2 3 20 5 30 M=10

multi-stage graph:

The longest path represents the optimal solution: x1 = 0, x2 = 1, x3 = 1 Pixi = 20 + 30 = 50 Let fi(Q) be the value of an optimal solution to objects 1,2,3,…,i with capacity Q. fi(Q) = max{ fi-1(Q), fi-1(Q-Wi)+Pi } The optimal solution is fn(M).

Optimal binary search trees e.g. binary search trees for 3, 7, 9, 12;

n identifiers : a1 <a2 <a3 <…< an Pi, 1in : the probability that ai is searched. Qi, 1in : the probability that x is searched where ai < x < ai+1 (a0=-, an+1=). internal node: successful search , Pi external node: unsuccessful search , Qi the expected cost of a binary tree: the level of the root : 1

Fig. A Binary Tree with Added External Nodes. e.g. identifiers : 4, 5, 8, 10, 11, 12, 14 Fig. A Binary Tree with Added External Nodes.

Let C(i, j) denote the cost of an optimal binary search tree containing ai,…,aj . The cost of the optimal binary search tree with ak as its root :

General formula

Fig. Computation Relationships of Subtrees. e.g. n = 4 Fig. Computation Relationships of Subtrees. time complexity : O(n3) when j-i=m, there are (n-m) C(i, j)’s to compute. Each C(i, j) with j-i=m can be computed in O(m) time. total time :

Matrix-chain multiplication n matrices A1, A2, …, An with size p0  p1, p1  p2, p2  p3, …, pn-1  pn Determine the multiplication order such that # of scalar multiplications is minimized. To compute Ai  Ai+1, we need pi-1pipi+1 scalar multiplications. e.g. n=4, A1: 3  5, A2: 5  4, A3: 4  2, A4: 2  5 ((A1  A2)  A3)  A4, # of scalar multiplications: 3 * 5 * 4 + 3 * 4 * 2 + 3 * 2 * 5 = 114 (A1  (A2  A3))  A4, # of scalar multiplications: 3 * 5 * 2 + 5 * 4 * 2 + 3 * 2 * 5 = 100 (A1  A2)  (A3  A4), # of scalar multiplications: 3 * 5 * 4 + 3 * 4 * 5 + 4 * 2 * 5 = 160

Let m(i, j) denote the minimum cost for computing Ai  Ai+1  …  Aj Computation sequence : Time complexity : O(n3)