Presentation is loading. Please wait.

Presentation is loading. Please wait.

CS 5243: Algorithms Dynamic Programming Dynamic Programming is applicable when sub-problems are dependent! In the case of Divide and Conquer they are.

Similar presentations


Presentation on theme: "CS 5243: Algorithms Dynamic Programming Dynamic Programming is applicable when sub-problems are dependent! In the case of Divide and Conquer they are."— Presentation transcript:

1

2 CS 5243: Algorithms Dynamic Programming

3 Dynamic Programming is applicable when sub-problems are dependent! In the case of Divide and Conquer they are independent. Often used in optimization problems Fib(5) Fib(4) Fib(3) Fib(2) Fib(1)

4 Development of a DP algorithm 1.Characterize the structure of an optimal solution 2.Recursively define the value of an optimal solution 3.Computer the value of an optimal solution 4.Construct an optimal solution from computed information Note: Dynamic programming works bottom up while a technique called Memoization is very similar but is a recursive top down method.

5 Memoization l Memoization is one way to deal with overlapping subproblems n After computing the solution to a subproblem, store in a table n Subsequent calls just do a table lookup l Can modify recursive alg to use memoziation: Lets do this for the Fibonacci sequence Fib(n) = Fib( n-1) + Fib( n-2) Fib(2) = Fib(1) = 1

6 Fibonacci Memoization Lets write a recursive algorithm to calculate Fib(n) and create an array to store subprogram results in order that they will not need to be recalculated. In memoization we recurse down and fill the array bottom up. int fibArray[]={0,1,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0} int Fib(int n){ if (fibArray[n]>0)return( fibArray[n]) else return (Fib(n-1)+Fib(n-2)) }

7 Fibonacci Tree Traversal Fib(5) Fib(4) Fib(3) Fib(2) Fib(1) Fib(k) Is known when requested!! Fib(1) Fib(6) Fib(4)

8 The DP solution to Fibonacci int Fib(int n) { // Note that this is not recursive int fibArray[]={0,1,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,,0}; for (i=3, i <= n; i++) fibArray[i] = fibArray[i-1]+fibArray[i-2]; return fibArray[n]; }

9 Next Problem:Matrix Multiplication Matrix-Multiply(A, B) 1.if columns[A] != rows[B] 2. then error “ incompatible dimensions” 3. else for i = 1 to rows[A] 4. do for j = 1 to columns[B] 5. do C[i, j] = 0 6. for k=1 to columns[A] 7. do C[i, j] = C[i, j]+A[i, k]*B[k, j] 8.return C AB

10 Matrix-chain Multiplication problem Problem: Given a chain of n matrices, where for i= 1,2,...,n, matrix Ai has dimension p i-1 x p i, fully parenthesize the product A 1 A 2... A n in a way that minimizes the number of scalar multiplications. Example: Suppose we have the following size arrays 10 x 20 * 20 x 40 * 40 x 5 then ((10 x 20 20 x 40) 40 x 5) yields 10*20*40+10*40*5=10,000 mults then (10 x 20 (20 x 40 40 x 5)) yields 20*40*5+10*20*5=5,000 mults Clearly parenthesizing makes a big difference!

11 How many parens are there? Exhaustively checking all possible parenthesizations is inefficient! Let P(n) denote the number of alternative parenthesizations of a sequence of n matrices. Hence P(1)=1 otherwise if n>2 we have The complexity of P(n) in this case is  (2 n )

12 Optimal Parenthesization Major Observation: An optimal solution to an instance of the matrix chain multiplication problem contains within it optimal solutions to subproblems. Suppose that an optimal parenthesization of A i...A j splits the product between A k and A k+1. Then the parenthesization of the prefix subchain A i.. A k must also be optimal! The same is true for the postfix subchain. Why is this true? (A 1, A 2, A 3,. A k )(A k+1.., A n ) Optimal Optimal Cost(A 1.. A n ) = Cost(A 1.. A k ) + Cost(A k+1.. A n ) + Cost of final product

13 Intermediate Storage in Table As usual in DP we will store intermediate results in a table say in this case M. Let M[i,j] be the minimum # of scalar multiplications needed to computer A i..j If the optimal soln splits A i..j at k then M[i,j] = M[i,k] + M[k,j] + p i-1 p k p j where A i has dim p i-1 p j Since we really do not know the value of k we need to check the j-i possible values of k. Hence

14 Finding the Optimal Solution Each time we find the above minimum k lets store it in an array s[i,j]. Hence s[i,j] is the value of k at which we can split the product A 1 A i+1... A j to obtain an optimal parenthesization.

15 The Algorithm Matrix-Chain-Order(p) n  length[p]-1 for i  1 to n do m[i,j]  0 for p  2 to n // p is the chain length do for i  1 to n - p + 1 do j  i + p - 1 m[i,k]   for k  i to j - 1 do q  m[i,k] + m[k+1,j] + p i-1 p k p j if q < m[i,j] then m[i,j]  q; s[i,j]  k return m and s

16 The m and s tables for n=6 3 3 3 1 3355 334 33 1 2 M S 15125105005375350050000 118757125250010000 937543757500 787526250 157500 0 matrix dimension A 1 30x35 A 2 35x15 A 3 15x5 A 4 5x10 A 5 10x20 A 6 20x20 A1A1 A2A2 A3A3 A4A4 A5A5 A6A6 1 2 3 4 5 6 12 3 4 5 6 1 2 3 4 5 2 3 4 5 6 m[2,2] + m[3,5] + p1p2p5 m[2,5]= min m[2,3] + m[4,5] + p1p3p5 m[2,4] + m[5,5] + p1p4p5

17 Observations 15125105005375350050000 118757125250010000 937543757500 787526250 157500 0 A1A1 A2A2 A3A3 A4A4 A5A5 1 2 3 4 5 6 12 3 4 5 6 Note that we build this array from bottom up. Once a cell is filled we never recalculate it. Complexity is  (n 3 ) Why? What is the space complexity?

18 Printing the Parenthezation PrintOptimalParens(s, i, j) if i = j then print “A”; else print “(“ PrintOptimalParens(s, i, s[i, j]) PrintOptimalParens(s, s[i,j]+1, j) print “)” The call PrintOptimalParens(s, 1, 6) prints the solution ((A 1 ( A 2, A 3 ))((A 4 A 5 ) A 6 ))

19 Optimal Binary Search Trees The next problem is a another optimization problem. You are required to find a BST that optimizes accesses to the tree given probability of access for each item in the tree. Here are examples assume equal probabilities of 1/7. while if do if whiledo 1 2 33 2222 1 2 3 1 22 cost=15/7 cost=13/7

20 More Examples while if do if whiledo 1 2 33 2 222 1 2 3 1 22 while do if Assuming that probabilities of access are p(do)=.5, p(if)=.1, p(while)=.05 and q(0)=.15, q(1)=.1, q(2)=.05, q(3)=.05 q(0) : do : q(1) : if : q(2) : while : q(3) Cost=2.65 Cost=2.05 2 33 Cost=1.9 Cost(third tree)= 1(.1)+2(.5)+2(.05)+2(.15)+2(.1)+2(.05)+2(.05) = 1.9

21 A DP Solution for the Optimal BST Let the set of nodes in a tree be {a 1 <a 2 <a 3 < < a n } and define E i, 0  i  n. The class E i contains all identifiers x such that a i  x  a i+1. Therefore the expected cost of a tree is e3 e2e1 e0 a3 a1 a2 a3a1 e1e2e3 1 22 e0 2 2 22 1 2 2 3 1 3=level(e3)-13

22 Optimal Subproblems! if Optimal Right tree Optimal Left tree This problem exhibits optimal substructure. For if the above tree is optimal then the left and right subtrees most be optimal! Optimal Substructure often implies a Dynamic Programming solution The following algorithm is based on this observation.

23 Optimal Subtree Calculation akak Optimal Right tree Optimal Left tree Note: The level of the root of each subtree is 1 in each case

24 Optimal Subtree akak Optimal Right tree Optimal Left tree The cost of the above entire tree in terms of the optimal subtree is p(k) + cost(left_tree) + w(0,k-1) +cost(right_tree) + + w(k,n) where

25 W(0,k-1) is a fix for the left tree e3 e2e1 e0 a3 a1 a2 1 2 2 3 1 33 1 2 2 left subtree Assuming p(i) is probability of ai the cost of left subtree by itself is 1*p(1)+2*p(2)+ 1*q(0)+2*q(1)+2*q(2) and w(0,2) in this case is p(1) + p(2) + q(0) + q(1) + q(2) Can you see what w(0,2) does in this case? 1 2 It adds 1 probe count to each term on the left side to account for the fact that the subtree is one node deeper!

26 Minimizing over the roots Recall that the cost of the entire tree, with a k the root, is p(k) + cost(left_tree) + w(0,k-1) +cost(right_tree) + + w(k,n) All we need to do then is to find the value of k so that the above is minimum. So we look at each tree with a k the root. a1a2 a1 akan

27 Minimization formula If we use c(i,j) to represent the cost of an optimal binary search tree, tij, containing ai+1,...,aj and Ei,...,Ej, then for the tree to be optimal k must be optimally chosen so that is minimum. So p(k) + c(left_tree) + w(0,k-1) +c(right_tree) + + w(k,n) in general simplifies to

28 The Process The above equation can be solve sort of bottom up by so we compute all c(i,j) where j=i+1, then those that have j=i+2 etc. While we do this we record the root r(i,j) of each optimal tree. This allows us to then reconstruct the complete optimal tree.

29 An Example Let n=4 and {a1,a2,a3,a4} be the set of nodes in a tree. Also let p(1:4)=(3,3,1,1) and q(0:4)=(2,3,1,1,1). Here we multiply the p’s and the q’s by 16 for simplicity. So w(i,i)=q(i)=c(i,i)=0 w(0,1) = p(1) + q(1) + w(0,0) = 8 c(0,1) = w(0,1) + min{c(0,0+c(1,1)} = 8 r(0,1) = 1 w(1,2) = p(2) + q(2) + w(1,1) = 7 c(1,2) = w(1,2) + min{c(1,1)+c(2,2)} = 7 r(1,2) = 2 w(2,3) = p(3) + q(3) + w(2,2) = 3 c(2,3) = w(2,3) + min{c(2,2)+c(3,3)} = 3 r(2,3) = 3 w(3,4) = p(4) + q(4) + w(3,3) = 3 c(3,4) = w(3,4) + min{c(3,3)+c(4,4)} = 3 r(3,4) = 4

30 Computation Table w00=2 c00=0 r00=0 w11=3 c11=0 r11=0 w22=1 c22=0 r22=0 w33=1 c33=0 r33=0 w44=1 c44=0 r44=0 0 w01=8 c01=8 r01=1 w12=7 c12=7 r12=2 w23=3 c23=3 r23=3 w34=3 c34=3 r34=4 1 w02=12 c02=19 r02=1 w13=9 c13=12 r13=2 w24=5 c24=8 r24=3 2 w04=16 c04=32 r04=2 4 w03=14 c03=25 r03=2 w14=11 c14=19 r14=2 3 a2 a1a3 a4 t04 = t01 : t24 (root a2) t024 = t22 : t34(root a3) Recall that tij={ai+1,...,aj}

31 Complexity of OBST Algorithm We computer c(i,j) for j-i = 1,2,3,...,n in that order. When j-i=m there are n-m+1 c(i,j)’s to computer each of which finds the minimum of m quantities. Hence we do m(n-m+1) or mn-m 2 +m operations on the m th row. So the total time is D.E. Knuth has shown that if you limit the search to the range r(i,j-1)  k  r(i+1,j) the computing time becomes O(n 2 )


Download ppt "CS 5243: Algorithms Dynamic Programming Dynamic Programming is applicable when sub-problems are dependent! In the case of Divide and Conquer they are."

Similar presentations


Ads by Google