Design and Analysis of Algorithms - Chapter 81 Dynamic Programming Dynamic Programming is a general algorithm design techniqueDynamic Programming is a.

Slides:



Advertisements
Similar presentations
Advanced Algorithm Design and Analysis (Lecture 7) SW5 fall 2004 Simonas Šaltenis E1-215b
Advertisements

 2004 SDU Lecture11- All-pairs shortest paths. Dynamic programming Comparing to divide-and-conquer 1.Both partition the problem into sub-problems 2.Divide-and-conquer.
RAIK 283: Data Structures & Algorithms
Dynamic Programming Dynamic Programming is a general algorithm design technique for solving problems defined by recurrences with overlapping subproblems.
Design and Analysis of Algorithms - Chapter 81 Dynamic Programming Warshall’s and Floyd’sAlgorithm Dr. Ying Lu RAIK 283: Data Structures.
All Pairs Shortest Paths and Floyd-Warshall Algorithm CLRS 25.2
Dynamic Programming Reading Material: Chapter 7..
Dynamic Programming Technique. D.P.2 The term Dynamic Programming comes from Control Theory, not computer science. Programming refers to the use of tables.
Chapter 8 Dynamic Programming Copyright © 2007 Pearson Addison-Wesley. All rights reserved.
Dynamic Programming1. 2 Outline and Reading Matrix Chain-Product (§5.3.1) The General Technique (§5.3.2) 0-1 Knapsack Problem (§5.3.3)
Design and Analysis of Algorithms - Chapter 81 Dynamic Programming Dynamic Programming is a general algorithm design technique Dynamic Programming is a.
Dynamic Programming Dynamic Programming algorithms address problems whose solution is recursive in nature, but has the following property: The direct implementation.
CSC401 – Analysis of Algorithms Lecture Notes 12 Dynamic Programming
Chapter 8 Dynamic Programming Copyright © 2007 Pearson Addison-Wesley. All rights reserved.
Dynamic Programming Optimization Problems Dynamic Programming Paradigm
Midterm 3 Revision Prof. Sin-Min Lee Department of Computer Science San Jose State University.
Algorithms All pairs shortest path
Dynamic Programming Reading Material: Chapter 7 Sections and 6.
Dynamic Programming A. Levitin “Introduction to the Design & Analysis of Algorithms,” 3rd ed., Ch. 8 ©2012 Pearson Education, Inc. Upper Saddle River,
CS 473 All Pairs Shortest Paths1 CS473 – Algorithms I All Pairs Shortest Paths.
Dynamic Programming Introduction to Algorithms Dynamic Programming CSE 680 Prof. Roger Crawfis.
Dynamic Programming – Part 2 Introduction to Algorithms Dynamic Programming – Part 2 CSE 680 Prof. Roger Crawfis.
A. Levitin “Introduction to the Design & Analysis of Algorithms,” 3rd ed., Ch. 8 ©2012 Pearson Education, Inc. Upper Saddle River, NJ. All Rights Reserved.
Chapter 5 Dynamic Programming 2001 년 5 월 24 일 충북대학교 알고리즘연구실.
CSCE350 Algorithms and Data Structure Lecture 17 Jianjun Hu Department of Computer Science and Engineering University of South Carolina
Dynamic Programming. Well known algorithm design techniques:. –Divide-and-conquer algorithms Another strategy for designing algorithms is dynamic programming.
Dynamic Programming Dynamic Programming Dynamic Programming is a general algorithm design technique for solving problems defined by or formulated.
Algorithms April-May 2013 Dr. Youn-Hee Han The Project for the Establishing the Korea ㅡ Vietnam College of Technology in Bac Giang.
Chapter 8 Dynamic Programming Copyright © 2007 Pearson Addison-Wesley. All rights reserved.
CSC401: Analysis of Algorithms CSC401 – Analysis of Algorithms Chapter Dynamic Programming Objectives: Present the Dynamic Programming paradigm.
CS 8833 Algorithms Algorithms Dynamic Programming.
MA/CSSE 473 Day 28 Dynamic Programming Binomial Coefficients Warshall's algorithm Student questions?
1 The Floyd-Warshall Algorithm Andreas Klappenecker.
The all-pairs shortest path problem (APSP) input: a directed graph G = (V, E) with edge weights goal: find a minimum weight (shortest) path between every.
Algorithms: Design and Analysis Summer School 2013 at VIASM: Random Structures and Algorithms Lecture 4: Dynamic Programming Phan Th ị Hà D ươ ng 1.
Chapter 8 Dynamic Programming Copyright © 2007 Pearson Addison-Wesley. All rights reserved.
12-CRS-0106 REVISED 8 FEB 2013 CSG523/ Desain dan Analisis Algoritma Dynamic Programming Intelligence, Computing, Multimedia (ICM)
Algorithmics - Lecture 121 LECTURE 11: Dynamic programming - II -
1 Ch20. Dynamic Programming. 2 BIRD’S-EYE VIEW Dynamic programming The most difficult one of the five design methods Has its foundation in the principle.
Optimization Problems In which a set of choices must be made in order to arrive at an optimal (min/max) solution, subject to some constraints. (There may.
All-Pairs Shortest Paths
Dynamic Programming1. 2 Outline and Reading Matrix Chain-Product (§5.3.1) The General Technique (§5.3.2) 0-1 Knapsack Problem (§5.3.3)
Chapter 7 Dynamic Programming 7.1 Introduction 7.2 The Longest Common Subsequence Problem 7.3 Matrix Chain Multiplication 7.4 The dynamic Programming Paradigm.
MA/CSSE 473 Day 30 B Trees Dynamic Programming Binomial Coefficients Warshall's algorithm No in-class quiz today Student questions?
MA/CSSE 473 Day 27 Dynamic Programming Binomial Coefficients
Dynamic Programming Copyright © 2007 Pearson Addison-Wesley. All rights reserved.
Introduction to the Design and Analysis of Algorithms
Seminar on Dynamic Programming.
Dynamic Programming Dynamic Programming is a general algorithm design technique for solving problems defined by recurrences with overlapping subproblems.
Dynamic Programming Dynamic Programming is a general algorithm design technique for solving problems defined by recurrences with overlapping subproblems.
Chapter 8 Dynamic Programming
Warshall’s and Floyd’sAlgorithm
Unit-5 Dynamic Programming
Analysis and design of algorithm
Chapter 8 Dynamic Programming
ICS 353: Design and Analysis of Algorithms
Dynamic Programming 1/15/2019 8:22 PM Dynamic Programming.
Dynamic Programming.
3. Brute Force Selection sort Brute-Force string matching
Dynamic Programming.
Advanced Algorithms Analysis and Design
All pairs shortest path problem
3. Brute Force Selection sort Brute-Force string matching
Dynamic Programming.
Dynamic Programming II DP over Intervals
Dynamic Programming.
Dynamic Programming Dynamic Programming is a general algorithm design technique for solving problems defined by recurrences with overlapping subproblems.
COSC 3101A - Design and Analysis of Algorithms 12
Seminar on Dynamic Programming.
3. Brute Force Selection sort Brute-Force string matching
Presentation transcript:

Design and Analysis of Algorithms - Chapter 81 Dynamic Programming Dynamic Programming is a general algorithm design techniqueDynamic Programming is a general algorithm design technique Invented by American mathematician Richard Bellman in the 1950s to solve optimization problemsInvented by American mathematician Richard Bellman in the 1950s to solve optimization problems “Programming” here means “planning”“Programming” here means “planning” Main idea:Main idea: solve several smaller (overlapping) subproblems solve several smaller (overlapping) subproblems record solutions in a table so that each subproblem is only solved once record solutions in a table so that each subproblem is only solved once final state of the table will be (or contain) solution final state of the table will be (or contain) solution

Design and Analysis of Algorithms - Chapter 82 Example: Fibonacci numbers Recall definition of Fibonacci numbers: f(0) = 0 f(1) = 1 f(n) = f(n-1) + f(n-2) Compute the n th Fibonacci number recursively (top- down) f(n) f(n-1) + f(n-2) f(n-2) + f(n-3) f(n-3) + f(n-4)...

Design and Analysis of Algorithms - Chapter 83 Example: Fibonacci numbers (2) Compute the n th Fibonacci number using bottom-up iteration: f(0) = 0 f(1) = 1 f(2) = 0+1 = 1 f(3) = 1+1 = 2 f(4) = 1+2 = 3 f(n-2) = f(n-1) = f(n) = f(n-1) + f(n-2)

Design and Analysis of Algorithms - Chapter 84 Examples of Dynamic Programming Algorithms Computing binomial coefficientsComputing binomial coefficients Optimal chain matrix multiplicationOptimal chain matrix multiplication Constructing an optimal binary search treeConstructing an optimal binary search tree Warshall’s algorithm for transitive closureWarshall’s algorithm for transitive closure Floyd’s algorithms for all-pairs shortest pathsFloyd’s algorithms for all-pairs shortest paths Some instances of difficult discrete optimization problems:Some instances of difficult discrete optimization problems: travelling salesman travelling salesman knapsack knapsack

Design and Analysis of Algorithms - Chapter 85 Binomial coefficients Algorithm based on identity Algorithm Binomial(n,k)Algorithm Binomial(n,k) for i  0 to n do for j  0 to min(j,k) do if j=0 or j=i then C[i,j]  1 else C[i,j]  C[i-1,j-1]+C[i-1,j] return C[n,k] Pascal’s TrianglePascal’s Triangle ExampleExample Space and Time efficiencySpace and Time efficiency

Design and Analysis of Algorithms - Chapter 86 Warshall’s Algorithm: Transitive Closure Computes the transitive closure of a relation Computes the transitive closure of a relation Alternatively: all paths in a directed graph Alternatively: all paths in a directed graph Example of transitive closure: Example of transitive closure:

Design and Analysis of Algorithms - Chapter 87 Warshall’s Algorithm (2) Main idea: a path exists between two vertices i, j, iff Main idea: a path exists between two vertices i, j, iff there is an edge from i to j; orthere is an edge from i to j; or there is a path from i to j going through vertex 1; orthere is a path from i to j going through vertex 1; or there is a path from i to j going through vertex 1 and/or 2; orthere is a path from i to j going through vertex 1 and/or 2; or there is a path from i to j going through any of the other verticesthere is a path from i to j going through any of the other vertices R R R R R

Design and Analysis of Algorithms - Chapter 88 Warshall’s Algorithm (3) In the k th stage find if a path exists between two vertices i, j using just vertices among 1,…,k R (k-1) [i,j] (path using just 1,…,k-1) R (k) [i,j] = or R (k) [i,j] = or (R (k-1) [i,k] & R (k-1) [k,j]) (path from i to k and from (R (k-1) [i,k] & R (k-1) [k,j]) (path from i to k and from k to i using just 1,…,k-1) k to i using just 1,…,k-1) i j k k th stage {

Design and Analysis of Algorithms - Chapter 89 Warshall’s Algorithm (4) Algorithm Warshall(A[1..n,1..n]) R (0)  A for k  1 to n do for i  1 to n do for i  1 to n do for j  1 to n do for j  1 to n do R (k) [i,j]  R (k-1) [i,j] or R (k-1) [i,k] and R (k-1) [k,j] R (k) [i,j]  R (k-1) [i,j] or R (k-1) [i,k] and R (k-1) [k,j] return R (k) Space and Time efficiency

Design and Analysis of Algorithms - Chapter 810 Floyd’s Algorithm: All pairs shortest paths In a weighted graph, find shortest paths between every pair of verticesIn a weighted graph, find shortest paths between every pair of vertices Same idea: construct solution through series of matrices D(0), D(1), … using an initial subset of the vertices as intermediaries.Same idea: construct solution through series of matrices D(0), D(1), … using an initial subset of the vertices as intermediaries. Example:Example:

Design and Analysis of Algorithms - Chapter 811 Floyd’s Algorithm (2) Algorithm Floyd(W[1..n,1..n]) D  W for k  1 to n do for i  1 to n do for i  1 to n do for j  1 to n do for j  1 to n do D[i,j]  min(D[i,j],D[i,k]+D[k,j]) D[i,j]  min(D[i,j],D[i,k]+D[k,j]) return D Space and Time efficiency When does it not work? Principle of optimality

Design and Analysis of Algorithms - Chapter 812 Optimal Binary Search Trees Keys are not searched with equal probabilityKeys are not searched with equal probability Suppose keys A, B, C, D with probabilities 0.1, 0.2, 0.3, 0.4Suppose keys A, B, C, D with probabilities 0.1, 0.2, 0.3, 0.4 What is the average search cost in each possible structure?What is the average search cost in each possible structure? How many different structures can be produced with n nodes ?How many different structures can be produced with n nodes ? Catalan number C(n)=comb(2n,n)/(n+1)Catalan number C(n)=comb(2n,n)/(n+1)

Design and Analysis of Algorithms - Chapter 813 Optimal Binary Search Trees (2) C[i,j] denotes the smallest average search cost in a tree including items i through j Based on the principle of optimality, the optimal search tree is constructed by using the recurrence C[i,j] = min{C[i,k-1]+C[k+1,j]} + Σp s for 1≤i ≤ j ≤ n and i ≤ k ≤ j C[i,i] = p i

Design and Analysis of Algorithms - Chapter 814 Optimal Binary Search Trees (3) Algorithm OptimalBST(P[1..n]) for i  1 to n do C[i,i-1]  0; C[i,i]  P[i], R[i,i]  I C[n+1,n]  0 for d  1 to n-1 do for i  1 to n-d do j  i+d; minval  ∞ for k  i to j do if C[i,k-1]+C[k+1,j]<minval minval  C[I,k-1]+C[k+1,j]; kmin  k R[i,j]  kmin; sum  P[i]; for s  i+1 to j do sum  sum+P[s] C[i,j]  minval+sum return C[1,n], R

Design and Analysis of Algorithms - Chapter 815 The Knapsack problem Problem: given n items with weight w 1,w 2, …, w n, values v 1, v 2, …, v n and capacity W. Find the most valuable subset that fit into the knapsack. Solution with exhaustive search Let V[i,j] denote the optimal solution for the first i items and capacity j. Purpose to find V[n,W] Recurrence max{V[i-1,j],v i +V[i-1,j-w i ]} if j-w i ≥0 V[i,j] = V[i-1,j] if j-w i <0 {

Design and Analysis of Algorithms - Chapter 816 The Knapsack problem (2) itemweightvalue Example W=5 Space and Time efficiencyi,j ?

Design and Analysis of Algorithms - Chapter 817 Memory functions [1] A disadvantage of the dynamic programming approach is that is solves subproblems not finally needed. An alternative technique is to combine a top- down and a bottom-up approach (recursion plus a table for temporary results)

Design and Analysis of Algorithms - Chapter 818 Memory functions [2] Algorithm MFKnapsack(i,j) if V[i,j]<0 if j<Weights[i] value  MFKnapack[i-1,j] else value  max{MFKnapsack(i-1,j), Values[i]+MFKnapsack[i-1,j-Weights[i]]} V[i,j]  value return V[i,j]

Design and Analysis of Algorithms - Chapter 819 Memory functions (3) itemweightvalue Example W=5 Space and Time efficiencyi,j ?