Chapter 5 Dynamic Programming 2001 년 5 월 24 일 충북대학교 알고리즘연구실.

Slides:



Advertisements
Similar presentations
Chapter 4 The Greedy Approach. Minimum Spanning Tree A tree is an acyclic, connected, undirected graph. A spanning tree for a given graph G=, where E.
Advertisements

1 Appendix B: Solving TSP by Dynamic Programming Course: Algorithm Design and Analysis.
1 Chapter 26 All-Pairs Shortest Paths Problem definition Shortest paths and matrix multiplication The Floyd-Warshall algorithm.
Lecture 17 Path Algebra Matrix multiplication of adjacency matrices of directed graphs give important information about the graphs. Manipulating these.
Chapter 7 Dynamic Programming.
Management Science 461 Lecture 2b – Shortest Paths September 16, 2008.
 2004 SDU Lecture11- All-pairs shortest paths. Dynamic programming Comparing to divide-and-conquer 1.Both partition the problem into sub-problems 2.Divide-and-conquer.
Dynamic Programming Dynamic Programming is a general algorithm design technique for solving problems defined by recurrences with overlapping subproblems.
C++ Programming: Program Design Including Data Structures, Third Edition Chapter 21: Graphs.
Computability and Complexity 23-1 Computability and Complexity Andrei Bulatov Search and Optimization.
Chapter 7 Dynamic Programming 7.
§ 8 Dynamic Programming Fibonacci sequence
Network Optimization Problems: Models and Algorithms This handout: Minimum Spanning Tree Problem.
Chapter 8 Dynamic Programming Copyright © 2007 Pearson Addison-Wesley. All rights reserved.
Graphs and Trees This handout: Trees Minimum Spanning Tree Problem.
Design and Analysis of Algorithms - Chapter 81 Dynamic Programming Dynamic Programming is a general algorithm design technique Dynamic Programming is a.
1 More On Dynamic programming Algorithms Shortest path with edge constraint: Let G=(V, E) be a directed graph with weighted edges. Let s and v be two vertices.
Chapter 8 Dynamic Programming Copyright © 2007 Pearson Addison-Wesley. All rights reserved.
Data Structures, Spring 2004 © L. Joskowicz 1 Data Structures – LECTURE 16 All shortest paths algorithms Properties of all shortest paths Simple algorithm:
7 -1 Chapter 7 Dynamic Programming Fibonacci Sequence Fibonacci sequence: 0, 1, 1, 2, 3, 5, 8, 13, 21, … F i = i if i  1 F i = F i-1 + F i-2 if.
7 -1 Chapter 7 Dynamic Programming Fibonacci sequence (1) 0,1,1,2,3,5,8,13,21,34,... Leonardo Fibonacci ( ) 用來計算兔子的數量 每對每個月可以生產一對 兔子出生後,
Algorithms All pairs shortest path
9-1 Chapter 9 Approximation Algorithms. 9-2 Approximation algorithm Up to now, the best algorithm for solving an NP-complete problem requires exponential.
Design and Analysis of Algorithms - Chapter 81 Dynamic Programming Dynamic Programming is a general algorithm design techniqueDynamic Programming is a.
Dynamic Programming A. Levitin “Introduction to the Design & Analysis of Algorithms,” 3rd ed., Ch. 8 ©2012 Pearson Education, Inc. Upper Saddle River,
Optimal binary search trees
Data Structures Using C++ 2E
A. Levitin “Introduction to the Design & Analysis of Algorithms,” 3rd ed., Ch. 8 ©2012 Pearson Education, Inc. Upper Saddle River, NJ. All Rights Reserved.
IT 60101: Lecture #201 Foundation of Computing Systems Lecture 20 Classic Optimization Problems.
Chapter 2 Graph Algorithms.
CSCE350 Algorithms and Data Structure Lecture 17 Jianjun Hu Department of Computer Science and Engineering University of South Carolina
7 -1 Chapter 7 Dynamic Programming Fibonacci sequence Fibonacci sequence: 0, 1, 1, 2, 3, 5, 8, 13, 21, … F i = i if i  1 F i = F i-1 + F i-2 if.
Dynamic Programming Dynamic Programming Dynamic Programming is a general algorithm design technique for solving problems defined by or formulated.
7 -1 Chapter 7 Dynamic Programming Fibonacci sequence Fibonacci sequence: 0, 1, 1, 2, 3, 5, 8, 13, 21, … F i = i if i  1 F i = F i-1 + F i-2 if.
Contents of Chapter 5 Chapter 5 Dynamic Programming
1 The Floyd-Warshall Algorithm Andreas Klappenecker.
Minimum weight spanning trees
The all-pairs shortest path problem (APSP) input: a directed graph G = (V, E) with edge weights goal: find a minimum weight (shortest) path between every.
EMIS 8373: Integer Programming Combinatorial Relaxations and Duals Updated 8 February 2005.
1 Ch20. Dynamic Programming. 2 BIRD’S-EYE VIEW Dynamic programming The most difficult one of the five design methods Has its foundation in the principle.
All-Pairs Shortest Paths
Chapter 7 Dynamic Programming 7.1 Introduction 7.2 The Longest Common Subsequence Problem 7.3 Matrix Chain Multiplication 7.4 The dynamic Programming Paradigm.
Graph Theory. undirected graph node: a, b, c, d, e, f edge: (a, b), (a, c), (b, c), (b, e), (c, d), (c, f), (d, e), (d, f), (e, f) subgraph.
Chapter 20: Graphs. Objectives In this chapter, you will: – Learn about graphs – Become familiar with the basic terminology of graph theory – Discover.
Graphs Definition: a graph is an abstract representation of a set of objects where some pairs of the objects are connected by links. The interconnected.
Chapter 05 Introduction to Graph And Search Algorithms.
Lecture 20. Graphs and network models 1. Recap Binary search tree is a special binary tree which is designed to make the search of elements or keys in.
Trees.
The Theory of NP-Completeness
Dynamic Programming Dynamic Programming is a general algorithm design technique for solving problems defined by recurrences with overlapping subproblems.
Dynamic Programming Dynamic Programming is a general algorithm design technique for solving problems defined by recurrences with overlapping subproblems.
Chapter 8 Dynamic Programming
Design and Analysis of Algorithm
Autumn 2016 Lecture 11 Minimum Spanning Trees (Part II)
Autumn 2015 Lecture 11 Minimum Spanning Trees (Part II)
Unit-5 Dynamic Programming
Analysis and design of algorithm
Unit 4: Dynamic Programming
Chapter 8 Dynamic Programming
Unit-4: Dynamic Programming
3. Brute Force Selection sort Brute-Force string matching
Ch. 15: Dynamic Programming Ming-Te Chi
Foundations of Algorithms, Fourth Edition
3. Brute Force Selection sort Brute-Force string matching
Dynamic Programming Dynamic Programming is a general algorithm design technique for solving problems defined by recurrences with overlapping subproblems.
The Greedy Approach Young CS 530 Adv. Algo. Greedy.
Winter 2019 Lecture 11 Minimum Spanning Trees (Part II)
Advanced Analysis of Algorithms
3. Brute Force Selection sort Brute-Force string matching
Autumn 2019 Lecture 11 Minimum Spanning Trees (Part II)
Presentation transcript:

Chapter 5 Dynamic Programming 2001 년 5 월 24 일 충북대학교 알고리즘연구실

Dynamic Programming Used when the solution of a problem is a result of a sequence of decisions Example –Knapsack –Shortest path –Optimal merge patterns

5.3 All pairs shortest paths G(V,E) : A directed graph with n vertices A k (i,j) : Length of shortest path from i to j going through no vertex of index greater than k A(i,j) = min{min{A k-1 (i,k) + A k-1 (k,j)},cost(i,j)} A k (i,j) = A k-1 (i,k) + A k-1 (k,j) A k (i,j) = min{A k-1 (i,j),A k-1 (i,k) + A k-1 (k,j)},k>=1 C(i,j) = 0, 1<=i<=n C(i,j) Cost of edge if (i,j) E if i j and (i,j) E

5.3 All pairs shortest paths Figure 5.5 Graph with negative cycle

5.3 All pairs shortest paths Program 5.3 Function to compute lengths of shortest paths

5.3 All pairs shortest paths Figure 5.6 Directed Graph and associated matrices

5.4 Single-source shortest paths: General weights dist[u] = cost[v][u] dist l [u] = length of a shortest path from the source vertex V to vertex U under the constraint that the shortest path contains at most L edges 순환관계 –dist k [u] = min(dist k-1 [u],min(dist k-1 [i] + cost[i][u])),2<=k<=n-1

5.4 Single-source shortest paths: General weights Figure 5.10 Shortest paths with negative edge lengths

5.4 Single-source shortest paths: General weights Program 5.4 Bellman and Ford algorithm to compute shortest paths

5.5 Optimal binary search trees Definition –A binary search tree T All identifies in the T left < T root All identifies in the T right < T root The left and right subtres of T are also BST 가정 –a 1 < a 2 < … < a n –T i,j : OBST for a i+1, …,a j –C i,j : cost for T i,j –R i,j : root of T i,j –Weight of T i,j : W i,j = Q i +

5.5 Optimal binary search trees P(i) : probability Search(a(i)) Q(i) : probability search (a(i) < x < a(i+1)) : Probability of an unsuccessful search Internal node External node (5.9)

Optimal Binary Search Tree Cost of the search Tree p(k)+cost(l)+cost(r)+w(0,k-1)+w(k,n) c(0, n) = min{c(0, k-1) + c(k, n) +p(k) +w(0, k-1)+w(k, n)} c(i, j) = min{c(i, k-1), c(k, j)+p(k) + w(i,k-1)+w(k,j)} C(i,j) = min{c(i,k-1)+c(k,j)} + w(i,j)

5.5 Optimal binary search trees Figure 5.12 Two possible binary search trees

5.5 Optimal binary search trees Figure 5.13 Binary search trees of Figure 5.12 with external nodes added

5.5 Optimal binary search trees

Figure 5.16 Computation of c(0,4), w(0,4), and r(0,4)

5.8 Reliability design Solve a problem with a multiplicative optimization function Several devices are connected in series r i be the reliability of device D i Reliability of the entire system Duplicate : multiple copies of the same device type are connected in parallel use switching circuits

5.8 Reliability design Figure 5.19 n devices D i, 1<=i<=n, connected in series Figure 5.20 Multiple devices connected in parallel in each stage

Multiple copies stage in contain m i copies of D i P(all m i malfunction) = (1-r i ) mi Reliability of stage i =1-(1-r i ) mi

5.8 Reliability design Maximum allowable cost of the system Maximize Subject to M i >=1 and integer, 1<=i<=n Assume c i >0 u i =

5.9 The traveling salesperson problem 우체부 : n 개의 틀린 장소에서 mail pickup –n+1 vertex graph –Edge distance from i to j –Tour of minimum cost Permutation problem –n! different permutation of n object while there are 2 n different subset of n object n! > O(2 n )

5.9 The traveling salesperson problem Tour : simple path that starts and ends at vertex 1 Every tour : edge for some k v-{1} each Optimal tour : path(k,1) –Shortest k to 1 path all the vertices in V-{1,k} Let g(i,s) be the length of a shortest path starting at vertex i, going through all vertices in S and terminating at vertex 1

5.9 The traveling salesperson problem Figure 5.21 Directed graph and edge length matrix c

5.9 The traveling salesperson problem Thus g(2, ) = c 21 = 5, g(3, ) = c 31 = 6, and g(4, ) = c 41 = 8. We obtain g(2,{3}) = c 23 + g(3, ) = 15 g(2,{4}) = 18 g(3,{2}) = 18 g(3,{4}) = 20 g(4,{2}) = 13 g(4,{3}) = 15 g(2,{3,4}) = min{c 23 +g(3,{4}),c 24 +g(4,{3})} = 25 g(3,{2,4}) = min{c 32 +g(2,{4}),c 34 +g(4,{2})} = 25 g(4,{2,3}) = min{c 42 +g(2,{3}),c 43 +g(3,{2})} = 23 g(1,{2,3,4}) = min{c 12 +g(2,{3,4}),c 13 +g(3,{2,4}),c 14 +g(4,{2,3})} = min(35,40,43} = 35

5.9 The traveling salesperson problem Let N be the number of g(i,s), that have to be computed before g(1,V-{1}) i, computed for each value of |s| n-1 choices of i The number of distinct sets of S of size k not including 1 and i