Dynamic Programming Part 1: intro and the assembly-line scheduling problem.

Slides:



Advertisements
Similar presentations
Subsequence A subsequence of a sequence/string X = is a sequence obtained by deleting 0 or more elements from X. Example: sudan is a subsequence of sesquipedalian.
Advertisements

Dynamic Programming.
Dynamic Programming An algorithm design paradigm like divide-and-conquer “Programming”: A tabular method (not writing computer code) Divide-and-Conquer.
Greedy Algorithms Greed is good. (Some of the time)
Analysis of Algorithms
Greedy Algorithms Be greedy! always make the choice that looks best at the moment. Local optimization. Not always yielding a globally optimal solution.
Lecture 8: Dynamic Programming Shang-Hua Teng. Longest Common Subsequence Biologists need to measure how similar strands of DNA are to determine how closely.
Overview What is Dynamic Programming? A Sequence of 4 Steps
CS Section 600 CS Section 002 Dr. Angela Guercio Spring 2010.
CS420 Lecture 9 Dynamic Programming. Optimization Problems In optimization problems a set of choices are to be made to arrive at an optimum, and sub problems.
Dynamic Programming.
ROMaN: Revenue driven Overlay Multicast Networking Varun Khare.
Greedy Algorithms Basic idea Connection to dynamic programming
Analysis of Algorithms Dynamic Programming. A dynamic programming algorithm solves every sub problem just once and then Saves its answer in a table (array),
Introduction to Algorithms
 2004 SDU Lecture11- All-pairs shortest paths. Dynamic programming Comparing to divide-and-conquer 1.Both partition the problem into sub-problems 2.Divide-and-conquer.
Greedy Algorithms Basic idea Connection to dynamic programming Proof Techniques.
Advanced Topics in Algorithms and Data Structures Lecture 7.1, page 1 An overview of lecture 7 An optimal parallel algorithm for the 2D convex hull problem,
Chapter 15 Dynamic Programming Lee, Hsiu-Hui Ack: This presentation is based on the lecture slides from Hsu, Lih-Hsing, as well as various materials from.
1 Dynamic Programming (DP) Like divide-and-conquer, solve problem by combining the solutions to sub-problems. Differences between divide-and-conquer and.
Dynamic Programming (pro-gram)
Dynamic Programming Lets begin by looking at the Fibonacci sequence.
Dynamic Programming Lecture 9 Asst. Prof. Dr. İlker Kocabaş 1.
Analysis of Algorithms CS 477/677 Instructor: Monica Nicolescu Lecture 11.
Dynamic Programming (II)
Introduction to Algorithms Second Edition by Cormen, Leiserson, Rivest & Stein Chapter 15.
Dynamic Programming CIS 606 Spring 2010.
CS3381 Des & Anal of Alg ( SemA) City Univ of HK / Dept of CS / Helena Wong 4. Dynamic Programming - 1 Dynamic.
Greedy Algorithms CIS 606 Spring Greedy Algorithms Similar to dynamic programming. Used for optimization problems. Idea – When we have a choice.
UNC Chapel Hill Lin/Manocha/Foskey Optimization Problems In which a set of choices must be made in order to arrive at an optimal (min/max) solution, subject.
Analysis of Algorithms CS 477/677
Sequence Alignment Oct 9, 2002 Joon Lee Genomics & Computational Biology.
UMass Lowell Computer Science Analysis of Algorithms Prof. Karen Daniels Fall, 2006 Design Patterns for Optimization Problems Dynamic Programming.
UMass Lowell Computer Science Analysis of Algorithms Prof. Karen Daniels Spring, 2008 Design Patterns for Optimization Problems Dynamic Programming.
November 7, 2005Copyright © by Erik D. Demaine and Charles E. Leiserson Dynamic programming Design technique, like divide-and-conquer. Example:
Fundamental Techniques
Analysis of Algorithms
UMass Lowell Computer Science Analysis of Algorithms Prof. Karen Daniels Fall, 2005 Design Patterns for Optimization Problems Dynamic Programming.
Lecture 5 Dynamic Programming. Dynamic Programming Self-reducibility.
Dynamic Programming. Well known algorithm design techniques:. –Divide-and-conquer algorithms Another strategy for designing algorithms is dynamic programming.
Dynamic Programming Chapter 15 Highlights Charles Tappert Seidenberg School of CSIS, Pace University.
Dynamic Programming.
1 Summary: Design Methods for Algorithms Andreas Klappenecker.
COSC 3101A - Design and Analysis of Algorithms 7 Dynamic Programming Assembly-Line Scheduling Matrix-Chain Multiplication Elements of DP Many of these.
CSCI-256 Data Structures & Algorithm Analysis Lecture Note: Some slides by Kevin Wayne. Copyright © 2005 Pearson-Addison Wesley. All rights reserved. 17.
1 Chapter 15-1 : Dynamic Programming I. 2 Divide-and-conquer strategy allows us to solve a big problem by handling only smaller sub-problems Some problems.
Optimization Problems In which a set of choices must be made in order to arrive at an optimal (min/max) solution, subject to some constraints. (There may.
Computer Sciences Department1.  Property 1: each node can have up to two successor nodes (children)  The predecessor node of a node is called its.
CSC5101 Advanced Algorithms Analysis
15.Dynamic Programming. Computer Theory Lab. Chapter 15P.2 Dynamic programming Dynamic programming is typically applied to optimization problems. In such.
Greedy Algorithms BIL741: Advanced Analysis of Algorithms I (İleri Algoritma Çözümleme I)1.
1 Today’s Material Dynamic Programming – Chapter 15 –Introduction to Dynamic Programming –0-1 Knapsack Problem –Longest Common Subsequence –Chain Matrix.
Chapter 15 Dynamic Programming Lee, Hsiu-Hui Ack: This presentation is based on the lecture slides from Hsu, Lih-Hsing, as well as various materials from.
Advanced Algorithms Analysis and Design
Example 2 You are traveling by a canoe down a river and there are n trading posts along the way. Before starting your journey, you are given for each 1
Dynamic Programming Csc 487/687 Computing for Bioinformatics.
Greedy algorithms: CSC317
Dynamic Programming Typically applied to optimization problems
Dynamic Programming (DP)
Advanced Design and Analysis Techniques
Chapter 15-1 : Dynamic Programming I
Analysis of Algorithms CS 477/677
DYNAMIC PROGRAMMING.
Algorithm Design Techniques Greedy Approach vs Dynamic Programming
COMP108 Algorithmic Foundations Dynamic Programming
Analysis of Algorithms CS 477/677
Asst. Prof. Dr. İlker Kocabaş
Advance Algorithm Dynamic Programming
Analysis of Algorithms CS 477/677
Presentation transcript:

Dynamic Programming Part 1: intro and the assembly-line scheduling problem

What is dynamic programming Is an algorithm method that solves a problem by combining solutions of subproblems This sounds similar to divide-and-conquer However, there’s a difference between the two: –In divide-and-conquer, the subproblems don’t overlap –In dynamic programming, the subproblems overlap Subproblems share sub-subproblems!

Dynamic programming may be good for these problems Dynamic programming is typically applied to optimization problems These optimization problems may have many possible solutions Each solution has a value associated with it We want a solution with the optimal (min or max) value –We say “a solution,” not “the solution,” because there may be more than 1 solution with the same, best value

Sequence in the development of a dynamic programming algorithm 1.Characterize the structure of an optimal solution 2.Recursively define the value of an optimal solution 3.Compute the value of an optimal solution in a bottom-up fashion (smaller subproblems first) 4.Construct an optimal solution from computed information Step 4 omitted if only the value of the best solutions is required

Assembly line scheduling 2 assembly lines at an auto assembly plant Normally they operate independently and concurrently But when there is a rush job the manager halts the assembly lines and use stations in both assembly lines in an optimal way, to be explained next

Step 1: structure of fastest way: What’s the fastest way through station S(1,j)? First suppose the fastest way through station S(1,j) is though S(1,j-1) Key observation: the chassis must have taken a fastest way from the starting point through station S(1,j-1) Why? If there had been a faster way to get through station S(1,j-1), we could have substituted this faster way to yield a faster way through station S(1,j): a contradiction

Dynamic optimality or Optimal substructure property An optimal solution to a problem (finding the fastest way through station S(i,j) contians within it an optimal solution to subproblems (finding the fastest way through either S(1,j-1) or s(2,j-1)).

Step 2: a recursive solution Let f i [j] = fastest possible time to get a chassis from the starting point through station S(i,j) Ultimate goal: compute fastest time to get a chassis all the way through the factory Denote this time by f* We have…

Equations for the first stations on each assembly line

Step 3: computing the fastest times

Step 4: constructing the fastest way