Algorithm Design Techniques: Dynamic Programming.

Slides:



Advertisements
Similar presentations
Dynamic Programming Introduction Prof. Muhammad Saeed.
Advertisements

Algorithm Design Methodologies Divide & Conquer Dynamic Programming Backtracking.
Dynamic Programming An algorithm design paradigm like divide-and-conquer “Programming”: A tabular method (not writing computer code) Divide-and-Conquer.
Dynamic Programming.
Analysis of Algorithms Dynamic Programming. A dynamic programming algorithm solves every sub problem just once and then Saves its answer in a table (array),
Introduction to Algorithms
1 Dynamic Programming Jose Rolim University of Geneva.
1 Dynamic Programming (DP) Like divide-and-conquer, solve problem by combining the solutions to sub-problems. Differences between divide-and-conquer and.
Dynamic Programming Part 1: intro and the assembly-line scheduling problem.
Dynamic Programming Lets begin by looking at the Fibonacci sequence.
Analysis of Algorithms CS 477/677 Instructor: Monica Nicolescu Lecture 11.
Data Structures Lecture 10 Fang Yu Department of Management Information Systems National Chengchi University Fall 2010.
UMass Lowell Computer Science Analysis of Algorithms Prof. Karen Daniels Fall, 2006 Lecture 1 (Part 3) Design Patterns for Optimization Problems.
CPSC 311, Fall 2009: Dynamic Programming 1 CPSC 311 Analysis of Algorithms Dynamic Programming Prof. Jennifer Welch Fall 2009.
Dynamic Programming Technique. D.P.2 The term Dynamic Programming comes from Control Theory, not computer science. Programming refers to the use of tables.
Dynamic Programming Solving Optimization Problems.
Dynamic Programming1. 2 Outline and Reading Matrix Chain-Product (§5.3.1) The General Technique (§5.3.2) 0-1 Knapsack Problem (§5.3.3)
Design and Analysis of Algorithms - Chapter 81 Dynamic Programming Dynamic Programming is a general algorithm design technique Dynamic Programming is a.
CSC401 – Analysis of Algorithms Lecture Notes 12 Dynamic Programming
Dynamic Programming Optimization Problems Dynamic Programming Paradigm
CPSC 411 Design and Analysis of Algorithms Set 5: Dynamic Programming Prof. Jennifer Welch Spring 2011 CPSC 411, Spring 2011: Set 5 1.
Dynamic Programming1. 2 Outline and Reading Matrix Chain-Product (§5.3.1) The General Technique (§5.3.2) 0-1 Knapsack Problem (§5.3.3)
UNC Chapel Hill Lin/Manocha/Foskey Optimization Problems In which a set of choices must be made in order to arrive at an optimal (min/max) solution, subject.
Analysis of Algorithms CS 477/677
CSE 960 Spring Outline Logistical details Presentation evaluations Scheduling Approximation Algorithms Search space view of mathematical concepts.
UMass Lowell Computer Science Analysis of Algorithms Prof. Karen Daniels Fall, 2006 Design Patterns for Optimization Problems Dynamic Programming.
© 2004 Goodrich, Tamassia Dynamic Programming1. © 2004 Goodrich, Tamassia Dynamic Programming2 Matrix Chain-Products (not in book) Dynamic Programming.
Analysis of Algorithms
11-1 Matrix-chain Multiplication Suppose we have a sequence or chain A 1, A 2, …, A n of n matrices to be multiplied –That is, we want to compute the product.
Dynamic Programming Introduction to Algorithms Dynamic Programming CSE 680 Prof. Roger Crawfis.
First Ingredient of Dynamic Programming
Algorithms and Data Structures Lecture X
Dynamic Programming UNC Chapel Hill Z. Guo.
October 21, Algorithms and Data Structures Lecture X Simonas Šaltenis Nykredit Center for Database Research Aalborg University
Dynamic Programming. Well known algorithm design techniques:. –Divide-and-conquer algorithms Another strategy for designing algorithms is dynamic programming.
Recursion and Dynamic Programming. Recursive thinking… Recursion is a method where the solution to a problem depends on solutions to smaller instances.
CS 5243: Algorithms Dynamic Programming Dynamic Programming is applicable when sub-problems are dependent! In the case of Divide and Conquer they are.
1 Summary: Design Methods for Algorithms Andreas Klappenecker.
COSC 3101A - Design and Analysis of Algorithms 7 Dynamic Programming Assembly-Line Scheduling Matrix-Chain Multiplication Elements of DP Many of these.
CSC401: Analysis of Algorithms CSC401 – Analysis of Algorithms Chapter Dynamic Programming Objectives: Present the Dynamic Programming paradigm.
CS 8833 Algorithms Algorithms Dynamic Programming.
DP (not Daniel Park's dance party). Dynamic programming Can speed up many problems. Basically, it's like magic. :D Overlapping subproblems o Number of.
Dynamic Programming David Kauchak cs302 Spring 2013.
1 Chapter 15-1 : Dynamic Programming I. 2 Divide-and-conquer strategy allows us to solve a big problem by handling only smaller sub-problems Some problems.
Dynamic Programming. Many problem can be solved by D&C – (in fact, D&C is a very powerful approach if you generalize it since MOST problems can be solved.
Types of Algorithms. 2 Algorithm classification Algorithms that use a similar problem-solving approach can be grouped together We’ll talk about a classification.
CS4710 Algorithms. What is an Algorithm? An algorithm is a procedure to perform some task. 1.General - applicable in a variety of situations 2.Step-by-step.
Algorithmics - Lecture 121 LECTURE 11: Dynamic programming - II -
Optimization Problems In which a set of choices must be made in order to arrive at an optimal (min/max) solution, subject to some constraints. (There may.
Dynamic Programming1. 2 Outline and Reading Matrix Chain-Product (§5.3.1) The General Technique (§5.3.2) 0-1 Knapsack Problem (§5.3.3)
Computer Sciences Department1.  Property 1: each node can have up to two successor nodes (children)  The predecessor node of a node is called its.
Dynamic Programming David Kauchak cs161 Summer 2009.
1 Today’s Material Dynamic Programming – Chapter 15 –Introduction to Dynamic Programming –0-1 Knapsack Problem –Longest Common Subsequence –Chain Matrix.
1Computer Sciences Department. 2 Advanced Design and Analysis Techniques TUTORIAL 7.
Fundamental Data Structures and Algorithms Ananda Guna March 18, 2003 Dynamic Programming Part 1.
1 Chapter 15-2: Dynamic Programming II. 2 Matrix Multiplication Let A be a matrix of dimension p x q and B be a matrix of dimension q x r Then, if we.
Dynamic Programming Csc 487/687 Computing for Bioinformatics.
Lecture 12.
Fundamental Structures of Computer Science
Advanced Design and Analysis Techniques
Chapter 8 Dynamic Programming.
Unit-5 Dynamic Programming
Merge Sort 1/12/2019 5:31 PM Dynamic Programming Dynamic Programming.
Dynamic Programming Dynamic Programming 1/18/ :45 AM
Merge Sort 1/18/ :45 AM Dynamic Programming Dynamic Programming.
Merge Sort 2/22/ :33 AM Dynamic Programming Dynamic Programming.
Ch. 15: Dynamic Programming Ming-Te Chi
CSE 326: Data Structures Lecture #24 The Algorhythmics
CSCI 235, Spring 2019, Lecture 25 Dynamic Programming
Presentation transcript:

Algorithm Design Techniques: Dynamic Programming

Introduction Dynamic Programming –Divide-and-conquer solution to a problem with overlapping subproblems –Similar to bottom-up recursion where results are saved in a table as they are computed –Typically applied to optimization problems –Typically best used when objects have a linear ordering and cannot be rearranged

Example: Fibonacci Basic recursive solution F(N) = F(N-1)+F(N-2) F1 calculated three times Instead, calculate numbers 1->n, store values as they are calculated Use space instead of time F4 F3 F2 F1 F0 F1F0

Steps (CLR pg323) 1.Characterize the structure of an optimal solution (optimal substructure) 2.Recursively define the value of an optimal solution 3.Compute the value of an optimal solution in a bottom-up fashion 4.Construct an optimal solution from computed information

Ordering Matrix Multiplications Matrices –A (50x10), B (10x40), C (40x30), D (30x5) Compute ABCD –Associative (not commutative) Fits the ordering property –Decide how to parenthesize to achieve fewest operations

Ordering Matrix Multiplications (A((BC)D) –BC = 10x40x30 = 12,000 operations –(BC)D = 12, x30x5 = 13,500 –A((BC)D) = 13, x10x5 = 16,000 (A(B(CD)) = 10,500 ((AB)(CD)) = 36,000 (((AB)C)D) = 87,500 ((A(BC))D) = 34,500

Ordering Matrix Multiplications N-1 T(N) = Σ T(i)T(N-i) i=1 Basic recursive solution is exponential M Left, Right = min Left<=i<=Right {M Left,i +M i+1,Right +c Left-1 c i c Right }

Ordering Matrix Multiplications A B CD A = 50x10 B = 10x40 C = 40x30 D = 30x5 20,00012,0006,000 27,0008,000 10, N 2 /2 values are calculated O(N 3 ) running time

Optimal Binary Search Tree Create binary search tree to minimize N Σ p i (1+d i ) i=1 WordProbability a.22 am.18 and.20 egg.05 if.25 the.02 two.08 if atwo theand amegg

Optimal Binary Search Tree Basic greedy strategy does not work But, problem has the ordering property and optimal substructure property i-1 Right C Left, Right = min{p i + C Left, i-1 +C i+1, Right +  p j +  p j } j=Left j=i+1

Optimal Binary Search Tree (pg431) a..a.22 a am..am.18 am and..and.20 and egg..egg.05 egg if..if.25 if the..the.02 the two..two.08 two a..am.58 a am..and.56 am and..egg.30 and egg..if.35 if if..the.29 if the..two.12 two a..and 1.02 am am..egg.66 and and..if.80 if egg..the.39 if if..two.46 if a..egg 1.17 am am..if 1.21 and and..the.84 if egg..two.57 if a..if 1.83 and am..the 1.27 and and..two 1.02 if a..the 1.89 and am..two 1.53 and a..two 2.15 and

Memoization Maintain top-down recursion, but memoize (keep "memos" of) results allocate array for memo; initialize memo[1] and memo[2] to 1; fib(n) { if memo[n] is not zero, return memo[n]; memo[n] = fib(n-1) + fib(n-2); return memo[n]; }