David Luebke 1 2/26/2016 CS 332: Algorithms Dynamic Programming.

Slides:



Advertisements
Similar presentations
Introduction to Algorithms 6.046J/18.401J/SMA5503
Advertisements

Exercise 1.1(Paid Exchanges Necessary) Consider the three-element list with the following initial configuration: (i.e. x1 is at the front). Prove that.
Lecture 8: Dynamic Programming Shang-Hua Teng. Longest Common Subsequence Biologists need to measure how similar strands of DNA are to determine how closely.
CPSC 335 Dynamic Programming Dr. Marina Gavrilova Computer Science University of Calgary Canada.
Overview What is Dynamic Programming? A Sequence of 4 Steps
Algorithms Dynamic programming Longest Common Subsequence.
Data Structures Haim Kaplan and Uri Zwick October 2013 Lecture 2 Amortized Analysis “amortized analysis bounds the worst case running time of a sequence.
David Luebke 1 5/4/2015 CS 332: Algorithms Dynamic Programming Greedy Algorithms.
1 Chapter 17: Amortized Analysis III. 2 Dynamic Table Sometimes, we may not know in advance the #objects to be stored in a table We may allocate space.
Comp 122, Fall 2004 Dynamic Programming. dynprog - 2 Lin / Devi Comp 122, Spring 2004 Longest Common Subsequence  Problem: Given 2 sequences, X =  x.
1 Longest Common Subsequence (LCS) Problem: Given sequences x[1..m] and y[1..n], find a longest common subsequence of both. Example: x=ABCBDAB and y=BDCABA,
UMass Lowell Computer Science Graduate Analysis of Algorithms Prof. Karen Daniels Spring, 2010 Lecture 3 Tuesday, 2/9/10 Amortized Analysis.
Data Structures Haim Kaplan and Uri Zwick February 2010 Lecture 2 Amortized Analysis “amortized analysis finds the average running time per operation over.
UMass Lowell Computer Science Graduate Analysis of Algorithms Prof. Karen Daniels Spring, 2005 Lecture 3 Tuesday, 2/8/05 Amortized Analysis.
Princeton University COS 423 Theory of Algorithms Spring 2001 Kevin Wayne Amortized Analysis Some of these lecture slides are adapted from CLRS.
Tirgul 9 Amortized analysis Graph representation.
UMass Lowell Computer Science Graduate Analysis of Algorithms Prof. Karen Daniels Spring, 2009 Lecture 3 Tuesday, 2/10/09 Amortized Analysis.
Theory I Algorithm Design and Analysis (8 – Dynamic tables) Prof. Th. Ottmann.
Amortized Analysis (chap. 17) Not just consider one operation, but a sequence of operations on a given data structure. Average cost over a sequence of.
November 7, 2005Copyright © by Erik D. Demaine and Charles E. Leiserson Dynamic programming Design technique, like divide-and-conquer. Example:
17.Amortized analysis Hsu, Lih-Hsing. Computer Theory Lab. Chapter 17P.2 The time required to perform a sequence of data structure operations in average.
1 Dynamic Programming Jose Rolim University of Geneva.
Lecture 7 Topics Dynamic Programming
UMass Lowell Computer Science Analysis of Algorithms Prof. Karen Daniels Fall, 2001 Lecture 2 (Part 2) Tuesday, 9/11/01 Amortized Analysis.
Longest Common Subsequence
UMass Lowell Computer Science Analysis of Algorithms Prof. Karen Daniels Fall, 2005 Design Patterns for Optimization Problems Dynamic Programming.
David Luebke 1 8/23/2015 CS 332: Algorithms Greedy Algorithms.
Dynamic Programming (16.0/15) The 3-d Paradigm 1st = Divide and Conquer 2nd = Greedy Algorithm Dynamic Programming = metatechnique (not a particular algorithm)
Introduction to Algorithms 6.046J/18.401J/SMA5503 Lecture 14 Prof. Charles E. Leiserson.
Tonga Institute of Higher Education Design and Analysis of Algorithms IT 254 Lecture 5: Advanced Design Techniques.
ADA: 7. Dynamic Prog.1 Objective o introduce DP, its two hallmarks, and two major programming techniques o look at two examples: the fibonacci.
1 Summary of lectures 1.Introduction to Algorithm Analysis and Design (Chapter 1-3). Lecture SlidesLecture Slides 2.Recurrence and Master Theorem (Chapter.
Amortized Analysis The problem domains vary widely, so this approach is not tied to any single data structure The goal is to guarantee the average performance.
Amortized Analysis Typically, most data structures provide absolute guarantees on the worst case time for performing a single operation. We will study.
David Luebke 1 10/24/2015 CS 332: Algorithms Greedy Algorithms Continued.
CSC 413/513: Intro to Algorithms Greedy Algorithms.
Greedy Methods and Backtracking Dr. Marina Gavrilova Computer Science University of Calgary Canada.
6/4/ ITCS 6114 Dynamic programming Longest Common Subsequence.
Dynamic Programming continued David Kauchak cs302 Spring 2012.
Dynamic Programming David Kauchak cs302 Spring 2013.
1 Hashing - Introduction Dictionary = a dynamic set that supports the operations INSERT, DELETE, SEARCH Dictionary = a dynamic set that supports the operations.
Advanced Algorithm Design and Analysis (Lecture 12) SW5 fall 2004 Simonas Šaltenis E1-215b
David Luebke 1 12/12/2015 CS 332: Algorithms Amortized Analysis.
Introduction to Algorithms Jiafen Liu Sept
9/27/10 A. Smith; based on slides by E. Demaine, C. Leiserson, S. Raskhodnikova, K. Wayne Adam Smith Algorithm Design and Analysis L ECTURE 16 Dynamic.
CS583 Lecture 12 Jana Kosecka Dynamic Programming Longest Common Subsequence Matrix Chain Multiplication Greedy Algorithms Many slides here are based on.
Introduction to Algorithms: Amortized Analysis. Introduction to Algorithms Amortized Analysis Dynamic tables Aggregate method Accounting method Potential.
Amortized Analysis.
Many slides here are based on D. Luebke slides
David Meredith Dynamic programming David Meredith
Presentation by Marty Krogel
Amortized Analysis The problem domains vary widely, so this approach is not tied to any single data structure The goal is to guarantee the average performance.
Least common subsequence:
CS200: Algorithm Analysis
Dynamic Programming Several problems Principle of dynamic programming
CS 332: Algorithms Dijkstra’s Algorithm Continued Disjoint-Set Union
CS Algorithms Dynamic programming 0-1 Knapsack problem 12/5/2018.
Dynamic Programming Dr. Yingwu Zhu Chapter 15.
CS 3343: Analysis of Algorithms
Longest Common Subsequence
Dynamic Programming Comp 122, Fall 2004.
Lecture 8. Paradigm #6 Dynamic Programming
CS 332: Algorithms Amortized Analysis Continued
Introduction to Algorithms: Dynamic Programming
Longest Common Subsequence
Longest common subsequence (LCS)
Analysis of Algorithms CS 477/677
Longest Common Subsequence
Dynamic Programming.
Longest Common Subsequence
Presentation transcript:

David Luebke 1 2/26/2016 CS 332: Algorithms Dynamic Programming

David Luebke 2 2/26/2016 Review: Amortized Analysis ● To illustrate amortized analysis we examined dynamic tables 1. Init table size m = 1 2. Insert elements until number n > m 3. Generate new table of size 2m 4. Reinsert old elements into new table 5. (back to step 2) ● What is the worst-case cost of an insert? ● What is the amortized cost of an insert?

David Luebke 3 2/26/2016 Review: Analysis Of Dynamic Tables ● Let c i = cost of ith insert ● c i = i if i-1 is exact power of 2, 1 otherwise ● Example: ■ OperationTable Size Cost Insert(1)11 1 Insert(2) Insert(3) Insert(4)41 Insert(5) Insert(6)81 Insert(7)81 Insert(8)81 Insert(9)

David Luebke 4 2/26/2016 Review: Aggregate Analysis ● n Insert() operations cost ● Average cost of operation = (total cost)/(# operations) < 3 ● Asymptotically, then, a dynamic table costs the same as a fixed-size table ■ Both O(1) per Insert operation

David Luebke 5 2/26/2016 Review: Accounting Analysis ● Charge each operation $3 amortized cost ■ Use $1 to perform immediate Insert() ■ Store $2 ● When table doubles ■ $1 reinserts old item, $1 reinserts another old item ■ We’ve paid these costs up front with the last n/2 Insert()s ● Upshot: O(1) amortized cost per operation

David Luebke 6 2/26/2016 Review: Accounting Analysis ● Suppose must support insert & delete, table should contract as well as expand ■ Table overflows  double it (as before) ■ Table < 1/4 full  halve it ■ Charge $3 for Insert (as before) ■ Charge $2 for Delete ○ Store extra $1 in emptied slot ○ Use later to pay to copy remaining items to new table when shrinking table ● What if we halve size when table < 1/8 full?

David Luebke 7 2/26/2016 Dynamic Programming ● Another strategy for designing algorithms is dynamic programming ■ A metatechnique, not an algorithm (like divide & conquer) ■ The word “programming” is historical and predates computer programming ● Use when problem breaks down into recurring small subproblems

David Luebke 8 2/26/2016 Dynamic Programming Example: Longest Common Subsequence ● Longest common subsequence (LCS) problem: ■ Given two sequences x[1..m] and y[1..n], find the longest subsequence which occurs in both ■ Ex: x = {A B C B D A B }, y = {B D C A B A} ■ {B C} and {A A} are both subsequences of both ○ What is the LCS? ■ Brute-force algorithm: For every subsequence of x, check if it’s a subsequence of y ○ How many subsequences of x are there? ○ What will be the running time of the brute-force alg?

David Luebke 9 2/26/2016 LCS Algorithm ● Brute-force algorithm: 2 m subsequences of x to check against n elements of y: O(n 2 m ) ● We can do better: for now, let’s only worry about the problem of finding the length of LCS ■ When finished we will see how to backtrack from this solution back to the actual LCS ● Notice LCS problem has optimal substructure ■ Subproblems: LCS of pairs of prefixes of x and y

David Luebke 10 2/26/2016 Finding LCS Length ● Define c[i,j] to be the length of the LCS of x[1..i] and y[1..j] ■ What is the length of LCS of x and y? ● Theorem: ● What is this really saying?

David Luebke 11 2/26/2016 The End