UMass Lowell Computer Science 91.503 Analysis of Algorithms Prof. Karen Daniels Fall, 2002 Lecture 1 (Part 3) Tuesday, 9/3/02 Design Patterns for Optimization.

Slides:



Advertisements
Similar presentations
Introduction to Algorithms 6.046J/18.401J/SMA5503
Advertisements

Dynamic Programming.
Algorithm Design Methodologies Divide & Conquer Dynamic Programming Backtracking.
1.1 Data Structure and Algorithm Lecture 6 Greedy Algorithm Topics Reference: Introduction to Algorithm by Cormen Chapter 17: Greedy Algorithm.
Greedy Algorithms Be greedy! always make the choice that looks best at the moment. Local optimization. Not always yielding a globally optimal solution.
UMass Lowell Computer Science Analysis of Algorithms Prof. Giampiero Pecelli Fall, 2010 Paradigms for Optimization Problems Dynamic Programming.
Lecture 8: Dynamic Programming Shang-Hua Teng. Longest Common Subsequence Biologists need to measure how similar strands of DNA are to determine how closely.
CS420 Lecture 9 Dynamic Programming. Optimization Problems In optimization problems a set of choices are to be made to arrive at an optimum, and sub problems.
1 Dynamic Programming (DP) Like divide-and-conquer, solve problem by combining the solutions to sub-problems. Differences between divide-and-conquer and.
UMass Lowell Computer Science Analysis of Algorithms Prof. Karen Daniels Fall, 2006 Lecture 1 (Part 3) Design Patterns for Optimization Problems.
UMass Lowell Computer Science Analysis of Algorithms Prof. Karen Daniels Fall, 2001 Lecture 2 (Part 1) Tuesday, 9/11/01 Dynamic Programming.
UMass Lowell Computer Science Analysis of Algorithms Prof. Karen Daniels Spring, 2002 Lecture 2 Tuesday, 2/5/02 Dynamic Programming.
UMass Lowell Computer Science Analysis of Algorithms Prof. Karen Daniels Fall, 2001 Lecture 1 (Part 3) Tuesday, 9/4/01 Greedy Algorithms.
UMass Lowell Computer Science Analysis of Algorithms Prof. Karen Daniels Spring, 2006 Lecture 2 Monday, 2/6/06 Design Patterns for Optimization.
UMass Lowell Computer Science Analysis of Algorithms Prof. Karen Daniels Spring, 2009 Design Patterns for Optimization Problems Dynamic Programming.
UMass Lowell Computer Science Analysis of Algorithms Prof. Karen Daniels Fall, 2002 Lecture 2 Tuesday, 9/10/02 Design Patterns for Optimization.
UMass Lowell Computer Science Analysis of Algorithms Prof. Karen Daniels Fall, 2006 Lecture 2 Monday, 9/13/06 Design Patterns for Optimization Problems.
UMass Lowell Computer Science Analysis of Algorithms Prof. Karen Daniels Fall, 2002 Monday, 12/2/02 Design Patterns for Optimization Problems Greedy.
1 Dynamic Programming Andreas Klappenecker [based on slides by Prof. Welch]
Analysis of Algorithms CS 477/677
UMass Lowell Computer Science Analysis of Algorithms Prof. Karen Daniels Fall, 2006 Design Patterns for Optimization Problems Dynamic Programming.
UMass Lowell Computer Science Analysis of Algorithms Prof. Karen Daniels Spring, 2008 Design Patterns for Optimization Problems Dynamic Programming.
November 7, 2005Copyright © by Erik D. Demaine and Charles E. Leiserson Dynamic programming Design technique, like divide-and-conquer. Example:
Lecture 8: Dynamic Programming Shang-Hua Teng. First Example: n choose k Many combinatorial problems require the calculation of the binomial coefficient.
UMass Lowell Computer Science Analysis of Algorithms Prof. Karen Daniels Spring, 2002 Lecture 1 (Part 3) Tuesday, 1/29/02 Design Patterns for Optimization.
UMass Lowell Computer Science Analysis of Algorithms Prof. Karen Daniels Fall, 2008 Lecture 2 Tuesday, 9/16/08 Design Patterns for Optimization.
Analysis of Algorithms
Lecture 7 Topics Dynamic Programming
Longest Common Subsequence
UMass Lowell Computer Science Analysis of Algorithms Prof. Karen Daniels Fall, 2005 Design Patterns for Optimization Problems Dynamic Programming.
Dynamic Programming Introduction to Algorithms Dynamic Programming CSE 680 Prof. Roger Crawfis.
1 Dynamic Programming 2012/11/20. P.2 Dynamic Programming (DP) Dynamic programming Dynamic programming is typically applied to optimization problems.
First Ingredient of Dynamic Programming
Algorithms and Data Structures Lecture X
Lecture 5 Dynamic Programming. Dynamic Programming Self-reducibility.
Dynamic Programming UNC Chapel Hill Z. Guo.
October 21, Algorithms and Data Structures Lecture X Simonas Šaltenis Nykredit Center for Database Research Aalborg University
Dynamic Programming. Well known algorithm design techniques:. –Divide-and-conquer algorithms Another strategy for designing algorithms is dynamic programming.
CS 5243: Algorithms Dynamic Programming Dynamic Programming is applicable when sub-problems are dependent! In the case of Divide and Conquer they are.
David Luebke 1 10/24/2015 CS 332: Algorithms Greedy Algorithms Continued.
COSC 3101A - Design and Analysis of Algorithms 7 Dynamic Programming Assembly-Line Scheduling Matrix-Chain Multiplication Elements of DP Many of these.
CS 8833 Algorithms Algorithms Dynamic Programming.
Lecture21: Dynamic Programming Bohyung Han CSE, POSTECH CSED233: Data Structures (2014F)
Greedy Methods and Backtracking Dr. Marina Gavrilova Computer Science University of Calgary Canada.
COSC 3101A - Design and Analysis of Algorithms 8 Elements of DP Memoization Longest Common Subsequence Greedy Algorithms Many of these slides are taken.
Analysis of Algorithms CS 477/677 Instructor: Monica Nicolescu Lecture 14.
Introduction to Algorithms Jiafen Liu Sept
Computer Sciences Department1.  Property 1: each node can have up to two successor nodes (children)  The predecessor node of a node is called its.
Chapter 15 Dynamic Programming Lee, Hsiu-Hui Ack: This presentation is based on the lecture slides from Hsu, Lih-Hsing, as well as various materials from.
UMass Lowell Computer Science Analysis of Algorithms Prof. Karen Daniels Spring, 2010 Lecture 2 Tuesday, 2/2/10 Design Patterns for Optimization.
TU/e Algorithms (2IL15) – Lecture 3 1 DYNAMIC PROGRAMMING
9/27/10 A. Smith; based on slides by E. Demaine, C. Leiserson, S. Raskhodnikova, K. Wayne Adam Smith Algorithm Design and Analysis L ECTURE 16 Dynamic.
TU/e Algorithms (2IL15) – Lecture 4 1 DYNAMIC PROGRAMMING II
6/13/20161 Greedy A Comparison. 6/13/20162 Greedy Solves an optimization problem: the solution is “best” in some sense. Greedy Strategy: –At each decision.
Dr Nazir A. Zafar Advanced Algorithms Analysis and Design Advanced Algorithms Analysis and Design By Dr. Nazir Ahmad Zafar.
CS583 Lecture 12 Jana Kosecka Dynamic Programming Longest Common Subsequence Matrix Chain Multiplication Greedy Algorithms Many slides here are based on.
Dynamic Programming Typically applied to optimization problems
Lecture 5 Dynamic Programming
Advanced Design and Analysis Techniques
Lecture 5 Dynamic Programming
CSCE 411 Design and Analysis of Algorithms
Dynamic Programming.
Algorithms CSCI 235, Spring 2019 Lecture 28 Dynamic Programming III
Introduction to Algorithms: Dynamic Programming
Algorithm Design Techniques Greedy Approach vs Dynamic Programming
Longest Common Subsequence
Analysis of Algorithms CS 477/677
Matrix Chain Multiplication
Longest Common Subsequence
Presentation transcript:

UMass Lowell Computer Science Analysis of Algorithms Prof. Karen Daniels Fall, 2002 Lecture 1 (Part 3) Tuesday, 9/3/02 Design Patterns for Optimization Problems Dynamic Programming & Greedy Algorithms

Algorithmic Paradigm Context Divide & Conquer Dynamic Programming View problem as collection of subproblems “Recursive” nature Independent subproblems Number of subproblems depends on partitioning factors typically small Preprocessing Characteristic running time typically log function of n depends on number and difficulty of subproblems Primarily for optimization problems Optimal substructure: optimal solution to problem contains within it optimal solutions to subproblems Overlapping subproblems

Dynamic Programming

Example: Matrix Parenthesization Definitions ä Given “chain” of n matrices: ä Given “chain” of n matrices: ä Compute product A 1 A 2 … A n efficiently ä Minimize “cost” = number of scalar multiplications ä Multiplication order matters!

Example: Matrix Parenthesization Step 1: Characterizing an Optimal Solution Observation: Any parenthesization of A i A i+1 … A j must split it between A k and A k+1 for some k. THM: Optimal Matrix Parenthesization: If an optimal parenthesization of A i A i+1 … A j splits at k, then parenthesization of prefix A i A i+1 … A k must be an optimal parenthesization. Why? If existed less costly way to parenthesize prefix, then substituting that parenthesization would yield less costly way to parenthesize A i A i+1 … A j, contradicting optimality of that parenthesization. If existed less costly way to parenthesize prefix, then substituting that parenthesization would yield less costly way to parenthesize A i A i+1 … A j, contradicting optimality of that parenthesization.

Example: Matrix Parenthesization Step 2: A Recursive Solution ä Recursive definition of minimum parenthesization cost: m[i,j]= min{m[i,k] + m[k+1,j] + p i-1 p k p j } if i < j 0 if i = j How many distinct subproblems? i <= k < j

Example: Matrix Parenthesization Step 3: Computing Optimal Costs 0 2,625 2,500 1,000 s: value of k that achieves optimal cost in computing m[i, j] source: textbook Cormen, et al.

Example: Matrix Parenthesization Step 4: Constructing an Optimal Solution PRINT-OPTIMAL-PARENS(s, i, j) if i = j then print “A” i else print “(“ PRINT-OPTIMAL-PARENS(s, i, s[i, j]) PRINT-OPTIMAL-PARENS(s, i, s[i, j]) PRINT-OPTIMAL-PARENS(s, s[i, j]+1, j) PRINT-OPTIMAL-PARENS(s, s[i, j]+1, j) print “)“ print “)“ source: textbook Cormen, et al.

Example: Matrix Parenthesization Memoization ä Provide Dynamic Programming efficiency ä But with top-down strategy ä Use recursion ä Fill in table “on demand” ä Example: ä RECURSIVE-MATRIX-CHAIN: MEMOIZED-MATRIX-CHAIN(p) 1 n length[p] for i 1 to n 3 do for j i to n 4 do m[i,j] 5 return LOOKUP-CHAIN(p,1,n) LOOKUP-CHAIN(p,1,n) 1 if m[i,j] < 2 then return m[i,j] 3 if i=j 4 then m[i,j] 0 5 else for k i to j-1 6 do q LOOKUP-CHAIN(p,i,k) + LOOKUP-CHAIN(p,k+1,j) + p i-1 p k p j 7if q < m[i,j] 8 then m[i,j] q 9 return m[i,j] source: textbook Cormen, et al.

Example: Longest Common Subsequence (LCS): Motivation ä Strand of DNA: string over finite set {A,C,G,T} ä each element of set is a base: adenine, guanine, cytosine or thymine ä Compare DNA similarities ä S 1 = ACCGGTCGAGTGCGCGGAAGCCGGCCGAA ä S 2 = GTCGTTCGGAATGCCGTTGCTCTGTAAA ä One measure of similarity: ä find the longest string S 3 containing bases that also appear (not necessarily consecutively) in S 1 and S 2 ä S 3 = GTCGTCGGAAGCCGGCCGAA source: textbook Cormen, et al.

Example: LCS Definitions ä Sequence is a subsequence of if (strictly increasing indices of X) such that ä example: is subsequence of with index sequence ä Z is common subsequence of X and Y if Z is subsequence of both X and Y ä example: ä common subsequence but not longest ä common subsequence. Longest? Longest Common Subsequence Problem: Given 2 sequences X, Y, find maximum-length common subsequence Z. source: textbook Cormen, et al.

Example: LCS Step 1: Characterize an LCS THM 15.1: Optimal LCS Substructure Given sequences: For any LCSof X and Y: 1 if thenand Z k-1 is an LCS of X m-1 and Y n-1 2 if thenZ is an LCS of X m-1 and Y 3 if thenZ is an LCS of X and Y n-1 PROOF: based on producing contradictions 1 a) Suppose. Appending to Z contradicts longest nature of Z. b) To establish longest nature of Z k-1, suppose common subsequence W of X m-1 and Y n-1 has length > k-1. Appending to W yields common subsequence of length > k = contradiction. b) To establish longest nature of Z k-1, suppose common subsequence W of X m-1 and Y n-1 has length > k-1. Appending to W yields common subsequence of length > k = contradiction. 2 Common subsequence W of X m-1 and Y of length > k would also be common subsequence of X m, Y, contradicting longest nature of Z. 3 Similar to proof of (2) source: textbook Cormen, et al.

Example: LCS Step 2: A Recursive Solution ä Implications of Thm 15.1: ? yes no Find LCS(X m-1, Y n-1 ) Find LCS(X m-1, Y) Find LCS(X, Y n-1 ) LCS(X, Y) = LCS(X m-1, Y n-1 ) + x m LCS(X, Y) = max(LCS(X m-1, Y), LCS(X, Y n-1 ) LCS(X, Y)

Example: LCS Step 2: A Recursive Solution (continued) ä Overlapping subproblem structure: ä Recurrence for length of optimal solution: Conditions of problem can exclude some subproblems! c[i,j]= c[i-1,j-1]+1 if i,j > 0 and x i =y j max(c[i,j-1], c[i-1,j])if i,j > 0 and x i =y j 0 if i=0 or j=0  (mn) distinct subproblems source: textbook Cormen, et al.

Example: LCS Step 3: Compute Length of an LCS c table ( represent b table) B CB A B C B A source: textbook Cormen, et al.

Example: LCS Step 4: Construct an LCS source: textbook Cormen, et al.

Example: LCS Improve the Code ä Can eliminate b table  c[i,j] depends only on 3 other c table entries: ä c[i-1,j-1] c[i-1,j] c[i,j-1]  given value of c[i,j], can pick the one in O(1) time ä reconstruct LCS in O(m+n) time similar to PRINT-LCS  same  (mn) space, but  (mn) was needed anyway... ä Asymptotic space reduction ä leverage: need only 2 rows of c table at a time ä row being computed ä previous row ä can also do it with ~ space for 1 row of c table ä but does not preserve LCS reconstruction data source: textbook Cormen, et al.

Algorithmic Paradigm Context

Greedy Algorithms

What is a Greedy Algorithm? ä Solves an optimization problem ä Optimal Substructure: ä optimal solution contains in it optimal solutions to subproblems ä Greedy Strategy: ä At each decision point, do what looks best “locally” ä Choice does not depend on evaluating potential future choices or solving subproblems ä Top-down algorithmic structure ä With each step, reduce problem to a smaller problem ä Greedy Choice Property: ä “locally best” = globally best

Examples ä From ä Minimum Spanning Tree ä Dijkstra’s single-source shortest path ä Huffman Codes ä Fractional Knapsack ä Activity Selection

Example Optimization Problem: Activity Selection ä Problem Instance: ä Set S = {1,2,...,n} of n activities ä Each activity i has: ä start time: s i ä finish time : f i ä Activities i, j are compatible iff non-overlapping: ä Objective: ä select a maximum-sized set of mutually compatible activities source: textbook Cormen, et al.

Running time? Examples: Activity Selection ä Algorithm: ä S’ = presort activities in S by nondecreasing finish time ä and renumber ä GREEDY-ACTIVITY-SELECTOR(S’) ä n length[S’] ä A {1} ä j1 ä for i 2 to n ä do if ä then ä j i ä return A source: textbook Cormen, et al.

Another use for Greedy Algorithm ä If optimization problem does not have “greedy choice property”, greedy algorithm may still be useful in bounding the optimal solution ä Example: minimization problem Optimal (unknown value) Upper Bound (heuristic) Lower Bound Solution Values