Analysis of Algorithms CS 477/677 Instructor: Monica Nicolescu Lecture 11.

Slides:



Advertisements
Similar presentations
CS Section 600 CS Section 002 Dr. Angela Guercio Spring 2010.
Advertisements

Dynamic Programming.
Analysis of Algorithms CS 477/677 Instructor: Monica Nicolescu Lecture 6.
Types of Algorithms.
Greedy Algorithms Greed is good. (Some of the time)
Lecture 8: Dynamic Programming Shang-Hua Teng. Longest Common Subsequence Biologists need to measure how similar strands of DNA are to determine how closely.
CS Section 600 CS Section 002 Dr. Angela Guercio Spring 2010.
CS420 Lecture 9 Dynamic Programming. Optimization Problems In optimization problems a set of choices are to be made to arrive at an optimum, and sub problems.
Dynamic Programming.
Introduction to Algorithms
1 Dynamic Programming Jose Rolim University of Geneva.
14. Augmenting Data Structures Hsu, Lih-Hsing. Computer Theory Lab. Chapter 13P Dynamic order statistics We shall also see the rank of an element―its.
2IL50 Data Structures Spring 2015 Lecture 8: Augmenting Data Structures.
Comp 122, Fall 2004 Dynamic Programming. dynprog - 2 Lin / Devi Comp 122, Spring 2004 Longest Common Subsequence  Problem: Given 2 sequences, X =  x.
1 Dynamic Programming (DP) Like divide-and-conquer, solve problem by combining the solutions to sub-problems. Differences between divide-and-conquer and.
Dynamic Programming Part 1: intro and the assembly-line scheduling problem.
Dynamic Programming Lets begin by looking at the Fibonacci sequence.
Dynamic Programming (II)
Introduction to Algorithms Second Edition by Cormen, Leiserson, Rivest & Stein Chapter 15.
Dynamic Programming CIS 606 Spring 2010.
CS3381 Des & Anal of Alg ( SemA) City Univ of HK / Dept of CS / Helena Wong 4. Dynamic Programming - 1 Dynamic.
Greedy Algorithms CIS 606 Spring Greedy Algorithms Similar to dynamic programming. Used for optimization problems. Idea – When we have a choice.
Analysis of Algorithms CS 477/677
Augmenting Data Structures Advanced Algorithms & Data Structures Lecture Theme 07 – Part II Prof. Dr. Th. Ottmann Summer Semester 2006.
Analysis of Algorithms
Introduction to Analysis of Algorithms CAS CS 330 Lecture 16 Shang-Hua Teng Thanks to Charles E. Leiserson and Silvio Micali of MIT for these slides.
Dynamic Programming Introduction to Algorithms Dynamic Programming CSE 680 Prof. Roger Crawfis.
Interval Trees.
Dynamic Programming UNC Chapel Hill Z. Guo.
Dynamic Programming. Well known algorithm design techniques:. –Divide-and-conquer algorithms Another strategy for designing algorithms is dynamic programming.
Lecture X Augmenting Data Structures
Interval Trees CS302 Data Structures Modified from Dr Monica Nicolescu.
1 Summary: Design Methods for Algorithms Andreas Klappenecker.
COSC 3101A - Design and Analysis of Algorithms 7 Dynamic Programming Assembly-Line Scheduling Matrix-Chain Multiplication Elements of DP Many of these.
CS 8833 Algorithms Algorithms Dynamic Programming.
1 Chapter 15-1 : Dynamic Programming I. 2 Divide-and-conquer strategy allows us to solve a big problem by handling only smaller sub-problems Some problems.
COSC 3101A - Design and Analysis of Algorithms 8 Elements of DP Memoization Longest Common Subsequence Greedy Algorithms Many of these slides are taken.
Introduction to Algorithms 6.046J/18.401J/SMA5503 Lecture 11 Prof. Erik Demaine.
Analysis of Algorithms CS 477/677 Instructor: Monica Nicolescu Lecture 7.
Types of Algorithms. 2 Algorithm classification Algorithms that use a similar problem-solving approach can be grouped together We’ll talk about a classification.
Optimization Problems In which a set of choices must be made in order to arrive at an optimal (min/max) solution, subject to some constraints. (There may.
Computer Sciences Department1.  Property 1: each node can have up to two successor nodes (children)  The predecessor node of a node is called its.
Dynamic Programming.  Decomposes a problem into a series of sub- problems  Builds up correct solutions to larger and larger sub- problems  Examples.
CSC5101 Advanced Algorithms Analysis
15.Dynamic Programming. Computer Theory Lab. Chapter 15P.2 Dynamic programming Dynamic programming is typically applied to optimization problems. In such.
Chapter 15 Dynamic Programming Lee, Hsiu-Hui Ack: This presentation is based on the lecture slides from Hsu, Lih-Hsing, as well as various materials from.
2IL05 Data Structures Spring 2010 Lecture 9: Augmenting Data Structures.
Advanced Algorithms Analysis and Design
Analysis of Algorithms CS 477/677 Instructor: Monica Nicolescu Lecture 18.
Analysis of Algorithms CS 477/677 Instructor: Monica Nicolescu Lecture 17.
Dynamic Programming Csc 487/687 Computing for Bioinformatics.
Analysis of Algorithms CS 477/677 Instructor: Monica Nicolescu Lecture 10.
Analysis of Algorithms CS 477/677
Analysis of Algorithms CS 477/677
Introduction to Algorithms
Advanced Design and Analysis Techniques
Dynamic Programming Several problems Principle of dynamic programming
Dynamic Programming Comp 122, Fall 2004.
Introduction to Algorithms
Analysis of Algorithms CS 477/677
Dynamic Programming Comp 122, Fall 2004.
Introduction to Algorithms: Dynamic Programming
DYNAMIC PROGRAMMING.
COMP108 Algorithmic Foundations Dynamic Programming
Dynamic Programming II DP over Intervals
Analysis of Algorithms CS 477/677
Interval Trees CS302 Data Structures Modified from Dr Monica Nicolescu.
Analysis of Algorithms CS 477/677
Analysis of Algorithms CS 477/677
Presentation transcript:

Analysis of Algorithms CS 477/677 Instructor: Monica Nicolescu Lecture 11

Interval Trees Useful for representing a set of intervals –E.g.: time intervals of various events Each interval i has a low[i] and a high[i] CS 477/677 - Lecture 112

Interval Trees Def.: Interval tree = a red-black tree that maintains a dynamic set of elements, each element x having associated an interval int[x]. Operations on interval trees: –INTERVAL-INSERT( T, x ) –INTERVAL-DELETE( T, x ) –INTERVAL-SEARCH( T, i ) CS 477/677 - Lecture 113

Interval Properties Intervals i and j overlap iff: low[i] ≤ high[j] and low[j] ≤ high[i] Intervals i and j do not overlap iff: high[i] < low[j] or high[j] < low[i] i j i j i j i j ij ji CS 477/677 - Lecture 114

Interval Trichotomy Any two intervals i and j satisfy the interval trichotomy: exactly one of the following three properties holds: a)i and j overlap, b)i is to the left of j ( high[i] < low[j] ) c)i is to the right of j ( high[j] < low[i] ) CS 477/677 - Lecture 115

Designing Interval Trees 1.Underlying data structure –Red-black trees –Each node x contains: an interval int[x], and the key: low[int[x]] –An inorder tree walk will list intervals sorted by their low endpoint 2.Additional information –max[x] = maximum endpoint value in subtree rooted at x 3.Maintaining the information max[x] = Constant work at each node, so still O(lgn) time high[int[x]] max max[left[x]] max[right[x]] CS 477/677 - Lecture 116

Designing Interval Trees 4.Develop new operations INTERVAL-SEARCH( T, i ): –Returns a pointer to an element x in the interval tree T, such that int[x] overlaps with i, or NIL otherwise Idea: Check if int[x] overlaps with i Max[left[x]] ≥ low[i] –Go left Otherwise, go right [16, 21] 30 [25, 30] 30 [26, 26] 26 [17, 19] 20 [19, 20] 20 [8, 9] 23 [15, 23] 23 [5, 8] 10 [6, 10] 10 [0, 3] 3 [22, 25] highlow CS 477/677 - Lecture 117

Example [16, 21] 30 [25, 30] 30 [26, 26] 26 [17, 19] 20 [19, 20] 20 [8, 9] 23 [15, 23] 23 [5, 8] 10 [6, 10] 10 [0, 3] 3 i = [11, 14] x x x x = NIL i = [22, 25] CS 477/677 - Lecture 118

INTERVAL-SEARCH( T, i ) 1.x ← root[T] 2.while x  nil[T] and i does not overlap int[x] 3. do if left[x]  nil[T] and max[left[x]] ≥ low[i] 4. then x ← left[x] 5. else x ← right[x] 6.return x CS 477/677 - Lecture 119

Theorem At the execution of interval search: if the search goes right, then either: –There is an overlap in right subtree, or –There is no overlap in either subtree Similar when the search goes left It is safe to always proceed in only one direction CS 477/677 - Lecture 1110

Theorem Proof: If search goes right: –If there is an overlap in right subtree, done –If there is no overlap in right  show there is no overlap in left –Went right because: left[x] = nil[T]  no overlap in left, or m ax[left[x]] < low[i]  no overlap in left i max[left[x]]low[x] CS 477/677 - Lecture 1111

Theorem - Proof If search goes left: If there is an overlap in left subtree, done If there is no overlap in left, show there is no overlap in right Went left because: low[i] ≤ max[left[x]] = high[j] for some j in left subtree int[root] max [low[j], high[j]] max[j] [low[k], high[k]] max[k] high[j] < low[i] high[i] < low[j] iij low[j] < low[k] k No overlap! high[i] < low[k] max[left] CS 477/677 - Lecture 1112

Dynamic Programming An algorithm design technique for optimization problems (similar to divide and conquer) Divide and conquer –Partition the problem into independent subproblems –Solve the subproblems recursively –Combine the solutions to solve the original problem CS 477/677 - Lecture 1113

Dynamic Programming Used for optimization problems –Find a solution with the optimal value (minimum or maximum) –A set of choices must be made to get an optimal solution –There may be many solutions that return the optimal value: we want to find one of them CS 477/677 - Lecture 1114

Dynamic Programming Applicable when subproblems are not independent –Subproblems share subsubproblems E.g.: Fibonacci numbers: Recurrence: F(n) = F(n-1) + F(n-2) Boundary conditions: F(1) = 0, F(2) = 1 Compute: F(5) = 3, F(3) = 1, F(4) = 2 –A divide and conquer approach would repeatedly solve the common subproblems –Dynamic programming solves every subproblem just once and stores the answer in a table CS 477/677 - Lecture 1115

Dynamic Programming Algorithm 1.Characterize the structure of an optimal solution 2.Recursively define the value of an optimal solution 3.Compute the value of an optimal solution in a bottom-up fashion 4.Construct an optimal solution from computed information CS 477/677 - Lecture 1116

Elements of Dynamic Programming Optimal Substructure –An optimal solution to a problem contains within it an optimal solution to subproblems –Optimal solution to the entire problem is built in a bottom-up manner from optimal solutions to subproblems Overlapping Subproblems –If a recursive algorithm revisits the same subproblems again and again  the problem has overlapping subproblems CS 477/677 - Lecture 1117

Assembly Line Scheduling Automobile factory with two assembly lines –Each line has n stations: S 1,1,..., S 1,n and S 2,1,..., S 2,n –Corresponding stations S 1, j and S 2, j perform the same function but can take different amounts of time a 1, j and a 2, j –Times to enter are e 1 and e 2 and times to exit are x 1 and x 2 CS 477/677 - Lecture 1118

Assembly Line After going through a station, the car can either: –stay on same line at no cost, or –transfer to other line: cost after S i,j is t i,j, i = 1, 2, j = 1,..., n-1 CS 477/677 - Lecture 1119

Assembly Line Scheduling Problem: What stations should be chosen from line 1 and what from line 2 in order to minimize the total time through the factory for one car? CS 477/677 - Lecture 1120

One Solution Brute force –Enumerate all possibilities of selecting stations –Compute how long it takes in each case and choose the best one –There are 2 n possible ways to choose stations –Infeasible when n is large if choosing line 1 at step j (= n) 1234n 0 if choosing line 2 at step j (= 3) CS 477/677 - Lecture 1121

1. Structure of the Optimal Solution How do we compute the minimum time of going through the station? CS 477/677 - Lecture 1122

1. Structure of the Optimal Solution Let’s consider all possible ways to get from the starting point through station S 1,j –We have two choices of how to get to S 1, j : Through S 1, j - 1, then directly to S 1, j Through S 2, j - 1, then transfer over to S 1, j a 1,j a 1,j-1 a 2,j-1 t 2,j-1 S 1,j S 1,j-1 S 2,j-1 CS 477/677 - Lecture 1123

1. Structure of the Optimal Solution Suppose that the fastest way through S 1, j is through S 1, j – 1 –We must have taken the fastest way from entry through S 1, j – 1 –If there were a faster way through S 1, j - 1, we would use it instead Similarly for S 2, j – 1 a 1,j a 1,j-1 a 2,j-1 t 2,j-1 S 1,j S 1,j-1 S 2,j-1 CS 477/677 - Lecture 1124

Optimal Substructure Generalization: an optimal solution to the problem find the fastest way through S 1, j contains within it an optimal solution to subproblems: find the fastest way through S 1, j - 1 or S 2, j - 1. This is referred to as the optimal substructure property We use this property to construct an optimal solution to a problem from optimal solutions to subproblems CS 477/677 - Lecture 1125

2. A Recursive Solution Define the value of an optimal solution in terms of the optimal solution to subproblems Assembly line subproblems –Finding the fastest way through each station j on each line i ( i = 1,2, j = 1, 2, …, n ) CS 477/677 - Lecture 1126

2. A Recursive Solution f* = the fastest time to get through the entire factory f i [j] = the fastest time to get from the starting point through station S i,j f* = min (f 1 [n] + x 1, f 2 [n] + x 2 ) CS 477/677 - Lecture 1127

2. A Recursive Solution f i [j] = the fastest time to get from the starting point through station S i,j j = 1 ( getting through station 1 ) f 1 [1] = e 1 + a 1,1 f 2 [1] = e 2 + a 2,1 CS 477/677 - Lecture 1128

2. A Recursive Solution Compute f i [j] for j = 2, 3, …,n, and i = 1, 2 Fastest way through S 1, j is either: –the way through S 1, j - 1 then directly through S 1, j, or f 1 [j - 1] + a 1,j –the way through S 2, j - 1, transfer from line 2 to line 1, then through S 1, j f 2 [j -1] + t 2,j-1 + a 1,j f 1 [j] = min(f 1 [j - 1] + a 1,j,f 2 [j -1] + t 2,j-1 + a 1,j ) a 1,j a 1,j-1 a 2,j-1 t 2,j-1 S 1,j S 1,j-1 S 2,j-1 CS 477/677 - Lecture 1129

2. A Recursive Solution e 1 + a 1,1 if j = 1 f 1 [j] = min(f 1 [j - 1] + a 1,j,f 2 [j -1] + t 2,j-1 + a 1,j ) if j ≥ 2 e 2 + a 2,1 if j = 1 f 2 [j] = min(f 2 [j - 1] + a 2,j,f 1 [j -1] + t 1,j-1 + a 2,j ) if j ≥ 2 CS 477/677 - Lecture 1130

3. Computing the Optimal Value f* = min (f 1 [n] + x 1, f 2 [n] + x 2 ) f 1 [j] = min(f 1 [j - 1] + a 1,j,f 2 [j -1] + t 2,j-1 + a 1,j ) Solving top-down would result in exponential running time f 1 [j] f 2 [j] f 1 (5) f 2 (5) f 1 (4) f 2 (4) f 1 (3) f 2 (3) 2 times 4 times f 1 (2) f 2 (2) f 1 (1) f 2 (1) CS 477/677 - Lecture 1131

3. Computing the Optimal Value For j ≥ 2, each value f i [j] depends only on the values of f 1 [j – 1] and f 2 [j - 1] Compute the values of f i [j] –in increasing order of j Bottom-up approach –First find optimal solutions to subproblems –Find an optimal solution to the problem from the subproblems f 1 [j] f 2 [j] increasing j CS 477/677 - Lecture 1132

4. Construct the Optimal Solution We need the information about which line has been used at each station: –l i [j] – the line number (1, 2) whose station ( j - 1 ) has been used to get in fastest time through S i,j, j = 2, 3, …, n –l* – the line number (1, 2) whose station n has been used to get in fastest time through the exit point l 1 [j] l 2 [j] 2345 increasing j CS 477/677 - Lecture 1133

FASTEST-WAY( a, t, e, x, n ) 1. f 1 [1] ← e 1 + a 1,1 2. f 2 [1] ← e 2 + a 2,1 3. for j ← 2 to n 4. do if f 1 [j - 1] + a 1,j ≤ f 2 [j - 1] + t 2, j-1 + a 1, j 5. then f 1 [j] ← f 1 [j - 1] + a 1, j 6. l 1 [j] ← 1 7. else f 1 [j] ← f 2 [j - 1] + t 2, j-1 + a 1, j 8. l 1 [j] ← 2 9. if f 2 [j - 1] + a 2, j ≤ f 1 [j - 1] + t 1, j-1 + a 2, j 10. then f 2 [j] ← f 2 [j - 1] + a 2, j 11. l 2 [j] ← else f 2 [j] ← f 1 [j - 1] + t 1, j-1 + a 2, j 13. l 2 [j] ← 1 Compute initial values of f 1 and f 2 Compute the values of f 1 [j] and l 1 [j] Compute the values of f 2 [j] and l 2 [j] CS 477/677 - Lecture 1134

FASTEST-WAY( a, t, e, x, n ) (cont.) 14.if f 1 [n] + x 1 ≤ f 2 [n] + x then f* = f 1 [n] + x l* = else f* = f 2 [n] + x l* = 2 Compute the values of the fastest time through the entire factory CS 477/677 - Lecture 1135

Example e 1 + a 1,1, if j = 1 f 1 [j] = min(f 1 [j - 1] + a 1,j,f 2 [j -1] + t 2,j-1 + a 1,j ) if j ≥ 2 f* = 35 [1] f 1 [j] f 2 [j] [1] 18 [1] 20 [2] 22 [2] 24 [1] 25 [1] 32 [1] 30 [2] CS 477/677 - Lecture 11 36

4. Construct an Optimal Solution Alg.: PRINT-STATIONS( l, n ) i ← l* print “line ” i “, station ” n for j ← n downto 2 do i ← l i [j] print “line ” i “, station ” j - 1 f 1 [j] l1[j] f 2 [j] l2[j] [1] 18 [1] 20 [2] 22 [2] 24 [1] 25 [1] 32 [1] 30 [2] l* = 1 line 1, station 5 line 1, station 4 line 1, station 3 line 2, station 2 line 1, station 1 CS 477/677 - Lecture 1137

Dynamic Programming Algorithm 1.Characterize the structure of an optimal solution –Fastest time through a station depends on the fastest time on previous stations 2.Recursively define the value of an optimal solution –f 1 [j] = min(f 1 [j - 1] + a 1,j,f 2 [j -1] + t 2,j-1 + a 1,j ) 3.Compute the value of an optimal solution in a bottom-up fashion –Fill in the fastest time table in increasing order of j (station #) 4.Construct an optimal solution from computed information –Use an additional table to help reconstruct the optimal solution CS 477/677 - Lecture 1138

Readings Chapters 14, 15 CS 477/677 - Lecture 1139