Analysis of Algorithms CS 477/677

Slides:



Advertisements
Similar presentations
Dynamic Programming Introduction Prof. Muhammad Saeed.
Advertisements

CS Section 600 CS Section 002 Dr. Angela Guercio Spring 2010.
Dynamic Programming.
Algorithm Design Methodologies Divide & Conquer Dynamic Programming Backtracking.
Dynamic Programming An algorithm design paradigm like divide-and-conquer “Programming”: A tabular method (not writing computer code) Divide-and-Conquer.
Analysis of Algorithms CS 477/677 Instructor: Monica Nicolescu Lecture 6.
Types of Algorithms.
Greedy Algorithms Greed is good. (Some of the time)
Problem Solving Dr. Andrew Wallace PhD BEng(hons) EurIng
Dynamic Programming.
Introduction to Algorithms
14. Augmenting Data Structures Hsu, Lih-Hsing. Computer Theory Lab. Chapter 13P Dynamic order statistics We shall also see the rank of an element―its.
Comp 122, Fall 2004 Dynamic Programming. dynprog - 2 Lin / Devi Comp 122, Spring 2004 Longest Common Subsequence  Problem: Given 2 sequences, X =  x.
1 Dynamic Programming (DP) Like divide-and-conquer, solve problem by combining the solutions to sub-problems. Differences between divide-and-conquer and.
Dynamic Programming Part 1: intro and the assembly-line scheduling problem.
Analysis of Algorithms CS 477/677 Instructor: Monica Nicolescu Lecture 11.
Dynamic Programming (II)
Analysis of Algorithms CS 477/677
Introduction to Analysis of Algorithms CAS CS 330 Lecture 16 Shang-Hua Teng Thanks to Charles E. Leiserson and Silvio Micali of MIT for these slides.
Lecture 5 Dynamic Programming. Dynamic Programming Self-reducibility.
Dynamic Programming. Well known algorithm design techniques:. –Divide-and-conquer algorithms Another strategy for designing algorithms is dynamic programming.
Lecture X Augmenting Data Structures
Interval Trees CS302 Data Structures Modified from Dr Monica Nicolescu.
Fundamentals of Algorithms MCS - 2 Lecture # 7
1 Summary: Design Methods for Algorithms Andreas Klappenecker.
COSC 3101A - Design and Analysis of Algorithms 7 Dynamic Programming Assembly-Line Scheduling Matrix-Chain Multiplication Elements of DP Many of these.
CSCI-256 Data Structures & Algorithm Analysis Lecture Note: Some slides by Kevin Wayne. Copyright © 2005 Pearson-Addison Wesley. All rights reserved. 17.
CS 8833 Algorithms Algorithms Dynamic Programming.
Dynamic Programming. Many problem can be solved by D&C – (in fact, D&C is a very powerful approach if you generalize it since MOST problems can be solved.
Types of Algorithms. 2 Algorithm classification Algorithms that use a similar problem-solving approach can be grouped together We’ll talk about a classification.
Computer Sciences Department1.  Property 1: each node can have up to two successor nodes (children)  The predecessor node of a node is called its.
1 Today’s Material Dynamic Programming – Chapter 15 –Introduction to Dynamic Programming –0-1 Knapsack Problem –Longest Common Subsequence –Chain Matrix.
1 Algorithms CSCI 235, Fall 2015 Lecture 29 Greedy Algorithms.
Analysis of Algorithms CS 477/677 Instructor: Monica Nicolescu Lecture 17.
Dynamic Programming Csc 487/687 Computing for Bioinformatics.
Analysis of Algorithms CS 477/677 Instructor: Monica Nicolescu Lecture 10.
Analysis of Algorithms CS 477/677
Advanced Algorithms Analysis and Design
Greedy algorithms: CSC317
Dynamic Programming Typically applied to optimization problems
Dynamic Programming (DP)
Analysis of Algorithms CS 477/677
Introduction to Algorithms
Lecture 5 Dynamic Programming
Dynamic Programming Copyright © 2007 Pearson Addison-Wesley. All rights reserved.
Analysis of Algorithms CS 477/677
Advanced Design and Analysis Techniques
Types of Algorithms.
Lecture 5 Dynamic Programming
Dynamic Programming Several problems Principle of dynamic programming
Dynamic Programming Comp 122, Fall 2004.
Types of Algorithms.
Lecture 11 Overview Self-Reducibility.
Advanced Algorithms Analysis and Design
Dynamic Programming.
Introduction to Algorithms
Data Structure and Algorithms
Dynamic Programming Comp 122, Fall 2004.
Algorithms CSCI 235, Spring 2019 Lecture 29 Greedy Algorithms
Introduction to Algorithms: Dynamic Programming
Dynamic Programming.
DYNAMIC PROGRAMMING.
Types of Algorithms.
COMP108 Algorithmic Foundations Dynamic Programming
Analysis of Algorithms CS 477/677
CSCI 235, Spring 2019, Lecture 25 Dynamic Programming
Interval Trees CS302 Data Structures Modified from Dr Monica Nicolescu.
Dynamic Programming.
Analysis of Algorithms CS 477/677
Analysis of Algorithms CS 477/677
Presentation transcript:

Analysis of Algorithms CS 477/677 Instructor: Monica Nicolescu Lecture 15

Designing Interval Trees Develop new operations INTERVAL-SEARCH(T, i): Returns a pointer to an element x in the interval tree T, such that int[x] overlaps with i, or NIL otherwise Idea: Check if int[x] overlaps with i Max[left[x]] ≥ low[i] Go left Otherwise, go right low high [16, 21] 30 [22, 25] [8, 9] 23 [25, 30] 30 [5, 8] 10 [15, 23] 23 [17, 19] 20 [26, 26] 26 [0, 3] 3 [6, 10] 10 [19, 20] 20 CS 477/677 - Lecture 15

Theorem At the execution of interval search: if the search goes right, then either: There is an overlap in right subtree, or There is no overlap in either subtree Similar when the search goes left It is safe to always proceed in only one direction CS 477/677 - Lecture 15

Theorem Proof: If search goes right: If there is an overlap in right subtree, done If there is no overlap in right ⇒ show there is no overlap in left Went right because: left[x] = nil[T] ⇒ no overlap in left, or max[left[x]] < low[i] ⇒ no overlap in left i max[left[x]] low[x] CS 477/677 - Lecture 15

Theorem - Proof If search goes left: If there is an overlap in left subtree, done If there is no overlap in left, show there is no overlap in right Went left because: low[i] ≤ max[left[x]] = high[j] for some j in left subtree int[root] max max[left] [low[j], high[j]] max[j] [low[k], high[k]] max[k] high[j] < low[i] high[i] < low[j] i j k No overlap! high[i] < low[k] low[j] < low[k] CS 477/677 - Lecture 15

Dynamic Programming An algorithm design technique for optimization problems (similar to divide and conquer) Divide and conquer Partition the problem into independent subproblems Solve the subproblems recursively Combine the solutions to solve the original problem CS 477/677 - Lecture 15

Dynamic Programming Used for optimization problems Find a solution with the optimal value (minimum or maximum) A set of choices must be made to get an optimal solution There may be many solutions that return the optimal value: we want to find one of them CS 477/677 - Lecture 15

Dynamic Programming Applicable when subproblems are not independent Subproblems share subsubproblems E.g.: Fibonacci numbers: Recurrence: F(n) = F(n-1) + F(n-2) Boundary conditions: F(1) = 0, F(2) = 1 Compute: F(5) = 3, F(3) = 1, F(4) = 2 A divide and conquer approach would repeatedly solve the common subproblems Dynamic programming solves every subproblem just once and stores the answer in a table CS 477/677 - Lecture 15

Dynamic Programming Algorithm Characterize the structure of an optimal solution Recursively define the value of an optimal solution Compute the value of an optimal solution in a bottom-up fashion Construct an optimal solution from computed information CS 477/677 - Lecture 15

Elements of Dynamic Programming Optimal Substructure An optimal solution to a problem contains within it an optimal solution to subproblems Optimal solution to the entire problem is built in a bottom-up manner from optimal solutions to subproblems Overlapping Subproblems If a recursive algorithm revisits the same subproblems again and again  the problem has overlapping subproblems CS 477/677 - Lecture 15

Assembly Line Scheduling Automobile factory with two assembly lines Each line has n stations: S1,1, . . . , S1,n and S2,1, . . . , S2,n Corresponding stations S1, j and S2, j perform the same function but can take different amounts of time a1, j and a2, j Times to enter are e1 and e2 and times to exit are x1 and x2 CS 477/677 - Lecture 15

Assembly Line After going through a station, the car can either: stay on same line at no cost, or transfer to other line: cost after Si,j is ti,j , i = 1, 2, j = 1, . . . , n-1 CS 477/677 - Lecture 15

Assembly Line Scheduling Problem: What stations should be chosen from line 1 and what from line 2 in order to minimize the total time through the factory for one car? CS 477/677 - Lecture 15

One Solution Brute force Enumerate all possibilities of selecting stations Compute how long it takes in each case and choose the best one There are 2n possible ways to choose stations Infeasible when n is large 1 1 if choosing line 1 at step j (= n) 2 3 4 n 0 if choosing line 2 at step j (= 3) CS 477/677 - Lecture 15

Readings Chapters 14, 15 CS 477/677 - Lecture 15