Prof. Swarat Chaudhuri COMP 482: Design and Analysis of Algorithms Spring 2012 Lecture 17.

Slides:



Advertisements
Similar presentations
1 Chapter 6 Dynamic Programming Slides by Kevin Wayne. Copyright © 2005 Pearson-Addison Wesley. All rights reserved.
Advertisements

1 Chapter 6 Dynamic Programming Slides by Kevin Wayne. Copyright © 2005 Pearson-Addison Wesley. All rights reserved.
CS Section 600 CS Section 002 Dr. Angela Guercio Spring 2010.
Chapter 7 Dynamic Programming.
 2004 SDU Lecture11- All-pairs shortest paths. Dynamic programming Comparing to divide-and-conquer 1.Both partition the problem into sub-problems 2.Divide-and-conquer.
CSCI-256 Data Structures & Algorithm Analysis Lecture Note: Some slides by Kevin Wayne. Copyright © 2005 Pearson-Addison Wesley. All rights reserved. 20.
Prof. Swarat Chaudhuri COMP 482: Design and Analysis of Algorithms Spring 2012 Lecture 19.
DYNAMIC PROGRAMMING. 2 Algorithmic Paradigms Greedy. Build up a solution incrementally, myopically optimizing some local criterion. Divide-and-conquer.
Dynamic Programming CIS 606 Spring 2010.
Shortest Paths Definitions Single Source Algorithms –Bellman Ford –DAG shortest path algorithm –Dijkstra All Pairs Algorithms –Using Single Source Algorithms.
Shortest Path With Negative Weights s 3 t
Lecture 39 CSE 331 Dec 6, On Friday, Dec 10 hours-a-thon Atri: 2:00-3:30 (Bell 123) Jeff: 4:00-5:00 (Bell 224) Alex: 5:00-6:30 (Bell 242)
7 -1 Chapter 7 Dynamic Programming Fibonacci Sequence Fibonacci sequence: 0, 1, 1, 2, 3, 5, 8, 13, 21, … F i = i if i  1 F i = F i-1 + F i-2 if.
Tirgul 13. Unweighted Graphs Wishful Thinking – you decide to go to work on your sun-tan in ‘ Hatzuk ’ beach in Tel-Aviv. Therefore, you take your swimming.
1 Algorithmic Paradigms Greed. Build up a solution incrementally, myopically optimizing some local criterion. Divide-and-conquer. Break up a problem into.
CSE 421 Algorithms Richard Anderson Lecture 21 Shortest Paths.
Dynamic Programming Adapted from Introduction and Algorithms by Kleinberg and Tardos.
CS 473 All Pairs Shortest Paths1 CS473 – Algorithms I All Pairs Shortest Paths.
Dynamic Programming – Part 2 Introduction to Algorithms Dynamic Programming – Part 2 CSE 680 Prof. Roger Crawfis.
Advanced Algorithms Piyush Kumar (Lecture 5: Weighted Matching) Welcome to COT5405 Based on Kevin Wayne’s slides.
CSCI-256 Data Structures & Algorithm Analysis Lecture Note: Some slides by Kevin Wayne. Copyright © 2005 Pearson-Addison Wesley. All rights reserved. 22.
Advanced Algorithms Piyush Kumar (Lecture 5: Weighted Matching) Welcome to COT5405 Based on Kevin Wayne’s slides.
ADA: 7. Dynamic Prog.1 Objective o introduce DP, its two hallmarks, and two major programming techniques o look at two examples: the fibonacci.
1 Algorithmic Paradigms Greed. Build up a solution incrementally, myopically optimizing some local criterion. Divide-and-conquer. Break up a problem into.
Prof. Swarat Chaudhuri COMP 482: Design and Analysis of Algorithms Spring 2012 Lecture 10.
CSCI 256 Data Structures and Algorithm Analysis Lecture 14 Some slides by Kevin Wayne copyright 2005, Pearson Addison Wesley all rights reserved, and some.
Algorithm Course Dr. Aref Rashad February Algorithms Course..... Dr. Aref Rashad Part: 6 Shortest Path Algorithms.
Week 10Complexity of Algorithms1 Hard Computational Problems Some computational problems are hard Despite a numerous attempts we do not know any efficient.
10/4/10 A. Smith; based on slides by E. Demaine, C. Leiserson, S. Raskhodnikova, K. Wayne Adam Smith Algorithm Design and Analysis L ECTURE 17 Dynamic.
CSCI-256 Data Structures & Algorithm Analysis Lecture Note: Some slides by Kevin Wayne. Copyright © 2005 Pearson-Addison Wesley. All rights reserved. 17.
1 CPSC 320: Intermediate Algorithm Design and Analysis July 28, 2014.
Prof. Swarat Chaudhuri COMP 482: Design and Analysis of Algorithms Spring 2012 Lecture 16.
1 Chapter 6 Dynamic Programming. 2 Algorithmic Paradigms Greedy. Build up a solution incrementally, optimizing some local criterion. Divide-and-conquer.
Lectures on Greedy Algorithms and Dynamic Programming
1 Chapter 6 Dynamic Programming Slides by Kevin Wayne. Copyright © 2005 Pearson-Addison Wesley. All rights reserved.
Analysis of Algorithms CS 477/677 Instructor: Monica Nicolescu Lecture 14.
1 Chapter 6-1 Dynamic Programming Slides by Kevin Wayne. Copyright © 2005 Pearson-Addison Wesley. All rights reserved.
Sequence Alignment Tanya Berger-Wolf CS502: Algorithms in Computational Biology January 25, 2011.
Chapter 6 Dynamic Programming
Computer Sciences Department1.  Property 1: each node can have up to two successor nodes (children)  The predecessor node of a node is called its.
Dynamic Programming.  Decomposes a problem into a series of sub- problems  Builds up correct solutions to larger and larger sub- problems  Examples.
CSC5101 Advanced Algorithms Analysis
CS38 Introduction to Algorithms Lecture 10 May 1, 2014.
1 Chapter 6 Dynamic Programming Slides by Kevin Wayne. Copyright © 2005 Pearson-Addison Wesley. All rights reserved.
CSCI-256 Data Structures & Algorithm Analysis Lecture Note: Some slides by Kevin Wayne. Copyright © 2005 Pearson-Addison Wesley. All rights reserved. 18.
15.082J & 6.855J & ESD.78J September 30, 2010 The Label Correcting Algorithm.
CSCI 256 Data Structures and Algorithm Analysis Lecture 16 Some slides by Kevin Wayne copyright 2005, Pearson Addison Wesley all rights reserved, and some.
Prof. Swarat Chaudhuri COMP 382: Reasoning about Algorithms Fall 2015.
1 Chapter 7 Network Flow Slides by Kevin Wayne. Copyright © 2005 Pearson-Addison Wesley. All rights reserved.
CSE 421 Algorithms Richard Anderson Lecture 21 Shortest Paths and Network Flow.
Greedy Technique.
Prepared by Chen & Po-Chuan 2016/03/29
Chapter 6 Dynamic Programming
Dynamic Programming General Idea
Richard Anderson Lecture 20 LCS / Shortest Paths
Chapter 6 Dynamic Programming
Chapter 6 Dynamic Programming
Dynamic Programming Longest Path in a DAG, Yin Tat Lee
Lecture 14 Shortest Path (cont’d) Minimum Spanning Tree
Dynamic Programming General Idea
Evaluation of the Course (Modified)
Dynamic Programming II DP over Intervals
Lecture 13 Shortest Path (cont’d) Minimum Spanning Tree
Data Structures and Algorithm Analysis Lecture 15
Dynamic Programming.
Richard Anderson Lecture 20 Space Efficient LCS
Memory Efficient Dynamic Programming / Shortest Paths
Presentation transcript:

Prof. Swarat Chaudhuri COMP 482: Design and Analysis of Algorithms Spring 2012 Lecture 17

Q1: Longest palindromic subsequence Give an algorithm to find the longest subsequence of a given string A that is a palindrome. “amantwocamelsacrazyplanacanalpanama” 2

Q1-a: Palindromes (contd.) Every string can be decomposed into a sequence of palindromes. Give an efficient algorithm to compute the smallest number of palindromes that makes up a given string. 3

6.5 RNA Secondary Structure

5 RNA Secondary Structure RNA. String B = b 1 b 2  b n over alphabet { A, C, G, U }. Secondary structure. RNA is single-stranded so it tends to loop back and form base pairs with itself. This structure is essential for understanding behavior of molecule. G U C A GA A G CG A U G A U U A G A CA A C U G A G U C A U C G G G C C G Ex: GUCGAUUGAGCGAAUGUAACAACGUGGCUACGGCGAGA complementary base pairs: A-U, C-G

6 RNA Secondary Structure Secondary structure. A set of pairs S = { (b i, b j ) } that satisfy: n [Watson-Crick.] S is a matching and each pair in S is a Watson- Crick complement: A-U, U-A, C-G, or G-C. n [No sharp turns.] The ends of each pair are separated by at least 4 intervening bases. If (b i, b j )  S, then i < j - 4. n [Non-crossing.] If (b i, b j ) and (b k, b l ) are two pairs in S, then we cannot have i < k < j < l. Free energy. Usual hypothesis is that an RNA molecule will form the secondary structure with the optimum total free energy. Goal. Given an RNA molecule B = b 1 b 2  b n, find a secondary structure S that maximizes the number of base pairs. approximate by number of base pairs

7 RNA Secondary Structure: Examples Examples. C GG C A G U U UA A UGUGGCCAU GG C A G U UA A UGGGCAU C GG C A U G U UA A GUUGGCCAU sharp turncrossingok G G 44 base pair

8 RNA Secondary Structure: Subproblems First attempt. OPT(j) = maximum number of base pairs in a secondary structure of the substring b 1 b 2  b j. Difficulty. Results in two sub-problems. n Finding secondary structure in: b 1 b 2  b t-1. n Finding secondary structure in: b t+1 b t+2  b n-1. 1 tn match b t and b n OPT(t-1) need more sub-problems

9 Dynamic Programming Over Intervals Notation. OPT(i, j) = maximum number of base pairs in a secondary structure of the substring b i b i+1  b j. n Case 1. If i  j - 4. – OPT(i, j) = 0 by no-sharp turns condition. n Case 2. Base b j is not involved in a pair. – OPT(i, j) = OPT(i, j-1) n Case 3. Base b j pairs with b t for some i  t < j - 4. – non-crossing constraint decouples resulting sub-problems – OPT(i, j) = 1 + max t { OPT(i, t-1) + OPT(t+1, j-1) } Remark. Same core idea in CKY algorithm to parse context-free grammars. take max over t such that i  t < j-4 and b t and b j are Watson-Crick complements

10 Bottom Up Dynamic Programming Over Intervals Q. What order to solve the sub-problems? A. Do shortest intervals first. Running time. O(n 3 ). RNA(b 1,…,b n ) { for k = 5, 6, …, n-1 for i = 1, 2, …, n-k j = i + k Compute M[i, j] return M[1, n] } using recurrence i 6789 j

6.8 Shortest Paths

12 Shortest Paths Shortest path problem. Given a directed graph G = (V, E), with edge weights c vw, find shortest path from node s to node t. Ex. Nodes represent agents in a financial setting and c vw is cost of transaction in which we buy from agent v and sell immediately to w. s 3 t allow negative weights

13 Shortest Paths: Failed Attempts Dijkstra. Can fail if negative edge costs. Re-weighting. Adding a constant to every edge weight can fail. u t sv st

14 Shortest Paths: Negative Cost Cycles Negative cost cycle. Observation. If some path from s to t contains a negative cost cycle, there does not exist a shortest s-t path; otherwise, there exists one that is simple. st W c(W) <

15 Shortest Paths: Dynamic Programming Def. OPT(i, v) = length of shortest v-t path P using at most i edges. n Case 1: P uses at most i-1 edges. – OPT(i, v) = OPT(i-1, v) n Case 2: P uses exactly i edges. – if (v, w) is first edge, then OPT uses (v, w), and then selects best w-t path using at most i-1 edges Remark. By previous observation, if no negative cycles, then OPT(n-1, v) = length of shortest v-t path.

16 Shortest Paths: Implementation Analysis.  (mn) time,  (n 2 ) space. Finding the shortest paths. Maintain a "successor" for each table entry. Shortest-Path(G, t) { foreach node v  V M[0, v]   M[0, t]  0 for i = 1 to n-1 foreach node v  V M[i, v]  M[i-1, v] foreach edge (v, w)  E M[i, v]  min { M[i, v], M[i-1, w] + c vw } }

17 Shortest Paths: Practical Improvements Practical improvements. n Maintain only one array M[v] = shortest v-t path that we have found so far. n No need to check edges of the form (v, w) unless M[w] changed in previous iteration. Theorem. Throughout the algorithm, M[v] is length of some v-t path, and after i rounds of updates, the value M[v] is no larger than the length of shortest v-t path using  i edges. Overall impact. n Memory: O(m + n). n Running time: O(mn) worst case, but substantially faster in practice.

18 Bellman-Ford: Efficient Implementation Push-Based-Shortest-Path(G, s, t) { foreach node v  V { M[v]   successor[v]   } M[t] = 0 for i = 1 to n-1 { foreach node w  V { if (M[w] has been updated in previous iteration) { foreach node v such that (v, w)  E { if (M[v] > M[w] + c vw ) { M[v]  M[w] + c vw successor[v]  w } If no M[w] value changed in iteration i, stop. }

19 Dynamic Programming Summary Recipe. n Characterize structure of problem. n Recursively define value of optimal solution. n Compute value of optimal solution. n Construct optimal solution from computed information. Dynamic programming techniques. n Binary choice: weighted interval scheduling. n Multi-way choice: segmented least squares. n Adding a new variable: knapsack. n Dynamic programming over intervals: RNA secondary structure. Top-down vs. bottom-up: different people have different intuitions. Viterbi algorithm for HMM also uses DP to optimize a maximum likelihood tradeoff between parsimony and accuracy CKY parsing algorithm for context-free grammar has similar structure

6.10 Negative Cycles in a Graph

21 Detecting Negative Cycles Lemma. If OPT(n,v) = OPT(n-1,v) for all v, then no negative cycles. Pf. Bellman-Ford algorithm. Lemma. If OPT(n,v) < OPT(n-1,v) for some node v, then (any) shortest path from v to t contains a cycle W. Moreover W has negative cost. Pf. (by contradiction) n Since OPT(n,v) < OPT(n-1,v), we know P has exactly n edges. n By pigeonhole principle, P must contain a directed cycle W. n Deleting W yields a v-t path with < n edges  W has negative cost. v t W c(W) < 0

22 Detecting Negative Cycles Theorem. Can detect negative cost cycle in O(mn) time. n Add new node t and connect all nodes to t with 0-cost edge. n Check if OPT(n, v) = OPT(n-1, v) for all nodes v. – if yes, then no negative cycles – if no, then extract cycle from shortest path from v to t v t

23 Detecting Negative Cycles: Summary Bellman-Ford. O(mn) time, O(m + n) space. n Run Bellman-Ford for n iterations (instead of n-1). n Upon termination, Bellman-Ford successor variables trace a negative cycle if one exists. n See p. 288 for improved version and early termination rule.

Q2. Arbitrage Arbitrage is the use of discrepancies in currency exchange rates to transform one unit of a currenct into more than one unit of the same currency. For example, suppose that 1 US dollar buys 0.7 British pound, 1 British pound buys 9.5 French francs, and 1 French franc buys 0.16 US dollar. Then, by converting currencies, a trader can start with a US dollar and buy 0.7 x 9.5 x 0.16 = 1.064$ US dollars, thus turning a profit of 6.4 percent. Suppose that we are given n currencies c 1,…, c n and an n x n table R of exchange rates, such that one unit of currency c i buys R[i,j] units of currency c j. Give an efficient algorithm to determine whether or not there exists a sequence of currencies (c i1, …, c ik ) such that R[i 1, i 2 ] x R[i 2, i 3 ] x… x R[i k-1,i k ] x R[i k,i 1 ] > 1. Give an efficient algorithm to print out such a sequence if one exists. Analyze the running time of your algorithm. 24

Q3. Number of shortest paths Suppose we have a directed graph with costs on the edges. The costs may be positive or negative, but every cycle in the graph has a strictly positive cost. We are also given two nodes v, w. Give an efficient algorithm that computes the number of shortest v-w paths in G. 25