Richard Anderson Lecture 19 Longest Common Subsequence

Slides:



Advertisements
Similar presentations
Dynamic Programming Nithya Tarek. Dynamic Programming Dynamic programming solves problems by combining the solutions to sub problems. Paradigms: Divide.
Advertisements

Introduction to Bioinformatics Algorithms Divide & Conquer Algorithms.
CPSC 335 Dynamic Programming Dr. Marina Gavrilova Computer Science University of Calgary Canada.
CSEP 521 Applied Algorithms Richard Anderson Lecture 7 Dynamic Programming.
Inexact Matching of Strings General Problem –Input Strings S and T –Questions How distant is S from T? How similar is S to T? Solution Technique –Dynamic.
Space Efficient Alignment Algorithms and Affine Gap Penalties
Space Efficient Alignment Algorithms Dr. Nancy Warter-Perez June 24, 2005.
CSE 421 Algorithms Richard Anderson Lecture 19 Longest Common Subsequence.
CSE 421 Algorithms Richard Anderson Lecture 16 Dynamic Programming.
UNIVERSITY OF SOUTH CAROLINA College of Engineering & Information Technology Bioinformatics Algorithms and Data Structures Chapter 11: Core String Edits.
CSE 421 Algorithms Richard Anderson Lecture 19 Longest Common Subsequence.
Class 2: Basic Sequence Alignment
Space Efficient Alignment Algorithms Dr. Nancy Warter-Perez.
Classroom Presenter and Tutored Video Instruction Richard Anderson Natalie Linnell University of Washington 1.
Dynamic Programming – Part 2 Introduction to Algorithms Dynamic Programming – Part 2 CSE 680 Prof. Roger Crawfis.
Space-Efficient Sequence Alignment Space-Efficient Sequence Alignment Bioinformatics 202 University of California, San Diego Lecture Notes No. 7 Dr. Pavel.
Prof. Swarat Chaudhuri COMP 482: Design and Analysis of Algorithms Spring 2012 Lecture 16.
1 Chapter 6 Dynamic Programming. 2 Algorithmic Paradigms Greedy. Build up a solution incrementally, optimizing some local criterion. Divide-and-conquer.
Sequence Alignment Tanya Berger-Wolf CS502: Algorithms in Computational Biology January 25, 2011.
Introduction to Bioinformatics Algorithms Divide & Conquer Algorithms.
Space Efficient Alignment Algorithms and Affine Gap Penalties Dr. Nancy Warter-Perez.
Computer Sciences Department1.  Property 1: each node can have up to two successor nodes (children)  The predecessor node of a node is called its.
COSC 3101NJ. Elder Announcements Midterm Exam: Fri Feb 27 CSE C –Two Blocks: 16:00-17:30 17:30-19:00 –The exam will be 1.5 hours in length. –You can attend.
CS38 Introduction to Algorithms Lecture 10 May 1, 2014.
Algorithms Classroom Activities Richard Anderson University of Washington July 3, 20081IUCEE: Algorithm Activities.
Dynamic Programming Csc 487/687 Computing for Bioinformatics.
CS502: Algorithms in Computational Biology
Dynamic Programming Typically applied to optimization problems
All-pairs Shortest paths Transitive Closure
Divide & Conquer Algorithms
Richard Anderson Lecture 1
Algorithmics - Lecture 11
13 Text Processing Hongfei Yan June 1, 2016.
JinJu Lee & Beatrice Seifert CSE 5311 Fall 2005 Week 10 (Nov 1 & 3)
Dynamic Programming Several problems Principle of dynamic programming
Richard Anderson Lecture 14 Divide and Conquer
Prepared by Chen & Po-Chuan 2016/03/29
CS Algorithms Dynamic programming 0-1 Knapsack problem 12/5/2018.
Dynamic Programming Dr. Yingwu Zhu Chapter 15.
Richard Anderson Lecture 18 Dynamic Programming
Memory Efficient Longest Common Subsequence
Richard Anderson Lecture 20 LCS / Shortest Paths
Richard Anderson Lecture 13 Divide and Conquer
Richard Anderson Lecture 19 Longest Common Subsequence
Searching: linear & binary
Approximate Matching of Run-Length Compressed Strings
Dynamic Programming.
Longest Common Subsequence
Lecture 8. Paradigm #6 Dynamic Programming
Trevor Brown DC 2338, Office hour M3-4pm
Bioinformatics Algorithms and Data Structures
Dynamic Programming.
Longest Common Subsequence
Richard Anderson Lecture 19 Memory Efficient Dynamic Programming
Richard Anderson Lecture 14 Divide and Conquer
Lecture 5 Dynamic Programming
Analysis of Algorithms CS 477/677
Advanced Analysis of Algorithms
Longest Common Subsequence
Richard Anderson Lecture 14, Winter 2019 Divide and Conquer
Richard Anderson Lecture 19 Memory Efficient Dynamic Programming
A Note on Useful Algorithmic Strategies
Richard Anderson Lecture 14 Divide and Conquer
Richard Anderson Lecture 18 Dynamic Programming
Data Structures and Algorithm Analysis Lecture 15
Dynamic Programming.
Richard Anderson Lecture 20 Space Efficient LCS
Memory Efficient Dynamic Programming / Shortest Paths
CSE 5290: Algorithms for Bioinformatics Fall 2009
Presentation transcript:

Richard Anderson Lecture 19 Longest Common Subsequence CSE 421 Algorithms Richard Anderson Lecture 19 Longest Common Subsequence

Longest Common Subsequence C=c1…cg is a subsequence of A=a1…am if C can be obtained by removing elements from A (but retaining order) LCS(A, B): A maximum length sequence that is a subsequence of both A and B ocurranec occurrence attacggct tacgacca

Determine the LCS of the following strings BARTHOLEMEWSIMPSON KRUSTYTHECLOWN

String Alignment Problem Align sequences with gaps Charge dx if character x is unmatched Charge gxy if character x is matched to character y CAT TGA AT CAGAT AGGA Note: the problem is often expressed as a minimization problem, with gxx = 0 and dx > 0

LCS Optimization A = a1a2…am B = b1b2…bn Opt[ j, k] is the length of LCS(a1a2…aj, b1b2…bk)

Optimization recurrence If aj = bk, Opt[ j,k ] = 1 + Opt[ j-1, k-1 ] If aj != bk, Opt[ j,k] = max(Opt[ j-1,k], Opt[ j,k-1])

Give the Optimization Recurrence for the String Alignment Problem Charge dx if character x is unmatched Charge gxy if character x is matched to character y Opt[ j, k] = Let aj = x and bk = y Express as minimization Opt[j,k] = max(gxy + Opt[j-1,k-1], dx + Opt[j-1, k], dy + Opt[j, k-1])

Dynamic Programming Computation

Code to compute Opt[j,k]

Storing the path information A[1..m], B[1..n] for i := 1 to m Opt[i, 0] := 0; for j := 1 to n Opt[0,j] := 0; Opt[0,0] := 0; for i := 1 to m for j := 1 to n if A[i] = B[j] { Opt[i,j] := 1 + Opt[i-1,j-1]; Best[i,j] := Diag; } else if Opt[i-1, j] >= Opt[i, j-1] { Opt[i, j] := Opt[i-1, j], Best[i,j] := Left; } else { Opt[i, j] := Opt[i, j-1], Best[i,j] := Down; } b1…bn a1…am

How good is this algorithm? Is it feasible to compute the LCS of two strings of length 100,000 on a standard desktop PC? Why or why not.

Observations about the Algorithm The computation can be done in O(m+n) space if we only need one column of the Opt values or Best Values The algorithm can be run from either end of the strings

Computing LCS in O(nm) time and O(n+m) space Divide and conquer algorithm Recomputing values used to save space

Divide and Conquer Algorithm Where does the best path cross the middle column? For a fixed i, and for each j, compute the LCS that has ai matched with bj

Constrained LCS LCSi,j(A,B): The LCS such that LCS4,3(abbacbb, cbbaa) a1,…,ai paired with elements of b1,…,bj ai+1,…am paired with elements of bj+1,…,bn LCS4,3(abbacbb, cbbaa)

A = RRSSRTTRTS B=RTSRRSTST Compute LCS5,0(A,B), LCS5,1(A,B),…,LCS5,9(A,B)

A = RRSSRTTRTS B=RTSRRSTST Compute LCS5,0(A,B), LCS5,1(A,B),…,LCS5,9(A,B) j left right 4 1 2 3 5 6 7 8 9

Computing the middle column From the left, compute LCS(a1…am/2,b1…bj) From the right, compute LCS(am/2+1…am,bj+1…bn) Add values for corresponding j’s Note – this is space efficient

Divide and Conquer A = a1,…,am B = b1,…,bn Find j such that Recurse LCS(a1…am/2, b1…bj) and LCS(am/2+1…am,bj+1…bn) yield optimal solution Recurse

Algorithm Analysis T(m,n) = T(m/2, j) + T(m/2, n-j) + cnm

Prove by induction that T(m,n) <= 2cmn

Memory Efficient LCS Summary We can afford O(nm) time, but we can’t afford O(nm) space If we only want to compute the length of the LCS, we can easily reduce space to O(n+m) Avoid storing the value by recomputing values Divide and conquer used to reduce problem sizes