Download presentation
Presentation is loading. Please wait.
1
Linear-Space Alignment
2
Linear-space alignment Using 2 columns of space, we can compute for k = 1…M, F(M/2, k), F r (M/2, N – k) PLUS the backpointers x1x1 …x M/2 y1y1 xMxM yNyN x1x1 …x M/2+1 xMxM … y1y1 yNyN …
3
Linear-space alignment Now, we can find k * maximizing F(M/2, k) + F r (M/2, N-k) Also, we can trace the path exiting column M/2 from k * k*k* k * +1 0 1 …… M/2 M/2+1 …… M M+1
4
Linear-space alignment Iterate this procedure to the left and right! N-k * M/2 k*k*
5
Linear-space alignment Hirschberg’s Linear-space algorithm: MEMALIGN(l, l’, r, r’):(aligns x l …x l’ with y r …y r’ ) 1.Let h = (l’-l)/2 2.Find (in Time O((l’ – l) (r’ – r)), Space O(r’ – r)) the optimal path,L h, entering column h – 1, exiting column h Let k 1 = pos’n at column h – 2 where L h enters k 2 = pos’n at column h + 1 where L h exits 3.MEMALIGN(l, h – 2, r, k 1 ) 4.Output L h 5.MEMALIGN(h + 1, l’, k 2, r’) Top level call: MEMALIGN(1, M, 1, N)
6
Linear-space alignment Time, Space analysis of Hirschberg’s algorithm: To compute optimal path at middle column, For box of size M N, Space: 2N Time:cMN, for some constant c Then, left, right calls cost c( M/2 k * + M/2 (N – k * ) ) = cMN/2 All recursive calls cost Total Time: cMN + cMN/2 + cMN/4 + ….. = 2cMN = O(MN) Total Space: O(N) for computation, O(N + M) to store the optimal alignment
7
Heuristic Local Alignerers 1.The basic indexing & extension technique 2.Indexing: techniques to improve sensitivity Pairs of Words, Patterns 3.Systems for local alignment
8
Indexing-based local alignment Dictionary: All words of length k (~10) Alignment initiated between words of alignment score T (typically T = k) Alignment: Ungapped extensions until score below statistical threshold Output: All local alignments with score > statistical threshold …… query DB query scan
9
Indexing-based local alignment— Extensions A C G A A G T A A G G T C C A G T C T G A T C C T G G A T T G C G A Gapped extensions until threshold Extensions with gaps until score < C below best score so far Output: GTAAGGTCCAGT GTTAGGTC-AGT
10
Sensitivity-Speed Tradeoff long words (k = 15) short words (k = 7) Sensitivity Speed Kent WJ, Genome Research 2002 Sens. Speed X%
11
Sensitivity-Speed Tradeoff Methods to improve sensitivity/speed 1.Using pairs of words 2.Using inexact words 3.Patterns—non consecutive positions ……ATAACGGACGACTGATTACACTGATTCTTAC…… ……GGCACGGACCAGTGACTACTCTGATTCCCAG…… ……ATAACGGACGACTGATTACACTGATTCTTAC…… ……GGCGCCGACGAGTGATTACACAGATTGCCAG…… TTTGATTACACAGAT T G TT CAC G
12
Measured improvement Kent WJ, Genome Research 2002
13
Non-consecutive words—Patterns Patterns increase the likelihood of at least one match within a long conserved region 3 common 5 common 7 common Consecutive PositionsNon-Consecutive Positions 6 common On a 100-long 70% conserved region: Consecutive Non-consecutive Expected # hits: 1.070.97 Prob[at least one hit]:0.300.47
14
Advantage of Patterns 11 positions 10 positions
15
Multiple patterns K patterns Takes K times longer to scan Patterns can complement one another Computational problem: Given: a model (prob distribution) for homology between two regions Find: best set of K patterns that maximizes Prob(at least one match) TTTGATTACACAGAT T G TT CAC G T G T C CAG TTGATT A G Buhler et al. RECOMB 2003 Sun & Buhler RECOMB 2004 How long does it take to search the query?
16
Variants of BLAST NCBI BLAST: search the universe http://www.ncbi.nlm.nih.gov/BLAST/ http://www.ncbi.nlm.nih.gov/BLAST/ MEGABLAST: http://genopole.toulouse.inra.fr/blast/megablast.html http://genopole.toulouse.inra.fr/blast/megablast.html Optimized to align very similar sequences Works best when k = 4i 16 Linear gap penalty WU-BLAST: (Wash U BLAST) http://blast.wustl.edu/ http://blast.wustl.edu/ Very good optimizations Good set of features & command line arguments BLAT http://genome.ucsc.edu/cgi-bin/hgBlat http://genome.ucsc.edu/cgi-bin/hgBlat Faster, less sensitive than BLAST Good for aligning huge numbers of queries CHAOS http://www.cs.berkeley.edu/~brudno/chaos http://www.cs.berkeley.edu/~brudno/chaos Uses inexact k-mers, sensitive PatternHunter http://www.bioinformaticssolutions.com/products/ph/index.php http://www.bioinformaticssolutions.com/products/ph/index.php Uses patterns instead of k-mers BlastZ http://www.psc.edu/general/software/packages/blastz/ http://www.psc.edu/general/software/packages/blastz/ Uses patterns, good for finding genes Typhon http://typhon.stanford.edu http://typhon.stanford.edu Uses multiple alignments to improve sensitivity/speed tradeoff
17
Hidden Markov Models 1 2 K … 1 2 K … 1 2 K … … … … 1 2 K … x1x1 x2x2 x3x3 xKxK 2 1 K 2
18
Outline for our next topic Hidden Markov models – the theory Probabilistic interpretation of alignments using HMMs Later in the course: Applications of HMMs to biological sequence modeling and discovery of features such as genes
19
Example: The Dishonest Casino A casino has two dice: Fair die P(1) = P(2) = P(3) = P(5) = P(6) = 1/6 Loaded die P(1) = P(2) = P(3) = P(5) = 1/10 P(6) = 1/2 Casino player switches back-&-forth between fair and loaded die once every 20 turns Game: 1.You bet $1 2.You roll (always with a fair die) 3.Casino player rolls (maybe with fair die, maybe with loaded die) 4.Highest number wins $2
20
Question # 1 – Evaluation GIVEN A sequence of rolls by the casino player 1245526462146146136136661664661636616366163616515615115146123562344 QUESTION How likely is this sequence, given our model of how the casino works? This is the EVALUATION problem in HMMs Prob = 1.3 x 10 -35
21
Question # 2 – Decoding GIVEN A sequence of rolls by the casino player 1245526462146146136136661664661636616366163616515615115146123562344 QUESTION What portion of the sequence was generated with the fair die, and what portion with the loaded die? This is the DECODING question in HMMs FAIRLOADEDFAIR
22
Question # 3 – Learning GIVEN A sequence of rolls by the casino player 1245526462146146136136661664661636616366163616515615115146123562344 QUESTION How “loaded” is the loaded die? How “fair” is the fair die? How often does the casino player change from fair to loaded, and back? This is the LEARNING question in HMMs Prob(6) = 64%
23
The dishonest casino model FAIRLOADED 0.05 0.95 P(1|F) = 1/6 P(2|F) = 1/6 P(3|F) = 1/6 P(4|F) = 1/6 P(5|F) = 1/6 P(6|F) = 1/6 P(1|L) = 1/10 P(2|L) = 1/10 P(3|L) = 1/10 P(4|L) = 1/10 P(5|L) = 1/10 P(6|L) = 1/2
24
The dishonest casino model FAIRLOADED 0.05 0.95 P(1|F) = 1/6 P(2|F) = 1/6 P(3|F) = 1/6 P(4|F) = 1/6 P(5|F) = 1/6 P(6|F) = 1/6 P(1|L) = 1/10 P(2|L) = 1/10 P(3|L) = 1/10 P(4|L) = 1/10 P(5|L) = 1/10 P(6|L) = 1/2
25
A HMM is memory-less At each time step t, the only thing that affects future states is the current state t K 1 … 2
26
Definition of a hidden Markov model Definition: A hidden Markov model (HMM) Alphabet = { b 1, b 2, …, b M } Set of states Q = { 1,..., K } Transition probabilities between any two states a ij = transition prob from state i to state j a i1 + … + a iK = 1, for all states i = 1…K Start probabilities a 0i a 01 + … + a 0K = 1 Emission probabilities within each state e i (b) = P( x i = b | i = k) e i (b 1 ) + … + e i (b M ) = 1, for all states i = 1…K K 1 … 2 End Probabilities a i0 in Durbin; not needed
27
A HMM is memory-less At each time step t, the only thing that affects future states is the current state t P( t+1 = k | “whatever happened so far”) = P( t+1 = k | 1, 2, …, t, x 1, x 2, …, x t )= P( t+1 = k | t ) K 1 … 2
28
A HMM is memory-less At each time step t, the only thing that affects x t is the current state t P(x t = b | “whatever happened so far”) = P(x t = b | 1, 2, …, t, x 1, x 2, …, x t-1 )= P(x t = b | t ) K 1 … 2
29
A parse of a sequence Given a sequence x = x 1 ……x N, A parse of x is a sequence of states = 1, ……, N 1 2 K … 1 2 K … 1 2 K … … … … 1 2 K … x1x1 x2x2 x3x3 xKxK 2 1 K 2
30
Generating a sequence by the model Given a HMM, we can generate a sequence of length n as follows: 1.Start at state 1 according to prob a 0 1 2.Emit letter x 1 according to prob e 1 (x 1 ) 3.Go to state 2 according to prob a 1 2 4.… until emitting x n 1 2 K … 1 2 K … 1 2 K … … … … 1 2 K … x1x1 x2x2 x3x3 xnxn 2 1 K 2 0 e 2 (x 1 ) a 02
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.