Download presentation
Presentation is loading. Please wait.
Published byXiomara Highman Modified over 9 years ago
1
www..uni-rostock.de Ulf Schmitz, Statistical methods for aiding alignment1 Bioinformatics Statistical methods for pattern searching Ulf Schmitz ulf.schmitz@informatik.uni-rostock.de Bioinformatics and Systems Biology Group www.sbi.informatik.uni-rostock.de
2
www..uni-rostock.de Ulf Schmitz, Statistical methods for aiding alignment2 Outline 1.Expectation Maximization Algorithm 2.Markov Models 3.Hidden Markov models
3
www..uni-rostock.de Ulf Schmitz, Statistical methods for aiding alignment3 Expectation Maximization Algorithm an algorithm for locating similar sequence patterns in a set of sequences suspected parts are then aligned an expected scoring matrix representing the distribution of sequence characters in each column of the alignment will be generated the pattern is matched to each sequence and the scoring matrix values are then updated to maximize the alignment of the matrix to the sequences this procedure is repeated until there is no further improvement
4
www..uni-rostock.de Ulf Schmitz, Statistical methods for aiding alignment4 Expectation Maximization Algorithm Seq1: Seq2: Seq3: Seq4:. Seq10: 100 nucleotides long seq1: … … … … … … TCAGAATGCAGCATAG … … … … … … … … … … … … seq2: … … … … … … CGCATAGAGCATAGAC … … … … … … … … … … … … seq3: … … … … … … ACAGACAAAAAAATAC … … … … … … … … … … … … seq4: … … … … … … CATAGCAGATACAGCA … … … … … … … … … … … … preliminary local alignment of the sequences Columns not in motif provide background frequencies ACATAGACAGTATAGAGAATCAGAATGCAGCATAGCAGCACATAGAGCAGCATAG TAGACCATAGACCGATACGCGCATAGAGCATAGACACGATAGCATAGCATAGCAT TACAGATCAGCAAGAGCCGACAGACAAAAAAATACGAGCAAAACGAGCATTATCG TAGGGGACACAGATACAGACATAGCAGATACAGCATAGACATAGACAGATAGCAG. seq10: provides initial estimates of frequencies of nucleotides in each motif column
5
www..uni-rostock.de Ulf Schmitz, Statistical methods for aiding alignment5 Expectation Maximization Algorithm seq1: … … … … … … TCAGAATGCAGCATAG … … … … … … … … … … … … seq2: … … … … … … CGCATAGAGCATAGAC … … … … … … … … … … … … seq3: … … … … … … ACAGACAAAAAAATAC … … … … … … … … … … … … seq4: … … … … … … CATAGCAGATACAGCA … … … … … … … … … … … … BackgroundSite column 1Site column 2… G0.270.40.1… C0.250.40.1… A0.250.20.1… T0.230.20.7… 1.001.0 The first column gives the background frequencies in the flanking sequence. 12345678910111213141516. seq10: 4x17 matrix
6
www..uni-rostock.de Ulf Schmitz, Statistical methods for aiding alignment6 Expectation Maximization Algorithm Each sequence is scanned for all possible locations for the site to find the most probable location of the site the EM algorithm consists of two steps which are repeated consecutively step 1 - expectation step – –column-by-column composition of the found site is used to estimate the probability of finding the site at any position in each of the sequences –these probabilities are used to provide expected base or amino acid distribution for each column of the site Sequence 1xxxxoooooooooooooooooo xxxx |||| |||||||||||||||||| oxxxxooooooooooooooooo |||| | ||||||||||||||||| ooxxxxoooooooooooooooo |||| || |||||||||||||||| A B C Use estimates of residue frequencies for each column in the motif to calculate probability of motif in this position and multiply by… Backgroung frequencies in the remaining positions
7
www..uni-rostock.de Ulf Schmitz, Statistical methods for aiding alignment7 Expectation Maximization Algorithm seq1: ACATAGACAGTATAGAGAATCAGAATGCAGCATAGCAGCACATAGAGCAGCATAG 16151413121110987654321 1.0 1.00 …0.70.20.23T …0.10.20.25A …0.10.40.25C …0.10.40.27G …Site column 2Site column 1Background (for a in pos. 1) (for C in pos. 2) for the next 14 positions in site for G in flanking pos. 1 for A in flanking pos.2 for the next 82 flanking positions Table of column frequencies of each base
8
www..uni-rostock.de Ulf Schmitz, Statistical methods for aiding alignment8 Expectation Maximization Algorithm seq1: ACATAGACAGTATAGAGAATCAGAATGCAGCATAGCAGCACATAGAGCAGCATAG ACATAGACAGTATAGAGAATCAGAATGCAGCATAGCAGCACATAGAGCAGCATAG 16151413121110987654321 16151413121110987654321 16151413121110987654321 16151413121110987654321. seq10: The probability of this best location in seq1, say at site k, is the ratio of the site probability at k divided by the sum of all other site probabilities. P (site k in seq1) = P site k, seq1 / P site 1, seq1 + P site 2, seq1 + … + P site 85, seq1 The probability of the site location in each sequence is then calculated in this manner.
9
www..uni-rostock.de Ulf Schmitz, Statistical methods for aiding alignment9 Expectation Maximization Algorithm step 2 – maximization step – –the new counts of bases or amino acids for each position in the site found in step 1 are substituted for the previous set seq1: ACATAGACAGTATAGAGAATCAGAATGCAGCATAGCAGCACATAGAGCAGCATAG 16151413121110987654321 (e.g.)P(site 1 in seq1) = 0.01 and P(site 2 in seq1) = 0.02 seq1: … … … … … … TCAGAATGCAGCATAG … … … … … … … … … … … … seq2: … … … … … … CGCATAGAGCATAGAC … … … … … … … … … … … … seq3: … … … … … … ACAGACAAAAAAATAC … … … … … … … … … … … … seq4: … … … … … … CATAGCAGATACAGCA … … … … … … … … … … … …. seq10: +0.01 ACATAGACAGTATAGA +0.02 CATAGACAGTATAGAG
10
www..uni-rostock.de Ulf Schmitz, Statistical methods for aiding alignment10 Expectation Maximization Algorithm This procedure is repeated for all other site locations and all other sequences. A new version of the table of residue frequencies can be build. The expectation and maximation steps are repeated until the estimates of the base frequencies do not change. MEME (Multiple EM for Motif Elication) : is a tool for performing msa‘s by the em method see http://www.sdsc.edu/MEME/meme/website/meme.html
11
www..uni-rostock.de Ulf Schmitz, Statistical methods for aiding alignment11 Expectation Maximization Algorithm the EM algorithm consists of two steps which are repeated consecutively step 1 - expectation step – –column-by-column composition of the found site is used to estimate the probability of finding the site at any position in each of the sequences –these probabilities are used to provide expected base or amino acid distribution for each column of the site step 2 – maximization step – –the new counts of bases or amino acids for each position in the site found in step 1 are substituted for the previous set step 1 is then repeated until the algorithm converges on a solution
12
www..uni-rostock.de Ulf Schmitz, Statistical methods for aiding alignment12 Markov chain models –a Markov chain model is defined by: a set of states some states emit symbols other states (e.g. the begin state) are silent a set of transitions with associated probabilities the transitions emanating from a given state define a distribution over the possible next states
13
www..uni-rostock.de Ulf Schmitz, Statistical methods for aiding alignment13 Markov chain models given some sequence x of length L, we can ask how probable the sequence is, based on our model for any probabilistic model of sequences, we can write this probability as: key property of a (1st order) Markov chain: the probability of each X i depends only on X i-1
14
www..uni-rostock.de Ulf Schmitz, Statistical methods for aiding alignment14 Markov chain models 1st order Markov chain
15
www..uni-rostock.de Ulf Schmitz, Statistical methods for aiding alignment15 Markov chain models Example Application CpG islands –CG-dinucleotides are rarer in eukaryotic genomes than expected given the independent probabilities of C, G –but the regions upstream of genes are richer in CG dinucleotides than elsewhere – CpG islands –useful evidence for finding genes Could predict CpG islands with Markov chains –one to represent CpG islands –one to represent the rest of the genome
16
www..uni-rostock.de Ulf Schmitz, Statistical methods for aiding alignment16 Markov chain models
17
www..uni-rostock.de Ulf Schmitz, Statistical methods for aiding alignment17 Markov chain models Selecting the Order of a Markov Chain Model Higher order models remember more “history” Additional history can have predictive value Example: – predict the next word in this sentence fragment “…finish __” (up, it, first, last, …?) – now predict it given more history “Fast guys finish __”
18
www..uni-rostock.de Ulf Schmitz, Statistical methods for aiding alignment18 Hidden Markov Models (HMM) Hidden State We will distinguish between the observed parts of a problem and the hidden parts In the Markov models we have considered previously, it is clear which state accounts for each part of the observed sequence In another model, there are multiple states that could account for each part of the observed sequence – this is the hidden part of the problem – states are decoupled from sequence symbols
19
www..uni-rostock.de Ulf Schmitz, Statistical methods for aiding alignment19 Hidden Markov models Markov model: Move from state to state according to probability distribution of each state and emit states visited: Hidden Markov model: Move from state to state in the same way, but emit a symbol according to probability distribution instead:
20
www..uni-rostock.de Ulf Schmitz, Statistical methods for aiding alignment20 Hidden Markov Model Red square, match state green diamond, insert state blue circle, delete state Arrows indicate the probability of transition from one state to the next.
21
www..uni-rostock.de Ulf Schmitz, Statistical methods for aiding alignment21 Hidden Markov Model N * F L S N K Y L T Q * W - T A. Sequence alignment B. Hidden Markov model for sequence alignment BEGM1M2M3M4END I0I1 I2I3I4 D1 D2 D3 D4 Probability of sequence: N K Y L T BEG -> M -> I1 -> M2 -> M3 -> M4 -> END 0.33 * 0.05 * 0.33 * 0.05 * 0.33 * 0.05 * 0.33 * 0.05 * 0.33 * 0.05 * 0.5 = 6.1 * 10 -10
22
www..uni-rostock.de Ulf Schmitz, Statistical methods for aiding alignment22 Hidden Markov Models Three Important Questions How likely is a given sequence? What is the most probable “path” for generating a given sequence? How can we learn the HMM parameters given a set of sequences?
23
www..uni-rostock.de Ulf Schmitz, Statistical methods for aiding alignment23 Hidden Markov Models HMM-based homology searching formal probabilistic basis and consistent theory behind gap and insertion scores HMMs good for profile searches, bad for alignment (due to parametrisation of the models) HMMs are slow HMMER - http://hmmer.wustl.edu/ Tools: SAM - http://cse.ucsc.edu/research/comp/bio/sam.html
24
www..uni-rostock.de Ulf Schmitz, Statistical methods for aiding alignment24 Outlook Machine learning Clustering
25
www..uni-rostock.de Ulf Schmitz, Statistical methods for aiding alignment25 Sequence Alignment Thanx for your attention!!!
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.