數據分析 David Shiuan Department of Life Science Institute of Biotechnology Interdisciplinary Program of Bioinformatics National Dong Hwa University.

Slides:



Advertisements
Similar presentations
Lecture 16 Hidden Markov Models. HMM Until now we only considered IID data. Some data are of sequential nature, i.e. have correlations have time. Example:
Advertisements

Applying Hidden Markov Models to Bioinformatics
HIDDEN MARKOV MODELS IN COMPUTATIONAL BIOLOGY CS 594: An Introduction to Computational Molecular Biology BY Shalini Venkataraman Vidhya Gunaseelan.
Bioinformatics lectures at Rice University
1 Hidden Markov Model Xiaole Shirley Liu STAT115, STAT215, BIO298, BIST520.
Hidden Markov Model 主講人:虞台文 大同大學資工所 智慧型多媒體研究室. Contents Introduction – Markov Chain – Hidden Markov Model (HMM) Formal Definition of HMM & Problems Estimate.
HIDDEN MARKOV MODELS Prof. Navneet Goyal Department of Computer Science BITS, Pilani Presentation based on: & on presentation on HMM by Jianfeng Tang Old.
Lecture 8: Hidden Markov Models (HMMs) Michael Gutkin Shlomi Haba Prepared by Originally presented at Yaakov Stein’s DSPCSP Seminar, spring 2002 Modified.
Patterns, Profiles, and Multiple Alignment.
Cognitive Computer Vision
An Introduction to Hidden Markov Models and Gesture Recognition Troy L. McDaniel Research Assistant Center for Cognitive Ubiquitous Computing Arizona State.
Hidden Markov Models Ellen Walker Bioinformatics Hiram College, 2008.
Ch-9: Markov Models Prepared by Qaiser Abbas ( )
Hidden Markov Models in Bioinformatics
Profiles for Sequences
Cadeias de Markov Escondidas Fevereiro 2007 Magnos Martinello Universidade Federal do Espírito Santo - UFES Departamento de Informática - DI Laboratório.
Hidden Markov Models Theory By Johan Walters (SR 2003)
1 Hidden Markov Models (HMMs) Probabilistic Automata Ubiquitous in Speech/Speaker Recognition/Verification Suitable for modelling phenomena which are dynamic.
Lecture 15 Hidden Markov Models Dr. Jianjun Hu mleg.cse.sc.edu/edu/csce833 CSCE833 Machine Learning University of South Carolina Department of Computer.
Hidden Markov Models (HMMs) Steven Salzberg CMSC 828H, Univ. of Maryland Fall 2010.
Sequential Modeling with the Hidden Markov Model Lecture 9 Spoken Language Processing Prof. Andrew Rosenberg.
Applying Hidden Markov Models to Bioinformatics. Outline What are Hidden Markov Models? Why are they a good tool for Bioinformatics? Applications in Bioinformatics.
Albert Gatt Corpora and Statistical Methods Lecture 8.
درس بیوانفورماتیک December 2013 مدل ‌ مخفی مارکوف و تعمیم ‌ های آن به نام خدا.
Hidden Markov Model 11/28/07. Bayes Rule The posterior distribution Select k with the largest posterior distribution. Minimizes the average misclassification.
Hidden Markov Models I Biology 162 Computational Genetics Todd Vision 14 Sep 2004.
HIDDEN MARKOV MODELS IN MULTIPLE ALIGNMENT. 2 HMM Architecture Markov Chains What is a Hidden Markov Model(HMM)? Components of HMM Problems of HMMs.
HIDDEN MARKOV MODELS IN MULTIPLE ALIGNMENT
Timothy and RahulE6886 Project1 Statistically Recognize Faces Based on Hidden Markov Models Presented by Timothy Hsiao-Yi Chin Rahul Mody.
Metamorphic Malware Research
1 Hidden Markov Model Instructor : Saeed Shiry  CHAPTER 13 ETHEM ALPAYDIN © The MIT Press, 2004.
Learning HMM parameters Sushmita Roy BMI/CS 576 Oct 21 st, 2014.
Hidden Markov Models In BioInformatics
Hidden Markov Models As used to summarize multiple sequence alignments, and score new sequences.
Online Chinese Character Handwriting Recognition for Linux
CSCE555 Bioinformatics Lecture 6 Hidden Markov Models Meeting: MW 4:00PM-5:15PM SWGN2A21 Instructor: Dr. Jianjun Hu Course page:
Fundamentals of Hidden Markov Model Mehmet Yunus Dönmez.
Doug Raiford Lesson 3.  Have a fully sequenced genome  How identify the genes?  What do we know so far? 10/13/20152Gene Prediction.
Hidden Markov Models Usman Roshan CS 675 Machine Learning.
Sequence Models With slides by me, Joshua Goodman, Fei Xia.
Hidden Markov Models in Keystroke Dynamics Md Liakat Ali, John V. Monaco, and Charles C. Tappert Seidenberg School of CSIS, Pace University, White Plains,
Homework 1 Reminder Due date: (till 23:59) Submission: – – Write the names of students in your team.
ECE 8443 – Pattern Recognition ECE 8423 – Adaptive Signal Processing Objectives: ML and Simple Regression Bias of the ML Estimate Variance of the ML Estimate.
Hidden Markov Models & POS Tagging Corpora and Statistical Methods Lecture 9.
PGM 2003/04 Tirgul 2 Hidden Markov Models. Introduction Hidden Markov Models (HMM) are one of the most common form of probabilistic graphical models,
Algorithms in Computational Biology11Department of Mathematics & Computer Science Algorithms in Computational Biology Markov Chains and Hidden Markov Model.
1 Hidden Markov Models (HMMs). 2 Definition Hidden Markov Model is a statistical model where the system being modeled is assumed to be a Markov process.
ECE 8443 – Pattern Recognition ECE 8527 – Introduction to Machine Learning and Pattern Recognition Objectives: Elements of a Discrete Model Evaluation.
Hidden Markov Models (HMMs) –probabilistic models for learning patterns in sequences (e.g. DNA, speech, weather, cards...) (2 nd order model)
Albert Gatt Corpora and Statistical Methods. Acknowledgement Some of the examples in this lecture are taken from a tutorial on HMMs by Wolgang Maass.
1 Hidden Markov Models Hsin-min Wang References: 1.L. R. Rabiner and B. H. Juang, (1993) Fundamentals of Speech Recognition, Chapter.
Auto-Regressive HMM Recall the hidden Markov model (HMM) – a finite state automata with nodes that represent hidden states (that is, things we cannot necessarily.
(H)MMs in gene prediction and similarity searches.
ECE 8443 – Pattern Recognition ECE 8527 – Introduction to Machine Learning and Pattern Recognition Objectives: Reestimation Equations Continuous Distributions.
Introducing Hidden Markov Models First – a Markov Model State : sunny cloudy rainy sunny ? A Markov Model is a chain-structured process where future states.
Pattern Recognition NTUEE 高奕豪 2005/4/14. Outline Introduction Definition, Examples, Related Fields, System, and Design Approaches Bayesian, Hidden Markov.
Hidden Markov Models. A Hidden Markov Model consists of 1.A sequence of states {X t |t  T } = {X 1, X 2,..., X T }, and 2.A sequence of observations.
Definition of the Hidden Markov Model A Seminar Speech Recognition presentation A Seminar Speech Recognition presentation October 24 th 2002 Pieter Bas.
Other Models for Time Series. The Hidden Markov Model (HMM)
Visual Recognition Tutorial1 Markov models Hidden Markov models Forward/Backward algorithm Viterbi algorithm Baum-Welch estimation algorithm Hidden.
1 Hidden Markov Model Xiaole Shirley Liu STAT115, STAT215.
Behavior Recognition Based on Machine Learning Algorithms for a Wireless Canine Machine Interface Students: Avichay Ben Naim Lucie Levy 14 May, 2014 Ort.
Hidden Markov Models Wassnaa AL-mawee Western Michigan University Department of Computer Science CS6800 Adv. Theory of Computation Prof. Elise De Doncker.
Hidden Markov Models BMI/CS 576
What is a Hidden Markov Model?
Bioinformatics lectures at Rice University
Intelligent Information System Lab
Hidden Markov Models Part 2: Algorithms
Hidden Markov Autoregressive Models
HIDDEN MARKOV MODELS IN COMPUTATIONAL BIOLOGY
Presentation transcript:

數據分析 David Shiuan Department of Life Science Institute of Biotechnology Interdisciplinary Program of Bioinformatics National Dong Hwa University

Microsoft Excel Plot with Standard Deviation Data input Select data   /average Insert  Function  Statistics  STDEV Select average  plotting Select data/plot /2 knocks  Y-axis deviation  select Upper/Lower  self-determination  choose standard deviation data into + / - columns  enter

Microsoft Excel Plot with Regression Data input Plotting Regression - function

Hidden Markov Model A hidden Markov model (HMM) is a statistical model where the system being modeled is assumed to be a Markov process with unknown parameters, and the challenge is to determine the hidden parameters from the observable parameters.statistical model Markov processobservable The extracted model parameters can then be used to perform further analysis, for example for pattern recognition applications.pattern recognition

Hidden Markov Model - Machine Learning Algorithm Method

Hidden Markov Model In a regular Markov model, the state is directly visible to the observer, the state transition probabilities are the only parameters. In a hidden Markov model, the state is not directly visible, but variables influenced by the state are visible. Each state has a probability distribution over the possible output tokens. Therefore the sequence of tokens generated by an HMM gives some information about the sequence of states. Hidden Markov models are especially known for their application in temporal pattern recognition such as speech, handwriting, gesture recognition and bioinformatics.temporal bioinformatics

History of Hidden Markov Model Hidden Markov Models were first described in a series of statistical papers by Leonard E. Baum and other authors in the 1960s. One of the first applications of HMMs was speech recognition.Leonard E. Baum1960sspeech recognition In the second half of the 1980s, HMMs began to be applied to the analysis of biological sequences, in particular DNA. Since then, they have become ubiquitous in the field of bioinformatics.1980sDNAbioinformatics

Applications of hidden Markov models speech recognition, gesture and body motion recognition, optical character recognition speech recognitionoptical character recognition machine translation bioinformatics and genomics bioinformaticsgenomics prediction of protein-coding regions in genome sequences modelling families of related DNA or protein sequences prediction of secondary structure elements from protein primary sequences

Architecture of a Hidden Markov Model The random variable x(t) is the value of the hidden variable at time t. The random variable y(t) is the value of the observed variable at time t. The arrows in the diagram denote conditional dependencies. From the diagram, it is clear that the value of the hidden variable x(t) (at time t) only depends on the value of the hidden variable x(t − 1) (at time t − 1). This is called the Markov property. Similarly, the value of the observed variable y(t) only depends on the value of the hidden variable x(t) (both at time t).Markov property

Probability of an observed sequence The probability of observing a sequence Y = y(0), y(1),...,y(L − 1) of length L is given by: where the sum runs over all possible hidden node sequences X = x(0), x(1),..., x(L − 1). A brute force calculation of P(Y) is intractable for realistic problems, as the number of possible hidden node sequences typically is extremely high. The calculation can however be sped up enormously using an algorithm called the forward-backward procedure.

Using Hidden Markov Models Theree canonical 正統 problems associated with HMMs:canonical Given the parameters of the model, compute the probability of a particular output sequence. This problem is solved by the forward-backward procedure. Given the parameters of the model, find the most likely sequence of hidden states that could have generated a given output sequence. This problem is solved by the Viterbi algorithm Viterbi algorithm. Given an output sequence or a set of such sequences, find the most likely set of state transition and output probabilities. In other words, train the parameters of the HMM given a dataset of sequences. This problem is solved by the Baum-Welch algorithmBaum-Welch algorithm

A concrete example Your friend lives far away and you talk daily over the telephone about what he did that day. He is only interested in three activities: walking in the park, shopping, and cleaning his apartment. The choice of what to do is determined exclusively by the weather on a given day. You have no definite information about the weather where your friend lives, but you know general trends. Based on what he tells you he did each day, you try to guess what the weather must have been like. You believe that the weather operates as a discrete Markov chain. There are two states, "Rainy" and "Sunny", but you cannot observe them directly, that is, they are hidden from you. On each day, there is a certain chance that your friend will perform one of the following activities, depending on the weather: "walk", "shop", or "clean". Since your friend tells you about his activities, those are the observations. The entire system is that of a hidden Markov model (HMM).Markov chain You know the general weather trends in the area, and what your friend likes to do on average. In other words, the parameters of the HMM are known.

You can write them down in the Python programming language:Python programming language states = ('Rainy', 'Sunny') observations = ('walk', 'shop', 'clean') start_probability = {'Rainy': 0.6, 'Sunny': 0.4} transition_probability = { 'Rainy' : {'Rainy': 0.7, 'Sunny': 0.3}, 'Sunny' : {'Rainy': 0.4, 'Sunny': 0.6}, } emission_probability = { 'Rainy' : {'walk': 0.1, 'shop': 0.4, 'clean': 0.5}, 'Sunny' : {'walk': 0.6, 'shop': 0.3, 'clean': 0.1},

In this piece of code, start_probability represents your uncertainty about which state the HMM is in when your friend first calls you (all you know is that it tends to be rainy on average). The particular probability distribution used here is not the equilibrium one, which is (given the transition probabilities) actually approximately {'Rainy': 0.571, 'Sunny': 0.429}. The transition_probability represents the change of the weather in the underlying Markov chain. In this example, there is only a 30% chance that tomorrow will be sunny if today is rainy. The emission_probability represents how likely your friend is to perform a certain activity on each day. If it is rainy, there is a 50% chance that he is cleaning his apartment; if it is sunny, there is a 60% chance that he is outside for a walk.

Signal P : Hidden Markov Model Three Regions : 1. the N-terminal part 2. the hydrophobic region 3. the region around the cleavage site For known signal peptides, the model can be used to assign objective boundaries between these three regions.

Signal P : Hidden Markov Model