Tutorial: Markov Chains

Slides:



Advertisements
Similar presentations
Random Processes Markov Chains Professor Ke-Sheng Cheng Department of Bioenvironmental Systems Engineering
Advertisements

1 Introduction to Discrete-Time Markov Chain. 2 Motivation  many dependent systems, e.g.,  inventory across periods  state of a machine  customers.
ST3236: Stochastic Process Tutorial 3 TA: Mar Choong Hock Exercises: 4.
Markov Models.
Tutorial on Hidden Markov Models.
Patterns, Profiles, and Multiple Alignment.
Al-Imam Mohammad Ibn Saud University
Андрей Андреевич Марков. Markov Chains Graduate Seminar in Applied Statistics Presented by Matthias Theubert Never look behind you…
HIDDEN MARKOV MODEL Application of the conditional probability.
Tutorial 8 Markov Chains. 2  Consider a sequence of random variables X 0, X 1, …, and the set of possible values of these random variables is {0, 1,
Markov Chains & Randomized algorithms BY HAITHAM FALLATAH FOR COMP4804.
Markov chains Prof. Noah Snavely CS1114
Statistical Alignment: Computational Properties, Homology Testing and Goodness-of-Fit J. Hein, C. Wiuf, B. Knudsen, M.B. Moller and G. Wibling.
Data-driven methods: Video : Computational Photography Alexei Efros, CMU, Fall 2007 © A.A. Efros.
Estimate the Number of Relevant Images Using Two-Order Markov Chain Presented by: WANG Xiaoling Supervisor: Clement LEUNG.
CS6800 Advanced Theory of Computation Fall 2012 Vinay B Gavirangaswamy
1 Introduction to Stochastic Models GSLM Outline  discrete-time Markov chain  motivation  example  transient behavior.
1 A Network Traffic Classification based on Coupled Hidden Markov Models Fei Zhang, Wenjun Wu National Lab of Software Development.
Copyright 2006 Hal Caswell Applications of Markov chains in demography and population ecology Hal Caswell Biology Department Woods Hole Oceanographic Institution.
Monte Carlo Methods Versatile methods for analyzing the behavior of some activity, plan or process that involves uncertainty.
PHARMACOECONOMIC EVALUATIONS & METHODS MARKOV MODELING IN DECISION ANALYSIS FROM THE PHARMACOECONOMICS ON THE INTERNET ®SERIES ©Paul C Langley 2004 Maimon.
Chapter 61 Continuous Time Markov Chains Birth and Death Processes,Transition Probability Function, Kolmogorov Equations, Limiting Probabilities, Uniformization.
Courtesy of J. Akinpelu, Anis Koubâa, Y. Wexler, & D. Geiger
Theory of Computations III CS-6800 |SPRING
1 Hidden Markov Model Presented by Qinmin Hu. 2 Outline Introduction Generating patterns Markov process Hidden Markov model Forward algorithm Viterbi.
Stochastic Models Lecture 3 Continuous-Time Markov Processes
Markov Chains and Absorbing States
NOAA Fisheries’ Proposed Strategy to Reduce Ship Strikes of North Atlantic Right Whales – Shipping Industry Dialog Gregory Silber, Ph.D., Patricia Gerrior,
10.1 Properties of Markov Chains In this section, we will study a concept that utilizes a mathematical model that combines probability and matrices to.
Stochastic Processes and Transition Probabilities D Nagesh Kumar, IISc Water Resources Planning and Management: M6L5 Stochastic Optimization.
Meaning of Markov Chain Markov Chain is a stochastic model describing a sequence of possible events in which the probability of each event depends only.
ST3236: Stochastic Process Tutorial 6
Productivity. Operations Management Module 3 Productivity Efficiency Effectiveness Markov chains.
Warm-up Use the information below to find the population distribution after 20 years for the given population of giraffes. Age (in years)
Warm-up – for my history buffs…
Recitation 3 Steve Gu Jan
Sampling Why use sampling? Terms and definitions
A First course in Probability (8th Ed), Sheldon Ross
Business Modeling Lecturer: Ing. Martina Hanová, PhD.
Intelligent Systems (AI-2) Computer Science cpsc422, Lecture 3
Education in the USA.
Intelligent Systems (AI-2) Computer Science cpsc422, Lecture 3
PageRank and Markov Chains
Today 2870 days iPhone 6s / 6s Plus: ~13 mil units sold
See if you can write an equation from this table.
Kentwood Registration -Juniors
FW364 Ecological Problem Solving Class 16: Stage Structure
6. Markov Chain.
Hidden Markov Models Part 2: Algorithms
11th Grade Parent Meeting
CS 188: Artificial Intelligence Spring 2007
Markov Chains Carey Williamson Department of Computer Science
Unit 3. Day 1..
IENG 362 Markov Chains.
IENG 362 Markov Chains.
CSE 311 Foundations of Computing I
Introduction to Concepts Markov Chains and Processes
Courtesy of J. Akinpelu, Anis Koubâa, Y. Wexler, & D. Geiger
Discrete-time markov chain (continuation)
Chapman-Kolmogorov Equations
DESIGN OF SEQUENTIAL CIRCUITS
The chain of Andrej Markov and its applications NETWORKS goes to school - April 23, 2018 Jan-Pieter Dorsman.
Carey Williamson Department of Computer Science University of Calgary
Markov Chains Tutorial #5
How's the weather?.
Discrete-time markov chain (continuation)
Theory of Computation Lecture 23: Turing Machines III
Chi Square Test of Homogeneity
CS249: Neural Language Model
Intelligent Systems (AI-2) Computer Science cpsc422, Lecture 3
Presentation transcript:

Tutorial: Markov Chains Steve Gu Feb 28, 2008

Outline Markov chain Applications Summary Weather forecasting Enrollment assessment Sequence generation Rank the web page Life cycle analysis Summary

History The origin of Markov chains is due to Markov, a Russian mathematician who first published in the Imperial Academy of Sciences in St. Petersburg in 1907, a paper studying the statistical behavior of the letters in Onegin, a well known poem of Pushkin.

A Markov Chain

Transition Probability Table

Example 1: Weather Forecasting[1]

Weather Forecasting Weather forecasting example: Suppose tomorrow’s weather depends on today’s weather only. We call it an Order-1 Markov Chain, as the transition function depends on the current state only. Given today is sunny, what is the probability that the coming days are sunny, rainy, cloudy, cloudy, sunny ? Obviously, the answer is : (0.5)(0.4)(0.3)(0.5) (0.2) = 0.0054 0.1 0.3 0.4 0.5 0.4 0.3 sunny rainy cloudy 0.5 0.2

Weather Forecasting Weather forecasting example: Given today is sunny, what is the probability that it will be rainy 4 days later? We only knows the start state, the final state and the input length = 4 There are a number of possible combinations of states in between. 0.1 0.5 0.4 0.4 0.3 sunny rainy cloudy 0.3 0.3 0.5 0.2

Weather Forecasting Weather forecasting example: Chapman-Kolmogorov Equation: Transition Matrix: s r c s r c 0.1 0.5 0.4 0.4 0.3 sunny rainy cloudy 0.3 0.3 0.5 0.2

Weather Forecasting Weather forecasting example: Two days: Four days: (00 x 01) + (01 x 11) + (02 x 21)  01 0.1 0.5 0.4 0.4 0.3 sunny rainy cloudy 0.3 0.3 0.5 0.2

Weather Forecasting Weather forecasting example: What is the probability that today is cloudy? There are infinite number of days before today. It is equivalent to ask the probability after infinite number of days. We do not care the “start state” as it brings little effect when there are infinite number of states. We call it the “Limiting probability” when the machine becomes steady. 0.1 0.5 0.4 0.4 0.3 sunny rainy cloudy 0.3 0.3 0.5 0.2

Weather Forecasting Weather forecasting example: Since the start state is “don’t care”, instead of forming a 2-D matrix, the limiting probability is express a a single row matrix : Since the machine is steady, the limiting probability does not change even it goes one more step. 0.1 0.5 0.4 0.4 0.3 sunny rainy cloudy 0.3 0.3 0.5 0.2

Weather Forecasting Weather forecasting example: So the limiting probability can be computed by: We have  probability that today is cloudy = 0.1 0.5 0.4 0.4 0.3 sunny rainy cloudy 0.3 0.3 0.5 0.2

Example 2: Enrollment Assessment [1]

Undergraduate Enrollment Model Stop Out Freshmen Sophomore Junior Senior Graduate

State Transition Probabilities Fr So Jr Sr S/O Gr .2 .65 .14 .01 .25 .6 .13 .02 TP = .3 .55 .12 .03 .4 .05 0.1 0.4 0.3 1

Enrollment Assessment Graduate Freshmen Sophomore Junior Senior Stop Out Fr So Jr Sr S/O Gr .2 .65 .14 .01 .25 .6 .13 .02 TP = .3 .55 .12 .03 .4 .05 0.1 0.4 0.3 1 Given: Transition probability table & Initial enrollment estimation, we can estimate the number of students at each time point

Example 3: Sequence Generation[3]

Sequence Generation

Markov Chains as Models of Sequence Generation 0th-order 1st-order 1th-order 2 2nd-order

A Fifth Order Markov Chain

Example 4: Rank the web page

PageRank How to rank the importance of web pages?

PageRank http://en.wikipedia.org/wiki/Image:PageRanks-Example.svg

PageRank: Markov Chain For N pages, say p1,…,pN Write the Equation to compute PageRank as: where l(i,j) is define to be:

PageRank: Markov Chain Written in Matrix Form:

Example 5: Life Cycle Analysis[4]

How to model life cycles of Whales? http://www.specieslist.com/images/external/Humpback_Whale_underwater.jpg

Life cycle analysis In real application, we need to specify or learn the transition probability table calf immature mature mom Post-mom dead

Application: The North Atlantic right whale (Eubalaena glacialis) This is the north atlantic right whale; a mother and calf June 2006 Hal Caswell -- Markov Anniversary Meeting

Hal Caswell -- Markov Anniversary Meeting Endangered, by any standard N < 300 individuals Minimal recovery since 1935 Ship strikes Entanglement with fishing gear feeding The right whale is distributed calving June 2006 Hal Caswell -- Markov Anniversary Meeting

Mortality and serious injury due to entanglement and ship strikes This is not unreasonable; Every year several whales are killed by ship strikes or entanglement in fishing gear 1014 “Staccato” died April 1999 ship strike 2030: died October 1999 entanglement June 2006 Hal Caswell -- Markov Anniversary Meeting

Hal Caswell -- Markov Anniversary Meeting 1980 1984 1988 1992 1996 0.82 0.84 0.86 0.88 0.9 0.92 0.94 0.96 Calf survival time trend best model Year June 2006 Hal Caswell -- Markov Anniversary Meeting

Hal Caswell -- Markov Anniversary Meeting 1980 1984 1988 1992 1996 0.6 0.65 0.7 0.75 0.8 0.85 0.9 0.95 1 Mother survival time trend best model Year June 2006 Hal Caswell -- Markov Anniversary Meeting

Hal Caswell -- Markov Anniversary Meeting 1980 1984 1988 1992 1996 0.1 0.15 0.2 0.25 0.3 0.35 0.4 0.45 0.5 Birth probability time trend best model Year June 2006 Hal Caswell -- Markov Anniversary Meeting

Hal Caswell -- Markov Anniversary Meeting 1980 1982 1984 1986 1988 1990 1992 1994 1996 1998 10 20 30 40 50 60 70 Year Life expectancy period Things don’t look good for the right whale! June 2006 Hal Caswell -- Markov Anniversary Meeting

Summary Markov Chains: state transition model Some applications Natural Language Modeling Weather forecasting Enrollment assessment Sequence generation Rank the web page Life cycle analysis etc (Hopefully you will find more  )

Thank you Q&A

Reference [1] http://adammikeal.org/courses/compute/presentations/Markov_model.ppt [2] http://uaps.ucf.edu/doc/AIR2006MarkovChain051806.ppt [3]http://germain.umemat.maine.edu/faculty/khalil/courses/MAT500/JGraber/genes2007.ppt [4] http://www.csc2.ncsu.edu/conferences/nsmc/MAM2006/caswell.ppt