Discrete-time markov chain (continuation)

Slides:



Advertisements
Similar presentations
Markov Processes Aim Higher. What Are They Used For? Markov Processes are used to make predictions and decisions where results are partly random but may.
Advertisements

ST3236: Stochastic Process Tutorial 3 TA: Mar Choong Hock Exercises: 4.
MARKOV CHAIN EXAMPLE Personnel Modeling. DYNAMICS Grades N1..N4 Personnel exhibit one of the following behaviors: –get promoted –quit, causing a vacancy.
Markov Chain. 2 Stochastic Processes A stochastic process is a collection of random variables The index t is often interpreted as time. is called the.
Operations Research: Applications and Algorithms
Discrete Time Markov Chains
Markov Chains 1.
1 Markov Chains (covered in Sections 1.1, 1.6, 6.3, and 9.4)
. Hidden Markov Models - HMM Tutorial #5 © Ydo Wexler & Dan Geiger.
Solutions Markov Chains 1
The Rate of Concentration of the stationary distribution of a Markov Chain on the Homogenous Populations. Boris Mitavskiy and Jonathan Rowe School of Computer.
10.3 Absorbing Markov Chains
Андрей Андреевич Марков. Markov Chains Graduate Seminar in Applied Statistics Presented by Matthias Theubert Never look behind you…
Matrices, Digraphs, Markov Chains & Their Use. Introduction to Matrices  A matrix is a rectangular array of numbers  Matrices are used to solve systems.
Markov Chains Ali Jalali. Basic Definitions Assume s as states and s as happened states. For a 3 state Markov model, we construct a transition matrix.
Tutorial 8 Markov Chains. 2  Consider a sequence of random variables X 0, X 1, …, and the set of possible values of these random variables is {0, 1,
Operations Research: Applications and Algorithms
1. Markov Process 2. States 3. Transition Matrix 4. Stochastic Matrix 5. Distribution Matrix 6. Distribution Matrix for n 7. Interpretation of the Entries.
What is the probability that the great-grandchild of middle class parents will be middle class? Markov chains can be used to answer these types of problems.
Reliable System Design 2011 by: Amir M. Rahmani
1 Markov Chains Tom Finke. 2 Overview Outline of presentation The Markov chain model –Description and solution of simplest chain –Study of steady state.
1 Software Testing and Quality Assurance Lecture 36 – Software Quality Assurance.
048866: Packet Switch Architectures Dr. Isaac Keslassy Electrical Engineering, Technion Review.
The Gibbs sampler Suppose f is a function from S d to S. We generate a Markov chain by consecutively drawing from (called the full conditionals). The n’th.
If time is continuous we cannot write down the simultaneous distribution of X(t) for all t. Rather, we pick n, t 1,...,t n and write down probabilities.
CSE 221: Probabilistic Analysis of Computer Systems Topics covered: Discrete time Markov chains (Sec )
INDR 343 Problem Session
CSE 221: Probabilistic Analysis of Computer Systems Topics covered: Discrete time Markov chains (Sec. 7.1)
Basic Definitions Positive Matrix: 5.Non-negative Matrix:
Group exercise For 0≤t 1
Lecture 11 – Stochastic Processes
ECES 741: Stochastic Decision & Control Processes – Chapter 1: The DP Algorithm 31 Alternative System Description If all w k are given initially as Then,
Solutions Markov Chains 2 3) Given the following one-step transition matrix of a Markov chain, determine the classes of the Markov chain and whether they.
Chapter 61 Continuous Time Markov Chains Birth and Death Processes,Transition Probability Function, Kolmogorov Equations, Limiting Probabilities, Uniformization.
Courtesy of J. Akinpelu, Anis Koubâa, Y. Wexler, & D. Geiger
 { X n : n =0, 1, 2,...} is a discrete time stochastic process Markov Chains.
Chapter 3 : Problems 7, 11, 14 Chapter 4 : Problems 5, 6, 14 Due date : Monday, March 15, 2004 Assignment 3.
1 Markov chains and processes: motivations Random walk One-dimensional walk You can only move one step right or left every time unit Two-dimensional walk.
Problems Markov Chains 2
10.1 Properties of Markov Chains In this section, we will study a concept that utilizes a mathematical model that combines probability and matrices to.
Stochastic Processes and Transition Probabilities D Nagesh Kumar, IISc Water Resources Planning and Management: M6L5 Stochastic Optimization.
Asst. Professor in Mathematics
ST3236: Stochastic Process Tutorial 6
Markov Processes What is a Markov Process?
Warm-up Use the information below to find the population distribution after 20 years for the given population of giraffes. Age (in years)
From DeGroot & Schervish. Example Occupied Telephone Lines Suppose that a certain business office has five telephone lines and that any number of these.
How many iterations in the Gibbs sampler? Adrian E. Raftery and Steven Lewis (September, 1991) Duke University Machine Learning Group Presented by Iulian.
Let E denote some event. Define a random variable X by Computing probabilities by conditioning.
Advanced Statistical Computing Fall 2016
Discrete-time markov chain (continuation)
V5 Stochastic Processes
Discrete Time Markov Chains
Lecture on Markov Chain
Solutions Markov Chains 1
Operations Research: Applications and Algorithms
Hidden Markov Autoregressive Models
Discrete-time markov chain (continuation)
IENG 362 Markov Chains.
IENG 362 Markov Chains.
Courtesy of J. Akinpelu, Anis Koubâa, Y. Wexler, & D. Geiger
Discrete-time markov chain (continuation)
Solutions Markov Chains 1
CSE 531: Performance Analysis of Systems Lecture 4: DTMC
Solutions Markov Chains 1
Solutions Markov Chains 6
Markov Chains & Population Movements
CS723 - Probability and Stochastic Processes
Random Processes / Markov Processes
CS723 - Probability and Stochastic Processes
CS723 - Probability and Stochastic Processes
Presentation transcript:

Discrete-time markov chain (continuation)

Converting SOME non-Markov Chains to Markov ChainS Let us explain this using an example: Suppose we have a stochastic process 𝒀 𝒕 and 𝒀 𝒕 ∈ −𝟏𝟎𝟎,𝟏𝟎𝟎 .

Converting SOME non-Markov Chains to Markov ChainS Assume that the probabilities are 𝑷 𝒀 𝒕+𝟏 =−𝟏𝟎𝟎 𝒀 𝒕 =−𝟏𝟎𝟎, 𝒀 𝒕−𝟏 =−𝟏𝟎𝟎 =𝟎.𝟏 𝑷 𝒀 𝒕+𝟏 =−𝟏𝟎𝟎 𝒀 𝒕 =𝟏𝟎𝟎, 𝒀 𝒕−𝟏 =−𝟏𝟎𝟎 =𝟎.𝟕 𝑷 𝒀 𝒕+𝟏 =−𝟏𝟎𝟎 𝒀 𝒕 =−𝟏𝟎𝟎, 𝒀 𝒕−𝟏 =𝟏𝟎𝟎 =𝟎.𝟖 𝑷 𝒀 𝒕+𝟏 =−𝟏𝟎𝟎 𝒀 𝒕 =𝟏𝟎𝟎, 𝒀 𝒕−𝟏 =𝟏𝟎𝟎 =𝟎.𝟎𝟓

Converting SOME non-Markov Chains to Markov ChainS Assume that the probabilities are 𝑷 𝒀 𝒕+𝟏 =𝟏𝟎𝟎 𝒀 𝒕 =−𝟏𝟎𝟎, 𝒀 𝒕−𝟏 =−𝟏𝟎𝟎 =𝟎.𝟗 𝑷 𝒀 𝒕+𝟏 =𝟏𝟎𝟎 𝒀 𝒕 =𝟏𝟎𝟎, 𝒀 𝒕−𝟏 =−𝟏𝟎𝟎 =𝟎.𝟑 𝑷 𝒀 𝒕+𝟏 =𝟏𝟎𝟎 𝒀 𝒕 =−𝟏𝟎𝟎, 𝒀 𝒕−𝟏 =𝟏𝟎𝟎 =𝟎.𝟐 𝑷 𝒀 𝒕+𝟏 =𝟏𝟎𝟎 𝒀 𝒕 =𝟏𝟎𝟎, 𝒀 𝒕−𝟏 =𝟏𝟎𝟎 =𝟎.𝟗𝟓

Converting SOME non-Markov Chains to Markov ChainS Question: Is 𝒀 𝒕 a Markov Chain?

Converting SOME non-Markov Chains to Markov ChainS Now, suppose we have a stochastic process 𝑿 𝒕 where 𝑿 𝒕 ∈ 𝟏,𝟐,𝟑,𝟒 defined as 𝑿 𝒕 = 𝟏 𝐢𝐟 𝒀 𝒕 =𝟏𝟎𝟎 𝐚𝐧𝐝 𝒀 𝒕−𝟏 =−𝟏𝟎𝟎 𝟐 𝐢𝐟 𝒀 𝒕 =−𝟏𝟎𝟎 𝐚𝐧𝐝 𝒀 𝒕−𝟏 =𝟏𝟎𝟎 𝟑 𝐢𝐟 𝒀 𝒕 =−𝟏𝟎𝟎 𝐚𝐧𝐝 𝒀 𝒕−𝟏 =−𝟏𝟎𝟎 𝟒 𝐢𝐟 𝒀 𝒕 =𝟏𝟎𝟎 𝐚𝐧𝐝 𝒀 𝒕−𝟏 =𝟏𝟎𝟎

Converting SOME non-Markov Chains to Markov ChainS Probabilities are 𝑷 𝑿 𝒕+𝟏 =𝟏 𝑿 𝒕 =𝟏 =𝟎 100,-100;100,-100 𝑷 𝑿 𝒕+𝟏 =𝟏 𝑿 𝒕 =𝟐 =𝟎.𝟐 100,-100;-100,100 𝑷 𝑿 𝒕+𝟏 =𝟏 𝑿 𝒕 =𝟑 =𝟎.𝟗 100,-100;-100,-100 𝑷 𝑿 𝒕+𝟏 =𝟏 𝑿 𝒕 =𝟒 =𝟎 100,-100;100,100 𝑷 𝒀 𝒕+𝟏 =−𝟏𝟎𝟎 𝒀 𝒕 =−𝟏𝟎𝟎, 𝒀 𝒕−𝟏 =−𝟏𝟎𝟎 =𝟎.𝟏 𝑷 𝒀 𝒕+𝟏 =−𝟏𝟎𝟎 𝒀 𝒕 =𝟏𝟎𝟎, 𝒀 𝒕−𝟏 =−𝟏𝟎𝟎 =𝟎.𝟕 𝑷 𝒀 𝒕+𝟏 =−𝟏𝟎𝟎 𝒀 𝒕 =−𝟏𝟎𝟎, 𝒀 𝒕−𝟏 =𝟏𝟎𝟎 =𝟎.𝟖 𝑷 𝒀 𝒕+𝟏 =−𝟏𝟎𝟎 𝒀 𝒕 =𝟏𝟎𝟎, 𝒀 𝒕−𝟏 =𝟏𝟎𝟎 =𝟎.𝟎𝟓 𝑷 𝒀 𝒕+𝟏 =𝟏𝟎𝟎 𝒀 𝒕 =−𝟏𝟎𝟎, 𝒀 𝒕−𝟏 =−𝟏𝟎𝟎 =𝟎.𝟗 𝑷 𝒀 𝒕+𝟏 =𝟏𝟎𝟎 𝒀 𝒕 =𝟏𝟎𝟎, 𝒀 𝒕−𝟏 =−𝟏𝟎𝟎 =𝟎.𝟑 𝑷 𝒀 𝒕+𝟏 =𝟏𝟎𝟎 𝒀 𝒕 =−𝟏𝟎𝟎, 𝒀 𝒕−𝟏 =𝟏𝟎𝟎 =𝟎.𝟐 𝑷 𝒀 𝒕+𝟏 =𝟏𝟎𝟎 𝒀 𝒕 =𝟏𝟎𝟎, 𝒀 𝒕−𝟏 =𝟏𝟎𝟎 =𝟎.𝟗𝟓 𝑿 𝒕 = 𝟏 𝐢𝐟 𝒀 𝒕 =𝟏𝟎𝟎 𝐚𝐧𝐝 𝒀 𝒕−𝟏 =−𝟏𝟎𝟎 𝟐 𝐢𝐟 𝒀 𝒕 =−𝟏𝟎𝟎 𝐚𝐧𝐝 𝒀 𝒕−𝟏 =𝟏𝟎𝟎 𝟑 𝐢𝐟 𝒀 𝒕 =−𝟏𝟎𝟎 𝐚𝐧𝐝 𝒀 𝒕−𝟏 =−𝟏𝟎𝟎 𝟒 𝐢𝐟 𝒀 𝒕 =𝟏𝟎𝟎 𝐚𝐧𝐝 𝒀 𝒕−𝟏 =𝟏𝟎𝟎

Converting SOME non-Markov Chains to Markov ChainS Probabilities are 𝑷 𝑿 𝒕+𝟏 =𝟐 𝑿 𝒕 =𝟏 =𝟎.𝟕 -100,100;100,-100 𝑷 𝑿 𝒕+𝟏 =𝟐 𝑿 𝒕 =𝟐 =𝟎 -100,100;-100,100 𝑷 𝑿 𝒕+𝟏 =𝟐 𝑿 𝒕 =𝟑 =𝟎 -100,100;-100,-100 𝑷 𝑿 𝒕+𝟏 =𝟐 𝑿 𝒕 =𝟒 =𝟎.𝟎𝟓 -100,100;100,100 𝑷 𝒀 𝒕+𝟏 =−𝟏𝟎𝟎 𝒀 𝒕 =−𝟏𝟎𝟎, 𝒀 𝒕−𝟏 =−𝟏𝟎𝟎 =𝟎.𝟏 𝑷 𝒀 𝒕+𝟏 =−𝟏𝟎𝟎 𝒀 𝒕 =𝟏𝟎𝟎, 𝒀 𝒕−𝟏 =−𝟏𝟎𝟎 =𝟎.𝟕 𝑷 𝒀 𝒕+𝟏 =−𝟏𝟎𝟎 𝒀 𝒕 =−𝟏𝟎𝟎, 𝒀 𝒕−𝟏 =𝟏𝟎𝟎 =𝟎.𝟖 𝑷 𝒀 𝒕+𝟏 =−𝟏𝟎𝟎 𝒀 𝒕 =𝟏𝟎𝟎, 𝒀 𝒕−𝟏 =𝟏𝟎𝟎 =𝟎.𝟎𝟓 𝑷 𝒀 𝒕+𝟏 =𝟏𝟎𝟎 𝒀 𝒕 =−𝟏𝟎𝟎, 𝒀 𝒕−𝟏 =−𝟏𝟎𝟎 =𝟎.𝟗 𝑷 𝒀 𝒕+𝟏 =𝟏𝟎𝟎 𝒀 𝒕 =𝟏𝟎𝟎, 𝒀 𝒕−𝟏 =−𝟏𝟎𝟎 =𝟎.𝟑 𝑷 𝒀 𝒕+𝟏 =𝟏𝟎𝟎 𝒀 𝒕 =−𝟏𝟎𝟎, 𝒀 𝒕−𝟏 =𝟏𝟎𝟎 =𝟎.𝟐 𝑷 𝒀 𝒕+𝟏 =𝟏𝟎𝟎 𝒀 𝒕 =𝟏𝟎𝟎, 𝒀 𝒕−𝟏 =𝟏𝟎𝟎 =𝟎.𝟗𝟓 𝑿 𝒕 = 𝟏 𝐢𝐟 𝒀 𝒕 =𝟏𝟎𝟎 𝐚𝐧𝐝 𝒀 𝒕−𝟏 =−𝟏𝟎𝟎 𝟐 𝐢𝐟 𝒀 𝒕 =−𝟏𝟎𝟎 𝐚𝐧𝐝 𝒀 𝒕−𝟏 =𝟏𝟎𝟎 𝟑 𝐢𝐟 𝒀 𝒕 =−𝟏𝟎𝟎 𝐚𝐧𝐝 𝒀 𝒕−𝟏 =−𝟏𝟎𝟎 𝟒 𝐢𝐟 𝒀 𝒕 =𝟏𝟎𝟎 𝐚𝐧𝐝 𝒀 𝒕−𝟏 =𝟏𝟎𝟎

Converting SOME non-Markov Chains to Markov ChainS Probabilities are 𝑷 𝑿 𝒕+𝟏 =𝟑 𝑿 𝒕 =𝟏 =______ ___,___;100,-100 𝑷 𝑿 𝒕+𝟏 =𝟑 𝑿 𝒕 =𝟐 =______ ___,___;-100,100 𝑷 𝑿 𝒕+𝟏 =𝟑 𝑿 𝒕 =𝟑 =______ ___,___;-100,-100 𝑷 𝑿 𝒕+𝟏 =𝟑 𝑿 𝒕 =𝟒 =______ ___,___;100,100 𝑷 𝒀 𝒕+𝟏 =−𝟏𝟎𝟎 𝒀 𝒕 =−𝟏𝟎𝟎, 𝒀 𝒕−𝟏 =−𝟏𝟎𝟎 =𝟎.𝟏 𝑷 𝒀 𝒕+𝟏 =−𝟏𝟎𝟎 𝒀 𝒕 =𝟏𝟎𝟎, 𝒀 𝒕−𝟏 =−𝟏𝟎𝟎 =𝟎.𝟕 𝑷 𝒀 𝒕+𝟏 =−𝟏𝟎𝟎 𝒀 𝒕 =−𝟏𝟎𝟎, 𝒀 𝒕−𝟏 =𝟏𝟎𝟎 =𝟎.𝟖 𝑷 𝒀 𝒕+𝟏 =−𝟏𝟎𝟎 𝒀 𝒕 =𝟏𝟎𝟎, 𝒀 𝒕−𝟏 =𝟏𝟎𝟎 =𝟎.𝟎𝟓 𝑷 𝒀 𝒕+𝟏 =𝟏𝟎𝟎 𝒀 𝒕 =−𝟏𝟎𝟎, 𝒀 𝒕−𝟏 =−𝟏𝟎𝟎 =𝟎.𝟗 𝑷 𝒀 𝒕+𝟏 =𝟏𝟎𝟎 𝒀 𝒕 =𝟏𝟎𝟎, 𝒀 𝒕−𝟏 =−𝟏𝟎𝟎 =𝟎.𝟑 𝑷 𝒀 𝒕+𝟏 =𝟏𝟎𝟎 𝒀 𝒕 =−𝟏𝟎𝟎, 𝒀 𝒕−𝟏 =𝟏𝟎𝟎 =𝟎.𝟐 𝑷 𝒀 𝒕+𝟏 =𝟏𝟎𝟎 𝒀 𝒕 =𝟏𝟎𝟎, 𝒀 𝒕−𝟏 =𝟏𝟎𝟎 =𝟎.𝟗𝟓 𝑿 𝒕 = 𝟏 𝐢𝐟 𝒀 𝒕 =𝟏𝟎𝟎 𝐚𝐧𝐝 𝒀 𝒕−𝟏 =−𝟏𝟎𝟎 𝟐 𝐢𝐟 𝒀 𝒕 =−𝟏𝟎𝟎 𝐚𝐧𝐝 𝒀 𝒕−𝟏 =𝟏𝟎𝟎 𝟑 𝐢𝐟 𝒀 𝒕 =−𝟏𝟎𝟎 𝐚𝐧𝐝 𝒀 𝒕−𝟏 =−𝟏𝟎𝟎 𝟒 𝐢𝐟 𝒀 𝒕 =𝟏𝟎𝟎 𝐚𝐧𝐝 𝒀 𝒕−𝟏 =𝟏𝟎𝟎

Converting SOME non-Markov Chains to Markov ChainS Probabilities are 𝑷 𝑿 𝒕+𝟏 =𝟒 𝑿 𝒕 =𝟏 =______ ___,___;100,-100 𝑷 𝑿 𝒕+𝟏 =𝟒 𝑿 𝒕 =𝟐 =______ ___,___;-100,100 𝑷 𝑿 𝒕+𝟏 =𝟒 𝑿 𝒕 =𝟑 =______ ___,___;-100,-100 𝑷 𝑿 𝒕+𝟏 =𝟒 𝑿 𝒕 =𝟒 =______ ___,___;100,100 𝑷 𝒀 𝒕+𝟏 =−𝟏𝟎𝟎 𝒀 𝒕 =−𝟏𝟎𝟎, 𝒀 𝒕−𝟏 =−𝟏𝟎𝟎 =𝟎.𝟏 𝑷 𝒀 𝒕+𝟏 =−𝟏𝟎𝟎 𝒀 𝒕 =𝟏𝟎𝟎, 𝒀 𝒕−𝟏 =−𝟏𝟎𝟎 =𝟎.𝟕 𝑷 𝒀 𝒕+𝟏 =−𝟏𝟎𝟎 𝒀 𝒕 =−𝟏𝟎𝟎, 𝒀 𝒕−𝟏 =𝟏𝟎𝟎 =𝟎.𝟖 𝑷 𝒀 𝒕+𝟏 =−𝟏𝟎𝟎 𝒀 𝒕 =𝟏𝟎𝟎, 𝒀 𝒕−𝟏 =𝟏𝟎𝟎 =𝟎.𝟎𝟓 𝑷 𝒀 𝒕+𝟏 =𝟏𝟎𝟎 𝒀 𝒕 =−𝟏𝟎𝟎, 𝒀 𝒕−𝟏 =−𝟏𝟎𝟎 =𝟎.𝟗 𝑷 𝒀 𝒕+𝟏 =𝟏𝟎𝟎 𝒀 𝒕 =𝟏𝟎𝟎, 𝒀 𝒕−𝟏 =−𝟏𝟎𝟎 =𝟎.𝟑 𝑷 𝒀 𝒕+𝟏 =𝟏𝟎𝟎 𝒀 𝒕 =−𝟏𝟎𝟎, 𝒀 𝒕−𝟏 =𝟏𝟎𝟎 =𝟎.𝟐 𝑷 𝒀 𝒕+𝟏 =𝟏𝟎𝟎 𝒀 𝒕 =𝟏𝟎𝟎, 𝒀 𝒕−𝟏 =𝟏𝟎𝟎 =𝟎.𝟗𝟓 𝑿 𝒕 = 𝟏 𝐢𝐟 𝒀 𝒕 =𝟏𝟎𝟎 𝐚𝐧𝐝 𝒀 𝒕−𝟏 =−𝟏𝟎𝟎 𝟐 𝐢𝐟 𝒀 𝒕 =−𝟏𝟎𝟎 𝐚𝐧𝐝 𝒀 𝒕−𝟏 =𝟏𝟎𝟎 𝟑 𝐢𝐟 𝒀 𝒕 =−𝟏𝟎𝟎 𝐚𝐧𝐝 𝒀 𝒕−𝟏 =−𝟏𝟎𝟎 𝟒 𝐢𝐟 𝒀 𝒕 =𝟏𝟎𝟎 𝐚𝐧𝐝 𝒀 𝒕−𝟏 =𝟏𝟎𝟎

Converting SOME non-Markov Chains to Markov ChainS Question: Is 𝑿 𝒕 a Markov Chain?

Converting SOME non-Markov Chains to Markov ChainS The transition matrix related to the four-state 𝑿 𝒕 is 𝐏= 𝟎 𝟎.𝟕 ___ ___ 𝟎.𝟐 𝟎 ___ ___ 𝟎.𝟗 𝟎 ___ ___ 𝟎 𝟎.𝟎𝟓 ___ ___ Also, draw the state transition diagram.

Converting SOME non-Markov Chains to Markov ChainS Reflection: In Markov Chains we can still incorporate the past but we need to increase the number of states.