Solutions Markov Chains 2 3) Given the following one-step transition matrix of a Markov chain, determine the classes of the Markov chain and whether they.

Slides:



Advertisements
Similar presentations
Discrete time Markov Chain
Advertisements

Lecture 6  Calculating P n – how do we raise a matrix to the n th power?  Ergodicity in Markov Chains.  When does a chain have equilibrium probabilities?
Continuous-Time Markov Chains Nur Aini Masruroh. LOGO Introduction  A continuous-time Markov chain is a stochastic process having the Markovian property.
CS433 Modeling and Simulation Lecture 06 – Part 03 Discrete Markov Chains Dr. Anis Koubâa 12 Apr 2009 Al-Imam Mohammad Ibn Saud University.
Chapter 8 Continuous Time Markov Chains. Markov Availability Model.
1 Chapter 5 Continuous time Markov Chains Learning objectives : Introduce continuous time Markov Chain Model manufacturing systems using Markov Chain Able.
INDR 343 Problem Session
Markov Chains 1.
Topics Review of DTMC Classification of states Economic analysis
Lecture 12 – Discrete-Time Markov Chains
Solutions Markov Chains 1
The Rate of Concentration of the stationary distribution of a Markov Chain on the Homogenous Populations. Boris Mitavskiy and Jonathan Rowe School of Computer.
10.3 Absorbing Markov Chains
1 Part III Markov Chains & Queueing Systems 10.Discrete-Time Markov Chains 11.Stationary Distributions & Limiting Probabilities 12.State Classification.
Al-Imam Mohammad Ibn Saud University
Markov Processes MBAP 6100 & EMEN 5600 Survey of Operations Research Professor Stephen Lawrence Leeds School of Business University of Colorado Boulder,
Markov Chain Part 2 多媒體系統研究群 指導老師:林朝興 博士 學生:鄭義繼. Outline Review Classification of States of a Markov Chain First passage times Absorbing States.
Hidden Markov Models Fundamentals and applications to bioinformatics.
Tutorial 8 Markov Chains. 2  Consider a sequence of random variables X 0, X 1, …, and the set of possible values of these random variables is {0, 1,
What is the probability that the great-grandchild of middle class parents will be middle class? Markov chains can be used to answer these types of problems.
INDR 343 Problem Session
Reliable System Design 2011 by: Amir M. Rahmani
61 Nondeterminism and Nodeterministic Automata. 62 The computational machine models that we learned in the class are deterministic in the sense that the.
Final Exam Due: December 14 (Noon), 2004
AIC05 - S. Mocanu 1 NUMERICAL ALGORITHMS FOR TRANSIENT ANALYSIS OF FLUID QUEUES Stéphane Mocanu Laboratoire d’Automatique de Grenoble FRANCE.
Markov Chains Chapter 16.
INDR 343 Problem Session
Problems 10/3 1. Ehrenfast’s diffusion model:. Problems, cont. 2. Discrete uniform on {0,...,n}
Machine Shop Model. Lecture Two by: Lecturer/ Nada Ahmed.
CDA6530: Performance Models of Computers and Networks Examples of Stochastic Process, Markov Chain, M/M/* Queue TexPoint fonts used in EMF. Read the TexPoint.
6. Markov Chain. State Space The state space is the set of values a random variable X can take. E.g.: integer 1 to 6 in a dice experiment, or the locations.
Instructor: Vincent Conitzer
Final Exam Review II Chapters 5-7, 9 Objectives and Examples.
CH – 11 Markov analysis Learning objectives:
Markov Decision Processes1 Definitions; Stationary policies; Value improvement algorithm, Policy improvement algorithm, and linear programming for discounted.
Chapter 61 Continuous Time Markov Chains Birth and Death Processes,Transition Probability Function, Kolmogorov Equations, Limiting Probabilities, Uniformization.
© 2015 McGraw-Hill Education. All rights reserved. Chapter 19 Markov Decision Processes.
8/14/04J. Bard and J. W. Barnes Operations Research Models and Methods Copyright All rights reserved Lecture 12 – Discrete-Time Markov Chains Topics.
Problems Markov Chains 2
Discrete Time Markov Chains
To be presented by Maral Hudaybergenova IENG 513 FALL 2015.
Meaning of Markov Chain Markov Chain is a stochastic model describing a sequence of possible events in which the probability of each event depends only.
ST3236: Stochastic Process Tutorial 6
11. Markov Chains (MCs) 2 Courtesy of J. Bard, L. Page, and J. Heyl.
Markov Processes What is a Markov Process?
Theory of Computational Complexity Probability and Computing Lee Minseon Iwama and Ito lab M1 1.
Chapter 9: Markov Processes
From DeGroot & Schervish. Example Occupied Telephone Lines Suppose that a certain business office has five telephone lines and that any number of these.
Problems Markov Chains ) Given the following one-step transition matrices of a Markov chain, determine the classes of the Markov chain and whether.
Discrete-time Markov chain (DTMC) State space distribution
Discrete-time markov chain (continuation)
Much More About Markov Chains
ST3236: Stochastic Process Tutorial 7
ITEC 202 Operating Systems
Solutions Markov Chains 1
TexPoint fonts used in EMF.
Functional Properties and Problem Reduction
Solutions Markov Chains 1
Problem Markov Chains 1 A manufacturer has one key machine at the core of its production process. Because of heavy use, the machine.
Solutions Markov Chains 1
Solutions Markov Chains 1
Solutions Markov Chains 2
Lial/Hungerford/Holcomb/Mullins: Mathematics with Applications 11e Finite Mathematics with Applications 11e Copyright ©2015 Pearson Education, Inc. All.
Instructor: Vincent Conitzer
Discrete-time markov chain (continuation)
Discrete-time markov chain (continuation)
Solutions Markov Chains 6
TexPoint fonts used in EMF.
Random Processes / Markov Processes
CS723 - Probability and Stochastic Processes
Presentation transcript:

Solutions Markov Chains 2 3) Given the following one-step transition matrix of a Markov chain, determine the classes of the Markov chain and whether they are recurrent Drawing the transition diagram above, we see that states 1 and 2 communicate. Once you leave state 2, there is a finite probability you will never return. Therefore, state 2 is transient. States 3 and 4 communicate with each other but not with states 0, 1, or 2. Consequently, { 0, 1 } is recurrent { 2 } is transient { 3, 4 } is recurrent

Solutions Markov Chains 6 4) A computer is inspected at the end of every hour. It is found to be either working (up) or failed (down). If the computer is found to be up, the probability of its remaining up for the next hour is It it is down, the computer is repaired, which may require more than one hour. Whenever, the computer is down (regardlewss of how long it has been down), the probability of its still being down 1 hour later is a.Construct the one-step transition probability matrix. b.Find the expected first passage time from i to j for all i, j. Soln: Let, S = 0 if computer is down = 1 if computer is up Then, b. Find expected first passage times

Solutions Markov Chains 7 4) (cont.) b. Find expected first passage times

Solutions Markov Chains 8 4) (cont.) Alternative solution to b. (1) (2) (3) (4) Substituting (4) into (1) gives And from (4),

Solutions Markov Chains 9 4) (cont.) Alternative solution to b  

Solutions Markov Chains 10 5) A manufacturer has a machine that, when operational at the beginning of a day, has a probability of 0.1 of breaking down sometime during the day. When this happens, the repair is done the next day and completed at the end of that day. a. Formulate the evolution of the status of the machine as a 3 state Markov Chain. b.Fine the expected first passage times from i to j. c.Suppose the machine has gone 20 full days without a breakdown since the last repair was completed. How many days do we expect until the next breakdown/repair? Soln: a. Let, S = 0 if machine running at day’s end | running at start = 1 if machine down at day’s end | running at start = 2 if machine running at day’s end | down at start

Solutions Markov Chains 11 5) (cont.) a. S = 0 if machine running at day’s end | running at start = 1 if machine down at day’s end | running at start = 2 if machine running at day’s end | down at start Continuing in this fashion gives the one-step transition prob. b. Note: This makes intuitive sense. If the machine has a 10% chance of failing on any given day, then the expected number of days between failures is 10, (  01 = 10). 0

Solutions Markov Chains 12 5) (cont.) 0 0 0

Solutions Markov Chains 13 5) (cont.) Back substituting for  02,  10,  11

Solutions Markov Chains 14 5) (cont.) c. Suppose the machine has gone 20 full days without a breakdown since the last repair was completed. How many days do we expect until the next breakdown/repair? If we read this as the expected number of days to breakdown since the last repair, we are asking for  21 in which case, If we read this as the expected number of days to breakdown and subsequent repair since the last repair, we are asking for  22. Again, this should make intuitive sense. A machine has a 10% chance of breaking down. Therefore, the expected time between failures is 10 days. Since it takes 1 day to repair, the time from repair to repair is 10+1 = 11 days.