Solutions Markov Chains 1

Slides:



Advertisements
Similar presentations
Discrete time Markov Chain
Advertisements

Lecture 5 This lecture is about: Introduction to Queuing Theory Queuing Theory Notation Bertsekas/Gallager: Section 3.3 Kleinrock (Book I) Basics of Markov.
Lecture 6  Calculating P n – how do we raise a matrix to the n th power?  Ergodicity in Markov Chains.  When does a chain have equilibrium probabilities?
Continuous-Time Markov Chains Nur Aini Masruroh. LOGO Introduction  A continuous-time Markov chain is a stochastic process having the Markovian property.
Section 7.4: Closures of Relations Let R be a relation on a set A. We have talked about 6 properties that a relation on a set may or may not possess: reflexive,
1 Chapter 5 Continuous time Markov Chains Learning objectives : Introduce continuous time Markov Chain Model manufacturing systems using Markov Chain Able.
INDR 343 Problem Session
Topics Review of DTMC Classification of states Economic analysis
Lecture 12 – Discrete-Time Markov Chains
TCOM 501: Networking Theory & Fundamentals
Solutions Markov Chains 1
The Rate of Concentration of the stationary distribution of a Markov Chain on the Homogenous Populations. Boris Mitavskiy and Jonathan Rowe School of Computer.
10.3 Absorbing Markov Chains
1 Part III Markov Chains & Queueing Systems 10.Discrete-Time Markov Chains 11.Stationary Distributions & Limiting Probabilities 12.State Classification.
Markov Chain Part 2 多媒體系統研究群 指導老師:林朝興 博士 學生:鄭義繼. Outline Review Classification of States of a Markov Chain First passage times Absorbing States.
Tutorial 8 Markov Chains. 2  Consider a sequence of random variables X 0, X 1, …, and the set of possible values of these random variables is {0, 1,
INDR 343 Problem Session
Reliable System Design 2011 by: Amir M. Rahmani
Replacement Problem.
Final Exam Due: December 14 (Noon), 2004
Markov Chains Chapter 16.
INDR 343 Problem Session
Solutions Queueing Theory 1
Machine Shop Model. Lecture Two by: Lecturer/ Nada Ahmed.
CDA6530: Performance Models of Computers and Networks Examples of Stochastic Process, Markov Chain, M/M/* Queue TexPoint fonts used in EMF. Read the TexPoint.
Class 3 Binomial Random Variables Continuous Random Variables Standard Normal Distributions.
Solutions Markov Chains 2 3) Given the following one-step transition matrix of a Markov chain, determine the classes of the Markov chain and whether they.
Quality Tools. Decision Tree When to use it Use it when making important or complex decisions, to identify the course of action that will give the best.
© 2015 McGraw-Hill Education. All rights reserved. Chapter 19 Markov Decision Processes.
Copyright © 2010 Pearson Education, Inc. Slide
CS433 Modeling and Simulation Lecture 07 – Part 01 Continuous Markov Chains Dr. Anis Koubâa 14 Dec 2008 Al-Imam.
Problems Markov Chains 2
1 Example 1 Syrup from a 500 liter container of 10% sugar syrup flows into a candy making machine at 2 liters per minute. The vat is kept full by adding.
To be presented by Maral Hudaybergenova IENG 513 FALL 2015.
CS433 Modeling and Simulation Lecture 11 Continuous Markov Chains Dr. Anis Koubâa 01 May 2009 Al-Imam Mohammad Ibn Saud University.
Meaning of Markov Chain Markov Chain is a stochastic model describing a sequence of possible events in which the probability of each event depends only.
11. Markov Chains (MCs) 2 Courtesy of J. Bard, L. Page, and J. Heyl.
Markov Processes What is a Markov Process?
Theory of Computational Complexity Probability and Computing Lee Minseon Iwama and Ito lab M1 1.
Chapter 14 Probability Rules!. Do Now: According to the 2010 US Census, 6.7% of the population is aged 10 to 14 years, and 7.1% of the population is aged.
Reliability Engineering
Problems Markov Chains ) Given the following one-step transition matrices of a Markov chain, determine the classes of the Markov chain and whether.
Renewal Theory Definitions, Limit Theorems, Renewal Reward Processes, Alternating Renewal Processes, Age and Excess Life Distributions, Inspection Paradox.
Discrete Time Markov Chain
Discrete-time Markov chain (DTMC) State space distribution
Introduction to Computing Science and Programming I
Discrete-time markov chain (continuation)
Much More About Markov Chains
Continuous Random Variables
ST3236: Stochastic Process Tutorial 7
ITEC 202 Operating Systems
Discrete Time Markov Chains
Finite M/M/1 queue Consider an M/M/1 queue with finite waiting room.
Solutions Markov Chains 1
TexPoint fonts used in EMF.
Solutions Markov Chains 1
Solutions Queueing Theory 1
Solutions Markov Chains 1
Solutions Markov Chains 2
Introduction to Queueing Theory
Solutions Queueing Theory 1
Planning and Scheduling
State Machines 8-May-19.
Discrete-time markov chain (continuation)
Discrete-time markov chain (continuation)
Solutions Markov Chains 6
TexPoint fonts used in EMF.
Random Processes / Markov Processes
CS723 - Probability and Stochastic Processes
Lecture 5 This lecture is about: Introduction to Queuing Theory
Presentation transcript:

Solutions Markov Chains 1 3) Given the following one-step transition matrix of a Markov chain, determine the classes of the Markov chain and whether they are recurrent. 4 3 2 1 Drawing the transition diagram above, we see that states 1 and 2 communicate. Once you leave state 2, there is a finite probability you will never return. Therefore, state 2 is transient. States 3 and 4 communicate with each other but not with states 0, 1, or 2. Consequently, { 0, 1 } is recurrent { 2 } is transient { 3, 4 } is recurrent

Solutions Markov Chains 2 4) A computer is inspected at the end of every hour. It is found to be either working (up) or failed (down). If the computer is found to be up, the probability of its remaining up for the next hour is 0.90. It it is down, the computer is repaired, which may require more than one hour. Whenever, the computer is down (regardlewss of how long it has been down), the probability of its still being down 1 hour later is 0.35. a. Construct the one-step transition probability matrix. b. Find the expected first passage time from i to j for all i, j. Soln: Let, S = 0 if computer is down = 1 if computer is up Then, b. Find expected first passage times

Solutions Markov Chains 3 4) (cont.) b. Find expected first passage times

Solutions Markov Chains 8 4) (cont.) Alternative solution to b. (1) (2) (3) (4) Substituting (4) into (1) gives And from (4),

Solutions Markov Chains 4 4) (cont.) Alternative solution to b. . 13 p = . 87 1 p =

Solutions Markov Chains 10 5) A manufacturer has a machine that, when operational at the beginning of a day, has a probability of 0.1 of breaking down sometime during the day. When this happens, the repair is done the next day and completed at the end of that day. a. Formulate the evolution of the status of the machine as a 3 state Markov Chain. b. Fine the expected first passage times from i to j. c. Suppose the machine has gone 20 full days without a breakdown since the last repair was completed. How many days do we expect until the next breakdown/repair? Soln: a. Let, S = 0 if machine running at day’s end | running at start = 1 if machine down at day’s end | running at start = 2 if machine running at day’s end | down at start

Solutions Markov Chains 5 5) (cont.) a. S = 0 if machine running at day’s end | running at start = 1 if machine down at day’s end | running at start = 2 if machine running at day’s end | down at start Continuing in this fashion gives the one-step transition prob. b. Note: This makes intuitive sense. If the machine has a 10% chance of failing on any given day, then the expected number of days between failures is 10, (m01 = 10).

Solutions Markov Chains 6 5) (cont.)

Solutions Markov Chains 7 5) (cont.) Back substituting for m02, m10, m11

Solutions Markov Chains 8 5) (cont.) c. Suppose the machine has gone 20 full days without a breakdown since the last repair was completed. How many days do we expect until the next breakdown/repair? If we read this as the expected number of days to breakdown since the last repair, we are asking for m21 in which case, If we read this as the expected number of days to breakdown and subsequent repair since the last repair, we are asking for m22. Again, this should make intuitive sense. A machine has a 10% chance of breaking down. Therefore, the expected time between failures is 10 days. Since it takes 1 day to repair, the time from repair to repair is 10+1 = 11 days.

Solutions Markov Chains 9 A military maintenance depot overhauls tanks. There is room for 3 tanks in the facility and one tank in an overflow area. At most 4 tanks can be at the depot at one time. Every morning a tank arrives for an overhaul. If the depot is full, however, it is turned away, no new arrivals occur under these circumstances. On any given day, the following probabilities govern the completion of overhauls. No. tanks 0 1 2 3 Prob. .2 .4 .3 .1 These values are independent of the number of tanks in the depot, but obviously no more tanks than are waiting in line at the start of the day can be completed. Develop a Markov chain model for this situation. Soln: Let S = # tanks in the depot at the start of a day just after a tank arrival = 1, 2, 3, 4 (note, since 1 arrives each day, we can never have S=0 Start State Event End State Prob. 4 overhaul 0, 1 declined 4 .2 4 overhaul 1, 1 arrives 4 .4 4 overhaul 2, 1 arrives 3 .3 4 overhaul 3, 1 arrives 2 .1 3 overhaul 0. 1 arrives 4 .2 3 overhaul 1, 1 arrives 3 .4 3 overhaul 2, 1 arrives 2 .3 3 overhaul 3, 1 arrives 1 .1

Solutions Markov Chains 10 Start State Event End State Prob. 4 1 declined, overhaul 0 4 .2 4 1 declined, overhaul 1 3 .4 4 1 declined, overhaul 2 2 .3 4 1 declined, overhaul 3 1 .1 3 1 arrives, overhaul 0 4 .2 3 1 arrives, overhaul 1 3 .4 3 1 arrives, overhaul 2 2 .3 3 1 arrives, overhaul 3 1 .1 2 1 arrives, overhaul 0 3 .2 2 1 arrives, overhaul 1 2 .4 2 1 arrives, overhaul all 1 .3+.1 1 1 arrives, overhaul 0 2 .2 1 1 arrives, overhaul all 1 .4+.3+.1