Solutions Markov Chains 1

Slides:



Advertisements
Similar presentations
Many useful applications, especially in queueing systems, inventory management, and reliability analysis. A connection between discrete time Markov chains.
Advertisements

Lecture 6  Calculating P n – how do we raise a matrix to the n th power?  Ergodicity in Markov Chains.  When does a chain have equilibrium probabilities?
Chapter 4 Mathematical Expectation.
INDR 343 Problem Session
Markov Chains 1.
. Computational Genomics Lecture 7c Hidden Markov Models (HMMs) © Ydo Wexler & Dan Geiger (Technion) and by Nir Friedman (HU) Modified by Benny Chor (TAU)
Solutions Markov Chains 1
Introduction to Discrete Time Semi Markov Process Nur Aini Masruroh.
Chapter 17 Markov Chains.
1 Part III Markov Chains & Queueing Systems 10.Discrete-Time Markov Chains 11.Stationary Distributions & Limiting Probabilities 12.State Classification.
Markov Processes MBAP 6100 & EMEN 5600 Survey of Operations Research Professor Stephen Lawrence Leeds School of Business University of Colorado Boulder,
Markov Chains Ali Jalali. Basic Definitions Assume s as states and s as happened states. For a 3 state Markov model, we construct a transition matrix.
Tutorial 8 Markov Chains. 2  Consider a sequence of random variables X 0, X 1, …, and the set of possible values of these random variables is {0, 1,
Overview of Markov chains David Gleich Purdue University Network & Matrix Computations Computer Science 15 Sept 2011.
INDR 343 Problem Session
Simulation An Inventory Simulation. Example Daily demand for refrigerators at Hotpoint City has a probability distribution Lead time is not fixed but.
1 Managing Flow Variability: Safety Inventory The Newsvendor ProblemArdavan Asef-Vaziri, Oct 2011 The Magnitude of Shortages (Out of Stock)
Final Exam Due: December 14 (Noon), 2004
2003 Fall Queuing Theory Midterm Exam(Time limit:2 hours)
HW # Due Day: Nov. 9.
Markov Chains Chapter 16.
INDR 343 Problem Session
Problems, cont. 3. where k=0?. When are there stationary distributions? Theorem: An irreducible chain has a stationary distribution  iff the states are.
Limiting probabilities. The limiting probabilities P j exist if (a) all states of the Markov chain communicate (i.e., starting in state i, there is.
Section 10.2 Regular Markov Chains
6. Markov Chain. State Space The state space is the set of values a random variable X can take. E.g.: integer 1 to 6 in a dice experiment, or the locations.
ECES 741: Stochastic Decision & Control Processes – Chapter 1: The DP Algorithm 31 Alternative System Description If all w k are given initially as Then,
Solutions Markov Chains 2 3) Given the following one-step transition matrix of a Markov chain, determine the classes of the Markov chain and whether they.
Chapter 3 : Problems 7, 11, 14 Chapter 4 : Problems 5, 6, 14 Due date : Monday, March 15, 2004 Assignment 3.
CS433 Modeling and Simulation Lecture 06 – Part 02 Discrete Markov Chains Dr. Anis Koubâa 11 Nov 2008 Al-Imam Mohammad.
© 2015 McGraw-Hill Education. All rights reserved. Chapter 19 Markov Decision Processes.
1 Markov chains and processes: motivations Random walk One-dimensional walk You can only move one step right or left every time unit Two-dimensional walk.
1 Resource-Constrained Multiple Product System & Stochastic Inventory Model Prof. Yuan-Shyi Peter Chiu Feb Material Management Class Note #4.
Meaning of Markov Chain Markov Chain is a stochastic model describing a sequence of possible events in which the probability of each event depends only.
ST3236: Stochastic Process Tutorial 6
11. Markov Chains (MCs) 2 Courtesy of J. Bard, L. Page, and J. Heyl.
Problems Markov Chains 1 1) Given the following one-step transition matrices of a Markov chain, determine the classes of the Markov chain and whether they.
Warm-up Use the information below to find the population distribution after 20 years for the given population of giraffes. Age (in years)
A Simple Inventory Example Consider a retailer that sells ski-jackets. The sales season is very short and the items have to.
Let E denote some event. Define a random variable X by Computing probabilities by conditioning.
Problems Markov Chains ) Given the following one-step transition matrices of a Markov chain, determine the classes of the Markov chain and whether.
Introduction to Simulation Chapter 12. Introduction to Simulation  In many spreadsheets, the value for one or more cells representing independent variables.
Managing Uncertainty with Inventory I
DISCRETE RANDOM VARIABLES
Much More About Markov Chains
Simulation.
NATCOR Stochastic Modelling Course Inventory Control – Part 1
Managing Uncertainty in the Supply Chain: Safety Inventory
Finite M/M/1 queue Consider an M/M/1 queue with finite waiting room.
Optimal Level of Product Availability Chapter 13 of Chopra
IENG 362 Markov Chains.
Random Demand: Fixed Order Quantity
IENG 362 Markov Chains.
IENG 362 Markov Chains.
IENG 362 Markov Chains.
Courtesy of J. Akinpelu, Anis Koubâa, Y. Wexler, & D. Geiger
Chapman-Kolmogorov Equations
Solutions Markov Chains 1
Solutions Markov Chains 1
Solutions Markov Chains 1
HW # Due Day: Nov 30.
Solutions Markov Chains 2
IENG 362 Markov Chains.
Continuous Probability Distributions
Discrete-time markov chain (continuation)
Discrete-time markov chain (continuation)
Solutions Markov Chains 6
Markov Chains & Population Movements
CS723 - Probability and Stochastic Processes
CS723 - Probability and Stochastic Processes
Presentation transcript:

Solutions Markov Chains 1 1) Given the following one-step transition matrices of a Markov chain, determine the classes of the Markov chain and whether they are recurrent. a. b. 1 2 3 All states communicate, all states recurrent 1 2 3 Communicate {0}, {1, 2}, {3} states 0, 1, 2 recurrent state 3 transient

Problems Markov Chains 2 2) Suppose we have 3 pints of blood on hand at the end of the day. At start, we get another pint making 4 pints total. During the period we have a demand for 0, 1, 2, or 3 pints of blood with the following distribution. x 0 1 2 3 P(x) .4 .3 .2 .1 Assume we receive 1 pint of blood each day at the start of the day. We can hold no more than 5 pints of blood at the end of any given day. Excess blood over 5 pints will be discarded. Construct a one-step transition probability matrix to model this process as a Markov chain. Soln: Let S = the pints of blood at the end of each day. Since, we can hold more than 5, S = 0, 1, 2, 3, 4, 5 Pints available at start of day = S+1 S Day Start Event Probability End S 5 6 0 demand 0.4 5 threw one away 5 6 1 0.3 5 5 6 2 0.2 4 5 6 3 0.1 3 4 5 0 0.4 5 4 5 1 0.3 4 etc.

Problems Markov Chains 3 Soln: Let S = the pints of blood at the end of each day. Since, we can hold more than 5, S = 0, 1, 2, 3, 4, 5 Pints available at start of day = S+1 S Day Start Event Probability End S 5 6 0 demand 0.4 5 threw one away 5 6 1 0.3 5 5 6 2 0.2 4 5 6 3 0.1 3 4 5 0 0.4 5 4 5 1 0.3 4 etc.

Solutions Markov Chains 4 Louise Ciccone, a dealer in luxury cars, faces the following weekly demand distribution. Demand 0 1 2 Prob. .1 .5 .4 She adopts the policy of placing an order for 3 cars whenever the inventory level drops to 2 or fewer cars at the end of a week. Assume that the order is placed just after taking inventory. If a customer arrives and there is no car available, the sale is lost. Show the transition matrix for the Markov chain that describes the inventory level at the end of each week if the order takes one week to arrive. Soln S= inventory level at week’s end = 1, 2, 3, 4, 5 note: we can have at most 5 cars in the lot, 2 cars at week’s end followed by no sales followed by the arrival of 3 cars at the end of the week. note: we could start with 0 cars, but once an arrival of 3 cars comes we will never get back to 0 cars since we never sell more than 2 and we get below 2 at time t, 3 arrive at t+1

Solutions Markov Chains 4 Soln cont. State Event End State Probability 5 sell 0 5 .1 5 sell 1 4 .5 5 sell 2 3 .4 4 sell 0 4 .1 4 sell 1 3 .5 4 sell 2 2 .4 3 sell 0 3 .1 3 sell 1 2 .5 3 sell 2 1 .4 2 3 arrive at week’s end sell 0 5 .1 sell 1 4 .5 sell 2 3 .4 1 3 arrive at week’s end sell 0 4 .1 sell 1 3 .5+.4 since demand of 1 or 2 deletes inventory during the week

Solutions Markov Chains 8 Soln cont.