Solutions Markov Chains 1

Slides:



Advertisements
Similar presentations
Many useful applications, especially in queueing systems, inventory management, and reliability analysis. A connection between discrete time Markov chains.
Advertisements

The UNIVERSITY of NORTH CAROLINA at CHAPEL HILL Chapter 4. Discrete Probability Distributions Section 4.11: Markov Chains Jiaping Wang Department of Mathematical.
Lecture 6  Calculating P n – how do we raise a matrix to the n th power?  Ergodicity in Markov Chains.  When does a chain have equilibrium probabilities?
Continuous Probability Distributions For discrete RVs, f (x) is the probability density function (PDF) is not the probability of x but areas under it are.
INDR 343 Problem Session
1 Markov Chains (covered in Sections 1.1, 1.6, 6.3, and 9.4)
Solutions Markov Chains 1
Chapter 17 Markov Chains.
Markov Processes MBAP 6100 & EMEN 5600 Survey of Operations Research Professor Stephen Lawrence Leeds School of Business University of Colorado Boulder,
Matrices, Digraphs, Markov Chains & Their Use. Introduction to Matrices  A matrix is a rectangular array of numbers  Matrices are used to solve systems.
Tutorial 8 Markov Chains. 2  Consider a sequence of random variables X 0, X 1, …, and the set of possible values of these random variables is {0, 1,
Overview of Markov chains David Gleich Purdue University Network & Matrix Computations Computer Science 15 Sept 2011.
INDR 343 Problem Session
EMGT 501 HW # Due Day: Sep 12.
Lecture 4 Chapter 11 wrap-up
Final Exam Due: December 14 (Noon), 2004
2003 Fall Queuing Theory Midterm Exam(Time limit:2 hours)
HW # Due Day: Nov. 9.
INDR 343 Problem Session
Markov Processes ManualComputer-Based Homework Solution MGMT E-5070.
Section 10.2 Regular Markov Chains
CH – 11 Markov analysis Learning objectives:
Solutions Markov Chains 2 3) Given the following one-step transition matrix of a Markov chain, determine the classes of the Markov chain and whether they.
Chapter 3 : Problems 7, 11, 14 Chapter 4 : Problems 5, 6, 14 Due date : Monday, March 15, 2004 Assignment 3.
CS433 Modeling and Simulation Lecture 06 – Part 02 Discrete Markov Chains Dr. Anis Koubâa 11 Nov 2008 Al-Imam Mohammad.
© 2015 McGraw-Hill Education. All rights reserved. Chapter 19 Markov Decision Processes.
1 Markov chains and processes: motivations Random walk One-dimensional walk You can only move one step right or left every time unit Two-dimensional walk.
Problems Markov Chains 2
1 Resource-Constrained Multiple Product System & Stochastic Inventory Model Prof. Yuan-Shyi Peter Chiu Feb Material Management Class Note #4.
Markov Chains Part 4. The Story so far … Def: Markov Chain: collection of states together with a matrix of probabilities called transition matrix (p ij.
Meaning of Markov Chain Markov Chain is a stochastic model describing a sequence of possible events in which the probability of each event depends only.
ST3236: Stochastic Process Tutorial 6
11. Markov Chains (MCs) 2 Courtesy of J. Bard, L. Page, and J. Heyl.
Fault Tree Analysis Part 11 – Markov Model. State Space Method Example: parallel structure of two components Possible System States: 0 (both components.
Problems Markov Chains 1 1) Given the following one-step transition matrices of a Markov chain, determine the classes of the Markov chain and whether they.
Chapter 9: Markov Processes
From DeGroot & Schervish. Example Occupied Telephone Lines Suppose that a certain business office has five telephone lines and that any number of these.
Reliability Engineering
Let E denote some event. Define a random variable X by Computing probabilities by conditioning.
Problems Markov Chains ) Given the following one-step transition matrices of a Markov chain, determine the classes of the Markov chain and whether.
Discrete Time Markov Chain
Discrete-time markov chain (continuation)
Much More About Markov Chains
ST3236: Stochastic Process Tutorial 7
Marketing Management Dr. Aravind Banakar –
Marketing Management Dr. Aravind Banakar –
Marketing Management
Marketing Management
Marketing Management
Marketing Management
Managing Uncertainty in the Supply Chain: Safety Inventory
Finite M/M/1 queue Consider an M/M/1 queue with finite waiting room.
IENG 362 Markov Chains.
Solutions Markov Chains 1
IENG 362 Markov Chains.
1.
Introduction to Concepts Markov Chains and Processes
Chapman-Kolmogorov Equations
Problem Markov Chains 1 A manufacturer has one key machine at the core of its production process. Because of heavy use, the machine.
Solutions Markov Chains 1
Solutions Markov Chains 2
Problem Markov Chains 1 A manufacturer has one key machine at the core of its production process. Because of heavy use, the machine.
Slides by John Loucks St. Edward’s University.
IENG 362 Markov Chains.
Discrete-time markov chain (continuation)
Discrete-time markov chain (continuation)
Solutions Markov Chains 6
CS723 - Probability and Stochastic Processes
CS723 - Probability and Stochastic Processes
CS723 - Probability and Stochastic Processes
Presentation transcript:

Solutions Markov Chains 1 1) Given the following one-step transition matrices of a Markov chain, determine the classes of the Markov chain and whether they are recurrent. a. b. 1 2 3 All states communicate, all states recurrent 1 2 3 Communicate {0}, {1, 2}, {3} states 0, 1, 2 recurrent state 3 transient

Solutions Markov Chains 3 2) The leading brewery on the West Coast (A) has hired a TM specialist to analyze its market position. It is particularly concerned about its major competitor (B). The analyst believes that brand switching can be modeled as a Markov chain using 3 states, with states A and B representing customers drinking beer produced from the aforementioned breweries and state C representing all other brands. Data are taken monthly, and the analyst has constructed the following one-step transition probability matrix. A B C What are the steady-state market shares for the two major breweries? Soln: (1) (2) (3) Let, Then, from (1), (4)

Solutions Markov Chains 4 2) (cont.) (1) (2) (3) (4) p C B = - 3 2 from (2), from (4)

Solutions Markov Chains 5 2) (cont.) But, Note: This could also be checked by Pn for some large n.

Solutions Markov Chains 6 Louise Ciccone, a dealer in luxury cars, faces the following weekly demand distribution. Demand 0 1 2 Prob. .1 .5 .4 She adopts the policy of placing an order for 3 cars whenever the inventory level drops to 2 or fewer cars at the end of a week. Assume that the order is placed just after taking inventory. If a customer arrives and there is no car available, the sale is lost. Show the transition matrix for the Markov chain that describes the inventory level at the end of each week if the order takes one week to arrive. Compute steady state probabilities and the expected number of lost sales per week. Soln S= inventory level at week’s end = 1, 2, 3, 4, 5 note: we can have at most 5 cars in the lot, 2 cars at week’s end followed by no sales followed by the arrival of 3 cars at the end of the week. note: we could start with 0 cars, but once an arrival of 3 cars comes we will never get back to 0 cars since we never sell more than 2 and we get below 2 at time t, 3 arrive at t+1

Solutions Markov Chains 7 Soln cont. State Event End State Probability 5 sell 0 5 .1 5 sell 1 4 .5 5 sell 2 3 .4 4 sell 0 4 .1 4 sell 1 3 .5 4 sell 2 2 .4 3 sell 0 3 .1 3 sell 1 2 .5 3 sell 2 1 .4 2 3 arrive at week’s end sell 0 5 .1 sell 1 4 .5 sell 2 3 .4 1 3 arrive at week’s end sell 0 4 .1 sell 1 3 .5+.4 since demand of 1 or 2 deletes inventory during the week

Solutions Markov Chains 8 Soln cont. Let, Continue substitutions and renormalize or