Download presentation
Presentation is loading. Please wait.
1
Markov Chains
2
Markov Chains General Description
We want to describe the behavior of a system as it moves (makes transitions) probabilistically from “state” to “state”. States may be qualitative or quantitative Basic Assumption The future depends only on the present (current state) and not on the past. That is, the future depends on the state we are in, not on how we arrived at this state.
3
Example 1 - Brand loyalty or Market Share
For ease, assume that all cola buyers purchase either Coke or Pepsi in any given week. That is, there is a duopoly. Assume that if a customer purchases Coke in one week there is a 90% chance that the customer will purchase Coke the next week (and a 10% chance that the customer will purchase Pepsi). Similarly, 80% of Pepsi drinkers will repeat the purchase from week to week.
4
Example 1 - Developing the Markov Matrix
States State 1 - Coke was purchased State 2 - Pepsi was purchased (note: states are qualitative) Markov (transition or probability) Matrix From\To Coke Pepsi Coke Pepsi
5
Example 1 – Understanding Movement
From\To Coke Pepsi Coke Pepsi Quiz: If we start with 100 Coke purchasers and 100 Pepsi purchasers, how many Coke purchasers will there be after 1 week?
6
Graphical Description – 1 The States
From\To Coke Pepsi .9 .1 .2 .8
7
Graphical Description – 2 Transitions from Coke
.9 .1 From\To Coke Pepsi .9 .1 .2 .8
8
Graphical Description – 3 All transitions
.9 .8 .1 .2 From\To Coke Pepsi .9 .1 .2 .8
9
Example 1 - Starting Conditions
Percentages Identify probability of (percentage of shoppers) starting in either state (We will assume a 50/50 starting market share in our example that follows.) Assume we start in one specific state (by setting one probability to 1 and the remaining probabilities to 0) Counts (numbers) Identify number of shoppers starting in either state
10
Example 1 From\To Coke Pepsi Coke 0.9 0.1 Pepsi 0.2 0.8
Starting Probabilities = 50% (or 50 people) each Questions What will happen in the short run (next 3 periods)? What will happen in the long run? Do starting probabilities influence long run?
11
Graphical Solution After 1 Transition
.9(50)=45 .8(50)=40 .1(50)=5 (50)Coke(55) (50)Pepsi(45) .2(50)=10 From\To Coke Pepsi .9 .1 .2 .8
12
Graphical Solution After 2 Transitions
.9(55)=49.5 .8(45)=36 .1(55)=5.5 (55)Coke(58.5) (45)Pepsi(41.5) .2(45)=9 From\To Coke Pepsi .9 .1 .2 .8
13
Graphical Solution After 3 Transitions
.9(58.5)=52.65 .8(41.5)=33.2 .1(58.5)=5.85 (58.5)Coke(60.95) (41.5)Pepsi(39.05) .2(41.5)=8.3
14
Analyzing Markov Chains
Open QM for Windows Module – Markov Chains Number of states – 2 Number of transitions - 3
15
Example 1 – After 3 transitions n-step Transition probabilities
End of Period 1 Coke Pepsi Coke Pepsi End prob (given initial) End of Period 2 Coke Pepsi Coke Pepsi End prob (given initial) End of Period 3 Coke Pepsi Coke Pepsi End prob (given initial) 1 step transition matrix 2 step transition matrix 3 step transition matrix
16
Example 1 - Results (3 transitions, start = .5, .5)
From\To Coke Pepsi Coke Pepsi Ending probability Steady State probability Note: We end up alternating between Coke and Pepsi 3 step transition matrix Depends on initial conditions Independent of initial conditions
17
Example 2 - Student Progression Through a University
States Freshman Sophomore Junior Senior Dropout Graduate (note: again, states are qualitative)
18
Example 2 - Student Progression Through a University - States
Freshman Sophomore Junior Senior Drop out Graduate Note that eventually you must end up in Grad or Drop-out.
19
Example 2 – Results Lazarus paper data
First yr Soph Junior Senior Grad Drop out First year Sophomore Junior Senior Graduate Drop out End prob Steady State
20
From the paper If there are an equal number of freshmen, sophomores, juniors and seniors at the beginning of an academic year then The percentage of this mixed group of students who will graduate is ( )/4 = 91%
21
Classification of states
Absorbing Those states such that once you are in you never leave. Graduate, Drop Out Recurrent Those states to which you will always both leave and return at some time. Coke, Pepsi Transient States that you will eventually never return to Freshman, Sophomore, Junior, Senior
22
State Classification Quiz
23
State Classification Article
“A non-recursive algorithm for classifying the states of a finite Markov chain” European Journal of Operational Research Vol 28, 1987
25
Example 3 - Diseases States Purpose no disease
pre-clinical (no symptoms) clinical death (note: again states are qualitative) Purpose Transition probabilities can be different for different testing or treatment protocols
26
Example 4 - Customer Bill paying
States State 0: Bill is paid in full State i: Bill is in arrears for i months, i= 1,2,…,11 State 12: Deadbeat
27
Example 5 - Oil Market State State 0 - oil market is normal
State 1 - oil market is mildly disrupted State 2 - oil market is severely disrupted State 3 - oil production is essentially shut down Note: States are qualitative Phila Inq, 3/24/04, “Strategic oil reserve fill-up will continue”
28
Example 6 – HIV infections
Based on “Can Difficult-to-Reuse Syringes Reduce the Spread of HIV among Injection Drug Users” Caulkins, et. al. Interfaces, Vol 28, No. 3, May-June 1998, pp 23-33 State State 0 – Syringe is uninfected State 1 – Syringe is infected Notes: P(0, 1) = .14 14% of drug users are infected with HIV P(1, 0) = 5% of the time the virus dies; 33% of the time it is killed by bleaching
29
Example 7 – Mental Health Lazarus
depressed manic euthymic/remitted mortality
30
Example 8 - Baseball States Moneyball by Michael Lewis, p 134
State 0 - no outs, bases empty State 1 - no outs, runner on first State 2 - no outs, runner on second State 3 - no outs, runner on third State 4 - no outs, runners on first, second State 5 - no outs, runners on first, third State 6 - no outs, runners on second, third State 7 - no outs, runners on first, second, third …. Repeat for 1 out and 2 outs for a total of 24 states Moneyball by Michael Lewis, p 134
31
Example 9 – Football Overtime Playoffs (no time limit)
States Team A has ball Team B has ball Team A scores (absorbing) Team B scores (absorbing) “Win, Lose, or Draw: A Markov Chain Analysis of Overtime in the National Football League”, Michael A. Jones, The College Mathematics Journal, Vol. 35, No. 5, November 2004, pp
32
Additional References from Interfaces
Managing Credit Lines and Prices for Bank One Credit Cards. By: Trench, Margaret S.; Pederson, Shane P.; Lau, Edward T.; Lizhi Ma; Hui Wang; Nair, Suresh K.. Interfaces, Sep/Oct2003, Vol. 33 Issue 5, p4, 18p Real Applications of Markov Decision Processes. By: White, Douglas J.. Interfaces, Nov/Dec85, Vol. 15 Issue 6, p73, 11p Further Real Applications of Markov Decision. By: White, D.J.. Interfaces, Sep/Oct88, Vol. 18 Issue 5, p55, 7p A Markovian Model for the Valuation of Human Assets Acquired by an Organizational Purchase. By: Flamholtz, Eric G.; Geis, George T.; Perle, Richard J.. Interfaces, Nov/Dec84, Vol. 14 Issue 6, p11, 5p STUDENT FLOW IN A UNIVERSITY DEPARTMENT: RESULTS OF A MARKOV ANALYSIS. By: Bessent, E. Wailand; Bessent, Authella M.. Interfaces, 1980, Vol. 10 Issue 2, p52, 8p
33
Markov Chains The end
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.