CS433 Modeling and Simulation Lecture 06 – Part 02 Discrete Markov Chains Dr. Anis Koubâa 11 Nov 2008 Al-Imam Mohammad.

Slides:



Advertisements
Similar presentations
Discrete time Markov Chain
Advertisements

Lecture 5 This lecture is about: Introduction to Queuing Theory Queuing Theory Notation Bertsekas/Gallager: Section 3.3 Kleinrock (Book I) Basics of Markov.
Exponential Distribution
Many useful applications, especially in queueing systems, inventory management, and reliability analysis. A connection between discrete time Markov chains.
CS433: Modeling and Simulation
Continuous-Time Markov Chains Nur Aini Masruroh. LOGO Introduction  A continuous-time Markov chain is a stochastic process having the Markovian property.
CS433 Modeling and Simulation Lecture 06 – Part 03 Discrete Markov Chains Dr. Anis Koubâa 12 Apr 2009 Al-Imam Mohammad Ibn Saud University.
Markov Chains.
IERG5300 Tutorial 1 Discrete-time Markov Chain
Discrete Time Markov Chains
Al-Imam Mohammad Ibn Saud University
Андрей Андреевич Марков. Markov Chains Graduate Seminar in Applied Statistics Presented by Matthias Theubert Never look behind you…
Continuous Time Markov Chains and Basic Queueing Theory
Tutorial 8 Markov Chains. 2  Consider a sequence of random variables X 0, X 1, …, and the set of possible values of these random variables is {0, 1,
Reliable System Design 2011 by: Amir M. Rahmani
Lecture 13 – Continuous-Time Markov Chains
048866: Packet Switch Architectures Dr. Isaac Keslassy Electrical Engineering, Technion Review.
CSE 221: Probabilistic Analysis of Computer Systems Topics covered: Discrete time Markov chains (Sec. 7.1)
Lecture 7  Poisson Processes (a reminder)  Some simple facts about Poisson processes  The Birth/Death Processes in General  Differential-Difference.
Generalized Semi-Markov Processes (GSMP)
CS433 Modeling and Simulation Lecture 13 Queueing Theory Dr. Anis Koubâa 03 May 2009 Al-Imam Mohammad Ibn Saud University.
CS433 Modeling and Simulation Lecture 16 Output Analysis Large-Sample Estimation Theory Dr. Anis Koubâa 30 May 2009 Al-Imam Mohammad Ibn Saud University.
NETE4631:Capacity Planning (2)- Lecture 10 Suronapee Phoomvuthisarn, Ph.D. /
CS433 Modeling and Simulation Lecture 15 Random Number Generator Dr. Anis Koubâa 24 May 2009 Al-Imam Mohammad Ibn Saud Islamic University College Computer.
CS433: Modeling and Simulation Dr. Anis Koubâa Al-Imam Mohammad bin Saud University 15 October 2010 Lecture 05: Statistical Analysis Tools.
Queuing Theory Basic properties, Markovian models, Networks of queues, General service time distributions, Finite source models, Multiserver queues Chapter.
CS433 Modeling and Simulation Lecture 12 Queueing Theory Dr. Anis Koubâa 03 May 2008 Al-Imam Mohammad Ibn Saud University.
Lecture 4: State-Based Methods CS 7040 Trustworthy System Design, Implementation, and Analysis Spring 2015, Dr. Rozier Adapted from slides by WHS at UIUC.
CS433 Modeling and Simulation Lecture 03 – Part 01 Probability Review 1 Dr. Anis Koubâa Al-Imam Mohammad Ibn Saud University
Generalized Semi- Markov Processes (GSMP). Summary Some Definitions The Poisson Process Properties of the Poisson Process  Interarrival times  Memoryless.
Courtesy of J. Akinpelu, Anis Koubâa, Y. Wexler, & D. Geiger
Model under consideration: Loss system Collection of resources to which calls with holding time  (c) and class c arrive at random instances. An arriving.
ECE 466/658: Performance Evaluation and Simulation Introduction Instructor: Christos Panayiotou.
WOOD 492 MODELLING FOR DECISION SUPPORT Lecture 25 Simulation.
1 Markov chains and processes: motivations Random walk One-dimensional walk You can only move one step right or left every time unit Two-dimensional walk.
CS433 Modeling and Simulation Lecture 07 – Part 01 Continuous Markov Chains Dr. Anis Koubâa 14 Dec 2008 Al-Imam.
Some Common Discrete Random Variables. Binomial Random Variables.
CDA6530: Performance Models of Computers and Networks Chapter 3: Review of Practical Stochastic Processes.
Dr. Anis Koubâa CS433 Modeling and Simulation
 Recall your experience when you take an elevator.  Think about usually how long it takes for the elevator to arrive.  Most likely, the experience.
CS433 Modeling and Simulation Lecture 11 Continuous Markov Chains Dr. Anis Koubâa 01 May 2009 Al-Imam Mohammad Ibn Saud University.
SS r SS r This model characterizes how S(t) is changing.
Central Limit Theorem Let X 1, X 2, …, X n be n independent, identically distributed random variables with mean  and standard deviation . For large n:
Fault Tree Analysis Part 11 – Markov Model. State Space Method Example: parallel structure of two components Possible System States: 0 (both components.
Solving Equations with Variables on Both Sides. Review O Suppose you want to solve -4m m = -3 What would you do as your first step? Explain.
From DeGroot & Schervish. Example Occupied Telephone Lines Suppose that a certain business office has five telephone lines and that any number of these.
Reliability Engineering
Let E denote some event. Define a random variable X by Computing probabilities by conditioning.
Markov Chains.
Discrete Time Markov Chains (A Brief Overview)
Discrete-time Markov chain (DTMC) State space distribution
Availability Availability - A(t)
Industrial Engineering Dep
Al-Imam Mohammad Ibn Saud University Large-Sample Estimation Theory
V5 Stochastic Processes
Al-Imam Mohammad Ibn Saud University
Lecture on Markov Chain
Chapter 3 Discrete Random Variables and Probability Distributions
Solutions Markov Chains 1
Discrete-time markov chain (continuation)
Lecture 7 Poisson Processes (a reminder)
Courtesy of J. Akinpelu, Anis Koubâa, Y. Wexler, & D. Geiger
Solving Equations by Adding and Subtracting Solving Equations
September 1, 2010 Dr. Itamar Arel College of Engineering
Autonomous Cyber-Physical Systems: Probabilistic Models
Discrete-time markov chain (continuation)
CS723 - Probability and Stochastic Processes
CS723 - Probability and Stochastic Processes
Lecture 11 – Stochastic Processes
Presentation transcript:

CS433 Modeling and Simulation Lecture 06 – Part 02 Discrete Markov Chains Dr. Anis Koubâa 11 Nov 2008 Al-Imam Mohammad Ibn Saud University

22 Goals for Today  Practical example for modeling a system using Markov Chain  State Holding Time  State Probability and Transient Behavior

3 Example Learn how to find a model of a given system Learn how to extract the state space

4 Example: Two Processors System  Consider a two processor computer system where, time is divided into time slots and that operates as follows:  At most one job can arrive during any time slot and this can happen with probability α.  Jobs are served by whichever processor is available, and if both are available then the job is given to processor 1.  If both processors are busy, then the job is lost.  When a processor is busy, it can complete the job with probability β during any one time slot.  If a job is submitted during a slot when both processors are busy but at least one processor completes a job, then the job is accepted (departures occur before arrivals).  Q1. Describe the automaton that models this system (not included).  Q2. Describe the Markov Chain that describes this model.

5 Example: Automaton (not included)  Let the number of jobs that are currently processed by the system by the state, then the State Space is given by X= {0, 1, 2}.  Event set:  a : job arrival,  d : job departure  Feasible event set:  If X=0, then Γ(X)= a  If X = 1, 2, then Γ(Χ)= a, d.  State Transition Diagram 012 a - / a,d a - d d / a,d,d dd -/a/ad

6 Example: Alternative Automaton (not included)  Let (X 1,X2) indicate whether processor 1 or 2 are busy, X i = {0, 1}.  Event set:  a : job arrival, d i : job departure from processor i  Feasible event set:  If X=(0,0), then Γ(X)= a If X=(0,1) then Γ(Χ)= a, d 2.  If X=(1,0) then Γ(Χ)= a, d 1. If X=(0,1) then Γ(Χ)= a, d 1, d 2.  State Transition Diagram a - / a,d 1 a - d2d2 d1d1 d 1,d 2 -/a/ad 1 /ad 2 - d1d1 a,d 2 a,d 1,d

7 7 Example: Markov Chain  For the State Transition Diagram of the Markov Chain, each transition is simply marked with the transition probability 012 p 01 p 11 p 12 p 00 p 10 p 21 p 20 p 22

8 8 Example: Markov Chain  Suppose that α = 0.5 and β = 0.7, then, 012 p 01 p 11 p 12 p 00 p 10 p 21 p 20 p 22

9 How much time does it take for going from one state to another ? State Holding Time

10 State Holding Times  Suppose that at point k, the Markov Chain has transitioned into state X k =i. An interesting question is how long it will stay at state i.  Let V(i) be the random variable that represents the number of time slots that X k =i.  We are interested on the quantity Pr{V(i) = n}

11 State Holding Times  This is the Geometric Distribution with parameter  Clearly, V(i) has the memoryless property

12 State Probabilities  An interesting quantity we are usually interested in is the probability of finding the chain at various states, i.e., we define For all possible states, we define the vector Using total probability we can write In vector form, one can write Or, if homogeneous Markov Chain

13 State Probabilities Example  Suppose that Find π (k) for k=1,2,… with Transient behavior of the system In general, the transient behavior is obtained by solving the difference equation