10. 5: Model Solution Model Interpretation 10

Slides:



Advertisements
Similar presentations
Discrete time Markov Chain
Advertisements

Introduction to Queuing Theory
Exponential Distribution
Performance Evaluation: Markov Models, revisited CSCI 8710 E. Kraemer.
E&CE 418: Tutorial-4 Instructor: Prof. Xuemin (Sherman) Shen
Continuous-Time Markov Chains Nur Aini Masruroh. LOGO Introduction  A continuous-time Markov chain is a stochastic process having the Markovian property.
1 Chapter 5 Continuous time Markov Chains Learning objectives : Introduce continuous time Markov Chain Model manufacturing systems using Markov Chain Able.
INDR 343 Problem Session
10/12/1999(c) Ian Davis1 Predicting performance Topics: –Operational analysis of network models –Markov models.
Silberschatz, Galvin and Gagne  2002 Modified for CSCI 399, Royden, Operating System Concepts Operating Systems Lecture 19 Scheduling IV.
Example 14.3 Football Production at the Pigskin Company
Topics Review of DTMC Classification of states Economic analysis
NETE4631:Capacity Planning (3)- Private Cloud Lecture 11 Suronapee Phoomvuthisarn, Ph.D. /
Lecture 13 – Continuous-Time Markov Chains
LOGO 1 MATH 2040 Introduction to Mathematical Finance Instructor: Dr. Ken Tsang.
HW # Due Day: Nov 23.
Queueing Theory (2). Home Work 12-9 and Due Day: October 31 (Monday) 2005.
OS Fall ’ 02 Performance Evaluation Operating Systems Fall 2002.
Single queue modeling. Basic definitions for performance predictions The performance of a system that gives services could be seen from two different.
Performance Evaluation
What we will cover…  CPU Scheduling  Basic Concepts  Scheduling Criteria  Scheduling Algorithms  Evaluations 1-1 Lecture 4.
Performance Assessment Min Song, Ph.D. Is 465. LEARNING OUTCOMES 4.1 Compare efficiency IT metrics and effectiveness IT metrics 4.2 List and describe.
OS Fall ’ 02 Performance Evaluation Operating Systems Fall 2002.
HW # Due Day: Nov 23.
1 Part VI System-level Performance Models for the Web © 1998 Menascé & Almeida. All Rights Reserved.
Lecture 14 – Queuing Systems
Solutions Queueing Theory 1
Computer Networks Performance Evaluation. Chapter 12 Single Class MVA Performance by Design: Computer Capacity Planning by Example Daniel A. Menascé,
yahoo.com SUT-System Level Performance Models yahoo.com SUT-System Level Performance Models8-1 chapter12 Single Class MVA.
yahoo.com SUT-System Level Performance Models yahoo.com SUT-System Level Performance Models8-1 chapter10 Markov Models.
Queuing Theory Summary of results. 2 Notations Typical performance characteristics of queuing models are: L : Ave. number of customers in the system L.
AN INTRODUCTION TO THE OPERATIONAL ANALYSIS OF QUEUING NETWORK MODELS Peter J. Denning, Jeffrey P. Buzen, The Operational Analysis of Queueing Network.
M EAN -V ALUE A NALYSIS Manijeh Keshtgary O VERVIEW Analysis of Open Queueing Networks Mean-Value Analysis 2.
Computer Networks Performance Evaluation
Copyright ©: Nahrstedt, Angrave, Abdelzaher, Caccamo1 Queueing Systems.
Chapter 2 Machine Interference Model Long Run Analysis Deterministic Model Markov Model.
1 1 © 2003 Thomson  /South-Western Slide Slides Prepared by JOHN S. LOUCKS St. Edward’s University.
1 1 Slide © 2001 South-Western College Publishing/Thomson Learning Anderson Sweeney Williams Anderson Sweeney Williams Slides Prepared by JOHN LOUCKS QUANTITATIVE.
NETE4631:Capacity Planning (2)- Lecture 10 Suronapee Phoomvuthisarn, Ph.D. /
Manijeh Keshtgary. Queuing Network: model in which jobs departing from one queue arrive at another queue (or possibly the same queue)  Open and Closed.
Introduction to Queueing Theory
Queuing Theory Basic properties, Markovian models, Networks of queues, General service time distributions, Finite source models, Multiserver queues Chapter.
1 Queuing Models Dr. Mahmoud Alrefaei 2 Introduction Each one of us has spent a great deal of time waiting in lines. One example in the Cafeteria. Other.
1 Systems Analysis Methods Dr. Jerrell T. Stracener, SAE Fellow SMU EMIS 5300/7300 NTU SY-521-N NTU SY-521-N SMU EMIS 5300/7300 Queuing Modeling and Analysis.
Why Wait?!? Bryan Gorney Joe Walker Dave Mertz Josh Staidl Matt Boche.
CS433 Modeling and Simulation Lecture 07 – Part 01 Continuous Markov Chains Dr. Anis Koubâa 14 Dec 2008 Al-Imam.
NETE4631: Network Information System Capacity Planning (2) Suronapee Phoomvuthisarn, Ph.D. /
Ó 1998 Menascé & Almeida. All Rights Reserved.1 Part VI System-level Performance Models for the Web (Book, Chapter 8)
Internet Applications: Performance Metrics and performance-related concepts E0397 – Lecture 2 10/8/2010.
Little’s Law & Operational Laws. Little’s Law Proportionality relation between the average number of jobs (E[N]) in a system and the average system time.
 Recall your experience when you take an elevator.  Think about usually how long it takes for the elevator to arrive.  Most likely, the experience.
EVOLUTION OF INDUSTRIAL ENGINEERING Prof. Dr. Orhan TORKUL Res. Asst. M. Raşit CESUR Res. Asst. Furkan YENER.
Bushy Binary Search Tree from Ordered List. Behavior of the Algorithm Binary Search Tree Recall that tree_search is based closely on binary search. If.
CS433 Modeling and Simulation Lecture 11 Continuous Markov Chains Dr. Anis Koubâa 01 May 2009 Al-Imam Mohammad Ibn Saud University.
Fault Tree Analysis Part 11 – Markov Model. State Space Method Example: parallel structure of two components Possible System States: 0 (both components.
QUEUING THEORY 1.  - means the number of arrivals per second   - service rate of a device  T - mean service time for each arrival   = ( ) Utilization,
Traffic Flow Characteristics. Dr. Attaullah Shah
Reliability Engineering
Ó 1998 Menascé & Almeida. All Rights Reserved.1 Part VI System-level Performance Models for the Web.
 How do you know how long your design is going to last?  Is there any way we can predict how long it will work?  Why do Reliability Engineers get paid.
CPE 619 Mean-Value Analysis
Medium Access Control Protocols
Models of Traffic Flow 1.
Finite M/M/1 queue Consider an M/M/1 queue with finite waiting room.
ECE 358 Examples #1 Xuemin (Sherman) Shen Office: EIT 4155
Introduction Notation Little’s Law aka Little’s Result
Solutions Queueing Theory 1
Solutions Queueing Theory 1
Solutions Queueing Theory 1
EMGT 501 Fall 2005 Final Exam Due Day: Dec 12 (Noon)
Presentation transcript:

10. 5: Model Solution 10. 6 Model Interpretation 10 10.5: Model Solution 10.6 Model Interpretation 10.7 Assumption and Limitation

10.5 Model Solution Three steps to create a Markov model: 1) Construct the state diagram by identifying all possible states that the modeled system may find itself. 2) Identify the state connections(or transitions). 3)Parameterize the model by specifying the length of time spent in each state once it is entered(or the probability of transitioning from one state to another within the next time period).

10.5 Model Solution Definition of “model solution”: To find the long term(i.e., the “steady state”) probability of being in any particular state. This steady state solution is the overall probability of being in each system state. In general, there is one balance equation for each system state. The balance equation for each system state is: flows in=flows out Given N states, there are N desired unknowns along with N linear equations, which is a straightfoward linear algebra math problem.

Random walk through England: Model Solution

Random walk through England: Model Solution Let Pi represent probability of being in state i. So we have p1,p2,p3,p4 to represent the four states respectively. So the balance equations for this model are: 0.2*p2+0.1*p3+0.3*p4=0.6*p1 0.6*p1=p2 0.2*p4=0.3*p3 0.8*p2+0.2*p3=0.5*p4

Random walk through England: Model Solution The final state equations are: 0.2*p2+0.1*p3+0.3*p4=0.6*p1 0.6*p1=p2 0.2*p4=0.3*p3 p1+p2+p3+p4=1 The results are: P1=0.2644 P2=0.1586 P3=0.2308 P4=3462

Database Server Support: Model Solution

Database Server Support: Model Solution The balance equations for the six states are: 4*p(1,1,0)+2*(1,0,1)=6*p(2,0,0) 3*p(2,0,0)+4*p(0,2,0)+2*p(0,1,1)=10*p(1,1,0) 3*p(2,0,0)+4*p(0,1,1)+2*p(0,0,2)=8*p(1,0,1) 3*P(1,1,0)+3*p(1,0,1)=6*p(0,1,1) 3*p(1,1,0)=4*p(0,2,0) 3*p(1,0,1)=2*p(0,0,2)

Database Server Support: Model Solution The balance equations for the six states are: 4*p(1,1,0)+2*(1,0,1)=6*p(2,0,0) 3*p(2,0,0)+4*p(0,2,0)+2*p(0,1,1)=10*p(1,1,0) 3*p(2,0,0)+4*p(0,1,1)+2*p(0,0,2)=8*p(1,0,1) 3*P(1,1,0)+3*p(1,0,1)=6*p(0,1,1) 3*p(1,1,0)=4*p(0,2,0) p(2,0,0)+p(1,1,0)+p(1,0,1)+p(0,2,0)+p(0,1,1)+p(0,0,2)=1 Results: p(2,0,0)=0.1391; p(1,1,0)=0.1043; p(1,0,1)=0.2087; p(0,2,0)=0.0783; p(0,1,1)=0.1565; p(0,0,2)=0.3131;

10.6 Model Interpretation Example 1:Random Walk Through England Father’s question: What percentage of days is the son actually not drinking in Leeds? Answer: 74%. Interpretation: p1=0.2644=26%. So, the percentage of days that the son not drinking in Leeds is: 1-26%=74%

Example 1: Random Walk Through England Lake District relative’s question: Once the son finishes a day of kayaking in the Lake District, how long will it typically be before he returns? Answer: 3.33 days Interpretation: The mean time between entering a particular state (i.e., the state’s “cycle time”)is the inverse of the steady state probability of being in that state. P3=0.2308; cycle time=1/0.2308=4.33; It takes 1 day for the lad to kayak; The time from when he finishes a day of kayaking until he typically starts kayaking again is: 4.33-1=3.33

Example 1: Random Walk Through England Policeman’s question: How many days each month can the bobbies expect to see the son driving to London after drinking in Leeds? Answer: 4.76 days. Interpretation:p1=0.2644 The days will find the lad drinking in a month: 0.2644*30=7.93 Since the probability that the lad will go state 2 after state 1 is 0.6, so the days that the bobbies can expect to find the lad on the road to London is: 7.93*0.6=4.76 days.

Example 1: Random Walk Through England Kayak renters’ question: How many visits each month does the son typically visit their shop and typically how long does the son keep their kayak out each visit? Answer: 2.08 visits per month, keeping the kayak an average of 3.33 days each visit. Interpretation: The only way to enter state 3 from another state is from state 4. The days stay in state 4 each month is : 0.3462*30=10.39

Example 1: Random Walk Through England The probability to go to state 3 after state 4 is 0.2, so the lad typically start a new visit to the Lake District 10.39*0.2=2.08 times each month. P3=0.2308 The days that the lad can be expected to be kayaking each month is:30*0.2308=6.92 The duration of each visit is: 6.92/2.08=3.33

Database Server Support: Solution Interpretation

Example 2: Database Server Support User’s question: What response time can the typical user expect? Answer: 44.24 Interactive Response Time Law: R=M/X0-Z ,(Z=0) X0 ,the throughput of the system, measured at the CPU, is the product of its utilization and its service rate. The CPU is utilized in states(2,0,0),(1,1,0),(1,0,1) The utilization of CPU is: p(2,0,0)+p(1,1,0)+p(1,0,1)=0.4521 The service rate of CPU = 6 transactions/minute. X0 =0.4521*6=2.7126 R=M/X0-Z=2/2.7126=0.7373 minutes/ transaction

Example 2: Database Server Support System administrator’s question: How near capacity (utilization)of each of the system resources? Answer: Ucpu=0.4521, Ufast=0.3391, Ulow=0.6783 These are found as direct sums of the relevant steady state probabilities. Ufast=p(1,1,0)+p(0,2,0)+p(0,1,1)=0.3391 Ulow=p(1,0,1)+p(0,1,1)+p(0,0,2)=0.6783

Example 2: Database Server Support Company president’s question : If I capture Company X’s clientele, which will likely double the number of users on my system, I will need to also double the number of active users on my system. What new performance levels should I spin in my speech to the newly acquired customers? Answer: The throughput is predicted to go from 2.7126 to 3.4768; The response time is predicted to go from 44.24 to 69.03 Now we have 4 users and 15 states.

Example 2: Database Server Support Company pessimist’s question: Since I know that the fast disk is about to fail and all the files will need to be moved to the slow disk, what will the new response time be? Answer: 65.00 seconds/transaction Now we have 2 devices, and 3 states which are (1,1),(2,0),(0,2) The above two examples demonstrate how to use the knowledge of the steady state probabilities to arrive at more meaningful and more useful performance metrics.

10.7 Model Assumptions and Limitations Markov Models are quite robust. However, there are Key assumptions and resulting limitations: Memoryless Assumption: It is assumed that all the important system information is captured in the state descriptors of a Markov model. That is, simply knowing which state the system is in, uniquely defines all relevant information. Knowing the current state information alone is sufficient. This is the defining Markov characteristic and any other information is unnecessary as it applies to the system’s future behavior. That is, previous history can be forgotten.

10.7 Model Assumptions and Limitations Resulting Limitation: Because everything must be captured in the state descriptor, Markov models are susceptible to state space explosion. Having large state spaces implies additional complexity and a potential loss of accuracy.

10.7 Model Assumptions and Limitations Exponential Assumption: The exponential distribution is the only continuous distribution that is memoryless. For example, the service time is 10 seconds, Knowing that the customer has already received 5 seconds worth of CPU time but not yet finished(previous history, which is irrelevant under the Markov memoryless assumption), the average amount of CPU time still needed is again 10 seconds. Markov models assume that the time spent between relevant events, such as job arrival times and job service times, is exponentially distributed.

10.7 Model Assumptions and Limitations Resulting Limitation: To mitigate the limitation imposed by exponential assumptions, the concept of phases(or stages) can be introduced. For example, service time can be partitioned into two phases ,each phase being exponentially distributed with an average of five seconds. However the price is again a potential state space explosion since the state descriptor must now contain this additional phase information.