Presentation is loading. Please wait.

Presentation is loading. Please wait.

Analysis of Simulation Results Chapter 25. Overview  Analysis of Simulation Results  Model Verification Techniques  Model Validation Techniques  Transient.

Similar presentations


Presentation on theme: "Analysis of Simulation Results Chapter 25. Overview  Analysis of Simulation Results  Model Verification Techniques  Model Validation Techniques  Transient."— Presentation transcript:

1 Analysis of Simulation Results Chapter 25

2 Overview  Analysis of Simulation Results  Model Verification Techniques  Model Validation Techniques  Transient Removal  Terminating Simulations  Stopping Criteria: Variance Estimation  Variance Reduction

3 Model Verification vs. Validation

4 Model Verification Techniques

5  Would like model output to be close to that of real system  Made assumptions about behavior of real systems  1 st step, test if assumptions are reasonable  Validation, or representativeness of assumptions  2 nd step, test whether model implements assumptions  Verification, or correctness  Mutually exclusive.  Ex: what was your project 1? 5 Analysis of Simulation Results Always assume that your assumption is invalid. – Robert F. Tatman

6 Top Down Modular Design

7 Antibugging  Antibugging consists of including additional checks and outputs in the program that will point out the bugs, if any.  for example, the model counts the number of packets sent by a number of source nodes as well as the number received by the destination nodes.  Say, total packets = packets sent + packets received  If not, can halt or warn

8 Structured Walk-Through  Structured walk-through consists of explaining the code to another person or a group. The code developer explains what each line of code does

9 Deterministic Models  The key problem in debugging simulation models is the randomness of variables.  a deterministic program is easier to debug than a program with random variables  But by specifying constant (deterministic) distributions, the user can easily determine output variables and thus debug the modules

10 Run Simplified Cases  The model may be run with simple cases  Of course, a model that works for simple cases is not guaranteed to work for more complex cases. Therefore, the test cases should be as complex as can be easily analyzed without simulation  Only one packet  Only one source  Only one intermediate node

11 Trace

12 On-Line Graphic Displays

13  Slight change in input should yield slight change in output, otherwise error and bug in the simulation 13 Continuity tests Thrput (Debugged) Thrput (Undebugged)

14 Degeneracy tests Try extreme configuration and workloads. Try extremes (lowest and highest) since may reveal bugs  One CPU, Zero disk

15  Consistency tests – similar inputs produce similar outputs  Ex: 2 sources at 50 pkts/sec produce same total as 1 source at 100 pkts/sec  Seed independence – random number generator starting value should not affect final conclusion (maybe individual output, but not overall conclusion) 15 More Model Verification Techniques

16  Ensure assumptions used are reasonable  Want final simulated system to be like real system  Unlike verification, techniques to validate one simulation may be different from one model to another  Three key aspects to validate: 1.Assumptions 2.Input parameter values and distributions 3.Output values and conclusions  Compare validity of each to one or more of: 1.Expert intuition 2.Real system measurements 3.Theoretical results 16 Model Validation Techniques  9 combinations - Not all are always possible, however

17 Most practical, most common “Brainstorm” with people knowledgeable in area Present measured results and compare to simulated results (can see if experts can tell the difference) 17 Model Validation Techniques - Expert Intuition Throughput Packet Loss Probability 0.20.40.8 Which alternative looks invalid? Why?

18  Most reliable and preferred  May be unfeasible because system does not exist or too expensive to measure  That could be why simulating in the first place!  But even one or two measurements add an enormous amount to the validity of the simulation  Should compare input values, output values, workload characterization  Use multiple traces for trace-driven simulations  Can use statistical techniques (confidence intervals) to determine if simulated values different than measured values 18 Model Validation Techniques - Real System Measurements

19 Measurement  Three measurement  Measurement and simulation  Analytical model and simulation  Model and measurement

20 Theoretical Results  Model and simulation are wrong  Analysis = Simulation  Used to validate analysis also  Both may be invalid  Use theory in conjunction with experts' intuition  E.g., Use theory for a large configuration  Can show that the model is not invalid

21 Exercise  Imagine that you have been called as an expert to review a  simulation study. Which of the following simulation results would  you consider non-intuitive and would want it carefully validated:  1. The throughput of a system increases as its load increases.  2. The throughput of a system decreases as its load increases.  3. The response time increases as the load increases.  4. The response time of a system decreases as its load increases.  5. The loss rate of a system decreases as the load increases.

22  Introduction  Common Mistakes in Simulation  Terminology  Selecting a Simulation Language  Types of Simulations  Verification and Validation  Transient Removal  Termination 22 Outline

23  Most simulations only want steady state  Remove initial transient state  Trouble is, not possible to define exactly what constitutes end of transient state  Use heuristics:  Long runs  Proper initialization  Truncation  Initial data deletion  Moving average of replications  Batch means 23 Transient Removal

24  Use very long runs  Effects of transient state will be amortized  But … wastes resources  And tough to choose how long is “enough”  Recommendation … don’t use long runs alone 24 Long Runs

25  Start simulation in state close to expected state  Ex: CPU scheduler may start with some jobs in the queue  Determine starting conditions by previous simulations or simple analysis  May result in decreased run length, but still may not provide confidence that are in stable condition 25 Proper Initialization

26 Assume variability during steady state is less than during transient state Variability measured in terms of range – (min, max) If a trajectory of range stabilizes, then assume that in stable state Method: – Given n observations {x 1, x 2, …, x n } – ignore first l observations – Calculate (min,max) of remaining n-l – Repeat for l = 1…n – Stop when l+1th observation is neither min nor max 26 Truncation (Example next)

27 Sequence: 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 10, 9, 10, 11, 10, 9… Ignore first (l=1), range is (2, 11) and 2 nd observation (l+1) is the min Ignore second (l=2), range is (3,11) and 3 rd observation (l+1) is min Finally, l=9 and range is (9,11) and 10 th observation is neither min nor max So, discard first 9 observations 27 Truncation Example Transient Interval

28 Find duration of transient interval for: 11, 4, 2, 6, 5, 7, 10, 9, 10, 9, 10, 9, 10 28 Truncation Example 2 (1 of 2)

29 Find duration of transient interval for: 11, 4, 2, 6, 5, 7, 10, 9, 10, 9, 10, 9, 10 When l=3, range is (5,10) and 4 th (6) is not min or max So, discard only 3 instead of 6 29 Truncation Example 2 “Real” transient Assumed transient

30 Replication  Change seed so you have replication in simulation  If the seed is the same, it is not.  Do some replication to smooth the average

31  Study average after some initial observations are deleted from sample  If average does not change much, must be deleting from steady state  However, since randomness can cause some fluctuations during steady state, need multiple runs (w/different seeds)  Given m replications size n each with x ij jth observation of ith replication  Note j varies along time axis and i varies across replications 31 Initial Data Deletion

32 32 Initial Data Deletion (cont)

33

34

35 35 Initial Data Deletion x ij j xjxj j xlxl l (x l – x) / x l transient interval knee Individual replication Mean Across replications Mean of last n-l observation Relative Change

36 Batch Means  Run a long simulation and divide into equal duration part  Part = Batch = Sub-sample  Study variance of batch means as a function of the batch size Responses Observation number n2n3n4n5n

37 Batch Means (cont)

38

39 Batch Means  Ignore peaks followed by an upswing Variance of batch means transient interval Batch size n (Ignore)


Download ppt "Analysis of Simulation Results Chapter 25. Overview  Analysis of Simulation Results  Model Verification Techniques  Model Validation Techniques  Transient."

Similar presentations


Ads by Google