Analysis of Simulation Results Chapter 25. Overview  Analysis of Simulation Results  Model Verification Techniques  Model Validation Techniques  Transient.

Slides:



Advertisements
Similar presentations
1 CS533 Modeling and Performance Evaluation of Network and Computer Systems Capacity Planning and Benchmarking (Chapter 9)
Advertisements

Introduction To Simulation. 2 Overview Simulation: Key Questions Common Mistakes in Simulation Other Causes of Simulation Analysis Failure Checklist for.
Chapter 10: Estimating with Confidence
Variance reduction techniques. 2 Introduction Simulation models should be coded such that they are efficient. Efficiency in terms of programming ensures.
G. Alonso, D. Kossmann Systems Group
Output analyses for single system
Silberschatz, Galvin and Gagne  2002 Modified for CSCI 399, Royden, Operating System Concepts Operating Systems Lecture 19 Scheduling IV.
AQM for Congestion Control1 A Study of Active Queue Management for Congestion Control Victor Firoiu Marty Borden.
Simple Linear Regression
Statistics CSE 807.
Simulation.
Lecture 9 Output Analysis for a Single Model. 2  Output analysis is the examination of data generated by a simulation.  Its purpose is to predict the.
Variance Reduction Techniques
Copyright © 1998 Wanda Kunkle Computer Organization 1 Chapter 2.1 Introduction.
1 Validation and Verification of Simulation Models.
1 Simple Linear Regression Chapter Introduction In this chapter we examine the relationship among interval variables via a mathematical equation.
Lecture 10 Comparison and Evaluation of Alternative System Designs.
CS533 Modeling and Performance Evaluation of Network and Computer Systems Simulation (Chapters 24-25)
SIMULATION. Simulation Definition of Simulation Simulation Methodology Proposing a New Experiment Considerations When Using Computer Models Types of Simulations.
On Comparing Classifiers: Pitfalls to Avoid and Recommended Approach Published by Steven L. Salzberg Presented by Prakash Tilwani MACS 598 April 25 th.
Cmpt-225 Simulation. Application: Simulation Simulation  A technique for modeling the behavior of both natural and human-made systems  Goal Generate.
1 Terminating Statistical Analysis By Dr. Jason Merrick.
1 Introduction to Simulation Chapters 24. Overview Simulation: Key Questions Introduction to Simulation Common Mistakes in Simulation Other Causes of.
Analysis of Simulation Results Andy Wang CIS Computer Systems Performance Analysis.
Simulation Output Analysis
Simulation II IE 2030 Lecture 18. Outline: Simulation II Advanced simulation demo Review of concepts from Simulation I How to perform a simulation –concepts:
1 Validation & Verification Chapter VALIDATION & VERIFICATION Very Difficult Very Important Conceptually distinct, but performed simultaneously.
1 Performance Evaluation of Computer Networks: Part II Objectives r Simulation Modeling r Classification of Simulation Modeling r Discrete-Event Simulation.
 1  Outline  stages and topics in simulation  generation of random variates.
Verification & Validation
Chapter 4 – Modeling Basic Operations and Inputs  Structural modeling: what we’ve done so far ◦ Logical aspects – entities, resources, paths, etc. 
+ The Practice of Statistics, 4 th edition – For AP* STARNES, YATES, MOORE Chapter 8: Estimating with Confidence Section 8.1 Confidence Intervals: The.
Modeling and Performance Evaluation of Network and Computer Systems Introduction (Chapters 1 and 2) 10/4/2015H.Malekinezhad1.
1 Performance Evaluation of Computer Systems and Networks Introduction, Outlines, Class Policy Instructor: A. Ghasemi Many thanks to Dr. Behzad Akbari.
Introduction to Experimental Design
1 Statistical Distribution Fitting Dr. Jason Merrick.
Chapter 3 System Performance and Models. 2 Systems and Models The concept of modeling in the study of the dynamic behavior of simple system is be able.
© 2003, Carla Ellis Simulation Techniques Overview Simulation environments emulation exec- driven sim trace- driven sim stochastic sim Workload parameters.
Brian Macpherson Ph.D, Professor of Statistics, University of Manitoba Tom Bingham Statistician, The Boeing Company.
ICOM 6115: Computer Systems Performance Measurement and Evaluation August 11, 2006.
Simulation Tutorial By Bing Wang Assistant professor, CSE Department, University of Connecticut Web site.
Chap. 5 Building Valid, Credible, and Appropriately Detailed Simulation Models.
McGraw-Hill/Irwin © 2006 The McGraw-Hill Companies, Inc., All Rights Reserved. 1.
Chapter 10 Verification and Validation of Simulation Models
Simulation Techniques Overview Simulation environments emulation/ exec- driven event- driven sim trace- driven sim stochastic sim Workload parameters System.
Network Simulation Motivation: r learn fundamentals of evaluating network performance via simulation Overview: r fundamentals of discrete event simulation.
1 OUTPUT ANALYSIS FOR SIMULATIONS. 2 Introduction Analysis of One System Terminating vs. Steady-State Simulations Analysis of Terminating Simulations.
McGraw-Hill/Irwin Copyright © 2009 by The McGraw-Hill Companies, Inc. All rights reserved.
Introduction to Simulation Andy Wang CIS Computer Systems Performance Analysis.
OPERATING SYSTEMS CS 3530 Summer 2014 Systems and Models Chapter 03.
Output Analysis for Simulation
Introduction to statistics I Sophia King Rm. P24 HWB
Variability Introduction to Statistics Chapter 4 Jan 22, 2009 Class #4.
Chapter 9: Introduction to the t statistic. The t Statistic The t statistic allows researchers to use sample data to test hypotheses about an unknown.
Simulation. Types of simulation Discrete-event simulation – Used for modeling of a system as it evolves over time by a representation in which the state.
Building Valid, Credible & Appropriately Detailed Simulation Models
+ The Practice of Statistics, 4 th edition – For AP* STARNES, YATES, MOORE Chapter 8: Estimating with Confidence Section 8.1 Confidence Intervals: The.
1 Modeling and Performance Evaluation of Network and Computer Systems Simulation.
Variance reduction techniques Mat Simulation
Common Mistakes in Performance Evaluation The Art of Computer Systems Performance Analysis By Raj Jain Adel Nadjaran Toosi.
Step 1: Specify a null hypothesis
OPERATING SYSTEMS CS 3502 Fall 2017
Network Performance and Quality of Service
Chapter 10 Verification and Validation of Simulation Models
Statistical Methods Carey Williamson Department of Computer Science
MECH 3550 : Simulation & Visualization
Building Valid, Credible, and Appropriately Detailed Simulation Models
Modeling and Simulation: Exploring Dynamic System Behaviour
Presentation transcript:

Analysis of Simulation Results Chapter 25

Overview  Analysis of Simulation Results  Model Verification Techniques  Model Validation Techniques  Transient Removal  Terminating Simulations  Stopping Criteria: Variance Estimation  Variance Reduction

Model Verification vs. Validation

Model Verification Techniques

 Would like model output to be close to that of real system  Made assumptions about behavior of real systems  1 st step, test if assumptions are reasonable  Validation, or representativeness of assumptions  2 nd step, test whether model implements assumptions  Verification, or correctness  Mutually exclusive.  Ex: what was your project 1? 5 Analysis of Simulation Results Always assume that your assumption is invalid. – Robert F. Tatman

Top Down Modular Design

Antibugging  Antibugging consists of including additional checks and outputs in the program that will point out the bugs, if any.  for example, the model counts the number of packets sent by a number of source nodes as well as the number received by the destination nodes.  Say, total packets = packets sent + packets received  If not, can halt or warn

Structured Walk-Through  Structured walk-through consists of explaining the code to another person or a group. The code developer explains what each line of code does

Deterministic Models  The key problem in debugging simulation models is the randomness of variables.  a deterministic program is easier to debug than a program with random variables  But by specifying constant (deterministic) distributions, the user can easily determine output variables and thus debug the modules

Run Simplified Cases  The model may be run with simple cases  Of course, a model that works for simple cases is not guaranteed to work for more complex cases. Therefore, the test cases should be as complex as can be easily analyzed without simulation  Only one packet  Only one source  Only one intermediate node

Trace

On-Line Graphic Displays

 Slight change in input should yield slight change in output, otherwise error and bug in the simulation 13 Continuity tests Thrput (Debugged) Thrput (Undebugged)

Degeneracy tests Try extreme configuration and workloads. Try extremes (lowest and highest) since may reveal bugs  One CPU, Zero disk

 Consistency tests – similar inputs produce similar outputs  Ex: 2 sources at 50 pkts/sec produce same total as 1 source at 100 pkts/sec  Seed independence – random number generator starting value should not affect final conclusion (maybe individual output, but not overall conclusion) 15 More Model Verification Techniques

 Ensure assumptions used are reasonable  Want final simulated system to be like real system  Unlike verification, techniques to validate one simulation may be different from one model to another  Three key aspects to validate: 1.Assumptions 2.Input parameter values and distributions 3.Output values and conclusions  Compare validity of each to one or more of: 1.Expert intuition 2.Real system measurements 3.Theoretical results 16 Model Validation Techniques  9 combinations - Not all are always possible, however

Most practical, most common “Brainstorm” with people knowledgeable in area Present measured results and compare to simulated results (can see if experts can tell the difference) 17 Model Validation Techniques - Expert Intuition Throughput Packet Loss Probability Which alternative looks invalid? Why?

 Most reliable and preferred  May be unfeasible because system does not exist or too expensive to measure  That could be why simulating in the first place!  But even one or two measurements add an enormous amount to the validity of the simulation  Should compare input values, output values, workload characterization  Use multiple traces for trace-driven simulations  Can use statistical techniques (confidence intervals) to determine if simulated values different than measured values 18 Model Validation Techniques - Real System Measurements

Measurement  Three measurement  Measurement and simulation  Analytical model and simulation  Model and measurement

Theoretical Results  Model and simulation are wrong  Analysis = Simulation  Used to validate analysis also  Both may be invalid  Use theory in conjunction with experts' intuition  E.g., Use theory for a large configuration  Can show that the model is not invalid

Exercise  Imagine that you have been called as an expert to review a  simulation study. Which of the following simulation results would  you consider non-intuitive and would want it carefully validated:  1. The throughput of a system increases as its load increases.  2. The throughput of a system decreases as its load increases.  3. The response time increases as the load increases.  4. The response time of a system decreases as its load increases.  5. The loss rate of a system decreases as the load increases.

 Introduction  Common Mistakes in Simulation  Terminology  Selecting a Simulation Language  Types of Simulations  Verification and Validation  Transient Removal  Termination 22 Outline

 Most simulations only want steady state  Remove initial transient state  Trouble is, not possible to define exactly what constitutes end of transient state  Use heuristics:  Long runs  Proper initialization  Truncation  Initial data deletion  Moving average of replications  Batch means 23 Transient Removal

 Use very long runs  Effects of transient state will be amortized  But … wastes resources  And tough to choose how long is “enough”  Recommendation … don’t use long runs alone 24 Long Runs

 Start simulation in state close to expected state  Ex: CPU scheduler may start with some jobs in the queue  Determine starting conditions by previous simulations or simple analysis  May result in decreased run length, but still may not provide confidence that are in stable condition 25 Proper Initialization

Assume variability during steady state is less than during transient state Variability measured in terms of range – (min, max) If a trajectory of range stabilizes, then assume that in stable state Method: – Given n observations {x 1, x 2, …, x n } – ignore first l observations – Calculate (min,max) of remaining n-l – Repeat for l = 1…n – Stop when l+1th observation is neither min nor max 26 Truncation (Example next)

Sequence: 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 10, 9, 10, 11, 10, 9… Ignore first (l=1), range is (2, 11) and 2 nd observation (l+1) is the min Ignore second (l=2), range is (3,11) and 3 rd observation (l+1) is min Finally, l=9 and range is (9,11) and 10 th observation is neither min nor max So, discard first 9 observations 27 Truncation Example Transient Interval

Find duration of transient interval for: 11, 4, 2, 6, 5, 7, 10, 9, 10, 9, 10, 9, Truncation Example 2 (1 of 2)

Find duration of transient interval for: 11, 4, 2, 6, 5, 7, 10, 9, 10, 9, 10, 9, 10 When l=3, range is (5,10) and 4 th (6) is not min or max So, discard only 3 instead of 6 29 Truncation Example 2 “Real” transient Assumed transient

Replication  Change seed so you have replication in simulation  If the seed is the same, it is not.  Do some replication to smooth the average

 Study average after some initial observations are deleted from sample  If average does not change much, must be deleting from steady state  However, since randomness can cause some fluctuations during steady state, need multiple runs (w/different seeds)  Given m replications size n each with x ij jth observation of ith replication  Note j varies along time axis and i varies across replications 31 Initial Data Deletion

32 Initial Data Deletion (cont)

35 Initial Data Deletion x ij j xjxj j xlxl l (x l – x) / x l transient interval knee Individual replication Mean Across replications Mean of last n-l observation Relative Change

Batch Means  Run a long simulation and divide into equal duration part  Part = Batch = Sub-sample  Study variance of batch means as a function of the batch size Responses Observation number n2n3n4n5n

Batch Means (cont)

Batch Means  Ignore peaks followed by an upswing Variance of batch means transient interval Batch size n (Ignore)