ITEC 451 Network Design and Analysis

Slides:



Advertisements
Similar presentations
Design of Experiments Lecture I
Advertisements

1 Evaluation Rong Jin. 2 Evaluation  Evaluation is key to building effective and efficient search engines usually carried out in controlled experiments.
Introduction to Research Methodology
Silberschatz, Galvin and Gagne  2002 Modified for CSCI 399, Royden, Operating System Concepts Operating Systems Lecture 19 Scheduling IV.
Chapter 15 Application of Computer Simulation and Modeling.
ITEC 451 Network Design and Analysis. 2 You will Learn: (1) Specifying performance requirements Evaluating design alternatives Comparing two or more systems.
Project 4 U-Pick – A Project of Your Own Design Proposal Due: April 14 th (earlier ok) Project Due: April 25 th.
1 PERFORMANCE EVALUATION H Often one needs to design and conduct an experiment in order to: – demonstrate that a new technique or concept is feasible –demonstrate.
VLSI Systems--Spring 2009 Introduction: --syllabus; goals --schedule --project --student survey, group formation.
1 Validation and Verification of Simulation Models.
1 PERFORMANCE EVALUATION H Often in Computer Science you need to: – demonstrate that a new concept, technique, or algorithm is feasible –demonstrate that.
Lecture 10 Comparison and Evaluation of Alternative System Designs.
Science and Engineering Practices
1 Dr. Jerrell T. Stracener EMIS 7370 STAT 5340 Probability and Statistics for Scientists and Engineers Department of Engineering Management, Information.
March  There is a maximum of one obtuse angle in a triangle, but can you prove it?  To prove something like this, we mathematicians must do a.
Chapter 1: Introduction to Statistics
Lecture 1 What is Modeling? What is Modeling? Creating a simplified version of reality Working with this version to understand or control some.
© 2003, Carla Ellis Experimentation in Computer Systems Research Why: “It doesn’t matter how beautiful your theory is, it doesn’t matter how smart you.
Modeling and Performance Evaluation of Network and Computer Systems Introduction (Chapters 1 and 2) 10/4/2015H.Malekinezhad1.
PARAMETRIC STATISTICAL INFERENCE
Evaluating a Research Report
1 Performance Evaluation of Computer Systems and Networks Introduction, Outlines, Class Policy Instructor: A. Ghasemi Many thanks to Dr. Behzad Akbari.
The Scientific Method Formulation of an H ypothesis P lanning an experiment to objectively test the hypothesis Careful observation and collection of D.
Chapter 3 System Performance and Models. 2 Systems and Models The concept of modeling in the study of the dynamic behavior of simple system is be able.
Generic Approaches to Model Validation Presented at Growth Model User’s Group August 10, 2005 David K. Walters.
ICOM 6115: Computer Systems Performance Measurement and Evaluation August 11, 2006.
Managerial Economics Demand Estimation & Forecasting.
1 STAT 500 – Statistics for Managers STAT 500 Statistics for Managers.
Chapter 10 Verification and Validation of Simulation Models
Academic Research Academic Research Dr Kishor Bhanushali M
Building Simulation Model In this lecture, we are interested in whether a simulation model is accurate representation of the real system. We are interested.
Software Architecture Evaluation Methodologies Presented By: Anthony Register.
Chapter 3 System Performance and Models Introduction A system is the part of the real world under study. Composed of a set of entities interacting.
Presenting and Analysing your Data CSCI 6620 Spring 2014 Thesis Projects: Chapter 10 CSCI 6620 Spring 2014 Thesis Projects: Chapter 10.
1 Common Mistakes in Performance Evaluation (1) 1.No Goals  Goals  Techniques, Metrics, Workload 2.Biased Goals  (Ex) To show that OUR system is better.
1 1 Slide Simulation Professor Ahmadi. 2 2 Slide Simulation Chapter Outline n Computer Simulation n Simulation Modeling n Random Variables and Pseudo-Random.
OPERATING SYSTEMS CS 3530 Summer 2014 Systems and Models Chapter 03.
BME 353 – BIOMEDICAL MEASUREMENTS AND INSTRUMENTATION MEASUREMENT PRINCIPLES.
© 2003, Carla Ellis Vague idea “groping around” experiences Hypothesis Model Initial observations Experiment Data, analysis, interpretation Results & final.
26134 Business Statistics Week 4 Tutorial Simple Linear Regression Key concepts in this tutorial are listed below 1. Detecting.
Csci 418/618 Simulation Models Dr. Ken Nygard, IACC 262B
Vague idea “groping around” experiences Hypothesis Model Initial observations Experiment Data, analysis, interpretation Results & final Presentation Experimental.
Introduction Andy Wang CIS Computer Systems Performance Analysis.
Simulation. Types of simulation Discrete-event simulation – Used for modeling of a system as it evolves over time by a representation in which the state.
Introduction It had its early roots in World War II and is flourishing in business and industry with the aid of computer.
PSY 325 AID Education Expert/psy325aid.com FOR MORE CLASSES VISIT
Virtual University of Pakistan
Analysis Manager Training Module
Common Mistakes in Performance Evaluation The Art of Computer Systems Performance Analysis By Raj Jain Adel Nadjaran Toosi.
OPERATING SYSTEMS CS 3502 Fall 2017
Planning my research journey
Software Testing.
Software Architecture in Practice
Performance evaluation
Network Performance and Quality of Service
Evaluation of IR Systems
Determining How Costs Behave
Understanding Results
Chapter 10 Verification and Validation of Simulation Models
Software Engineering Experimentation
Introduction to Instrumentation Engineering
Introduction to Systems Analysis and Design
Computer Systems Performance Evaluation
Andy Wang CIS 5930 Computer Systems Performance Analysis
Rai University , November 2014
Computer Systems Performance Evaluation
Professor John Canny Fall 2004
MECH 3550 : Simulation & Visualization
DESIGN OF EXPERIMENTS by R. C. Baker
Dr. Arslan Ornek MATHEMATICAL MODELS
Presentation transcript:

ITEC 451 Network Design and Analysis

You will Learn: (1) Specifying performance requirements Evaluating design alternatives Comparing two or more systems Determining the optimal value of a parameter (system tuning) Finding the performance bottleneck (bottleneck identification)

You will Learn: (2) Characterizing the load on the system (workload characterization) Determining the number and sizes of components (capacity planning) Predicting the performance at future loads (forecasting)

Basic Terms (1) System: Any collection of hardware, software, or both Model: Mathematical representation of a concept, phenomenon, or system Metrics: The criteria used to evaluate the performance of a system Workload: The requests made by the users of a system

Basic Terms (2) Parameters: System and workload characteristics that affect system performance Factors: Parameters that are varied in a study especially those that depend on users Outliners: Values (in a set of measurement data) that are too high or too low as compared to the majority

Common Mistakes in Performance Evaluation (1) No Goals Goals  Techniques, Metrics, Workload Biased Goals (Ex) To show that OUR system is better than THEIRS Unsystematic Approach Analysis Without Understanding the Problem Incorrect Performance Metrics Unrepresentative Workload Wrong Evaluation Technique

Common Mistakes in Performance Evaluation (2) Overlook Important Parameters Ignore Significant Factors Inappropriate Experimental Design Inappropriate Level of Detail No Analysis Erroneous Analysis No Sensitivity Analysis Ignore Errors in Input

Common Mistakes in Performance Evaluation (3) Improper Treatment of Outliers Assuming No Change in the Future Ignoring Variability Too Complex Analysis Improper Presentation of Results Ignoring Social Aspects Omitting Assumptions and Limitations

Checklist for Avoiding Common Mistakes (1) Is the system correctly defined and the goals clearly stated? Are the goals stated in an unbiased manner? Have all the steps of the analysis followed systematically? Is the problem clearly understood before analyzing it? Are the performance metrics relevant for this problem? Is the workload correct for this problem?

Checklist for Avoiding Common Mistakes (2) Is the evaluation technique appropriate? Is the list of parameters that affect performance complete? Have all parameters that affect performance been chosen as factors to be varied? Is the experimental design efficient in terms of time and results? Is the level of detail proper? Is the measured data presented with analysis and interpretation?

Checklist for Avoiding Common Mistakes (3) Is the analysis statistically correct? Has the sensitivity analysis been done? Would errors in the input cause an insignificant change in the results? Have the outliers in the input or output been treated properly? Have the future changes in the system and workload been modeled? Have the variance of input been taken into account?

Checklist for Avoiding Common Mistakes (4) Has the variance of the results been analyzed? Is the analysis easy to explain? Is the presentation style suitable for its audience? Have the results been presented graphically as much as possible? Are the assumptions and limitations of the analysis clearly documented?

Systematic Approach to Performance Evaluation State Goals and Define the System List Services and Outcomes Select Appropriate Metrics List the Parameters Select Evaluation Techniques Select Workload Design Experiment(s) Analyze and Interpret Data Present the results

State Goals and Define the System Identify the goal of study Not trivial, but Will affect every decision or choice you make down the road Clearly define the system Where you draw the boundary will Dictate the choice of model Affect choice of metrics and workload

List Services and Outcomes Identify the services offered by the system For each service, identify all possible outcomes What’s the point? Those will help in the selection of appropriate metrics

Select Appropriate Metrics (1) These are the criteria for performance evaluation Desired Properties Specific Measurable Acceptable Realizable Through Examples? Prefer those that Have low variability, Are non-redundant, and Are complete

Select Appropriate Metrics (2) - Examples - Successful Service Rate – Throughput Frequency of Correct Results – Reliability Being Available When Needed – Availability Serving Users Fairly – Fairness Efficiency of Resource Usage – Utilization  Now, How to Measure These?

Select Appropriate Metrics (3) - A Classification - Higher is Better Examples? Lower is Better Nominal is the Best

Select Appropriate Metrics (4) - Criteria for Metric Set Selection - Low-variability Helps reduce the number of runs needed Advice: Avoid ratios of two variables Non-redundancy Helps make results less confusing and reduce the effort Try to find a relationship between metrics If a simple relationship exists, keep only one Completeness

Select Appropriate Metrics (5) - Summary - Metrics chosen should be measurable Can assign a numerical value to it Acceptable Easy to Work With (i.e., can measure it easily) Avoid Redundancy Pay Attention to the Unit Used Sanity Check Check the boundary conditions (i.e., best system, ideal workload, etc.) to see if the metric is sensible

Systematic Approach to Performance Evaluation State Goals and Define the System List Services and Outcomes Select Appropriate Metrics List the Parameters Select Evaluation Techniques Select Workload Design Experiment(s) Analyze and Interpret Data Present the results

List the Parameters Identify all system and workload parameters System parameters Characteristics of the system that affect system performance Workload parameters Characteristics of usage (or workload) that affect system performance Categorize them according to their effects on system performance Determine the range of their variation or expected variation Decide on one or at most a couple to vary while keeping others fixed

Select Evaluation Technique(s) (1) Three Techniques Measurement Simulation Analytical Modeling

Select Evaluation Technique(s) (2) -Measurement, Simulation, or Analysis?- Can be a combination of two or all three Use the goal of study to guide your decision Remember, each of these techniques has its pros and cons

Select Evaluation Technique(s) (3) - Measurement - (+) Provides realistic data (+) Can test the limits on load (-) System or a prototype should be working (-) The prototype may not represent the actual system (-) Not that easy to correlate cause and effect Challenges Defining appropriate metrics Using appropriate workload Statistical tools to analyze the data

Select Evaluation Technique(s) (4) - Simulation - (+) Less expensive than building a prototype (+) Can test under more load scenarios (-) Synthetic since the model is not the actual system (-) Can not use simulation to make any guarantees on expected performance Challenges Need to be careful when to use simulation Need to get the model right Need to represent results well (the graphical tools) Need to learn simulation tools

Select Evaluation Technique(s) (5) - Analytical Modeling - (+) Can make strong guarantees on expected behavior (+) Can provide an insight in to cause and effect (+) Does not need to build a prototype (-) Performance prediction only as good as the model Challenges Significant learning curve Mathematically involved Choosing the right model (the art work)

Select Evaluation Technique(s) (6) - Bottom Line - You can use measurement to demonstrate feasibility of an approach. You can use measurement or simulation to show an evidence that your algorithm or system performs better than competing approaches in certain situations. But, if you would like to claim any properties of your algorithm (or system), the only option is to use analysis and mathematically prove your claim.

Select Evaluation Technique(s) (7) - When to Use What? - It is good to be versed in all three Can start with measurement or simulation to get a feel of the model or expected behavior Start with a simple model Perform an analysis to predict the performance and prove some behavioral properties Observe the actual performance to determine the validity of your model and your analysis Can use simulation for the previous step if a working system is not available/feasible Go back to revise the model and analysis if significant inconsistency is observed and start with Step 4 Finally use simulation to verify your results for large scale data or for scenarios that can not be modeled with existing expertise and available time

Systematic Approach to Performance Evaluation State Goals and Define the System List Services and Outcomes Select Appropriate Metrics List the Parameters Select Evaluation Techniques Select Workload Design Experiment(s) Analyze and Interpret Data Present the results

Select Workload What is a workload? How do you represent it? Range of values What should be in increment size? Probability Distribution Need to find a good model that approximates reality May require measurement/statistical analysis In simulation, use an appropriate random number generator to produce values Trace from an actual system

Design Experiment(s) To provide maximum information with minimum effort Field experiments can take enormous preparation time Attempt to get several experiments done in one setup Explore if you can use data collected by someone else Also, explore if you can use remote labs Finally, explore if you can use simulation without loosing significant validity Modifying simulation code can be time consuming as well In both simulation and measurement, repeat the same experiment (for a fixed workload and fixed parameter values) sufficient number of times for statistical validity Always keep the goal in mind

Analyze and Interpret Data In Analytical Modeling Carry out mathematical derivations that prove expected system behavior In Measurement Statistically analyze the collected data Summarize the results by computing statistical measures

Systematic Approach to Performance Evaluation State Goals and Define the System List Services and Outcomes Select Appropriate Metrics List the Parameters Select Evaluation Techniques Select Workload Design Experiment(s) Analyze and Interpret Data Present the results

Present the Results (1) In Analytical Modeling Clear statements of lemmas, and theorems Description of an algorithm with a proof of its properties Present numerical computation results To show how to use the formulae, and To show the effect of varying the parameters Perform simulation/measurement to show the validity of the model and analysis

Present the Results (2) In simulation and Measurement Clear statement of the goals of experiment A list of assumptions The experiment set up Platforms, tools, units, range of values for parameters Graphical presentation of results The simpler is it to understand the graphs, the better it is

Present the Results (3) In all three, after presentation of results Discuss implications for the users Discuss how a user can use the results Any additional applications that can benefit from your experiment Present conclusions What did you learn, e.g., surprises, new directions Discuss limitations and future work

Review: Systematic Approach to Performance Evaluation State Goals and Define the System List Services and Outcomes Select Appropriate Metrics List the Parameters Select Evaluation Techniques Select Workload Design Experiment(s) Analyze and Interpret Data Present the results