Network Performance and Quality of Service

Slides:



Advertisements
Similar presentations
Network II.5 simulator ..
Advertisements

1 CS533 Modeling and Performance Evaluation of Network and Computer Systems Capacity Planning and Benchmarking (Chapter 9)
Copyright © 2005 Department of Computer Science CPSC 641 Winter PERFORMANCE EVALUATION Often in Computer Science you need to: – demonstrate that.
Silberschatz, Galvin and Gagne  2002 Modified for CSCI 399, Royden, Operating System Concepts Operating Systems Lecture 19 Scheduling IV.
ITEC 451 Network Design and Analysis. 2 You will Learn: (1) Specifying performance requirements Evaluating design alternatives Comparing two or more systems.
1 PERFORMANCE EVALUATION H Often one needs to design and conduct an experiment in order to: – demonstrate that a new technique or concept is feasible –demonstrate.
Statistics CSE 807.
OS Fall ’ 02 Performance Evaluation Operating Systems Fall 2002.
1 Performance Evaluation of Computer Networks Objectives  Introduction to Queuing Theory  Little’s Theorem  Standard Notation of Queuing Systems  Poisson.
Performance Evaluation
Copyright © 1998 Wanda Kunkle Computer Organization 1 Chapter 2.1 Introduction.
1 PERFORMANCE EVALUATION H Often in Computer Science you need to: – demonstrate that a new concept, technique, or algorithm is feasible –demonstrate that.
CS533 Modeling and Performance Evaluation of Network and Computer Systems Introduction (Chapters 1 and 2)
OS Fall ’ 02 Performance Evaluation Operating Systems Fall 2002.
SIMULATION. Simulation Definition of Simulation Simulation Methodology Proposing a New Experiment Considerations When Using Computer Models Types of Simulations.
Computer System Lifecycle Chapter 1. Introduction Computer System users, administrators, and designers are all interested in performance evaluation. Whether.
Analysis of Simulation Results Andy Wang CIS Computer Systems Performance Analysis.
1 Validation & Verification Chapter VALIDATION & VERIFICATION Very Difficult Very Important Conceptually distinct, but performed simultaneously.
1 Performance Evaluation of Computer Networks: Part II Objectives r Simulation Modeling r Classification of Simulation Modeling r Discrete-Event Simulation.
Modeling and Performance Evaluation of Network and Computer Systems Introduction (Chapters 1 and 2) 10/4/2015H.Malekinezhad1.
1 Performance Evaluation of Computer Systems and Networks Introduction, Outlines, Class Policy Instructor: A. Ghasemi Many thanks to Dr. Behzad Akbari.
Test Loads Andy Wang CIS Computer Systems Performance Analysis.
Chapter 3 System Performance and Models. 2 Systems and Models The concept of modeling in the study of the dynamic behavior of simple system is be able.
Analysis of Simulation Results Chapter 25. Overview  Analysis of Simulation Results  Model Verification Techniques  Model Validation Techniques  Transient.
ICOM 6115: Computer Systems Performance Measurement and Evaluation August 11, 2006.
McGraw-Hill/Irwin © 2006 The McGraw-Hill Companies, Inc., All Rights Reserved. 1.
Rassul Ayani 1 Performance of parallel and distributed systems  What is the purpose of measurement?  To evaluate a system (or an architecture)  To compare.
1 Common Mistakes in Performance Evaluation (1) 1.No Goals  Goals  Techniques, Metrics, Workload 2.Biased Goals  (Ex) To show that OUR system is better.
12/7/2015© 2008 Raymond P. Jefferis III1 Simulation of Computer Systems.
CPE 619: Modeling and Analysis of Computer and Communications Systems Aleksandar Milenković The LaCASA Laboratory Electrical and Computer Engineering Department.
OPERATING SYSTEMS CS 3530 Summer 2014 Systems and Models Chapter 03.
© 2003, Carla Ellis Vague idea “groping around” experiences Hypothesis Model Initial observations Experiment Data, analysis, interpretation Results & final.
1 The Software Development Process ► Systems analysis ► Systems design ► Implementation ► Testing ► Documentation ► Evaluation ► Maintenance.
Vague idea “groping around” experiences Hypothesis Model Initial observations Experiment Data, analysis, interpretation Results & final Presentation Experimental.
Introduction Andy Wang CIS Computer Systems Performance Analysis.
Simulation. Types of simulation Discrete-event simulation – Used for modeling of a system as it evolves over time by a representation in which the state.
Test Loads Andy Wang CIS Computer Systems Performance Analysis.
William Stallings Data and Computer Communications
Common Mistakes in Performance Evaluation The Art of Computer Systems Performance Analysis By Raj Jain Adel Nadjaran Toosi.
OPERATING SYSTEMS CS 3502 Fall 2017
Processes and threads.
Software Architecture in Practice
Introduction to Quantitative Analysis
Performance evaluation
Empirically Characterizing the Buffer Behaviour of Real Devices
Andy Wang CIS 5930 Computer Systems Performance Analysis
ADVANTAGES OF SIMULATION
ITEC 451 Network Design and Analysis
Lecture 2 Introduction to Programming
Switching Techniques In large networks there might be multiple paths linking sender and receiver. Information may be switched as it travels through various.
Chapter 10 Verification and Validation of Simulation Models
Transport Layer Unit 5.
B.Ramamurthy Appendix A
Computer Simulation of Networks
Computer Systems Performance Evaluation
Switching Techniques.
Measurements Measurements and errors :
Lecture 2 Part 3 CPU Scheduling
Andy Wang CIS 5930 Computer Systems Performance Analysis
Architecture & System Performance
Performance Evaluation of Computer Networks
Computer Systems Performance Evaluation
Performance Evaluation of Computer Networks
Requirements Definition
Chapter-5 Traffic Engineering.
MECH 3550 : Simulation & Visualization
DESIGN OF EXPERIMENTS by R. C. Baker
Congestion Control (from Chapter 05)
Dr. Arslan Ornek MATHEMATICAL MODELS
Presentation transcript:

Network Performance and Quality of Service 2. Performance Evaluation

Performance Evaluation Performance is a key criterion in the design, procurement, and use of systems. The goal of engineers, scientists, analysts, and users is to get the highest performance for a given cost. To achieve this goal, one needs, at least, a basic knowledge of performance evaluation techniques. RQ2012

Basic Terms System: Any collection of hardware, software, and firmware Metrics: Criteria used to evaluate the performance of the system components Workloads: The requests made by the users of the system RQ2012

Considerations (1 of 6) Select appropriate evaluation techniques, performance metrics and workloads for a system. Techniques: measurement, simulation, analytic modeling Metrics: criteria to study performance (eg: response time) Workloads: requests by users/applications to the system Example: What performance metrics should you use to compare the performance of … Two disk drives? Two transactions processing systems? Two packet retransmission algorithms? RQ2012

Considerations (2 of 6) Conduct performance measurements correctly Need two tools Load generator – a tool to load the system Monitor – a tool to measure the results Example: Which type of monitor would be suitable for measuring following quantities? Response time from a Web server Utilization on a LAN Audio quality in a VoIP network RQ2012

Considerations (3 of 6) Use proper statistical techniques to compare several alternatives One run of workload often not sufficient Many non-deterministic events that effect performance Comparing average of several runs may also not lead to correct results Especially if variance is high Example: Packets lost on a link. Which link is better? File Size Link A Link B 1000 5 10 1200 7 3 1300 3 0 50 0 1 RQ2012

Considerations (4 of 6) Design measurement and simulation experiments to provide the most information with the least effort. Often many factors that affect performance. Separate out the effects that individually matter. Example: The performance of a network system depends upon three factors: A) type of routing technique: R1, R2 or none B) type of workload (traffic): http, VoIP, Gaming C) type of equipment: hubs, switches How many experiments are needed? How can the performance of each factor be estimated? RQ2012

Considerations (5 of 6) Perform simulations correctly Select correct language/software, seeds for random numbers, length of simulation run, and analysis Before all of that, may need to validate simulator Example: To compare the performance of two routing algorithms: A) how long should the simulation be run? B) what can be done to get the same accuracy with a shorter run? RQ2012

Considerations (6 of 6) Use simple queuing models to analyze the performance of systems. Queuing models are used for analytical modeling by service and arrival rate of load Multiple servers Multiple queues Example: For a given Web request rate, is it more effective to have 2 single-processor Web servers or 4 single-processor Web servers? RQ2012

The Art of Performance Evaluation When first presented to an analyst, most performance problems are expressed as an abstract feeling, by the end user. Defining the real problem and converting it to a form in which established tools and techniques can be used and where time and other constraints can be met is a major part of the analyst’s “art”. RQ2012

The Art of Performance Evaluation Evaluation cannot be produced mechanically Requires intimate knowledge of system Careful selection of methodology, workload, tools No one correct answer as two performance analysts may choose different metrics or workloads Like art, there are techniques to learn how to use them when to apply them RQ2012

Example: Comparing Two Systems Two systems, two workloads, measure transactions per second Which is better? System Workload 1 Workload 2 A 20 10 B RQ2012

Example: Comparing Two Systems Two systems, two workloads, measure transactions per second They are equally good! … but is A better than B? System Workload 1 Workload 2 Average A 20 10 15 B RQ2012

The Ratio Game Take system B as the base A is better! … but is B better than A? System Workload 1 Workload 2 Average A 2 0.5 1.25 B 1 RQ2012

The Ratio Game Take system A as the base B is better!? System Workload 1 Workload 2 Average A 1 B 0.5 2 1.25 RQ2012

Common Mistakes in Performance Evaluation Most of the mistakes discussed here are not intentional. Rather, they happen due to simple oversights, misconceptions, and lack of knowledge about performance evaluation techniques. RQ2012

Common Mistakes (1 of 4) Undefined Goals Biased Goals There is no such thing as a general model Describe goals and then design experiments (Don’t shoot and then draw target) Biased Goals Don’t show YOUR system better than HERS (Performance analysis is like a jury) RQ2012

Common Mistakes (2 of 4) Unrepresentative Workload Should be representative of how system will work “in the wild” Ex: large and small packets? Don’t test with only large or only small Wrong Evaluation Technique Use most appropriate: measurement, simulation, and analytical modeling (Don’t have a hammer and see everything as a nail) RQ2012

Common Mistakes (3 of 4) Inappropriate Level of Detail Can have too much! Ex: modeling disk Can have too little! Ex: analytic model for congested router No Sensitivity Analysis Analysis is evidence and not fact Need to determine how sensitive results are to settings RQ2012

Common Mistakes (4 of 4) Improper Presentation of Results It is not the number of graphs, but the number of graphs that help make decisions Omitting Assumptions and Limitations Ex: may assume most traffic TCP, whereas some links may have significant UDP traffic May lead to applying results where assumptions do not hold RQ2012

Checklist for Avoiding Common Mistakes in Performance Evaluation Is the system correctly defined and the goals are clearly stated? Are the goals stated in an unbiased manner? Have all the steps of the analysis followed systematically? Is the problem clearly understood before analyzing it? Are the performance metrics relevant for this problem? Is the workload correct for this problem? Is the evaluation technique appropriate? Is the list of parameters that affect performance complete? Have all parameters that affect performance been chosen as factors to be varied? Is the experimental design efficient in terms of time and results? Is the level of detail proper? Is the measured data presented with analysis and interpretation? Is the analysis statistically correct? Has the sensitivity analysis been done? Would errors in the input cause an insignificant change in the results? Have the outliers in the input or the output been treated properly? Have the future changes in the system and workload been modeled? Has the variance of input been taken into account? Has the variance of the results been analyzed? Is the analysis easy to explain? Is the presentation style suitable for its audience? Have the results been presented graphically as much as possible? Are the assumptions and limitations of the analysis clearly documented? RQ2012

A Systematic Approach State goals and define boundaries Select performance metrics List system and workload parameters Select factors and values Select evaluation techniques Select workload Design experiments Analyze and interpret the data Present the results. Repeat. RQ2012

State Goals and Define Boundaries Just “measuring performance” or “seeing how it works” is too broad Ex: goal is to decide which ISP provides better throughput Definition of system depend upon goals E.g.: if measuring CPU instruction speed, system may include CPU + cache E.g.: if measuring response time, system may include CPU + memory + … + OS + workload RQ2012

Select Metrics Criteria to compare performance In general, related to speed, accuracy and/or availability of system services Ex: network performance Speed: throughput and delay Accuracy: error rate Availability: data packets sent do arrive Ex: processor performance Speed: time to execute instructions RQ2012

List Parameters List all parameters that affect performance System parameters (hardware and software) Ex: CPU type, OS type, … Workload parameters Ex: Number of users, type of requests List may not be initially complete, so have working list and let grow as progress RQ2012

Select Factors to Study Divide parameters into those that are to be studied and those that are not Ex: may vary CPU type but fix OS type Ex: may fix packet size but vary no of connections Select appropriate levels for each factor Want typical and ones with potentially high impact For workload often smaller (1/2 or 1/10th) and larger (2x or 10x) range Start small or number can quickly overcome available resources! RQ2012

Select Evaluation Technique Depends upon time, resources and desired level of accuracy Analytic modeling Quick, less accurate Simulation Medium effort, medium accuracy Measurement Typical most effort, most accurate Note, above are all typical but can be reversed! RQ2012

Select Workload Set of service requests to system Depends upon measurement technique Analytic model may have probability of various requests Simulation may have trace of requests from real system Measurement may have scripts impose transactions Should be representative of real life RQ2012

Design Experiments Want to maximize results with minimal effort Phase 1: Many factors, few levels See which factors matter Phase 2: Few factors, more levels See where the range of impact for the factors is RQ2012

Analyze and Interpret Data Compare alternatives Take into account variability of results Statistical techniques Interpret results Analysis only produces results, not conclusions Results provide basis on which the analysts or decision makers can draw conclusions Different analysts may come to different conclusions RQ2012

Present Results The final step of all performance projects is to communicate the results to others It is important that the results be presented in a manner that is easily understood. e.g. graphic form and without statistical jargon Disseminate (entire methodology!) RQ2012

"The job of a scientist is not merely to see: it is to see, understand, and communicate. Leave out any of these phases, and you're not doing science. If you don't see, but you do understand and communicate, you're a prophet, not a scientist. If you don't understand, but you do see and communicate, you're a reporter, not a scientist. If you don't communicate, but you do see and understand, you're a mystic, not a scientist." RQ2012