Presentation is loading. Please wait.

Presentation is loading. Please wait.

Network Performance and Quality of Service

Similar presentations


Presentation on theme: "Network Performance and Quality of Service"— Presentation transcript:

1 Network Performance and Quality of Service
2. Performance Evaluation

2 Performance Evaluation
Performance is a key criterion in the design, procurement, and use of systems. The goal of engineers, scientists, analysts, and users is to get the highest performance for a given cost. To achieve this goal, one needs, at least, a basic knowledge of performance evaluation techniques. RQ2012

3 Basic Terms System: Any collection of hardware, software, and firmware
Metrics: Criteria used to evaluate the performance of the system components Workloads: The requests made by the users of the system RQ2012

4 Considerations (1 of 6) Select appropriate evaluation techniques, performance metrics and workloads for a system. Techniques: measurement, simulation, analytic modeling Metrics: criteria to study performance (eg: response time) Workloads: requests by users/applications to the system Example: What performance metrics should you use to compare the performance of … Two disk drives? Two transactions processing systems? Two packet retransmission algorithms? RQ2012

5 Considerations (2 of 6) Conduct performance measurements correctly
Need two tools Load generator – a tool to load the system Monitor – a tool to measure the results Example: Which type of monitor would be suitable for measuring following quantities? Response time from a Web server Utilization on a LAN Audio quality in a VoIP network RQ2012

6 Considerations (3 of 6) Use proper statistical techniques to compare several alternatives One run of workload often not sufficient Many non-deterministic events that effect performance Comparing average of several runs may also not lead to correct results Especially if variance is high Example: Packets lost on a link. Which link is better? File Size Link A Link B RQ2012

7 Considerations (4 of 6) Design measurement and simulation experiments to provide the most information with the least effort. Often many factors that affect performance. Separate out the effects that individually matter. Example: The performance of a network system depends upon three factors: A) type of routing technique: R1, R2 or none B) type of workload (traffic): http, VoIP, Gaming C) type of equipment: hubs, switches How many experiments are needed? How can the performance of each factor be estimated? RQ2012

8 Considerations (5 of 6) Perform simulations correctly
Select correct language/software, seeds for random numbers, length of simulation run, and analysis Before all of that, may need to validate simulator Example: To compare the performance of two routing algorithms: A) how long should the simulation be run? B) what can be done to get the same accuracy with a shorter run? RQ2012

9 Considerations (6 of 6) Use simple queuing models to analyze the performance of systems. Queuing models are used for analytical modeling by service and arrival rate of load Multiple servers Multiple queues Example: For a given Web request rate, is it more effective to have 2 single-processor Web servers or 4 single-processor Web servers? RQ2012

10 The Art of Performance Evaluation
When first presented to an analyst, most performance problems are expressed as an abstract feeling, by the end user. Defining the real problem and converting it to a form in which established tools and techniques can be used and where time and other constraints can be met is a major part of the analyst’s “art”. RQ2012

11 The Art of Performance Evaluation
Evaluation cannot be produced mechanically Requires intimate knowledge of system Careful selection of methodology, workload, tools No one correct answer as two performance analysts may choose different metrics or workloads Like art, there are techniques to learn how to use them when to apply them RQ2012

12 Example: Comparing Two Systems
Two systems, two workloads, measure transactions per second Which is better? System Workload 1 Workload 2 A 20 10 B RQ2012

13 Example: Comparing Two Systems
Two systems, two workloads, measure transactions per second They are equally good! … but is A better than B? System Workload 1 Workload 2 Average A 20 10 15 B RQ2012

14 The Ratio Game Take system B as the base A is better!
… but is B better than A? System Workload 1 Workload 2 Average A 2 0.5 1.25 B 1 RQ2012

15 The Ratio Game Take system A as the base B is better!? System
Workload 1 Workload 2 Average A 1 B 0.5 2 1.25 RQ2012

16 Common Mistakes in Performance Evaluation
Most of the mistakes discussed here are not intentional. Rather, they happen due to simple oversights, misconceptions, and lack of knowledge about performance evaluation techniques. RQ2012

17 Common Mistakes (1 of 4) Undefined Goals Biased Goals
There is no such thing as a general model Describe goals and then design experiments (Don’t shoot and then draw target) Biased Goals Don’t show YOUR system better than HERS (Performance analysis is like a jury) RQ2012

18 Common Mistakes (2 of 4) Unrepresentative Workload
Should be representative of how system will work “in the wild” Ex: large and small packets? Don’t test with only large or only small Wrong Evaluation Technique Use most appropriate: measurement, simulation, and analytical modeling (Don’t have a hammer and see everything as a nail) RQ2012

19 Common Mistakes (3 of 4) Inappropriate Level of Detail
Can have too much! Ex: modeling disk Can have too little! Ex: analytic model for congested router No Sensitivity Analysis Analysis is evidence and not fact Need to determine how sensitive results are to settings RQ2012

20 Common Mistakes (4 of 4) Improper Presentation of Results
It is not the number of graphs, but the number of graphs that help make decisions Omitting Assumptions and Limitations Ex: may assume most traffic TCP, whereas some links may have significant UDP traffic May lead to applying results where assumptions do not hold RQ2012

21 Checklist for Avoiding Common Mistakes in Performance Evaluation
Is the system correctly defined and the goals are clearly stated? Are the goals stated in an unbiased manner? Have all the steps of the analysis followed systematically? Is the problem clearly understood before analyzing it? Are the performance metrics relevant for this problem? Is the workload correct for this problem? Is the evaluation technique appropriate? Is the list of parameters that affect performance complete? Have all parameters that affect performance been chosen as factors to be varied? Is the experimental design efficient in terms of time and results? Is the level of detail proper? Is the measured data presented with analysis and interpretation? Is the analysis statistically correct? Has the sensitivity analysis been done? Would errors in the input cause an insignificant change in the results? Have the outliers in the input or the output been treated properly? Have the future changes in the system and workload been modeled? Has the variance of input been taken into account? Has the variance of the results been analyzed? Is the analysis easy to explain? Is the presentation style suitable for its audience? Have the results been presented graphically as much as possible? Are the assumptions and limitations of the analysis clearly documented? RQ2012

22 A Systematic Approach State goals and define boundaries
Select performance metrics List system and workload parameters Select factors and values Select evaluation techniques Select workload Design experiments Analyze and interpret the data Present the results. Repeat. RQ2012

23 State Goals and Define Boundaries
Just “measuring performance” or “seeing how it works” is too broad Ex: goal is to decide which ISP provides better throughput Definition of system depend upon goals E.g.: if measuring CPU instruction speed, system may include CPU + cache E.g.: if measuring response time, system may include CPU + memory + … + OS + workload RQ2012

24 Select Metrics Criteria to compare performance
In general, related to speed, accuracy and/or availability of system services Ex: network performance Speed: throughput and delay Accuracy: error rate Availability: data packets sent do arrive Ex: processor performance Speed: time to execute instructions RQ2012

25 List Parameters List all parameters that affect performance
System parameters (hardware and software) Ex: CPU type, OS type, … Workload parameters Ex: Number of users, type of requests List may not be initially complete, so have working list and let grow as progress RQ2012

26 Select Factors to Study
Divide parameters into those that are to be studied and those that are not Ex: may vary CPU type but fix OS type Ex: may fix packet size but vary no of connections Select appropriate levels for each factor Want typical and ones with potentially high impact For workload often smaller (1/2 or 1/10th) and larger (2x or 10x) range Start small or number can quickly overcome available resources! RQ2012

27 Select Evaluation Technique
Depends upon time, resources and desired level of accuracy Analytic modeling Quick, less accurate Simulation Medium effort, medium accuracy Measurement Typical most effort, most accurate Note, above are all typical but can be reversed! RQ2012

28 Select Workload Set of service requests to system
Depends upon measurement technique Analytic model may have probability of various requests Simulation may have trace of requests from real system Measurement may have scripts impose transactions Should be representative of real life RQ2012

29 Design Experiments Want to maximize results with minimal effort
Phase 1: Many factors, few levels See which factors matter Phase 2: Few factors, more levels See where the range of impact for the factors is RQ2012

30 Analyze and Interpret Data
Compare alternatives Take into account variability of results Statistical techniques Interpret results Analysis only produces results, not conclusions Results provide basis on which the analysts or decision makers can draw conclusions Different analysts may come to different conclusions RQ2012

31 Present Results The final step of all performance projects is to communicate the results to others It is important that the results be presented in a manner that is easily understood. e.g. graphic form and without statistical jargon Disseminate (entire methodology!) RQ2012

32 "The job of a scientist is not merely to see: it is to see, understand, and communicate. Leave out any of these phases, and you're not doing science. If you don't see, but you do understand and communicate, you're a prophet, not a scientist. If you don't understand, but you do see and communicate, you're a reporter, not a scientist. If you don't communicate, but you do see and understand, you're a mystic, not a scientist." RQ2012


Download ppt "Network Performance and Quality of Service"

Similar presentations


Ads by Google