CPSC 531: System Modeling and Simulation

Slides:



Advertisements
Similar presentations
Web Server Benchmarking Using the Internet Protocol Traffic and Network Emulator Carey Williamson, Rob Simmonds, Martin Arlitt et al. University of Calgary.
Advertisements

Doc.: IEEE /0604r1 Submission May 2014 Slide 1 Modeling and Evaluating Variable Bit rate Video Steaming for ax Date: Authors:
Copyright © 2005 Department of Computer Science CPSC 641 Winter PERFORMANCE EVALUATION Often in Computer Science you need to: – demonstrate that.
What’s the Problem Web Server 1 Web Server N Web system played an essential role in Proving and Retrieve information. Cause Overloaded Status and Longer.
ITEC 451 Network Design and Analysis. 2 You will Learn: (1) Specifying performance requirements Evaluating design alternatives Comparing two or more systems.
1 Network Measurements of a Wireless Classroom Network Carey Williamson Nuha Kamaluddeen Department of Computer Science University of Calgary.
Multi-Variate Analysis of Mobility Models for Network Protocol Performance Evaluation Carey Williamson Nayden Markatchev
July 2003SPECTS Network-Level Impacts on User-Level Web Performance Carey Williamson Nayden Markatchev University of Calgary.
Ad-Hoc Networking Course Instructor: Carlos Pomalaza-Ráez D. D. Perkins, H. D. Hughes, and C. B. Owen: ”Factors Affecting the Performance of Ad Hoc Networks”,
October 14, 2002MASCOTS Workload Characterization in Web Caching Hierarchies Guangwei Bai Carey Williamson Department of Computer Science University.
1 PERFORMANCE EVALUATION H Often one needs to design and conduct an experiment in order to: – demonstrate that a new technique or concept is feasible –demonstrate.
1 Experimental Methodology H Experimental methods can be used to: – demonstrate that a new concept, technique, or algorithm is feasible –demonstrate that.
OS Fall ’ 02 Performance Evaluation Operating Systems Fall 2002.
Performance Evaluation
1 CS 430 / INFO 430 Information Retrieval Lecture 24 Usability 2.
1 PERFORMANCE EVALUATION H Often in Computer Science you need to: – demonstrate that a new concept, technique, or algorithm is feasible –demonstrate that.
Performance and Robustness Testing of Explicit-Rate ABR Flow Control Schemes Milan Zoranovic Carey Williamson October 26, 1999.
CS533 Modeling and Performance Evaluation of Network and Computer Systems Introduction (Chapters 1 and 2)
OS Fall ’ 02 Performance Evaluation Operating Systems Fall 2002.
Performance Evaluation Techniques Professor Bob Kinicki Computer Science Department.
The Perception of Correlation in Scatterplots Ronald A. Rensink Departments of Computer Science and Psychology University of British Columbia Vancouver,
Data starts with width and height of image Then an array of pixel values (colors) The number of elements in this array is width times height Colors can.
A Self-tuning Page Cleaner for the DB2 Buffer Pool Wenguang Wang Rick Bunt Department of Computer Science University of Saskatchewan.
Dynamic and Decentralized Approaches for Optimal Allocation of Multiple Resources in Virtualized Data Centers Wei Chen, Samuel Hargrove, Heh Miao, Liang.
1 9/23/2015 MATH 224 – Discrete Mathematics Basic finite probability is given by the formula, where |E| is the number of events and |S| is the total number.
Robust Design Integrated Product and Process Development MeEn 475/476 “Great Product, Solid, and just always works.” - CNET user review of MacBook Pro.
CPSC 531: Experiment Design1 CPSC 531: Experiment Design and Performance Evaluation Instructor: Anirban Mahanti Office: ICT 745
Analysis of the Impact and Interactions of Protocol and Environmental Parameters on Overall MANET Performance Michael W. Totaro and Dmitri D. Perkins Center.
Quality by Design (QbD) Myth : An expensive development tool ! Fact : A tool that makes product development and commercial scale manufacturing simple !
Your university or experiment logo here Caitriana Nicholson University of Glasgow Dynamic Data Replication in LCG 2008.
ICOM 6115: Computer Systems Performance Measurement and Evaluation August 11, 2006.
Performance Analysis of Real Traffic Carried with Encrypted Cover Flows Nabil Schear David M. Nicol University of Illinois at Urbana-Champaign Department.
1 Evaluation of Cooperative Web Caching with Web Polygraph Ping Du and Jaspal Subhlok Department of Computer Science University of Houston presented at.
Investigating the Effects of Using Different Nursery Sizing Policies on Performance Tony Guan, Witty Srisa-an, and Neo Jia Department of Computer Science.
1 On Dynamic Parallelism Adjustment Mechanism for Data Transfer Protocol GridFTP Takeshi Itou, Hiroyuki Ohsaki Graduate School of Information Sci. & Tech.
Ó 1998 Menascé & Almeida. All Rights Reserved.1 Part V Workload Characterization for the Web.
Improving Disk Throughput in Data-Intensive Servers Enrique V. Carrera and Ricardo Bianchini Department of Computer Science Rutgers University.
S7-1 ADM , Section 7, August 2005 Copyright  2005 MSC.Software Corporation SECTION 7 DESIGN OF EXPERIMENTS.
1 Evaluation of Cooperative Web Caching with Web Polygraph Ping Du and Jaspal Subhlok Department of Computer Science University of Houston presented at.
Agent-Based Modeling ANB 218a Jeff Schank.
OPERATING SYSTEMS CS 3502 Fall 2017
Quality Management ISO 9001
Network Performance and Quality of Service
ITEC 451 Network Design and Analysis
Parallel Density-based Hybrid Clustering
CS 591 S1 – Computational Audio -- Spring, 2017
CPSC 531: System Modeling and Simulation
Understanding and Exploiting Amazon EC2 Spot Instances
CPSC 641: LAN Measurement Carey Williamson
CPSC 531: System Modeling and Simulation
CPSC 531: System Modeling and Simulation
Statistical Methods Carey Williamson Department of Computer Science
Software Engineering Experimentation
CPSC 531: System Modeling and Simulation
Model Validation Carey Williamson Department of Computer Science
Computer Systems Performance Evaluation
Devil physics The baddest class on campus IB Physics
HTTP and TCP Carey Williamson Department of Computer Science
Congestion Control in SDN-Enabled Networks
Naman shah Harshil shah Priyank BambhrOLIA
Performance Evaluation of Computer Networks
Carey Williamson Department of Computer Science University of Calgary
Computer Systems Performance Evaluation
Performance Evaluation of Computer Networks
Carey Williamson Department of Computer Science University of Calgary
Carey Williamson Department of Computer Science University of Calgary
Carey Williamson Department of Computer Science University of Calgary
The Essentials of 2-Level Design of Experiments Part I: The Essentials of Full Factorial Designs Developed by Don Edwards, John Grego and James Lynch.
Congestion Control in SDN-Enabled Networks
Modeling and Evaluating Variable Bit rate Video Steaming for ax
Presentation transcript:

CPSC 531: System Modeling and Simulation Carey Williamson Department of Computer Science University of Calgary Fall 2017

Performance Evaluation Often in Computer Science you need to: demonstrate that a new concept, technique, or algorithm is feasible demonstrate that a new method is better than an existing method understand the impact of various factors and parameters on the performance, scalability, or robustness of a system

Performance Evaluation Methodology The performance evaluation work can be done using either experimental, simulation, or analytical approaches (or a combination thereof) The design of a performance study requires careful thought about the experimental design/methodology Need to identify experimental factors to be tested levels (settings) for these factors performance metrics to be used experimental design to be used

Factors Factors are the main “components” that are varied in an experiment, in a controlled way, in order to understand their impact on system performance Examples: request rate, file size, read/write ratio, number of concurrent clients Need to choose factors properly, since the number of factors affects the size of the performance study

Levels Levels are the precise settings of the factors that are to be used in an experiment Examples: file size S = 1 KB, 10 KB, 1 MB Example: num clients C = 10, 20, 30, 40, 50 Example: http (unencrypted), https (encrypted) Example: sort algorithm = selection, merge, quicksort Need to choose levels realistically Need to cover useful portion of the design space

Performance Metrics Performance metrics specify what you want to measure in your performance study Examples: response time, throughput, packet loss Must choose your metrics properly and instrument your experiment accordingly

Experimental Design Experimental design refers to the organizational structure of your performance study Need to methodically go through factors and levels to get the full range of experimental results desired There are several “classical” approaches to experimental design

Examples of Experimental Design One factor at a time vary only one factor through its levels to see what the impact is on performance all other factors are kept fixed at their default settings Two factors at a time vary two factors to see not only their individual effects, but also their interaction effects, if any Full factorial try every possible combination of factors and levels to see full range of performance results

Summary Computer systems performance evaluation defines well-known methods for designing and conducting performance studies Great care must be taken in experimental design and methodology if the experiment is to achieve its goal, and if results are to be fully understood