1 PERFORMANCE EVALUATION H Often one needs to design and conduct an experiment in order to: – demonstrate that a new technique or concept is feasible –demonstrate.

Slides:



Advertisements
Similar presentations
Variance reduction techniques. 2 Introduction Simulation models should be coded such that they are efficient. Efficiency in terms of programming ensures.
Advertisements

Experimental Design, Response Surface Analysis, and Optimization
Copyright © 2005 Department of Computer Science CPSC 641 Winter PERFORMANCE EVALUATION Often in Computer Science you need to: – demonstrate that.
A brief overview What is program evaluation? How is an evaluation conducted? When should it be used? When can it be used? Used with Permission of: John.
Operations Research Session Objectives:  To describe the need and importance of Operations Research for rationale decision making in health care delivery.
Chapter 14: Usability testing and field studies. 2 FJK User-Centered Design and Development Instructor: Franz J. Kurfess Computer Science Dept.
1 Statistical Inference H Plan: –Discuss statistical methods in simulations –Define concepts and terminology –Traditional approaches: u Hypothesis testing.
Project 4 U-Pick – A Project of Your Own Design Proposal Due: April 14 th (earlier ok) Project Due: April 25 th.
Ad-Hoc Networking Course Instructor: Carlos Pomalaza-Ráez D. D. Perkins, H. D. Hughes, and C. B. Owen: ”Factors Affecting the Performance of Ad Hoc Networks”,
Achitecting an Active Classroom: An Integrative Approach Rocky K. C. Chang Department of Computing The Hong Kong Polytechnic University
1 Experimental Methodology H Experimental methods can be used to: – demonstrate that a new concept, technique, or algorithm is feasible –demonstrate that.
OS Fall ’ 02 Performance Evaluation Operating Systems Fall 2002.
Performance Evaluation
Chapter 28 Design of Experiments (DOE). Objectives Define basic design of experiments (DOE) terminology. Apply DOE principles. Plan, organize, and evaluate.
1 PERFORMANCE EVALUATION H Often in Computer Science you need to: – demonstrate that a new concept, technique, or algorithm is feasible –demonstrate that.
Scientific method - 1 Scientific method is a body of techniques for investigating phenomena and acquiring new knowledge, as well as for correcting and.
Performance and Robustness Testing of Explicit-Rate ABR Flow Control Schemes Milan Zoranovic Carey Williamson October 26, 1999.
OS Fall ’ 02 Performance Evaluation Operating Systems Fall 2002.
SIMULATION. Simulation Definition of Simulation Simulation Methodology Proposing a New Experiment Considerations When Using Computer Models Types of Simulations.
CS533 Modeling and Performance Evaluation of Network and Computer Systems Mark Claypool.
Chapter 12: Simulation and Modeling Invitation to Computer Science, Java Version, Third Edition.
Introduction to the design (and analysis) of experiments James M. Curran Department of Statistics, University of Auckland
IE 594 : Research Methodology – Discrete Event Simulation David S. Kim Spring 2009.
Performance Evaluation of Computer Systems and Networks By Behzad Akbari Tarbiat Modares University Spring 2012 In the Name of the Most High.
Chapter 12: Simulation and Modeling
Commercial Database Applications Testing. Test Plan Testing Strategy Testing Planning Testing Design (covered in other modules) Unit Testing (covered.
Experimental Design Tutorial Presented By Michael W. Totaro Wireless Research Group Center for Advanced Computer Studies University of Louisiana at Lafayette.
Introduction to Discrete Event Simulation Customer population Service system Served customers Waiting line Priority rule Service facilities Figure C.1.
Introducing Unit Specifications and Unit Assessment Support Packs Art & Design National 3, 4 & 5.
© 2003, Carla Ellis Experimentation in Computer Systems Research Why: “It doesn’t matter how beautiful your theory is, it doesn’t matter how smart you.
Evaluation of software engineering. Software engineering research : Research in SE aims to achieve two main goals: 1) To increase the knowledge about.
CPSC 531: Experiment Design1 CPSC 531: Experiment Design and Performance Evaluation Instructor: Anirban Mahanti Office: ICT 745
Analysis of Simulation Results Chapter 25. Overview  Analysis of Simulation Results  Model Verification Techniques  Model Validation Techniques  Transient.
9 Differential Analysis and Product Pricing Managerial Accounting 13e
ICOM 6115: Computer Systems Performance Measurement and Evaluation August 11, 2006.
Center for Radiative Shock Hydrodynamics Fall 2011 Review Assessment of predictive capability Derek Bingham 1.
Performance Analysis of Real Traffic Carried with Encrypted Cover Flows Nabil Schear David M. Nicol University of Illinois at Urbana-Champaign Department.
Course Information Andy Wang CIS 5930 Computer Systems Performance Analysis.
McGraw-Hill/Irwin © 2006 The McGraw-Hill Companies, Inc., All Rights Reserved. 1.
Copyright 2006 John Wiley & Sons, Inc. Chapter 1 - Introduction HCI: Designing Effective Organizational Systems Dov Te’eni Jane Carey Ping Zhang.
Deliverable 10. Deliverable #10 Class Design and Implementation #1 due 9 April Executive Summary and Statement of Work Iteration Plan (Updated) for remaining.
ECE 466/658: Performance Evaluation and Simulation Introduction Instructor: Christos Panayiotou.
Building Simulation Model In this lecture, we are interested in whether a simulation model is accurate representation of the real system. We are interested.
Chapter 3 System Performance and Models Introduction A system is the part of the real world under study. Composed of a set of entities interacting.
1 OUTPUT ANALYSIS FOR SIMULATIONS. 2 Introduction Analysis of One System Terminating vs. Steady-State Simulations Analysis of Terminating Simulations.
Quantitative Wildlife Ecology Thinking Quantitatively Fear of mathematics Uncertainty & the art Sampling, experimental design, & analyses Presentation.
OPERATING SYSTEMS CS 3530 Summer 2014 Systems and Models Chapter 03.
Csci 418/618 Simulation Models Dr. Ken Nygard, IACC 262B
 Simulation enables the study of complex system.  Simulation is a good approach when analytic study of a system is not possible or very complex.  Informational,
WHAT IS RESEARCH? According to Redman and Morry,
Building Valid, Credible & Appropriately Detailed Simulation Models
Course Information Andy Wang CIS 5930 Computer Systems Performance Analysis.
Chapter 12: Simulation and Modeling
OPERATING SYSTEMS CS 3502 Fall 2017
Part Two.
Performance evaluation
Manufacturing system design (MSD)
CPSC 531: System Modeling and Simulation
Statistical Methods Carey Williamson Department of Computer Science
Computer Systems Performance Evaluation
Devil physics The baddest class on campus IB Physics
Carey Williamson Department of Computer Science University of Calgary
Computer Systems Performance Evaluation
MECH 3550 : Simulation & Visualization
CPSC 531: System Modeling and Simulation
Introduction to the design (and analysis) of experiments
MECH 3550 : Simulation & Visualization
SIMULATION IN THE FINANCE INDUSTRY BY HARESH JANI
Modeling and Simulation: Exploring Dynamic System Behaviour
DOE Terminologies IE-432.
Presentation transcript:

1 PERFORMANCE EVALUATION H Often one needs to design and conduct an experiment in order to: – demonstrate that a new technique or concept is feasible –demonstrate that a new method is better than an existing method –understand the impact of various factors and parameters on the overall system performance

2 PERFORMANCE EVALUATION H There is a whole field of computer science called computer systems performance evaluation that does exactly this H One of the best books is Raj Jain’s “The Art of Computer Systems Performance Analysis”, Wiley & Sons, 1991 (listed in bibliography) H Much of what is outlined in this presentation is described in more detail in [Jain 1991]

3 PERF EVAL 101: THE BASICS H There are three main methods used in the design of performance studies H Experimental approaches –measurement and use of a real system H Analytic approaches –the use of mathematics, queueing theory, Petri Nets, abstract models, etc H Simulation approaches –design and use of computer simulations and simplified models to assess performance

4 EXPERIMENTAL DESIGN AND METHODOLOGY H The design of a performance study requires great care in experimental design and methodology H Need to identify –experimental factors to be tested –levels (settings) for these factors –performance metrics to be used –experimental design to be used

5 FACTORS H Factors are the main “components” or “things” that are to be varied in an experiment, because their impact on performance wants to be understood H Examples: switch size, network load, number of buffers at output ports H Need to choose factors properly, since the number of factors affects size of study

6 LEVELS H Levels are the precise settings of the factors that are to be used in an experiment H Examples: switch size N = 2, 4, 8, 16 H Example: buffer size B = 100, 200, 400, 800 H Need to choose levels realistically H Need to cover reasonable portion of the design space

7 PERFORMANCE METRICS H Performance metrics specify what you want to measure in your performance study H Examples: cell loss, cell delay H Must choose your metrics properly and instrument your experiment accordingly

8 EXPERIMENTAL DESIGN H Experimental design refers to the organizational structure of your experiment H Need to methodically go through factors and levels to get the full range of experimental results desired H There are several “classical” approaches to experimental design

9 EXAMPLES H One factor at a time –vary only one factor through its levels to see what the impact is on performance H Two factors at a time –vary two factors to see not only their individual effects, but also their interaction effects, if any H Full factorial –try every possible combination of factors and levels to see full range of performance results

10 OTHER ISSUES H Simulation run length –choosing a long enough run time to get statistically meaningful results (equilibrium) H Simulation start-up effects and end effects –deciding how much to “chop off” at the start and end of simulations to get proper results H Replications –ensure repeatability of results, and gain greater statistical confidence in the results given H Presentation of results

11 SUMMARY H Great care must be taken in experimental design and methodology if the experiment is to achieve its goal, and if results are to be fully understood H Computer systems performance evaluation defines standard methods for designing and conducting performance studies H Please follow these guidelines (where applicable) when doing assignments and course projects