Architecture & System Performance

Slides:



Advertisements
Similar presentations
Test process essentials Riitta Viitamäki,
Advertisements

SWEN 5130 Requirements EngineeringSlide 1 Software Prototyping u Animating and demonstrating system requirements.
Data Mining Methodology 1. Why have a Methodology  Don’t want to learn things that aren’t true May not represent any underlying reality ○ Spurious correlation.
1 Prescriptive Process Models. 2 Prescriptive Models Prescriptive process models advocate an orderly approach to software engineering Prescriptive process.
Rational Unified Process
©Ian Sommerville 2000 Software Engineering, 6th edition. Chapter 8 Slide 1 Software Prototyping l Rapid software development.
Swami NatarajanJune 17, 2015 RIT Software Engineering Reliability Engineering.
SE 450 Software Processes & Product Metrics Reliability Engineering.
1 Software Testing and Quality Assurance Lecture 40 – Software Quality Assurance.
Software Architecture Quality. Outline Importance of assessing software architecture Better predict the quality of the system to be built How to improve.
The Software Product Life Cycle. Views of the Software Product Life Cycle  Management  Software engineering  Engineering design  Architectural design.
Copyright © , Software Engineering Research. All rights reserved. Creating Responsive Scalable Software Systems Dr. Lloyd G. Williams Software.
Computer System Lifecycle Chapter 1. Introduction Computer System users, administrators, and designers are all interested in performance evaluation. Whether.
1 Computer Performance: Metrics, Measurement, & Evaluation.
Introduction to RUP Spring Sharif Univ. of Tech.2 Outlines What is RUP? RUP Phases –Inception –Elaboration –Construction –Transition.
1 Software Testing (Part-II) Lecture Software Testing Software Testing is the process of finding the bugs in a software. It helps in Verifying and.
Software Development *Life-Cycle Phases* Compiled by: Dharya Dharya Daisy Daisy
Relating Testing to Quality –Timeliness of Testing –Quality Attributes Gauge by Testing –Roles Defining Test Discipline Activities Elaborating the Test.
©Ian Sommerville 1995/2000 (Modified by Spiros Mancoridis 1999) Software Engineering, 6th edition. Chapter 8 Slide 1 Software Prototyping l Animating and.
©Ian Sommerville 2000 Software Engineering, 6th edition. Chapter 8 Slide 1 Software Prototyping l Rapid software development to validate requirements.
Software Development Software Testing. Testing Definitions There are many tests going under various names. The following is a general list to get a feel.
Testing Workflow In the Unified Process and Agile/Scrum processes.
Software Project Management Lecture 11. Outline Brain Storming session  Some simple discussion on questions and their answers  Case studies related.
University of Palestine software engineering department Testing of Software Systems Testing throughout the software life cycle instructor: Tasneem.
ICOM 6115: Computer Systems Performance Measurement and Evaluation August 11, 2006.
Fifth Lecture Hour 9:30 – 10:20 am, September 9, 2001 Framework for a Software Management Process – Life Cycle Phases (Part II, Chapter 5 of Royce’ book)
CEN5011, Fall CEN5011 Software Engineering Dr. Yi Deng ECS359, (305)
Rational Unified Process (RUP) Process Meta-model Inception Phase These notes adopted and slightly modified from “RUP Made Easy”, provided by the IBM Academic.
Cmpe 589 Spring 2006 Lecture 2. Software Engineering Definition –A strategy for producing high quality software.
Software Prototyping Rapid software development to validate requirements.
Modeling Virtualized Environments in Simalytic ® Models by Computing Missing Service Demand Parameters CMG2009 Paper 9103, December 11, 2009 Dr. Tim R.
Ivar Jacobson, Grady Booch, and James Rumbaugh The Unified Software Development Process Addison Wesley, : James Rumbaugh's OOMD 1992: Ivar Jacobson's.
Performance Testing Test Complete. Performance testing and its sub categories Performance testing is performed, to determine how fast some aspect of a.
Introduction to Performance Testing Performance testing is the process of determining the speed or effectiveness of a computer, network, software program.
Software Development. The Software Life Cycle Encompasses all activities from initial analysis until obsolescence Analysis of problem or request Analysis.
TK2023 Object-Oriented Software Engineering
Software Development.
Analytics and Value Creation
Project Cost Management
Prototyping in the software process
PREPARED BY G.VIJAYA KUMAR ASST.PROFESSOR
Software Prototyping.
性能测试那些事儿 刘博 ..
The Development Process of Web Applications
Software Architecture in Practice
Architecture & System Performance
Chapter 8 – Software Testing
Software Engineering: A Practitioner’s Approach, 7/e Chapter 2 Prescriptive Process Models copyright © 1996, 2001, 2005 R.S. Pressman & Associates, Inc.
Software Prototyping Animating and demonstrating system requirements.
Software Engineering: A Practitioner’s Approach, 7/e Chapter 2 Prescriptive Process Models copyright © 1996, 2001, 2005 R.S. Pressman & Associates, Inc.
Object Oriented Analysis and Design
Architecture review methods
Process Models Coming up: Prescriptive Models.
Requirements Engineering Bsc Applied Computing Year 2
Chapter 25 Process and Project Metrics
Software Engineering: A Practitioner’s Approach, 6/e Chapter 3 Prescriptive Process Models copyright © 1996, 2001, 2005 R.S. Pressman & Associates, Inc.
Software Verification, Validation, and Acceptance Testing
Architecture review methods
Architecture & System Performance
UNIT 5 EMBEDDED SYSTEM DEVELOPMENT
Dynamic Program Analysis
UNIT 5 EMBEDDED SYSTEM DEVELOPMENT
Chapter 32 Process and Project Metrics
Software Engineering: A Practitioner’s Approach, 6/e Chapter 3 Prescriptive Process Models copyright © 1996, 2001, 2005 R.S. Pressman & Associates, Inc.
Jia-Bin Huang Virginia Tech
Dell EMC SQL Server Solutions Doug Bernhardt
Software Engineering: A Practitioner’s Approach, 6/e Chapter 3 Prescriptive Process Models copyright © 1996, 2001, 2005 R.S. Pressman & Associates, Inc.
Presentation transcript:

Architecture & System Performance

Performance NOT always critical to a project At least 80% of the slow code in the world is NOT WORTH SPEEDING UP But for the other 20%, performance is Hard to manage Really hard to fix after the fact! The architect is the first person to impact performance!

If you remember only one thing You cannot control what you cannot measure. You cannot manage what you cannot quantify.

Performance in the life of an architecture Early (inception) Requirements & architecture (elaboration) Development & testing (construction) Beta testing & deployment (transition) Maintenance & enhancement (later releases)

Inception Focus: get the basic parameters Size Speed Cost System boundary

Elaboration Focus: validate the architecture with respect to performance, capacity, and hardware cost Define performance & capacity related quality attribute scenarios Establish engineering parameters, including Safety margins Utilization limits Begin analytical modeling Spreadsheet models work best at this stage Establish resource budgets Measure early and often Hardware characteristics Performance of prototypes for major risk items

Resource budgets Architecture team translates system-wide numbers into targets that make sense to developers and testers working on modules or sub-systems Critically dependent on scenario frequencies Good example of an allocation view Resources that can be budgeted include: CPU time Elapsed time Disk accesses Network traffic Memory utilization

More on resource budgets Start at a very high level, for example: Communication gets 8% CPU Servlets get 10% Session beans get 15% Entity beans get 20% Logging gets 5% Monitoring gets 2% Respect your engineering parameters e.g. engineering for 60% CPU utilization

Resource budgets - 3 Refine the budget as you learn more About expected resource consumption by subsystem by scenario About scenario frequencies About platform capacity Hardware Database Middleware The goal: answer the developers’ and testers’ question: how fast is fast enough?

Construction Focus: monitor as-built performance & capacity vs as-designed Measure, measure, and measure some more Replace assumed parameters with measurements as they become available Refine models As system matures, queuing models improve in accuracy and usefulness Adjust budgets as needed

Transition Focus: improvement of models Keep on measuring Stress test Identify and deal with problems

Maintenance and Enhancement Focus: predict impact of potential changes Spreadsheet models for forecasting effects on throughput Queuing models for forecasting effects on response time

Measuring Performance State your goals Clear (what parameter or attribute are you studying?) Quantifiable Define the system Boundary is VERY important here Identify scenario(s) to be measured Also known as “outcomes” Identify other workload parameters Define the metrics Should make sense, given your goals For example, don’t report response time if you’re studying CPU utilization Develop the test harness(es) and driver(s)

Services and outcomes Services are the activities of the system Outcomes are results of a service execution Positive Negative Some outcomes are more expensive than others Normally, services are use case related Major administrative services should be modeled, even if not captured in use cases Often, a use case requires execution of multiple services

Tools and techniques Measurement, including Evaluation UNIX: ps, vmstat, sar, prof, glance, truss, Purify Windows: perfmon, sysinternals Network: netstat, ping, tracert Database: DBA tools, explain plan Programming language specific profilers Evaluation Analytical modeling Spreadsheet models Queuing models Simulation Real system measurement

One more thing to remember … Calibrate your tools Simulated users don’t always match real users Test harnesses and drivers are software, which implies bugs will appear Cross-check your measurements If the system is 80% utilized but your per-process measurements add up to 43%, find out what you’re missing (transient processes?) In short, be paranoid