Topics to be discussed Introduction Performance Factors Methodology Test Process Tools Conclusion Abu Bakr Siddiq.

Slides:



Advertisements
Similar presentations
Cultural Heritage in REGional NETworks REGNET Project Meeting Content Group
Advertisements

Performance Testing - Kanwalpreet Singh.
Web Performance Tuning Lin Wang, Ph.D. US Department of Education Copyright [Lin Wang] [2004]. This work is the intellectual property of the author. Permission.
Ch-11 Project Execution and Termination. System Testing This involves two different phases with two different outputs First phase is system test planning.
Requirements Specification and Management
Software Quality Assurance Plan
DESIGNING A PUBLIC KEY INFRASTRUCTURE
  Copyright 2003 by SPAN Technologies. Performance Assessments of Internet Systems By Kishore G. Kamath SPAN Technologies Testing solutions for the enterprise.
Swami NatarajanJune 17, 2015 RIT Software Engineering Reliability Engineering.
APPLICATION DEVELOPMENT BY SYED ADNAN ALI.
1 Software Testing and Quality Assurance Lecture 40 – Software Quality Assurance.
MCITP Guide to Microsoft Windows Server 2008 Server Administration (Exam #70-646) Chapter 14 Server and Network Monitoring.
©Company confidential 1 Performance Testing for TM & D – An Overview.
Testing - an Overview September 10, What is it, Why do it? Testing is a set of activities aimed at validating that an attribute or capability.
Data Warehousing: Defined and Its Applications Pete Johnson April 2002.
© 2006, Cognizant Technology Solutions. All Rights Reserved. The information contained herein is subject to change without notice. Automation – How to.
VMware vCenter Server Module 4.
CH 13 Server and Network Monitoring. Hands-On Microsoft Windows Server Objectives Understand the importance of server monitoring Monitor server.
MSF Testing Introduction Functional Testing Performance Testing.
Windows Server 2008 Chapter 11 Last Update
Load Test Planning Especially with HP LoadRunner >>>>>>>>>>>>>>>>>>>>>>
Computer System Lifecycle Chapter 1. Introduction Computer System users, administrators, and designers are all interested in performance evaluation. Whether.
CNJohnson & Associates, Inc An Overview of Chargeback Best Practices.
Existing Network Study CPIT 375 Data Network Designing and Evaluation.
Chapter-4 Windows 2000 Professional Win2K Professional provides a very usable interface and was designed for use in the desktop PC. Microsoft server system.
TESTING STRATEGY Requires a focus because there are many possible test areas and different types of testing available for each one of those areas. Because.

Database Systems: Design, Implementation, and Management Eighth Edition Chapter 10 Database Performance Tuning and Query Optimization.
© 2012 IBM Corporation Rational Insight | Back to Basis Series Chao Zhang Unit Testing.
Software Testing Life Cycle
1 An SLA-Oriented Capacity Planning Tool for Streaming Media Services Lucy Cherkasova, Wenting Tang, and Sharad Singhal HPLabs,USA.
Bottlenecks: Automated Design Configuration Evaluation and Tune.
Performance Concepts Mark A. Magumba. Introduction Research done on 1058 correspondents in 2006 found that 75% OF them would not return to a website that.
© 2008 Hewlett-Packard Development Company, L.P. The information contained herein is subject to change without notice Introduction to HP Availability Manager.
Auditing Information Systems (AIS)
Building Quality into Web Applications - Meeting the Challenges of Testing and Usability Paula Duchnowski CQA, CSTE (608)
ICOM 6115: Computer Systems Performance Measurement and Evaluation August 11, 2006.
Business Data Communications, Fourth Edition Chapter 11: Network Management.
Chair of Software Engineering Exercise Session 6: V & V Software Engineering Prof. Dr. Bertrand Meyer March–June 2007.
Ó 1998 Menascé & Almeida. All Rights Reserved.1 Part V Workload Characterization for the Web.
Lecture 11 Introduction to Information Systems Lecture 12 Objectives  Describe an information system and explain its components  Describe the characteristics.
Automated Testing Gireendra Kasmalkar Prabodhan Exports Pvt. Ltd.
Network design Topic 6 Testing and documentation.
Monitoring and Managing Server Performance. Server Monitoring To become familiar with the server’s performance – typical behavior Prevent problems before.
Software Engineering1  Verification: The software should conform to its specification  Validation: The software should do what the user really requires.
Chapter 8 System Management Semester 2. Objectives  Evaluating an operating system  Cooperation among components  The role of memory, processor,
Software Architecture in Practice Mandatory project in performance engineering.
Introduction to Performance Testing Performance testing is the process of determining the speed or effectiveness of a computer, network, software program.
“The Role of Experience in Software Testing Practice” A Review of the Article by Armin Beer and Rudolf Ramler By Jason Gero COMP 587 Prof. Lingard Spring.
If you have a transaction processing system, John Meisenbacher
Tool Support for Testing Classify different types of test tools according to their purpose Explain the benefits of using test tools.
Configuring SQL Server for a successful SharePoint Server Deployment Haaron Gonzalez Solution Architect & Consultant Microsoft MVP SharePoint Server
Test Loads Andy Wang CIS Computer Systems Performance Analysis.
1 Presented by: Val Pennell, Test Tool Manager Date: March 9, 2004 Software Testing Tools – Load Testing.
Introduction to Performance Tuning Chia-heng Tu PAS Lab Summer Workshop 2009 June 30,
SQL Database Management
Performance Testing of Web Apps
SOFTWARE TESTING Date: 29-Dec-2016 By: Ram Karthick.
SYSTEM TESTING.
性能测试那些事儿 刘博 ..
Software Architecture in Practice
SOFTWARE TESTING OVERVIEW
Hands-On Microsoft Windows Server 2008
Microsoft SharePoint Server 2016
Performance Testing Methodology for Cloud Based Applications
Software Architecture in Practice
Course: Module: Lesson # & Name Instructional Material 1 of 32 Lesson Delivery Mode: Lesson Duration: Document Name: 1. Professional Diploma in ERP Systems.
Software System Testing
DEPLOYING SECURITY CONFIGURATION
Performance And Scalability In Oracle9i And SQL Server 2000
Presentation transcript:

Topics to be discussed Introduction Performance Factors Methodology Test Process Tools Conclusion Abu Bakr Siddiq

Why Performance Test? What is Performance Test? How Performance Test? When to Performance Test? Performance Vocabulary

Why Performance test the application? To improve the software speed, scalability, stability and confidence of the system under different loads for desired performance. Ex: USA Shopping in Dec. Indian Railway Reservation. What is Performance testing? Testing for performance factors against the acceptable or suggestible configuration is performance testing. How Performance test the application? Using a Tool which follows Performance Methodology & Performance Process When Performance test the application? Available Performance Requirements, Stable Build, Resource Availability, Defined Performance Plan and methodology, Set entry and Exit Criteria.

The different types of performance testing Capacity Testing: Determining the servers failure point is called Capacity Testing. Component Testing: Testing the architectural component of the application like, servers, databases, networks, firewalls, and storage devices. Endurance Testing: Testing for performance characteristics with work load models anticipated during production. Load Testing: Subjecting the system to a statistically representatives(load) is called Load Testing. Smoke Testing: A initial run to see the system performance under normal Load. Spike Testing: Testing for performance characteristics with work load models during production which repeatedly increase beyond anticipated. Stress Testing: Evaluating the application beyond peak load conditions. Validation Testing: Testing against the expectations that have been presumed for the system Volume Testing: Subjecting the system to variable amount of data and testing for performance is Volume Testing. User Load Saturation point Load Testing Stress Testing Through put

Response Time Latency Tuning Benchmarking Capacity Planning

ClientWeb ServerDB Server Internet N1 A1 N4 A3N3 N2 A2 Network Latency = N1 + N2 + N3 + N4 Product Latency = A1 + A2 + A3 Actual Response Time = Network Latency + Product Latency Latency = Actual Response Time + O1 O1 Throughput : The number of requests/business transactions processed by the system in a specified time duration.

Tuning : Procedure by which the system performance is enhanced by setting different values to the parameters (variables) of the product, OS and other components. Ex: Search Benchmarking: Comparing the throughput and response time of the system with that of competitive products. Ex: Comparing Open office with that of MS- Office. Capacity Planning: The exercise to find out what resources and configurations are needed. Ex: suggesting the ideal software, hardware and other components for the system to the customer.

Collecting Requirements Writing Test Cases Automating Test Cases Executing Test Cases Analyzing Results Tuning Benchmarking Capacity Planning Collecting Requirements Authoring Test Cases Automating Test Cases Executing Test Cases Analyzing Results Tuning Benchmarking Capacity Planning

Performance Requirements types: Generic Specific Performance Requirements Characteristics: Testable Clear Specific Sources for gathering Requirements: Performance compared to previous release of the same product Performance compared to the competitive product(s) Performance compared to absolute numbers derived from actual need Performance numbers derived from architecture and design Graceful Performance Degradation

Performance Test Cases should have the following details: List of Operations or business transactions to be tested Steps for executing those operations/transactions List of Product, OS parameters that impact the Performance Testing and their values Loading Pattern Resource and their configuration (network, hardware, software configurations The expected results The product version/competitive products to be compared with the related information such as their corresponding fields

Performance Testing naturally lends itself to automation, reasons : Performance testing is repetitive Performance test cases cannot be effective without automation Results of Performance testing need to be accurate, manually calculating may introduce inaccuracy Performance Testing takes into account too many permutations and combinations hence if done manually will be difficult Extensive Analysis of performance results and failures needs to be taken into account, is very difficult to do things manually

The following aspects need to be considered while executing Performance Tests: Start and End time of Execution Log and trace/audit files of the product and OS. (for future debugging and repeatability) Utilization of resources ( CPU, memory, disk, network utilization and so on) on periodic basis Configuration of all Environmental factors (Hardware, software and other components) The performance factors listed in the test case documentation at regular intervals

Performance Test results are concluded as follows: Whether performance of the product is consistent when tests are executed multiple times What performance can be expected for what type of configuration What parameters impact and how they can be derived for better performance What is the effect of scenarios involving a mix of performance factors What is the effect of product technologies such as caching What Loads of performance numbers are acceptable What is the optimum throughput and response time of the product for set of factors What performance requirements are met how the performance looks compared to previous version or expectations

Two ways to get optimum mileage Tuning the Product parameters Tuning the Operating system parameters Product parameters: Repeat the Performance tests for different values of each parameters that impact A Particular Parameter change values is changed, it needs changes in other Repeat the tests for default values of all parameters( factory settings tests) Repeat the performance tests for low and high values of each parameter and combinations Operating system parameters: Files system related parameters Disk Management parameters Memory Management parameters Processor Management parameters Network management parameters

The steps involved in Performance Benchmarking are the as follows: Identify the transactions/ scenarios & the test configuration Comparing the performance of different product Tuning the parameters of the products bring compared fairly to deliver the best performance Publishing the results of performance benchmarking Capacity Planning is identifying the right configuration, which is of 3 types: Minimum required configuration Typical configuration Special configuration

Resource Requirements Test Lab Setup Responsibilities Setting up Traces & Audits Entry & Exit Criteria

Obtain Measurable, testable requirements Create a Performance Test Plan Design Test Cases Automate Test Cases Evaluate Entry Criteria Perform and Analyze performance test cases Evaluate Exit Criteria

Resource Requirements: All the resources are specifically needed, hence shall be planned and obtained. Resources are to be exclusively dedicated to the current system without interchanging the roles and responsibilities often. Test – Lab Setup: Test Lab with all required equipment is to be setup prior to execution. Test lab has to be configured cautiously as a single mistake can lead to running the tests again. Responsibilities: Performance defects may cause changes to architecture, design and code. Team facing customer communicates requirements for performance. Multiple teams are involved in Performance testing the system. Hence a matrix which describes the responsibilities of the team is part of the test plan.

Setting up Product traces, Audits: Performance test results need to be associated with traces and audit trails to analyze the results. Audits and traces are to be planned in advance or else may start impacting the performance results. Entry and Exit Criteria: Performance tests require a stable product due to its complexity and accuracy that is needed. Changes to the product mean tests are to be repeated. Hence Performance tests starts after the product meets a set criteria.

Commercial Tools Load Runner – HP QA Partner - Compuware Silk Test - Segue MS Stress Tool – Microsoft Open Source Tools Web Load JMeter Open STA Challenges Conclusion