Download presentation
Presentation is loading. Please wait.
Published byRachel Boyd Modified over 9 years ago
1
Performance Testing Design By Omri Lapidot Symantec Corporation Email: olapidot@gmail.com Mobile: 0544-497179 At SIGiST Israel Meeting November 2007
2
Agenda Why Test Performance? How not to test Performance Testing Phases –Designing a Usage Model –Tests and environments creation –Load and tune The Politics of Performance Tests Use Case: Symantec I3 Summary
3
System under test assumptions User Interface which multiple users manipulate concurrently Core applications Core database with data loading mechanism Dynamic components customers choose and install
4
–Without performance testing, functionality is likely to suffer in increased loads Most QA is done on under-loaded environments and unrealistic configurations designed for locating specific functional problems –Performance tuning is a repetitive, cyclic process Performance problems can not be addressed via normal patching process unlike functional problems –Determine realistic configuration recommendations Customers and field personnel need to know what are the recommended hardware and software configurations –Let the field people know what they are up against If we release the product with known performance issues, field personnel should know what problems should they expect and how to work around them Why test for performance?
5
–“just load the system with users” Set up a system Blast it with gazillion of virtual users Check V next to “Performance Tests” –The problems of overloading Chasing ghosts wastes dev resources Reduce QA accountability –The problems of under loading We won’t find real performance issues Reduce QA accountability How not to test
6
–Usage Model design –Tests and environments creation –Load and Tune Performance testing phases
7
Load is generated by three factors: –User activity User actions on user interface Typically: select operations, configuration changes –External automated activity Data flows into the system Typically: insert operations –Internal automated activity Data manipulation Typically: aggregation and data purge activity, internal processes Performance testing phases Designing a Usage Model
8
Mapping the loading metrics of the system we want to test –What causes the load on the system (Loading Parameters)? –Settle for a finite number of metrics Obtaining the loading metrics –Objective sources Customer logs Customer databases Customer support calls –Subjective sources Field personnel Support personnel Product marketing Selected customers Performance testing phases Designing a Usage Model
9
For each customer size: –Users How many concurrent users? What is the user activity distribution? How long does a typical session last? –Hardware What is the estimated hardware? How is it configured? –Software How many Alerts are defined? How many are set off each second? How many SLAs? Etc’ Performance testing phases Designing a Usage Model
10
–Hardware –Monitors –Data emulation –User emulation Performance testing phases Tests and environments creation
11
–(Optional) Run a baseline run –Load and Tune Performance testing phases Load and Tune Increase Load Yes Finished YesNo Run load Does the system handle the load? Tune Code No Did we reached expected thresholds ?
12
–The three phases of QA-Dev interaction Cooperation Shock Retaliation –Back yourself up Involve dev personnel in test design Involve field personnel in test design Know how each metric you use is relevant to real life The Politics Of Performance Tests
13
–A Symantec I3 system consists of the following components: Customer’s monitored applications and servers Data collectors Data Loaders Database (PW) GUI Use case: Symantec I3
15
–Who are the large, medium and small customers? –Support file analysis PW team had size estimation for each monitored technology Support files hold hundreds of customers each with the amount of instances in each monitored technology When the analysis was through, we knew what is the average configuration for small, medium and large customers Use case: Symantec I3
16
–Field interaction Product experts feedback Subjective data source –Performance Test Plan Formalize our goals, tests, tools and schedule Use case: Symantec I3
17
–Test preparation Synthetic Vs Real data User activity patterns Monitoring –No thresholds Unable to open performance bugs No measurable software standards –Testing Set up a system Transmit data Generate virtual users Measure user experience, internal processes, hardware resource usage Use case: Symantec I3
18
–Summary Our reports pressured dev to improve GUI response time by 40% We issued a table of hardware recommendations based on the expected load Two most problematic application within the system identified and will be dealt with We changed the way load is measured in our organization Dev and support teams now contact us with performance problems, performance tests are now integral part of QA tests Use case: Symantec I3
19
Agenda Why Test Performance? How not to test Performance Testing Phases –Designing a Usage Model –Tests and environments creation –Load and tune The Politics of Performance Tests Use Case: Symantec I3 Summary
20
–Releasing software without performance testing is like releasing new running shoes without running with them. –Always design a usage model What are the loading parameters? Retrieve data from objective source Verify data with subjective source
21
Thank You Omri Lapidot olapidot@gmail.com
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.