Download presentation
Presentation is loading. Please wait.
Published byAnnabella Dortha Gibson Modified over 7 years ago
1
AUTOMATING WEB APPLICATION PERFORMANCE TESTING – CHALLENGES AND APPROACHES
Bishnu Priya Nanda , Tata Consultancy Services Ltd. Seji Thomas, Tata Consultancy Services Ltd. Mohan Jayaramappa, Tata Consultancy Services Ltd. Tata Consultancy Services Limited
2
Contents Introduction to Automated Performance Testing
Approach to automate Performance Testing Executing Automated Performance Testing using Jenkins Challenges Benefits Limitations conclusion Tata Consultancy Services Limited
3
Introduction to Automated Performance Testing
Reduces effort in testing environment setup , collecting and analyzing metrics. Adoption of agile technology in software development cycle. Helps in receiving continuous feed back on software performance and optimizing them. Non of the manual step of Performance analysis will be missed. Helps in finding performance difference between application versions. Tata Consultancy Services Limited
4
Approach to automate Performance Testing
Three approach identified currently to conclude the application performance Screen/page response time captured by JMeter load testing. CPU and Memory utilization of servers involved in Performance Testing Garbage Collection (GC) log analysis of JVMs. Tata Consultancy Services Limited
5
Automating Application Transaction Response Times
Automating load testing to get the response time report, analyzing and concluding as Pass/Fail will follow the below approach. JMeter script preparation as per workload (manually) Preparing response time limit file containing SLA to compare with Load testing result (manually) Automatic triggering load testing using JMeter command line option Automatic collecting JMeter load testing result Automatic executing python script to aggregate JMeter report collected , compare with limit file against SLA set for transactions and creating output file saying response time as pass/fail. Tata Consultancy Services Limited
6
Automating Application Transaction Response Times Output
The output file from response time automation process will look like, Note : The requests for that transaction is sorted in descending order of 90th Percentile response and that request is compared with the SLA given in limit file for that transaction and concluded pass/Fail The overall response time result is considered as Pass is all the transactions are pass. Transaction Name Request Name SLA (in ms) Concurrent User Samples Min Avg 90th Percentile Max Pass/Fail Abc Abc:/request1 2000 10 150 980 1700 1900 2100 Pass Xyz Xyz:/request5 2500 125 850 2400 2600 2900 Fail Tata Consultancy Services Limited
7
Automating Server CPU and Memory Utilization
Automating Server Utilization report, analyzing and concluding as Pass/Fail will follow the below approach. Creation of server utilization limit file for each server involved in Performance testing (manually). Automatic trigger of Vmstat command of linux to collect the CPU and memory utilization of servers. Automatic execution of python script to compare the avg CPU and memory utilization against the SLA mentioned in limit file for servers to create an output file saying result as pass/fail. There is another statistical parameter considered called Coefficient of Variance considering the percentage of variance from max utilization to average. Tata Consultancy Services Limited
8
Automating Server CPU and Memory Utilization Output
The output file from response time automation process will look like, Note: The average server utilization (CPU and RAM) and C.V. is compared with the SLA given in limit file to make the result as pass or fail. If any one result among CPU and RAM fails then the overall utilization result of the server is considered to be failed. Measure Limit (in %) Min Avg Max StdDev C.V. Pass/Fail CPU 70 26 35.74 96 20.64 57.75 Fail RAM 97.12 97.54 97.61 0.1 Tata Consultancy Services Limited
9
Automating Garbage Collection Log analysis
Automating Garbage Collection(GC) log capturing, analyzing and concluding as Pass/Fail will follow the below approach. Creation of GC limit file for application server involved in Performance testing (manually). Automatic trigger to collect GC log from server just after load test and parse raw GC log file to proper format for better analysis using GCViewer, an open source tool. Automatic execution of python script to compare the parsed log file from GCViewer with SLA mentioned in GC limit file and creating output file containing GC metrics as pass or fail. To decide leaks and potential issues one parameter is considered in GC statistics i.e. MaxSlopeAfterFullGC , which is calculated by max memory slope between two full GCs . Tata Consultancy Services Limited
10
Automating Garbage Collection Log analysis Output
The output file from GC log analysis automation will look like, Measure Limit Actual Result Overall Result Remarks MinTotalTime 3600 57506 Pass Fail Seconds MinFullGCCount Min Full GCs that must be present in entire test MaxFullGCCountPerHour 10 12 Max number of Full GC per hour MinThroughput 95 94.9 in %age AvgMemAfterFullGC 5 2 % of total memory. If it is below this limit then Overall result will always made PASS MaxSlopeAfterFullGC 30000 Bytes/Sec increase in used-memory after Full GC (i.e. slope of all min used-memory after FullGCs) Tata Consultancy Services Limited
11
Deployment of Performance Testing Automation in Jenkins
Tata Consultancy Services Limited
12
Performance Settings in Jenkins
Tata Consultancy Services Limited
13
Executing Performance Testing Automation in Jenkins
Creating New Performance Testing Job by Jenkins admin user PT servers to be linked with the Jenkins Server by Jenkins admin Tata Consultancy Services Limited
14
Executing Performance Testing Automation in Jenkins contd…
Configuring PT job for execution First reboot and restart the PT servers Tata Consultancy Services Limited
15
Executing Performance Testing Automation in Jenkins contd..
Run Vmstat command on each of the servers Tata Consultancy Services Limited
16
Executing Performance Testing Automation in Jenkins contd..
Trigger load testing to perform through JMeter (command line option) and transfer load testing result to Jenkins server. Tata Consultancy Services Limited
17
Executing Performance Testing Automation in Jenkins contd..
Transfer Vmstat output, GC output from all servers involved in PT (e.g. App/DB/Web ) to Jenkins server after load test. Parse JMeter result , Vmstat output and GC output comparing with respective SLAs set in limit files to get the final output showing Pass/Fail. Scripts configured in Jenkins for same are, Tata Consultancy Services Limited
18
Executing Performance Testing Automation in Jenkins contd..
Final python script will be executed , which will read Jenkins log file containing all output and will create final output file consolidating all performance testing result. If all results are pass then will declare performance testing result as pass otherwise fail. The final performance result will be displayed in application’s dashboard in Jenkins. Script configured in Jenkins for same is, Final out put in HTML format is Tata Consultancy Services Limited
19
Executing Performance Testing Automation in Jenkins contd..
Final python script will be executed , which will read Jenkins log file containing all output and will create final output file consolidating all performance testing result. If all results are pass then will declare performance testing result as pass otherwise fail. The final performance result will be displayed in application’s dashboard in Jenkins. Script configured in Jenkins for same is, Final out put in HTML format is Tata Consultancy Services Limited
20
Challenges Zero percentage error have to consider .
Need for password less authentication between server to transfer files. Difficulty in finding memory leak as visual GC graph not possible. Identified new metrics ‘MaxSlopeAfterFullGC’ . Graphical analysis of server utilization results not possible make judgment, so identified the statistical measure ‘co-efficient of variance’ (C.V.) as a good indicator of the overall utilization. C.V. is the Standard Deviation/Average Tata Consultancy Services Limited
21
Benefits Avoids time consumption in manual analysis and PT environment setup. Analyzing and mitigating Performance issues before system derails at the last moment. Automated performance testing guarantees users get new feature not new performance issues. For application adopting agile methodology . Automated performance testing helps developer to optimize performance issues arising by addition of new features. Tata Consultancy Services Limited
22
Limitations Need for password less authentication between server for SSH. Creating limit file and modifying in case of change in workload manually. Creating and editing JMeter scripts manually. Incase of build change , validating JMeter scripts manually. Tata Consultancy Services Limited
23
Conclusion Automated Performance testing increases application quality and productivity. This process enables everyone on the development team to share test scenarios and test results . Produces report that everyone can understand in the team. The goal of load testing at the speed of Agile is to deliver as much value to the users of an application through evolving features and functionality while ensuring performance no matter how many users are on the app at any one time. Tata Consultancy Services Limited
24
Acknowledgement The authors would like to acknowledge the encouragement, valuable guidance and review comments provided by Mohan Jayaramappa, Senior Consultant, Tata Consultancy Services Ltd. A note of thanks for helping in this experiment and concept to, Debiprasad Swain, Principal Consultant, Tata Consultancy Services Ltd. Pitabasa Sa, Senior Consultant, Tata Consultancy Services Ltd. Tata Consultancy Services Limited
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.