AUTOMATING WEB APPLICATION PERFORMANCE TESTING – CHALLENGES AND APPROACHES Bishnu Priya Nanda , Tata Consultancy Services Ltd. Seji Thomas, Tata Consultancy Services Ltd. Mohan Jayaramappa, Tata Consultancy Services Ltd. Copyright @ 2016 Tata Consultancy Services Limited
Contents Introduction to Automated Performance Testing Approach to automate Performance Testing Executing Automated Performance Testing using Jenkins Challenges Benefits Limitations conclusion Copyright @ 2016 Tata Consultancy Services Limited
Introduction to Automated Performance Testing Reduces effort in testing environment setup , collecting and analyzing metrics. Adoption of agile technology in software development cycle. Helps in receiving continuous feed back on software performance and optimizing them. Non of the manual step of Performance analysis will be missed. Helps in finding performance difference between application versions. Copyright @ 2016 Tata Consultancy Services Limited
Approach to automate Performance Testing Three approach identified currently to conclude the application performance Screen/page response time captured by JMeter load testing. CPU and Memory utilization of servers involved in Performance Testing Garbage Collection (GC) log analysis of JVMs. Copyright @ 2016 Tata Consultancy Services Limited
Automating Application Transaction Response Times Automating load testing to get the response time report, analyzing and concluding as Pass/Fail will follow the below approach. JMeter script preparation as per workload (manually) Preparing response time limit file containing SLA to compare with Load testing result (manually) Automatic triggering load testing using JMeter command line option Automatic collecting JMeter load testing result Automatic executing python script to aggregate JMeter report collected , compare with limit file against SLA set for transactions and creating output file saying response time as pass/fail. Copyright @ 2016 Tata Consultancy Services Limited
Automating Application Transaction Response Times Output The output file from response time automation process will look like, Note : The requests for that transaction is sorted in descending order of 90th Percentile response and that request is compared with the SLA given in limit file for that transaction and concluded pass/Fail The overall response time result is considered as Pass is all the transactions are pass. Transaction Name Request Name SLA (in ms) Concurrent User Samples Min Avg 90th Percentile Max Pass/Fail Abc Abc:/request1 2000 10 150 980 1700 1900 2100 Pass Xyz Xyz:/request5 2500 125 850 2400 2600 2900 Fail Copyright @ 2016 Tata Consultancy Services Limited
Automating Server CPU and Memory Utilization Automating Server Utilization report, analyzing and concluding as Pass/Fail will follow the below approach. Creation of server utilization limit file for each server involved in Performance testing (manually). Automatic trigger of Vmstat command of linux to collect the CPU and memory utilization of servers. Automatic execution of python script to compare the avg CPU and memory utilization against the SLA mentioned in limit file for servers to create an output file saying result as pass/fail. There is another statistical parameter considered called Coefficient of Variance considering the percentage of variance from max utilization to average. Copyright @ 2016 Tata Consultancy Services Limited
Automating Server CPU and Memory Utilization Output The output file from response time automation process will look like, Note: The average server utilization (CPU and RAM) and C.V. is compared with the SLA given in limit file to make the result as pass or fail. If any one result among CPU and RAM fails then the overall utilization result of the server is considered to be failed. Measure Limit (in %) Min Avg Max StdDev C.V. Pass/Fail CPU 70 26 35.74 96 20.64 57.75 Fail RAM 97.12 97.54 97.61 0.1 Copyright @ 2016 Tata Consultancy Services Limited
Automating Garbage Collection Log analysis Automating Garbage Collection(GC) log capturing, analyzing and concluding as Pass/Fail will follow the below approach. Creation of GC limit file for application server involved in Performance testing (manually). Automatic trigger to collect GC log from server just after load test and parse raw GC log file to proper format for better analysis using GCViewer, an open source tool. Automatic execution of python script to compare the parsed log file from GCViewer with SLA mentioned in GC limit file and creating output file containing GC metrics as pass or fail. To decide leaks and potential issues one parameter is considered in GC statistics i.e. MaxSlopeAfterFullGC , which is calculated by max memory slope between two full GCs . Copyright @ 2016 Tata Consultancy Services Limited
Automating Garbage Collection Log analysis Output The output file from GC log analysis automation will look like, Measure Limit Actual Result Overall Result Remarks MinTotalTime 3600 57506 Pass Fail Seconds MinFullGCCount Min Full GCs that must be present in entire test MaxFullGCCountPerHour 10 12 Max number of Full GC per hour MinThroughput 95 94.9 in %age AvgMemAfterFullGC 5 2 % of total memory. If it is below this limit then Overall result will always made PASS MaxSlopeAfterFullGC 30000 Bytes/Sec increase in used-memory after Full GC (i.e. slope of all min used-memory after FullGCs) Copyright @ 2016 Tata Consultancy Services Limited
Deployment of Performance Testing Automation in Jenkins Copyright @ 2016 Tata Consultancy Services Limited
Performance Settings in Jenkins Copyright @ 2016 Tata Consultancy Services Limited
Executing Performance Testing Automation in Jenkins Creating New Performance Testing Job by Jenkins admin user PT servers to be linked with the Jenkins Server by Jenkins admin Copyright @ 2016 Tata Consultancy Services Limited
Executing Performance Testing Automation in Jenkins contd… Configuring PT job for execution First reboot and restart the PT servers Copyright @ 2016 Tata Consultancy Services Limited
Executing Performance Testing Automation in Jenkins contd.. Run Vmstat command on each of the servers Copyright @ 2016 Tata Consultancy Services Limited
Executing Performance Testing Automation in Jenkins contd.. Trigger load testing to perform through JMeter (command line option) and transfer load testing result to Jenkins server. Copyright @ 2016 Tata Consultancy Services Limited
Executing Performance Testing Automation in Jenkins contd.. Transfer Vmstat output, GC output from all servers involved in PT (e.g. App/DB/Web ) to Jenkins server after load test. Parse JMeter result , Vmstat output and GC output comparing with respective SLAs set in limit files to get the final output showing Pass/Fail. Scripts configured in Jenkins for same are, Copyright @ 2016 Tata Consultancy Services Limited
Executing Performance Testing Automation in Jenkins contd.. Final python script will be executed , which will read Jenkins log file containing all output and will create final output file consolidating all performance testing result. If all results are pass then will declare performance testing result as pass otherwise fail. The final performance result will be displayed in application’s dashboard in Jenkins. Script configured in Jenkins for same is, Final out put in HTML format is Copyright @ 2016 Tata Consultancy Services Limited
Executing Performance Testing Automation in Jenkins contd.. Final python script will be executed , which will read Jenkins log file containing all output and will create final output file consolidating all performance testing result. If all results are pass then will declare performance testing result as pass otherwise fail. The final performance result will be displayed in application’s dashboard in Jenkins. Script configured in Jenkins for same is, Final out put in HTML format is Copyright @ 2016 Tata Consultancy Services Limited
Challenges Zero percentage error have to consider . Need for password less authentication between server to transfer files. Difficulty in finding memory leak as visual GC graph not possible. Identified new metrics ‘MaxSlopeAfterFullGC’ . Graphical analysis of server utilization results not possible make judgment, so identified the statistical measure ‘co-efficient of variance’ (C.V.) as a good indicator of the overall utilization. C.V. is the Standard Deviation/Average Copyright @ 2016 Tata Consultancy Services Limited
Benefits Avoids time consumption in manual analysis and PT environment setup. Analyzing and mitigating Performance issues before system derails at the last moment. Automated performance testing guarantees users get new feature not new performance issues. For application adopting agile methodology . Automated performance testing helps developer to optimize performance issues arising by addition of new features. Copyright @ 2016 Tata Consultancy Services Limited
Limitations Need for password less authentication between server for SSH. Creating limit file and modifying in case of change in workload manually. Creating and editing JMeter scripts manually. Incase of build change , validating JMeter scripts manually. Copyright @ 2016 Tata Consultancy Services Limited
Conclusion Automated Performance testing increases application quality and productivity. This process enables everyone on the development team to share test scenarios and test results . Produces report that everyone can understand in the team. The goal of load testing at the speed of Agile is to deliver as much value to the users of an application through evolving features and functionality while ensuring performance no matter how many users are on the app at any one time. Copyright @ 2016 Tata Consultancy Services Limited
Acknowledgement The authors would like to acknowledge the encouragement, valuable guidance and review comments provided by Mohan Jayaramappa, Senior Consultant, Tata Consultancy Services Ltd. A note of thanks for helping in this experiment and concept to, Debiprasad Swain, Principal Consultant, Tata Consultancy Services Ltd. Pitabasa Sa, Senior Consultant, Tata Consultancy Services Ltd. Copyright @ 2016 Tata Consultancy Services Limited