Copyright © 2010 by the Commonwealth of Pennsylvania. All Rights Reserved. Load Test Results Bureau of Information Systems
2 Contents Project Overview Release Application Changes Release Performance Tuning Load Test Approach Load Test Methodology Load Testing scenario enhancements Load Test Comparison SQL Execution Comparison
3 Project Overview
4 Release Application Changes
5 Release Performance Tuning
6 Load Test Approach Integrated load tests were conducted with: ---
7 Load Test Methodology
8 Load Testing Scenario Changes
9 Load Test Comparison Overview
Load Test Comparison Production MetricsAverageBaseline IBaseline II ColumnIIIIIIIV Test Volume- # of Virtual Users- Total Passed Transactions- Total Failed Transactions- % Processor Time (Web App Server) - % CPU Utilization (Database Server) - Average Throughput (bytes/second) - Average Response Time - *Sightline did not successfully capture CPU information for DPWL and SOA20 on 10/12. An estimated CPU was taken from Oracle graphs.
Load Test Comparison Production MetricsAverage Baseline IBaseline II ColumnIIIIIIIV Test Volume- Average Response Time (Resource and Referral) - Business Metrics (Resource and Referral) Average Response Time (Correspondence) Business Metrics (Correspondence) Average Response Time (Reports) Business Metrics (Reports) Average Response Time (Enrollments) Business Metrics (Enrollments)
SQL_ID Prorated Value for 2 hrs Load Test 10/13 Execution Count Load Test 10/12 Execution Count Load Test 9/28 Execution Count Load Test 8/02 Execution CountGap in Executions PELICAN SQL Execution Statistics for Production
13 SQL Execution Comparison Primary reasons for the coverage difference: