Post on 19-Jan-2016
description
Copyright © 2010 by the Commonwealth of Pennsylvania. All Rights Reserved.
Load Test Results
< Project Name & Date>
<Presenter Name>
Bureau of Information Systems
2
Contents
Project Overview Release Application Changes Release Performance Tuning Load Test Approach Load Test Methodology Load Testing scenario enhancements Load Test Comparison SQL Execution Comparison
3
Project Overview
4
Release Application Changes
5
Release Performance Tuning
6
Load Test Approach
Integrated load tests were conducted with:
---
7
Load Test Methodology
8
Load Testing Scenario Changes
9
Load Test Comparison Overview
Load Test Comparison
Production Metrics Average Baseline I Baseline II
Column I II III IV
Test Volume -
# of Virtual Users -
Total Passed Transactions -
Total Failed Transactions -
% Processor Time (Web App Server)
-
% CPU Utilization (Database Server)
-
Average Throughput (bytes/second)
-
Average Response Time-
*Sightline did not successfully capture CPU information for DPWL and SOA20 on 10/12. An estimated CPU was taken from Oracle graphs.
Load Test Comparison
Production Metrics AverageBaseline I Baseline II
Column I II III IV
Test Volume -
Average Response Time(Resource and Referral)
-
Business Metrics(Resource and Referral)
Average Response Time(Correspondence)
Business Metrics(Correspondence)
Average Response Time(Reports)
Business Metrics(Reports)
Average Response Time (Enrollments)
Business Metrics(Enrollments)
SQL_IDProrated Value for
2 hrsLoad Test 10/13 Execution Count
Load Test 10/12 Execution Count
Load Test 9/28 Execution Count
Load Test 8/02 Execution Count Gap in Executions
PELICAN SQL Execution Statistics for Production
13
SQL Execution Comparison
Primary reasons for the coverage difference: