ALL RIGHTS RESERVED 2013
ALL RIGHTS RESERVED 2013
Agenda:
Introduction to Performance TestingWhy Performance Testing?Performance Testing jargonsPerformance Test Metrics and ProcessLimitations of Manual Performance TestingBenefits of Performance Test Automation Tools used for Performance Testing
ALL RIGHTS RESERVED 2013
Performance TestingWikipedia says:“In the computer industry, software performance testing is
used to determine the speed or effectiveness of a computer, network, software program or device. This process can
involve quantitative tests done in a lab, such as measuring the response time or the number of MIPS (millions of instructions per second) at which a system functions.
Qualitative attributes such as reliability, scalability and interoperability may also be evaluated. Performance testing
is often done in conjunction with stress testing.”
Our Simplified DefinitionsPerformance Testing = how fast &
stable is the system?
ALL RIGHTS RESERVED 2013
Why Performance Testing?
ALL RIGHTS RESERVED 2013
Oh it’s 10 o’clock Got to book my Rail
Ticket..
ALL RIGHTS RESERVED 2013
ALL RIGHTS RESERVED 2013
Poor Performance Affects:
• Loss of revenue Amazon states that for every 100ms of latency, they
lose 1% of their sales. Google's experiments prove statistically that slowing
its search results by 400ms costs millions of dollars per year due to lost advertisement revenue.
Industry experts estimate that revenue of $3 billion has lost in 2012 because of slow site performance.
• Loss of customers Google says they could lose 8 million searches daily if
their search results are 1/2 second slower. Flipkart says “57% of shoppers will abandon a site
after waiting 3 seconds for a page to load, out of them 80% don’t return again on same site.”
ALL RIGHTS RESERVED 2013
Continued…• Loss of productivity Reddit, states that their performance focus is on page load speed.
“If we can get 10% more performance, we immediately see 10% more traffic.”
Better performing website (speed improvements) increased productivity by 7-12%.
• Backlog of Work On an average 1.2 million people try to book tickets in the first ten
minutes of the Tatkal timeframe, Of that only 50000 people come out of the website with tickets, at the success rate of 4.16% only due to service backlog during peak.
• Media attention and damage to your brand Times Of India Ranked IRCTC “1st” out of 8 ‘worst’ Indian
government websites in 2012.
ALL RIGHTS RESERVED 2013
Why Performance Testing?
• Identifies problems early on before they become costly to resolve.
• Produces better quality, more scalable code.• Prevents revenue and credibility loss due to
poor Web site performance.• Enables intelligent planning for future
expansion.• To ensure that the system meets
performance expectations such as response time, throughput etc.
ALL RIGHTS RESERVED 2013
Performance Testing Jargons Before going into the details, we should understand the factors that governs
Performance testing: Business Transaction Think Time Concurrent User load Simultaneous User load Throughput Response Time Tuning Baseline Test Benchmarking User Abandonment Performance Bottleneck Load Testing Stress Testing Spike Testing Volume Testing
ALL RIGHTS RESERVED 2013
Business Transaction
• It refers to a sequence of request-response and the related information exchange between the client and the server in order to accomplish a business requirement.
• For example, an online banking customer transferring money from one account to other is a business transaction.
ALL RIGHTS RESERVED 2013
Think Time• It refers to the time taken by the user
for thinking or clicking on any of the web page links, buttons, etc while navigating through the web site.
• Think time is a very important parameter which needs to be set while scripting the scenarios using the performance testing tools.
• The business analysts of the application or web site management team or sometimes even the end user survey might give a realistic picture about the think time requirements of a transaction.
ALL RIGHTS RESERVED 2013
User Abandonment• It refers to the situation wherein
the end users exit from the web site because of the slow performance of the site.
• The user abandonment rate varies from web site to web site. A low priority web site might experience a high user abandonment rate compared to the payroll web site. Analysis on the abandonment rate of the site needs to be done before concluding on the load on the site.
ALL RIGHTS RESERVED 2013
Simultaneous User load • The simultaneous users have the active
session on the server at any point of time wherein each user will be executing different transactions.
• For example, if we say 100 simultaneous users load, then there will be 100 active sessions opened up in the server, wherein each user will be performing different set of transactions – one logging in, another viewing reports, another navigating to the next page, etc.
• The simultaneous user load of a web site would be always greater than the concurrent user load of a web site.
ALL RIGHTS RESERVED 2013
Concurrent User load
• The concurrent users connect to the server and perform the same operation at any point of time.
• For example, if we say 100 concurrent user load, all 100 users would be logging in at the same point of time, view the reports at the same point of time, etc. For example, an online banking web site might have 10,000 – 20,000 simultaneous user load, but 1000 to 1500 concurrent user load.
ALL RIGHTS RESERVED 2013
Baseline Test
Make you familiar with the operational behavior of
each app/server. Tells you all about the
performance of system under normal conditions.
It refers to the test conducted to measure the application performance for limited virtual user load.
The baseline test is often conducted to collect the metrics about the system performance for 1,5,50…. user load .
ALL RIGHTS RESERVED 2013
Benchmarking
Benchmarks are baselines at known and defined levels of load.
The objective of the benchmark test is to validate the correctness of the test scripts and to check the readiness of the system before subjecting it to a high load.
Normally for a benchmark test, 15 -20% of the target load can be considered.
It is equally important to find out how much time each of the transactions took to complete.
Response time is defined as the delay between the point of request and the first response from the product.
The response time increases proportionally to the user load.
ALL RIGHTS RESERVED 2013
Response Time
• It is equally important to find out how much time each of the transactions took to complete.
• Response time is defined as the delay between the point of request and the first response from the product.
• The response time increases proportionally to the user load.
ALL RIGHTS RESERVED 2013
Throughput• Capability of a product to
handle multiple transactions in a give period.
• Throughput represents the number of requests/business transactions processed by the product in a specified time duration.
• As the number of concurrent users increase, the throughput increases almost linearly with the number of requests. As there is very little congestion within the Application Server system queues.
ALL RIGHTS RESERVED 2013
Tuning
• Tuning is the procedure by which product performance is enhanced by setting different values to the parameters of the product, operating system and other components.
• Tuning improves the product performance without having to touch the source code of the product.
ALL RIGHTS RESERVED 2013
Performance Bottleneck
• It refers to the slow spot, the effects of which are widely felt. It refers to the situation/areas which do not allow the application to perform as per its ideal specifications.
• For example, the response time increase for the load of 100 virtual users because of improper setting of HTTP connections parameter in the IIS server, CPU utilization reaching 95% during 100 users load are typical performance bottlenecks. A bottleneck might lead to a failure if mitigation actions are not taken.
ALL RIGHTS RESERVED 2013
Load Testing
• Process of exercising the system under test by feeding it the largest tasks it can operate with.
• Constantly increasing the load on the system via automated tools to simulate real time scenario with virtual users.
ALL RIGHTS RESERVED 2013
Stress Testing
• Trying to break the system under test by overwhelming its resources or by taking resources away from it.
• Purpose is to make sure that the system fails and recovers gracefully.
ALL RIGHTS RESERVED 2013
Spike Testing
• It refers to test conducted by subjecting the system to a short burst of concurrent load to the system. This test might be essential while conducting performance tests on an auction site wherein a sudden load is expected.
• The goal is to determine whether performance will suffer, the system will fail, or it will be able to handle dramatic changes in load.
ALL RIGHTS RESERVED 2013
Volume Testing
• It refers to the tests designed to measure the throughput of the system more in a batch processing, messaging kind of environment. The objective of this test is to identify the processing capacity.
• For example, if you want to volume test your application with a specific database size, you will expand your database to that size and then test the application's performance on it.
ALL RIGHTS RESERVED 2013
Transaction Based Metrics Server Based MetricsMetrics to be Captured Comments
Throughput (per Sec) Transactions per sec
Response Times / Elapsed Times
Time taken to process a transaction
Types of Errors Totals for different types of errors for a particular test
Count of Errors Total errors for a particular test.
Transaction Count Total transactions processed for the time period
Metrics to be Captured Comments
CPU Usage System%Idle%
Memory Usage Used%
Used in GB
Memory availableDisk I/O Disk Read KB/sec
Disk Write KB/sec
IO/sec
Network Activity MB/sec
Packets/sec
Size of packets
Bandwidth used
ALL RIGHTS RESERVED 2013
Performance Testing Process
ALL RIGHTS RESERVED 2013
Performance Testing Manual OR Automation…
ALL RIGHTS RESERVED 2013
Testers
Load Generation
System Under Test
Do you have the testing resources?• Testing personnel• Client machines
How do you coordinate and synchronize users?
How do you collect and analyze results?
How do you achieve test repeatability?
Analysis?
123.20
All of you, click the GO button again
Manual Performance Testing Limitations
Web server Database server
Coordinator
ALL RIGHTS RESERVED 2013
Continued…
Manual Testing Limitations Expensive, requiring large amounts of both personnel and machinery. Complicated, especially co-ordinating and synchronising multiple testers Involves a high degree of organization, especially to record and analyse
results meaningfully Repeatability of the manual tests is limited
ALL RIGHTS RESERVED 2013
Load Generation System Under Test
Benefits of Automation
Web server Database serverVuser
host
Analysis
Controller
Solves the resource limitations
•Runs many Vusers on a few machines•Controller manages the virtual users•Analyze results with graphs and reports
Solves the resource limitations
•Runs many Vusers on a few machines•Controller manages the virtual users•Analyze results with graphs and reports
ALL RIGHTS RESERVED 2013
Continued…
Using Automated Tools Reduces personnel requirements by replacing human users with virtual users
or Vusers. These Vusers emulate the behaviour of real users. Because numerous Vusers can run on a single computer, the tool reduces the
amount of hardware required for testing. Monitors the application performance online, enabling you to fine-tune your
system during test execution. It automatically records the performance of the application during a test. You
can choose from a wide variety of graphs and reports to view the performance data.
Because the tests are fully automated, you can easily repeat them as often as you need.
ALL RIGHTS RESERVED 2013
Some common Tools used for Performance Testing
Open Source• OpenSTA• Diesel Test• TestMaker• Grinder• LoadSim• Jmeter• Rubis
Commercial• LoadRunner• Silk Performer• Qengine• Empirix e-Load
ALL RIGHTS RESERVED 2013