Tuesday 22 September 2015

Load Test In the Load Runner

Load Test In the Load Runner:

Load Tests are end to end performance tests under anticipated production load. The objective such tests are to determine the response times for various time critical transactions and business processes and ensure that they are within documented expectations (or Service Level Agreements - SLAs). Load tests also measures the capability of an application to function correctly under load, by measuring transaction pass/fail/error rates. An important variation of the load test is the Network Sensitivity Test, which incorporates WAN segments into a load test as most applications are deployed beyond a single LAN.

Load Tests are major tests, requiring substantial input from the business, so that anticipated activity can be accurately simulated in a test environment. If the project has a pilot in production then logs from the pilot can be used to generate ‘usage profiles’ that can be used as part of the testing process, and can even be used to ‘drive’ large portions of the Load Test.

Load testing must be executed on “today’s” production size database, and optionally with a “projected” database. If some database tables will be much larger in some months time, then Load testing should also be conducted against a projected database. It is important that such tests are repeatable, and give the same results for identical runs. They may need to be executed several times in the first year of wide scale deployment, to ensure that new releases and changes in database size do not push response times beyond prescribed SLAs.

What is the purpose of a Load Test?


The purpose of any load test should be clearly understood and documented. A load test usually fits into one of the following categories:
Quantification of risk. - Determine, through formal testing, the likelihood that system performance will meet the formal stated performance expectations of stakeholders, such as response time requirements under given levels of load. This is a traditional Quality Assurance (QA) type test. Note that load testing does not mitigate risk directly, but through identification and quantification of risk, presents tuning opportunities and an impetus for remediation that will mitigate risk.
Determination of minimum configuration. - Determine, through formal testing, the minimum configuration that will allow the system to meet the formal stated performance expectations of stakeholders - so that extraneous hardware, software and the associated cost of ownership can be minimized. This is a Business Technology Optimization (BTO) type test.


What functions or business processes should be tested?


The following table describes the criteria for determining the business functions or processes to be included in a test.
Basis for inclusion in Load Test Comment

High frequency transactions 

The most frequently used transactions have the potential to impact the performance of all of the other transactions if they are not efficient.

Mission Critical transactions 

The more important transactions that facilitate the core objectives of the system should be included, as failure under load of these transactions has, by definition, the greatest impact.


Read Transactions

At least one READ ONLY transaction should be included, so that performance of such transactions can be differentiated from other more complex transactions.

Update Transactions 

At least one update transaction should be included so that performance of such transactions can be differentiated from other transactions.

Example of Load Test Configuration for a web system

The following diagram shows how a thorough load test could be set up using LoadRunner. 

The important thing to understand in executing such a load test is that the load is generated at a protocol level, by the load generators, that are running scripts developed with the VUGen tool. Transaction times derived from the VUGen scripts do not include processing time on the client PC, such as rendering (drawing parts of the screen) or execution of client side scripts such as JavaScript. The WinRunner PC(s) is utilized to measure end user experience response times. Most load tests would not employ a WinRunner PC to measure actual response times from the client perspective, but is highly recommended where complex and variable processing is performed on the desktop after data has been delivered to the client.

The LoadRunner controller is capable of displaying real-time graphs of response times as well as other measures such as CPU utilization on each of the components behind the firewall. Internal measures from products such as Oracle, WebSphere are also available for monitoring during test execution.

After completion of a test, the Analysis engine can generate a number of graphs and correlations to help locate any performance bottlenecks.

Simplified Load Test Configuration for a web system
In this simplified load test, the controller communicates directly to a load generator that can communicate directly to the load balancer. No WinRunner PC is utilized to measure actual user experience. The collection of statistics from various components is simplified as there is no firewall between the controller and the web components being measured.
Reporting on Response Time at various levels of load.

Expected output from a load test often includes a series of response time measures at various levels of load, eg 500 users, 750 users and 1,000 users. It is important when determining the response time at any particular level of load, that the system has run in a stable manner for a significant amount of time before taking measurements.

For example, a ramp-up to 500 users may take ten minutes, but another ten minutes may be required to let the system activity stabilize. Taking measurements over the next ten minutes would then give a meaningful result. The next measurement can be taken after ramping up to the next level and waiting a further ten minutes for stabilization and ten minutes for the measurement period and so on for each level of load requiring detailed response time measures.

Stating Response Time Requirements

Traditionally, response time is often defined as the interval from when a user initiates a request to the instant at which the first part of the response is received at by the application. However, such a definition is not usually appropriate within a performance related application requirement specification. The definition of response time must incorporate the behavior, design and architecture of the system under test. While understanding the concept of response time is critical in all load and performance tests, it is probably most crucial to Load Testing,Performance Testing and Network Sensitivity Testing.

Response time measuring points must be carefully considered because in client server applications, as well as web systems, the first characters returned to the application often does not contribute to the rendering of the screen with the anticipated response, and do not represent the users impression of response time.

For example response time in a web based booking system, that contains a banner advertising mechanism, may or may not include the time taken to download and display banner adds, depending on your interest in the project. If you are a marketing firm, you would be very interested in banner add display time, but if you were primarily interested in the booking component, then banner adds would not be of much concern.

Also, response time measurements are typically defined at the communications layer, which is very convenient for LoadRunner /VUGen based tests, but may be quite different to what a user experiences on his or her screen. A user sees what is drawn on a screen and does not see the data transmitted down the communications line. The display is updated after the computations for rendering the screen have been performed, and those computations may be very sophisticated and take a considerable amount of time. For response time requirements that are stated in terms of what the user sees on the screen, WinRunner should be used, unless there is a reliable mathematical calculation to translate communications based response time into screen based response time.

It is important that response time is clearly defined, and the response time requirements (or expectations) are stated in such a way to ensure that unacceptable performance is flagged in the load and performance testing process.

One simple suggestion is to state an Average and a 90th Percentile response time for each group of transactions that are time critical. In a set of 100 values that are sorted from best to worst, the 90th percentile simply means the 90th value in the list. The specification is as follows:
Time to display order details
Average time to display order details.
less than 5.0 seconds.
90th percentile time to display order details.
less than 7.0 seconds.

The above specification, or response time service level agreement, is a reasonably tight specification that is easy to validate against.

For example, a customer 'display order details' transaction was executed 20 times under similar conditions, with response times in seconds, sorted from best to worst, as follows -

2,2,2,2,2, 2,2,2,2,2, 3,3,3,3,3, 4,10,10,10,20 Average = 4.45 seconds, 90th Percentile = 10 seconds

The above test would fail when compared against the above stated criteria, as too many transactions were slower than seven seconds, even though the average was less than five seconds.

If the performance requirement was a simple "Average must be less than five seconds" then the test would pass, even though every fifth transaction was ten seconds or slower.

This simple approach can easily extended to include 99th percentile and other percentiles as required for even tighter response time service level agreement specifications.


LoadRunner Training in Bangalore
LoadRunner Training in Hyderabad
LoadRunner Online Training
LoadRunner Training in BTM
LoadRunner Training in Marathahalli
Best LoadRunner Training Institutes in Bangalore 
Best LoadRunner Training Institutes in India
Training Institutes in Bangalore

No comments:

Post a Comment