Volume Testing
Volume test shall
check if there are any problems when running the system under test with
realistic amounts of data, or even maximum or more. Volume test is necessary as
ordinary function testing normally does not use large amounts of data, rather
the opposite.
A special task is to check out real maximum amounts of data which are possible
in extreme situations, for example on days with extremely large amounts of
processing to be done (new year, campaigns, tax deadlines, disasters, etc.)
Typical problems are full or nearly full disks, databases, files, buffers,
counters which may lead to overflow. Maximal data amounts in communications may
also be a concern.
Part of the test is to run the system over a certain time with a lot of data.
This is in order to check what happens to temporary buffers and to timeouts due
to long times for access. One variant of this test is using especially low
volumes, such as empty databases or files, empty mails, no links etc. Some
programs cannot handle this either.
One last variant is measuring how much space is needed by a program. This is
important if a program is sharing resources with other ones. All programs taken
together must not use more resources than available.
So we can summarize Volume Testing as follows
Objective
Find problems with
max. amounts of data.
System performance or usability often degrades when large amounts ofdata must
be searched, ordered etc
Test Procedure:
The system is run with maximum amounts of data.
Internal tables, databases, files, disks etc. are loaded with a maximum of
data.
Maximal length of external input.
Important functions where data volume may lead to trouble.Result wanted:
No problems, no significant performance degradation, and no lost
data.Considerations:
Data generation may need analysis of a usage profile and may not be trivial.
(Same as in stress testing.)
Copy of production data or random generation.
Use data generation or extraction tools.
Data variation is important!
Memory fragmentation important!Examples:
Online system: Input fast, but not necessarily fastest possible,
from different input channels. This is done for some time in order to check if
temporary buffers tend to overflow or fill up, if execution time goes down. Use
a blend of create, update, read and delete operations.
Database system: The data base should be very large.
Every object occurs with maximum number of instances. Batch jobs are run with
large numbers of transactions, for example where something must be done for ALL
objects in the data base. Complex searches with sorting through many tables.
Many or all objects linked to other objects, and to the maximum number of such
objects. Large or largest possible numbers on sum fields
No comments:
Post a Comment