Home · Search · Print · Help
Recycling tests
There are two distinct ways to recycle a Polygraph test.
Continuing an
earlier test is possible using Persistent Working Set feature. With
this feature, you can, for example, start a test with a cache already
full of old URLs and have Polygraph request those URLs as if
they were cached during the current test. Persistent working set is
useful when you want to continue an earlier test instead of
starting from scratch.
Repeating an earlier
test is possible by forcing Polygraph to use exactly the same random
number generator seeds. Same seeds allow you to generate nearly
identical URL/request sequences. Unlike Persistent Working Set feature
above, this kludge mode does not provide Polygraph with information
about previously requested URLs so Polygraph will not, for example,
expect a cache hit when requesting an "old" URL for the first time in
the second test. Using the same seeds is useful when you want to
repeat an earlier test from scratch.
This page talks about common principles behind tests/URLs recycling.
Please see Continuing Tests and
Repeating Tests pages for
documentation specific to those two subjects.
1. Do not do it
In most cases, you do not want to force Polygraph to reuse
URLs. By default, Polygraph will use a unique set of URLs for every run.
Unique URL sets are convenient because they reduce side-effects that one
test might have on another. For example, one does not have to flush
(empty) the cache between to tests to get configured hit ratio because the
second test will not (by default) reuse old URLs.
If you read this page out of curiosity, stop now and come back only
if you really need two tests to use the same URL set.
2. What is [not] reproducible
Workload features explicitly controlled by PGL objects can be reproduced
with good precision. For example, two tests with the same object size
distribution are likely to yield very similar object size distributions. This
is default Polygraph behavior, and no special effort is required to enable it.
In other words, Polygraph tests are meant to be reproduceable by default.
Reproducibility in this context does not imply reusing things like URL strings
or request sequences; it implies reusing things like object size distributions
or request interarrival distributions.
It is possible to force Polygraph to use the same URL set or request
sequence in two tests using one of the two techniques mentioned above.
In general, it is not possible to repeat exact request submission times or
exact network packet sizes. Polygraph does not have much control over those
things. Kernel scheduler decisions or TCP stack behavior cannot be
reproduced in a general environment. Fortunately, such reproduction is
usually unnecessary.
Home · Search · Print · Help
|