heisenbug Heisenbug 2016 Msc (10.12.2016 — 10.12.2016)

Performance testing pitfalls


The talk will highlight typical JMeter pitfalls and provide hints to manage them.

To run a load test is enough simply. But launch without an analysis is just waste of time. Analysis reveals something from what it is necessary to repeat measurement. For example, the response time was good, and it turned out in the detailed analysis that all pages showed error 404. At the beginning of the test times are good, and then they are not at all. Or even so: JMeter shows that “everything is fine”, but in reality, the load has not been fed for half an hour. It happens that in general everything is fine, but there are unpleasant omissions. How to analyze the causes of omissions? How to understand on what data do they appear?

The talk will consider typical pitfalls appearing while testing enterprise applications, and solutions to these problems. The talk is built on JMeter example, but many of the approaches can be applied with the same success to other instruments. Vladimir will tell in what the average differs from the 90% line, how a coordinated omission prevents measure response times, and teach ways to circumvent the typical problems encountered during measuring performance.