Speaker: Daniel Mitterdorfer
Benchmarking is a tricky business. Pitfalls await at every corner: What is your workload and how do you model it correctly? Which hardware do you choose? Can you trust your load generator? How do you avoid accidentally introducing bottlenecks?
In this talk we will discuss rules and guidelines for seven common "gotchas" in benchmarking that will help you to evaluate performance correctly. While we use Elasticsearch as an example, these rules apply to benchmarks of Java applications in general.