3

I am currently investigating a strange behaviour in one of our performance tests: In this test I am executing a pretty complex calculation several times. The test is showing some ramp-up behaviour. The first run is the most expensive run after that about 100 runs execution time is 1000-5000ns and after this it drops to 10-40ns. As the test uses randomly generated values, I modified the test to generate the data once and then execute the test with exactly the same data.

In order to elminiate classloading and other issues, I even executed the run once and added a sleep of one second to give any background stuff the chance to finish loading classes and other stuff.

I would have expected a similar execution time throughout the runs and could understand a decrease of performance due to garbage collection kicking in ... but seeing the runs getting faster and faster seems rather odd.

I can't really explain this behaviour. Are there any effects in the VM that could be doing some sort of optimization?

Chris

4

1 回答 1

3

This effects are caused by HotSpot.

HotSpot does the runtime execution of the byte code.

First it interprets the byte code. This is pretty slow.

If HotSpot detects that you code is executed regularily it compiles the code to native code on the computer executing the code. This gives the first improvement in execution speed.

After that HotSpot is still analysing your code. If it is called very often (is a hot spot of your program) HotSpot will optimize your code. This is done even more often when it determines that your code is running often and takes much time to execute. The optimization is more and more aggressive.

So the code executed often and taking long time will be optimized more and more and if your code is optimizable it will do more and more optimization.

A white paper for this can be found here.

于 2013-06-21T13:04:37.040 回答