R
Remon van Vliet
Hello,
I've run into an odd performance difference between the client and the
server VM. I made a few classes for 3D math and such, and here are two
version of a scale method :
public final static RTVector scale(RTVector v, double s) {
/* Scale vector and return result */
return new RTVector(v.x * s, v.y * s, v.z * s);
}
public final static RTVector scale(RTVector r, RTVector v, double s) {
/* Scale vector */
r.set(v.x * s, v.y * s, v.z * s);
/* Return result vector */
return r;
}
As you can see, one creates a new vector and returns the result, the other
sets the result in a third vector that's passed to the method. The latter
version should be faster since it doesnt create a new object (note that i
made sure the test isnt creating a new object each iteration either). Now,
for the server VM (-server) all works as expected, for 10000000 runs :
option1 : 0.188s
option2 : 0.032s
The client VM however :
option1 : 0.579s
option2 : 9.547s
As you can see the server VM is way faster for this, which is expected
behavior. What is odd to me is that the option where no new objects are
created is actually a factor 20 slower on the client VM. Does anyone have an
explanation for this? Note that the only difference for these tests is the
VM command line argument -client/-server.
Hope someone can shed some light on this,
Remon van Vliet
I've run into an odd performance difference between the client and the
server VM. I made a few classes for 3D math and such, and here are two
version of a scale method :
public final static RTVector scale(RTVector v, double s) {
/* Scale vector and return result */
return new RTVector(v.x * s, v.y * s, v.z * s);
}
public final static RTVector scale(RTVector r, RTVector v, double s) {
/* Scale vector */
r.set(v.x * s, v.y * s, v.z * s);
/* Return result vector */
return r;
}
As you can see, one creates a new vector and returns the result, the other
sets the result in a third vector that's passed to the method. The latter
version should be faster since it doesnt create a new object (note that i
made sure the test isnt creating a new object each iteration either). Now,
for the server VM (-server) all works as expected, for 10000000 runs :
option1 : 0.188s
option2 : 0.032s
The client VM however :
option1 : 0.579s
option2 : 9.547s
As you can see the server VM is way faster for this, which is expected
behavior. What is odd to me is that the option where no new objects are
created is actually a factor 20 slower on the client VM. Does anyone have an
explanation for this? Note that the only difference for these tests is the
VM command line argument -client/-server.
Hope someone can shed some light on this,
Remon van Vliet