C
Captain
Hi,
I have been trying to measure the performance of SQL queries in Java
applications using JDBC and SQLJ to figure out the best method for our
environment. I intended to use the DB2SystemMonitor interface as
instructed in IBM documentations. Everything sounds great and the API
is very simple to use, but I see unexpected results when it comes to
reality, and I couldn't find anything on IBM's website or on the
internet about this kind of issue. The resources are incredibly
limited. I am hoping someone here would help me with this.
The issue I am having is the returning values don't make sense all
the time. I can measure 4 things with this interface (from IBM's
website):
Server time: The sum of all reported DB2 server elapsed times that were
collected while system monitoring was enabled, in microseconds.
Network I/O time: The sum of elapsed network I/O times that were
collected while system monitoring was enabled, in microseconds.
Core driver time: The sum of elapsed monitored API times that were
collected while system monitoring was enabled, in microseconds. In
general, only APIs that might #result in network I/O or DB2 server
interaction are monitored.
Application time: The sum of the application, JDBC driver, network I/O,
and DB2 server #elapsed times, in milliseconds.
Also, I found on other pages on IBM's website stating that Network
Time includes Server Time, and Core Driver Time includes Network Time +
Server Time. According to this, it should be like this:
Application Time > Core Driver Time > Network I/O Time > Server Time
But the results are not like this all the time. For example (real test
values):
ServerTime: 5978 NetworkIOTime: 7565 DriverTime: 40345 AppTime: 30
Here AppTime is in milliseconds, and if I convert it to microseconds,
it's 30000. As it seems, DriverTime is bigger than ApplicationTime,
which is not possible by definition. This is about the 30% of the
cases.
Also, in some tests, which is about 10% of the cases, ServerTime
returns bigger than NetworkTime, which is not possible either by
definition.
I am really stuck and can't find any information anywhere. I'd
really appreciate any help on this issue.
Thanks.
PS: DB2 runs on OS/390 7.1.2. I am running tests on Windows XP, using
DB2 Application Development Client FixPak 13.
I have been trying to measure the performance of SQL queries in Java
applications using JDBC and SQLJ to figure out the best method for our
environment. I intended to use the DB2SystemMonitor interface as
instructed in IBM documentations. Everything sounds great and the API
is very simple to use, but I see unexpected results when it comes to
reality, and I couldn't find anything on IBM's website or on the
internet about this kind of issue. The resources are incredibly
limited. I am hoping someone here would help me with this.
The issue I am having is the returning values don't make sense all
the time. I can measure 4 things with this interface (from IBM's
website):
Server time: The sum of all reported DB2 server elapsed times that were
collected while system monitoring was enabled, in microseconds.
Network I/O time: The sum of elapsed network I/O times that were
collected while system monitoring was enabled, in microseconds.
Core driver time: The sum of elapsed monitored API times that were
collected while system monitoring was enabled, in microseconds. In
general, only APIs that might #result in network I/O or DB2 server
interaction are monitored.
Application time: The sum of the application, JDBC driver, network I/O,
and DB2 server #elapsed times, in milliseconds.
Also, I found on other pages on IBM's website stating that Network
Time includes Server Time, and Core Driver Time includes Network Time +
Server Time. According to this, it should be like this:
Application Time > Core Driver Time > Network I/O Time > Server Time
But the results are not like this all the time. For example (real test
values):
ServerTime: 5978 NetworkIOTime: 7565 DriverTime: 40345 AppTime: 30
Here AppTime is in milliseconds, and if I convert it to microseconds,
it's 30000. As it seems, DriverTime is bigger than ApplicationTime,
which is not possible by definition. This is about the 30% of the
cases.
Also, in some tests, which is about 10% of the cases, ServerTime
returns bigger than NetworkTime, which is not possible either by
definition.
I am really stuck and can't find any information anywhere. I'd
really appreciate any help on this issue.
Thanks.
PS: DB2 runs on OS/390 7.1.2. I am running tests on Windows XP, using
DB2 Application Development Client FixPak 13.