JDBC, SQLJ and DB2: Interesting issue w/DB2SystemMonitor

C

Captain

Hi,

I have been trying to measure the performance of SQL queries in Java
applications using JDBC and SQLJ to figure out the best method for our
environment. I intended to use the DB2SystemMonitor interface as
instructed in IBM documentations. Everything sounds great and the API
is very simple to use, but I see unexpected results when it comes to
reality, and I couldn't find anything on IBM's website or on the
internet about this kind of issue. The resources are incredibly
limited. I am hoping someone here would help me with this.

The issue I am having is the returning values don't make sense all
the time. I can measure 4 things with this interface (from IBM's
website):

Server time: The sum of all reported DB2 server elapsed times that were
collected while system monitoring was enabled, in microseconds.

Network I/O time: The sum of elapsed network I/O times that were
collected while system monitoring was enabled, in microseconds.

Core driver time: The sum of elapsed monitored API times that were
collected while system monitoring was enabled, in microseconds. In
general, only APIs that might #result in network I/O or DB2 server
interaction are monitored.

Application time: The sum of the application, JDBC driver, network I/O,
and DB2 server #elapsed times, in milliseconds.

Also, I found on other pages on IBM's website stating that Network
Time includes Server Time, and Core Driver Time includes Network Time +
Server Time. According to this, it should be like this:

Application Time > Core Driver Time > Network I/O Time > Server Time

But the results are not like this all the time. For example (real test
values):

ServerTime: 5978 NetworkIOTime: 7565 DriverTime: 40345 AppTime: 30

Here AppTime is in milliseconds, and if I convert it to microseconds,
it's 30000. As it seems, DriverTime is bigger than ApplicationTime,
which is not possible by definition. This is about the 30% of the
cases.

Also, in some tests, which is about 10% of the cases, ServerTime
returns bigger than NetworkTime, which is not possible either by
definition.

I am really stuck and can't find any information anywhere. I'd
really appreciate any help on this issue.

Thanks.

PS: DB2 runs on OS/390 7.1.2. I am running tests on Windows XP, using
DB2 Application Development Client FixPak 13.
 
R

Rhino

Captain said:
Hi,

I have been trying to measure the performance of SQL queries in Java
applications using JDBC and SQLJ to figure out the best method for our
environment. I intended to use the DB2SystemMonitor interface as
instructed in IBM documentations. Everything sounds great and the API
is very simple to use, but I see unexpected results when it comes to
reality, and I couldn't find anything on IBM's website or on the
internet about this kind of issue. The resources are incredibly
limited. I am hoping someone here would help me with this.

The issue I am having is the returning values don't make sense all
the time. I can measure 4 things with this interface (from IBM's
website):

Server time: The sum of all reported DB2 server elapsed times that were
collected while system monitoring was enabled, in microseconds.

Network I/O time: The sum of elapsed network I/O times that were
collected while system monitoring was enabled, in microseconds.

Core driver time: The sum of elapsed monitored API times that were
collected while system monitoring was enabled, in microseconds. In
general, only APIs that might #result in network I/O or DB2 server
interaction are monitored.

Application time: The sum of the application, JDBC driver, network I/O,
and DB2 server #elapsed times, in milliseconds.

Also, I found on other pages on IBM's website stating that Network
Time includes Server Time, and Core Driver Time includes Network Time +
Server Time. According to this, it should be like this:

Application Time > Core Driver Time > Network I/O Time > Server Time

But the results are not like this all the time. For example (real test
values):

ServerTime: 5978 NetworkIOTime: 7565 DriverTime: 40345 AppTime: 30

Here AppTime is in milliseconds, and if I convert it to microseconds,
it's 30000. As it seems, DriverTime is bigger than ApplicationTime,
which is not possible by definition. This is about the 30% of the
cases.

Also, in some tests, which is about 10% of the cases, ServerTime
returns bigger than NetworkTime, which is not possible either by
definition.

I am really stuck and can't find any information anywhere. I'd
really appreciate any help on this issue.

Thanks.

PS: DB2 runs on OS/390 7.1.2. I am running tests on Windows XP, using
DB2 Application Development Client FixPak 13.

In my opinion, this question has a lot more to do with DB2 than Java. You'll
probably get a lot better answer if you post it to comp.databases.ibm-db2.

Be sure to indicate that you are referring to DB2 for OS/390 since the bulk
of the posts on that newsgroup are for DB2 for Unix/Linux/Windows and allow
a couple of days for answers to emerge since there aren't that many Java
users in the newsgroup. I think you'll find that group very helpful with
your question.
 
C

Captain

Thanks Rhino. I originally posted this same message to DB2 group but
nobody replied. I still have no answer. I will try the same thing on
Linux and see what'll happen.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

Forum statistics

Threads
473,743
Messages
2,569,478
Members
44,898
Latest member
BlairH7607

Latest Threads

Top