jdbc mysql timeout error - prepared statements?

S

seth brundle

I have a mysql table with 9M records which I am reading row by row though
j/connection/jdbc..

The program stops around 100,000 records with the following Exception(s):

com.mysql.jdbc.CommunicationsException: Communications link failure due to
underlying exception:
Last packet sent to the server was 2369890 ms ago.

From a little research it seemed that this was a problem with buffered
prepared statements creating too much idle time on the server connection and
the connection timing out, so I inserted the option
'useServerPrepStmts=false' in the connection string to process it one row at
a time from a suggestion on mysql website - but this did not fix or change
the problem in any way.

Any ideas? I'm kinda new to Java and never had this issue with Perl/DBI with
the same table...
 
D

David Harper

seth said:
I have a mysql table with 9M records which I am reading row by row
though j/connection/jdbc..

The program stops around 100,000 records with the following Exception(s):

com.mysql.jdbc.CommunicationsException: Communications link failure due
to underlying exception:
Last packet sent to the server was 2369890 ms ago.

From a little research it seemed that this was a problem with buffered
prepared statements creating too much idle time on the server connection
and the connection timing out, so I inserted the option
'useServerPrepStmts=false' in the connection string to process it one
row at a time from a suggestion on mysql website - but this did not fix
or change the problem in any way.

Any ideas? I'm kinda new to Java and never had this issue with Perl/DBI
with the same table...

Your application may be exceeding the wait_timeout server setting, as
described in the manual:

http://dev.mysql.com/doc/refman/5.0/en/instance-manager-command-options.html

If that's the case, then useServerPrepStmts=false isn't going to help
you, but you *can* set the wait_timeout server variable to a larger
value for all clients in the server config file, or on a per-connection
basis by executing this command on your connection:

set session wait_timeout = <value>

where <value> is the desired timeout value in seconds. I use this
method in my code:

public void setWaitTimeout(Connection conn, int timeout)
throws SQLException {
String sql = "set session wait_timeout = " + timeout;
Statement stmt = conn.createStatement();
stmt.execute(sql);
stmt.close();
}

However, I'm tempted to ask whether you *really* need to retrieve all 9
million rows? Surely there must be a better way to aproach your problem?

David Harper
Cambridge, England
 
U

UsenetBinaries.com

Your application may be exceeding the wait_timeout server setting, as

The setting is 8 hours, and mine is still set to that default, and the
program
stops after only 42 minutes or ~100,000 rows, so I dont think thats the
case.
However, I'm tempted to ask whether you *really* need to retrieve all 9
million rows?

Yes, it is going to be a program which runs regularly which transfers the
contents of a backup
MySQL database to a Lucene index, so yeah, the entire data set needs to be
read.

Whats really puzzling me is that I have been using Perl DBI on this table
for years in production and never ran into this issue.

I also have been seeing a recent surge in forum posts on the MySQL forums
for jdb/jconnection people with the same error
with no solution which has worked for me.

I realize I might be able to get around this problem by hacking the query
into LIMIT statements etc but I would much rather
solve the problem then work around it. This is not expected MySQL behavior
so either I am doing something wrong or a
recent update to MySQL/JDBC/jconnect/jdk/jre has a bug.
 
G

Guest

I realize I might be able to get around this problem by hacking the query
into LIMIT statements etc but I would much rather
solve the problem then work around it. This is not expected MySQL behavior
so either I am doing something wrong or a
recent update to MySQL/JDBC/jconnect/jdk/jre has a bug.

This may be your client JVM getting progressively slower, perhaps
due to memory limitations, and having to scrabble for free memory
as you process your result set. Try:
1 - Starting the JVM with arguments to give it more memory.
2 - Make sure you get forward-only, non-scrollable result sets,
and close every JDBC object ASAP.
3 - Try a plain statement for the query, not a prepared one. Unless
you're going to rerun the same prepared statement a lot of times,
there may be no benefit and some loss in using them.

Joe Weinstein at BEA Systems
 
D

David Harper

UsenetBinaries.com wrote:
[SNIP]
Yes, it is going to be a program which runs regularly which transfers
the contents of a backup
MySQL database to a Lucene index, so yeah, the entire data set needs to
be read.

According to the Lucene web site (http://lucene.apache.org/java/docs/),

"Apache Lucene is a high-performance, full-featured text search engine
library written entirely in Java. It is a technology suitable for nearly
any application that requires full-text search, especially cross-platform."

MySQL already provides full-text search so I can't help but think that
you may not be making optimal use of the software components at hand.
Given that you need to store your text in a database, why not make use
of the full-text search functionality provided by MySQL?

David Harper
Cambridge, England
 
S

seth brundle

"Apache Lucene is a high-performance, full-featured text search engine
library written entirely in Java.
.... why not make use of the full-text search functionality provided by
MySQL?

I've been using MySQL fulltext for user web search on this dataset for about
4 years, but the requirements and dataset have outgrown MySQL's limited
search capabilties.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

No members online now.

Forum statistics

Threads
473,768
Messages
2,569,574
Members
45,051
Latest member
CarleyMcCr

Latest Threads

Top