please confirm something (jdbc)

S

stc

I've been googling for hours now and what I've found out is that there's
just no way to use the same code for handling BLOBs in Oracle and other
databases. Am I wrong?

All examples I've come across suggest that one should cast
java.sql.ResultSet to oracle.jdbc.driver.OracleResultSet when retrieving
oracle.sql.BLOB objects and using streams (or put/get bytes) to read from or
write to BLOBs.

There are methods in java.sql.PreparedStatement and java.sql.ResultSet
(get/set blob, get/set bytes, get/set binary stream) but they're not working
for me.

I'm using Oracle 9i database and ojdbc14.jar that contains Oracle JDBC
driver for Java 1.4 and beyond. So technically I could use Oracle JDBC
classes and write database-specific code but I would like to write code that
would work OK if I ever switch to DB2 or PostgreSQL.
 
T

Thomas Kellerer

stc wrote on 28.11.2007 20:03:
I've been googling for hours now and what I've found out is that there's
just no way to use the same code for handling BLOBs in Oracle and other
databases. Am I wrong?

All examples I've come across suggest that one should cast
java.sql.ResultSet to oracle.jdbc.driver.OracleResultSet when retrieving
oracle.sql.BLOB objects and using streams (or put/get bytes) to read from or
write to BLOBs.

There are methods in java.sql.PreparedStatement and java.sql.ResultSet
(get/set blob, get/set bytes, get/set binary stream) but they're not working
for me.

I'm using Oracle 9i database and ojdbc14.jar that contains Oracle JDBC
driver for Java 1.4 and beyond. So technically I could use Oracle JDBC
classes and write database-specific code but I would like to write code that
would work OK if I ever switch to DB2 or PostgreSQL.

I'm using the getBinaryStream() and setBinaryStream() methods without problems
across several databases (Oracle, Postgres, SQL Server, MySQL, ...)

For Oracle you should use a recent 10.x driver. They work very well with an 9.x
server and the support for BLOBs and CLOBs is a lot better in there.

The only drawback with the 10.x driver is that (I think) it is only available
for Java 1.5 and above, so if you are stuck with 1.4 you are probably out of
luck. But as 1.4 is de-supported soon, you should think about upgrading anyway.

Oracle's download page seems to be down right, now, but do try the 10.x (or even
the 11.x) driver:

http://otn.oracle.com/software/tech/java/sqlj_jdbc/content.html

Thomas
 
M

Mark Jeffcoat

stc said:
I've been googling for hours now and what I've found out is that there's
just no way to use the same code for handling BLOBs in Oracle and other
databases. Am I wrong?

All examples I've come across suggest that one should cast
java.sql.ResultSet to oracle.jdbc.driver.OracleResultSet when retrieving
oracle.sql.BLOB objects and using streams (or put/get bytes) to read from or
write to BLOBs.

I haven't had to do anything like that -- but I'm not supporting
anything older than 10g.

I have run into one Oracle-specific limitation, in that a BLOB,
if present, must be at the end of your PreparedStatement parameter
list. See

http://forums.oracle.com/forums/thread.jspa?threadID=415560

for more. That still didn't require any Oracle-specific code in
the end, since none of the other supported systems cared about
the parameter order.
 
S

stc

Thomas Kellerer said:
I'm using the getBinaryStream() and setBinaryStream() methods without
problems across several databases (Oracle, Postgres, SQL Server, MySQL,
...)

For Oracle you should use a recent 10.x driver. They work very well with
an 9.x server and the support for BLOBs and CLOBs is a lot better in
there.

The only drawback with the 10.x driver is that (I think) it is only
available for Java 1.5 and above, so if you are stuck with 1.4 you are
probably out of luck. But as 1.4 is de-supported soon, you should think
about upgrading anyway.

Oracle's download page seems to be down right, now, but do try the 10.x
(or even the 11.x) driver:

http://otn.oracle.com/software/tech/java/sqlj_jdbc/content.html

Thomas

This is great news! I'm using Java 1.5 so it won't be a problem. I'll
download the driver tomorrow and try with getBinaryStream() and
setBinaryStream() methods.

Thanks a lot...
 
S

stc

Mark Jeffcoat said:
I haven't had to do anything like that -- but I'm not supporting
anything older than 10g.

I have run into one Oracle-specific limitation, in that a BLOB,
if present, must be at the end of your PreparedStatement parameter
list. See

http://forums.oracle.com/forums/thread.jspa?threadID=415560

for more. That still didn't require any Oracle-specific code in
the end, since none of the other supported systems cared about
the parameter order.

I'll take that into consideration. Thanks again...
 
J

John Maline

Thomas said:
The only drawback with the 10.x driver is that (I think) it is only
available for Java 1.5 and above, so if you are stuck with 1.4 you are
probably out of luck. But as 1.4 is de-supported soon, you should think
about upgrading anyway.

I don't believe that's true. I recently was testing out Oracle's
10.2.0.3 JDBC driver with Java 1.3.1. There's a classes12.zip version
(for Java 1.2 and 1.3.x, I think).

It looks like Oracle is being very good about supporting old Java
versions. Probably by necessity since some of us enterprise types get
stuck w/ infrastructure tied to specific (old) java versions.
 
T

Thomas Kellerer

Mark Jeffcoat wrote on 28.11.2007 21:38:
I haven't had to do anything like that -- but I'm not supporting
anything older than 10g.

I have run into one Oracle-specific limitation, in that a BLOB,
if present, must be at the end of your PreparedStatement parameter
list. See

http://forums.oracle.com/forums/thread.jspa?threadID=415560

for more. That still didn't require any Oracle-specific code in
the end, since none of the other supported systems cared about
the parameter order.

That thread speaks about LONG columns which is something different.
Due to the nature of how they are stored, this limitation is indeed
present for LONG columns but not for BLOB columns.
I have happily updated and retrieved BLOB columns with getBinaryStream() at any
position in my prepared statement (even more than one BLOB at a time).

Note that LONG is deprecated since 8.1 (or something near that version number).

Thomas
 
M

Mark Jeffcoat

Thomas Kellerer said:
That thread speaks about LONG columns which is something different.
Due to the nature of how they are stored, this limitation is indeed
present for LONG columns but not for BLOB columns.
I have happily updated and retrieved BLOB columns with
getBinaryStream() at any position in my prepared statement (even more
than one BLOB at a time).

You're right, I'd mis-remembered. That thread doesn't mention
BLOBs. On the other hand, the reason I found it in the
first place is by searching on the error message I got
on a column declared BLOB ("ORA-24816: Expanded non LONG bind
data supplied after actual LONG or LOB column"), and re-ordering
did solve my issue.

So there's some connection here. BLOB is a special case of
LOB, maybe?

I am using ResultSet.getBytes() instead of getBinaryStream();
I wonder if that's what made the difference.
 
T

Thomas Kellerer

Mark Jeffcoat, 29.11.2007 00:58:
You're right, I'd mis-remembered. That thread doesn't mention
BLOBs. On the other hand, the reason I found it in the
first place is by searching on the error message I got
on a column declared BLOB ("ORA-24816: Expanded non LONG bind
data supplied after actual LONG or LOB column"), and re-ordering
did solve my issue.
Is there a LONG column in the ResultSet maybe?
So there's some connection here. BLOB is a special case of
LOB, maybe?
I wouldn't call it "special" A BLOB *is* a LOB (by definition)
I am using ResultSet.getBytes() instead of getBinaryStream();
I wonder if that's what made the difference.
Could well be. When I tried to find a way to support LOB data across
different DMBS, getBinaryStream() (and set..) was the only method that
worked reliably (using an Oracle 10.x driver)

Thomas
 
S

stc

Thomas Kellerer said:
Could well be. When I tried to find a way to support LOB data across
different DMBS, getBinaryStream() (and set..) was the only method that
worked reliably (using an Oracle 10.x driver)

I've downloaded the driver but still can't make it work. Here's the code:

// write
conn.setAutoCommit(false); // because some data is stored in other tables as
well
PreparedStatement pstmt = conn.prepareStatement("insert into test (id,
content) values (?, ?)";
pstmt.setString(1, "12345");
pstmt.setBinaryStream(2, bais, bais.available()); // bais contains bytes
read from the local PDF file
pstmt.executeUpdate();
pstmt.close();
conn.commit();
bais.close();

This works fine, that is, there are no exceptions, but is there a way to
check what is stored in the table using Oracle Enterprise Manager Console. I
got the warning that BLOB columns cannot be displayed. If I could only see
the length of bytes stored in the column, I could compare it to the size of
the PDF files locally on the disk.

And here's the code to read the BLOB:

// read
conn.setAutoCommit(true);
pstmt = conn.prepareStatement("select content from test where id = ? for
update";
pstmt.setString(1, "12345");
ResultSet rs = pstmt.executeQuery();
if (rs.next()) {
InputStream is = rs.getBinaryStream("content");
int n;
byte[] tmp = new byte[is.available()]; // available() returns 0!
FileOutputStream fos = new FileOutputStream("C:/Temp/12345.pdf");
while ((n = is.read(tmp)) != -1) {
fos.write(tmp, 0, n);
}
fos.close();
is.close();
}
rs.close();
pstmt.close();
conn.close();

The problem is that "available()" method returns 0 and "read()" method
blocks. What am I doing wrong?
 
S

stc

stc said:
I've downloaded the driver but still can't make it work. Here's the code:

// write
conn.setAutoCommit(false); // because some data is stored in other tables
as well
PreparedStatement pstmt = conn.prepareStatement("insert into test (id,
content) values (?, ?)";
pstmt.setString(1, "12345");
pstmt.setBinaryStream(2, bais, bais.available()); // bais contains bytes
read from the local PDF file
pstmt.executeUpdate();
pstmt.close();
conn.commit();
bais.close();

This works fine, that is, there are no exceptions, but is there a way to
check what is stored in the table using Oracle Enterprise Manager Console.
I got the warning that BLOB columns cannot be displayed. If I could only
see the length of bytes stored in the column, I could compare it to the
size of the PDF files locally on the disk.

Query "select length(content) from test" returns correct number of bytes so
I guess storing BLOBs is working fine. The problem is with the reading of
BLOBs...
 
S

stc

stc said:
// read
conn.setAutoCommit(true);
pstmt = conn.prepareStatement("select content from test where id = ? for
update";
pstmt.setString(1, "12345");
ResultSet rs = pstmt.executeQuery();
if (rs.next()) {
InputStream is = rs.getBinaryStream("content");
int n;
byte[] tmp = new byte[is.available()]; // available() returns 0!
FileOutputStream fos = new FileOutputStream("C:/Temp/12345.pdf");
while ((n = is.read(tmp)) != -1) {
fos.write(tmp, 0, n);
}
fos.close();
is.close();
}
rs.close();
pstmt.close();
conn.close();

The problem is that "available()" method returns 0 and "read()" method
blocks. What am I doing wrong?

I managed to read the BLOB successfully using "getBytes()" method from
ResultSet. Then I tried to store BLOB using "setBytes()" and it worked. So
the summary is as following:

- Oracle 9i, 10g JDBC driver, Java 1.5
- writing BLOBs - both "setBinaryStream()" and "setBytes()" work
- reading BLOBs - works only with "getBytes()"

I would really like to have "getBinaryStream()" working so if you could help
I'd appreciate it...
 
T

Thomas Kellerer

stc, 29.11.2007 15:35:
I managed to read the BLOB successfully using "getBytes()" method from
ResultSet. Then I tried to store BLOB using "setBytes()" and it worked. So
the summary is as following:

- Oracle 9i, 10g JDBC driver, Java 1.5
- writing BLOBs - both "setBinaryStream()" and "setBytes()" work
- reading BLOBs - works only with "getBytes()"

I would really like to have "getBinaryStream()" working so if you could help
I'd appreciate it...
I'm not sure that getBytes() works for other JDBC drivers as well. I
think get/setBinaryStream() was the only reliable (cross-dbms) way that
I found.

What I do, to read the BLOB (I removed the error handling obviously)

InputStream in = rs.getBinaryStream(1);
ByteArrayOutputStream baos = new ByteArrayOutputStream();
byte[] buffer = new byte[1024];

int bytesRead = in.read(buff);

while (bytesRead > -1)
{
baos.write(buff, 0, bytesRead);
bytesRead = in.read(buff);
}

in.close();
baos.close();

Now you can e.g. use baos.toByteArray() to access the raw data as a byte
array.

Note that using ByteArrayOutputStream.toByteArray() will create a copy
of its internal buffer so that will double the amount of memory that you
need. That's the reason I wrote my own dynamic byte array (where I can
access the byte array without copying it around)

Thomas
 
L

Lew

Thomas said:
Note that using ByteArrayOutputStream.toByteArray() will create a copy
of its internal buffer so that will double the amount of memory that you
need. That's the reason I wrote my own dynamic byte array (where I can
access the byte array without copying it around)

Using ByteArrayOutputStream.toByteArray() will also guarantee that "the valid
contents of the buffer have been copied into [the resultant array]", and that
the resultant array is the right size. I find these guarantees to be worth
the copy overhead for reasonable-sized objects (under a couple MB, as is
typical for, say, graphic images, one use case for BLOBs).

Given all the memory re-allocation that goes on inside the
ByteArrayOutputStream anyway, I'm not sure that one more allocation is all
that much extra overhead. If you do it right, the internal array of the
ByteArrayOutputStream will immediately become eligible for GC right away
anyway, so the "efficiency" you gain from avoiding that one copy might not be
worth the extra risk and effort of maintaining a custom class.

It depends on how L your BLOB is, naturally. I can see how a large enough
object would require striped processing and other tricks, but like most
optimization this strikes me as one of those "Don't do it 'til you have to,
and don't think you have to until you've measured" scenarios.
 
M

Martin Gregorie

stc said:
Query "select length(content) from test" returns correct number of bytes so
I guess storing BLOBs is working fine. The problem is with the reading of
BLOBs...
I think this might be a general problem with some InputStream instances
when they are asked to process byte streams. I'm being tentative, because
I found something that looks like the same problem last night, diagnosed
it just now, but haven't tried fixing it yet.

I'm using J2SE 1.4 at present because I want to get this application
running and stable before moving to Java 6.

In my case I'm reading from the InputStream returned by the JavaMail
Message.getContent() method. It returns a PipedInputStream instance
which should transfer the contents of a byte array. I plug this directly
into PreparedStatement.setBinaryStream() to write the message content
into a bytea (binary equivalent of text or varchar) in a Postgres
database, having got the field size from InputStream.available().

As part of my diagnostic code, if debugging is on I execute the
following before loading data into the database:
- I call Message.getContent() and read the stream a byte at a time
- convert each byte to a char and append it to a StringBuffer.
- the StringBuffer is displayed by writing it to System.err.

The diagnostic display is ALWAYS a complete version of the message
content, but the database field content varies depending on whether the
message content is MIME or non-MIME:
- non-MIME: (plaintext UTF-8 or bit-7) the database field is always
correct.

- MIME: the database field is sometimes empty and sometimes incomplete.

It looks to me as if, for a PipedInputStream, the value returned by
available() can be invalidated by the content of the byte array but that
read() will always correctly recognise the end of the byte array.

I'm about to modify my code to check this assumption and will let you
know the result.
 
R

Ramon F Herrera

I've been googling for hours now and what I've found out is that there's
just no way to use the same code for handling BLOBs in Oracle and other
databases. Am I wrong?

All examples I've come across suggest that one should cast
java.sql.ResultSet to oracle.jdbc.driver.OracleResultSet when retrieving
oracle.sql.BLOB objects and using streams (or put/get bytes) to read from or
write to BLOBs.

There are methods in java.sql.PreparedStatement and java.sql.ResultSet
(get/set blob, get/set bytes, get/set binary stream) but they're not working
for me.

I'm using Oracle 9i database and ojdbc14.jar that contains Oracle JDBC
driver for Java 1.4 and beyond. So technically I could use Oracle JDBC
classes and write database-specific code but I would like to write code that
would work OK if I ever switch to DB2 or PostgreSQL.

I tried JDBC and gave up. It's too darn slow for any non-trivial task.
All those layers and generality affect the performance. The same can
be said about ODBC.

-Ramon (a former JDBC user)
 
T

Thomas Kellerer

Ramon F Herrera, 29.11.2007 17:39:
I tried JDBC and gave up. It's too darn slow for any non-trivial task.

I cannot confirm that. Importing data using JDBC (and batched
statements) is nearly as fast as using SQL*Loader to load the same data.

Thomas
 
S

Steve Wampler

Ramon said:
I tried JDBC and gave up. It's too darn slow for any non-trivial task.
All those layers and generality affect the performance. The same can
be said about ODBC.

I'm curious - what are you using instead? (Or did you move off of Java
at the same time?)
 
T

Thomas Kellerer

stc, 29.11.2007 10:23:
// read
conn.setAutoCommit(true);
pstmt = conn.prepareStatement("select content from test where id = ? for
update";
pstmt.setString(1, "12345");
ResultSet rs = pstmt.executeQuery();
if (rs.next()) {
InputStream is = rs.getBinaryStream("content");
int n;
byte[] tmp = new byte[is.available()]; // available() returns 0!
FileOutputStream fos = new FileOutputStream("C:/Temp/12345.pdf");
while ((n = is.read(tmp)) != -1) {
fos.write(tmp, 0, n);
}
fos.close();
is.close();
}
rs.close();
pstmt.close();
conn.close();

The problem is that "available()" method returns 0 and "read()" method
blocks. What am I doing wrong?

available() cannot be used to retrieve the number of bytes for an
InputStream.

From the Javadocs:

"Returns the number of bytes that can be read (or skipped over) from
this input stream without blocking by the next caller of a method for
this input stream"

It does not claim to return the length of the underlying stream.
Actually I doubt that it even knows the size of the "source".

You should create your tmp buffer with a fixed size (e.g. new
byte[1024]), then it should work. I'm not sure I understand what you
mean with "read() blocks" but that could well be caused by your buffer
of length zero.

Thomas
 
R

Ramon F Herrera

I'm curious - what are you using instead? (Or did you move off of Java
at the same time?)

--


No, I didn't move away from Java. Never will. The thing is that along
the years I have been learning the (not too many!) aspects in which a
100% Java solution is not appropriate.

My solution -which works quite well, the speed improved by orders of
magnitude- is to write the server side in Oracle OCI (it is available
for C and C++) and the Java client makes a remote Unix request through
the iNetFactory library from these folks:

http://www.jscape.com

-Ramon
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

Forum statistics

Threads
473,769
Messages
2,569,582
Members
45,071
Latest member
MetabolicSolutionsKeto

Latest Threads

Top