Populating 'long' data column with Ruby OCI8 (trying again, need a solution badly)

G

Gennady Bystritsky

Hello,

Is there any way to insert a big chunk of data (say, 100K) into a column
of type long with Ruby OCI8 or by any other Ruby means? I saw that to do
it in C you must be prepared to handle OCI error code inviting you to
insert another piece. But how to do it in Ruby, especially with OCI8? If
I do=20

require 'oci8'

db =3D OCI8.new 'system', 'manager'
db.exec("create table sample (id number, data long)")

data =3D 'A' * 1024 * 100

c =3D db.parse('insert into sample values (2, :data)')
c.bind_param(':data', data)

c.exec
db.commit

What I end up with in column 'data' is host dependent (or db block size
dependent?). I observed 14464 bytes on 2K database on Solaris, and 34652
bytes on 8K database on Linux.

UPDATE: It is not possible to read (with OCI8) columns of type long if
they contain large data chunks (100K). Reported error is:=20
`fetch': ORA-01406: fetched column value was truncated

Thank you,
Gennady Bystritsky.
 
N

Neil Kohl

Is there any way to insert a big chunk of data (say, 100K) into
a column of type long with Ruby OCI8 or by any other Ruby means?

You need to do it in chunks. See documentation for OCI8::BLOB#write().
Here's an example that works for me loading data from a file. You'll
need to adapt to load from a variable.

require 'oci8'

conn =3D OCI8.new(user, passwd, sid)

# test.txt is a file that's > 100K
name =3D "test.txt"

# create the row with an empty blob
cursor =3D conn.parse("INSERT INTO nkrb_test (name, data) VALUES:)name,
EMPTY_BLOB())")
cursor.exec(name)

# now load blob column with file contents
conn.exec("SELECT name, data FROM nkrb_test") do |name, data|
chunk_size =3D data.chunk_size
File.open(name, 'r') do |f|
until f.eof?
data.write(f.read(chunk_size))
end
data.size =3D f.pos
end
end
conn.commit

Schema for nkrb_test is:

create table nkrb_test (
=09id INT,
=09name VARCHAR(255),
=09data BLOB,
=09CONSTRAINT nkrb_test_pk PRIMARY KEY (id)
);
UPDATE: It is not possible to read (with OCI8) columns of type long if
they contain large data chunks (100K). Reported error is:
`fetch': ORA-01406: fetched column value was truncated

Try setting OCI8::BLOB#truncate() to longer than the longest data you
expect to see in a row, or use OCI8::BLOB#read() to read row data in
chunks -- again see docs for example.

If you haven't figured it out yet already, LOBs are a real pain to deal wit=
h.
 
K

KUBO Takehiro

Gennady Bystritsky said:
Hello,

Is there any way to insert a big chunk of data (say, 100K) into a column
of type long with Ruby OCI8 or by any other Ruby means? I saw that to do
it in C you must be prepared to handle OCI error code inviting you to
insert another piece. But how to do it in Ruby, especially with OCI8? If
I do

require 'oci8'

db = OCI8.new 'system', 'manager'
db.exec("create table sample (id number, data long)")

data = 'A' * 1024 * 100

c = db.parse('insert into sample values (2, :data)')
c.bind_param(':data', data)

c.exec
db.commit

What I end up with in column 'data' is host dependent (or db block size
dependent?). I observed 14464 bytes on 2K database on Solaris, and 34652
bytes on 8K database on Linux.

I don't know why. It may needs piecewise inserts to insert long values.

I've not tested the following patch. It may work, but may not...
--- oci8.rb.bak 2005-11-13 16:21:33.000000000 +0900
+++ oci8.rb 2006-01-19 03:07:00.000000000 +0900
@@ -264,6 +264,14 @@
end
end

+ # get/set LONG
+ LONG = Object.new
+ class << LONG
+ def fix_type(env, val, length, precision, scale)
+ [OCI8::SQLT_LNG, val, length || (val.nil? ? nil : val.length)]
+ end
+ end
+
# get/set RAW
RAW = Object.new
class << RAW
@@ -867,7 +875,7 @@
# datatype type size prec scale
# -------------------------------------------------
# LONG SQLT_LNG 0 0 0
- BindType::Mapping[OCI8::SQLT_LNG] = BindType::String
+ BindType::Mapping[OCI8::SQLT_LNG] = BindType::LONG

# datatype type size prec scale
# -------------------------------------------------

UPDATE: It is not possible to read (with OCI8) columns of type long if
they contain large data chunks (100K). Reported error is:
`fetch': ORA-01406: fetched column value was truncated

How about edit oci8.rb at line 693 as the following:

from:
when SQLT_LNG, SQLT_LBI
# TODO: use OCI_DYNAMIC_FETCH
datasize = 65535
end
to:
when SQLT_LNG, SQLT_LBI
# TODO: use OCI_DYNAMIC_FETCH
datasize = 100 * 1024
end
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Staff online

Members online

Forum statistics

Threads
473,734
Messages
2,569,441
Members
44,832
Latest member
GlennSmall

Latest Threads

Top