G
Gennady Bystritsky
Hello,
Is there any way to insert a big chunk of data (say, 100K) into a column
of type long with Ruby OCI8 or by any other Ruby means? I saw that to do
it in C you must be prepared to handle OCI error code inviting you to
insert another piece. But how to do it in Ruby, especially with OCI8? If
I do=20
require 'oci8'
db =3D OCI8.new 'system', 'manager'
db.exec("create table sample (id number, data long)")
data =3D 'A' * 1024 * 100
c =3D db.parse('insert into sample values (2, :data)')
c.bind_param(':data', data)
c.exec
db.commit
What I end up with in column 'data' is host dependent (or db block size
dependent?). I observed 14464 bytes on 2K database on Solaris, and 34652
bytes on 8K database on Linux.
UPDATE: It is not possible to read (with OCI8) columns of type long if
they contain large data chunks (100K). Reported error is:=20
`fetch': ORA-01406: fetched column value was truncated
Thank you,
Gennady Bystritsky.
Is there any way to insert a big chunk of data (say, 100K) into a column
of type long with Ruby OCI8 or by any other Ruby means? I saw that to do
it in C you must be prepared to handle OCI error code inviting you to
insert another piece. But how to do it in Ruby, especially with OCI8? If
I do=20
require 'oci8'
db =3D OCI8.new 'system', 'manager'
db.exec("create table sample (id number, data long)")
data =3D 'A' * 1024 * 100
c =3D db.parse('insert into sample values (2, :data)')
c.bind_param(':data', data)
c.exec
db.commit
What I end up with in column 'data' is host dependent (or db block size
dependent?). I observed 14464 bytes on 2K database on Solaris, and 34652
bytes on 8K database on Linux.
UPDATE: It is not possible to read (with OCI8) columns of type long if
they contain large data chunks (100K). Reported error is:=20
`fetch': ORA-01406: fetched column value was truncated
Thank you,
Gennady Bystritsky.