Batch Insert with duplicate key

F

Felix Roberto

any one knows how to do a
prepared statement batch insert, and when you get a pk exception
instead of breaking
the batch, keep going with the rest?

this is a simplyfied example of what i do
problem is i can get as much as 1500 record that are generated.
so how can i insert them into the db with out much performance
sacrifice?
and garantee that if a record allready exists it just keeps going with
the rest of the
group.

PreparedStatement pstm = con.crearPreparedStatement(
"INSERT INTO "+TABLE+" ("+PK_1+","+PK_2+") VALUES (?,?)");


for(int x=0;x<record1.length;x++){
for(int y=0;y<record2.length;y++){
pstm.setString(1, record1[x]);
pstm.setString(2, record2[y]);
pstm.addBatch();
}
}
pstm.executeBatch();
 
F

Furious George

Felix said:
any one knows how to do a
prepared statement batch insert,

as you have it below.
and when you get a pk exception
instead of breaking
the batch, keep going with the rest?

One way would be like this.

CREATE TEMPORARY TABLE t1 ( ... same as permanent table but without
primary key ... ) ;
batch insert into temporary table ;
batch execute ;
INSERT INTO permanentTable SELECT * FROM t1 WHERE primary key logic ;
this is a simplyfied example of what i do
problem is i can get as much as 1500 record that are generated.
so how can i insert them into the db with out much performance
sacrifice?

Maybe your db has a better way. For example MySQL has a LOAD DATA
statement that is useful for loading large amounts of data.
and garantee that if a record allready exists it just keeps going with
the rest of the
group.

You would really need to read the documentation for your particular
database to figure the optimal solution.
PreparedStatement pstm = con.crearPreparedStatement(
"INSERT INTO "+TABLE+" ("+PK_1+","+PK_2+") VALUES (?,?)");


for(int x=0;x<record1.length;x++){
for(int y=0;y<record2.length;y++){
pstm.setString(1, record1[x]);
pstm.setString(2, record2[y]);
pstm.addBatch();
}
}
pstm.executeBatch();
 
F

Felix Roberto

Rigth now we are using oracle 10g
but problem is that the data is not to be loaded it is generated.
it calculated data that is generated in whole block, up to 10k records
and there is a big chance for the primary key to be duplicated.
so what i was triying to do, was to execute the batch,
and on error to keep executing and give me at the end
all the error or something like that.

or any other solution that i can implement using jdbc

Furious said:
Felix said:
any one knows how to do a
prepared statement batch insert,

as you have it below.
and when you get a pk exception
instead of breaking
the batch, keep going with the rest?

One way would be like this.

CREATE TEMPORARY TABLE t1 ( ... same as permanent table but without
primary key ... ) ;
batch insert into temporary table ;
batch execute ;
INSERT INTO permanentTable SELECT * FROM t1 WHERE primary key logic ;
this is a simplyfied example of what i do
problem is i can get as much as 1500 record that are generated.
so how can i insert them into the db with out much performance
sacrifice?

Maybe your db has a better way. For example MySQL has a LOAD DATA
statement that is useful for loading large amounts of data.
and garantee that if a record allready exists it just keeps going with
the rest of the
group.

You would really need to read the documentation for your particular
database to figure the optimal solution.
PreparedStatement pstm = con.crearPreparedStatement(
"INSERT INTO "+TABLE+" ("+PK_1+","+PK_2+") VALUES (?,?)");


for(int x=0;x<record1.length;x++){
for(int y=0;y<record2.length;y++){
pstm.setString(1, record1[x]);
pstm.setString(2, record2[y]);
pstm.addBatch();
}
}
pstm.executeBatch();
 
A

andrewmcdonagh

Rigth now we are using oracle 10g
but problem is that the data is not to be loaded it is generated.
it calculated data that is generated in whole block, up to 10k records
and there is a big chance for the primary key to be duplicated.
so what i was triying to do, was to execute the batch,
and on error to keep executing and give me at the end
all the error or something like that.

or any other solution that i can implement using jdbc

So your Java app is generating 10k records, some of which their primary
keys might already have been used?

If so....

does java app need to create the keys, can't you use the auto-generated
ones from Oracle?

If your app does need to, can it also validate that its used them
itself before sending to db?

if it can't, then I'd suggest calling a Stored Procedure with the 10k
entries and have it process them. This would be much faster than the
java app, trying to do multiple small batches or single inserts. Take a
look at 'setArray()' from the JDBC classes, or if oracle is the only
db, look at their 'setARRAY()' version.


Andrew
NOTE: the stored Procedure can also be Java - but it wont run as fast
as a normal SQL stored Proc. It should however, run as fast as a
PL/SQL stored Proc.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

No members online now.

Forum statistics

Threads
473,754
Messages
2,569,522
Members
44,995
Latest member
PinupduzSap

Latest Threads

Top