T
Tim Haynes
Vincent Le-Texier said:If the query is :
my $str_trans = "INSERT INTO transcript (id,embl_acc) VALUES (?,?)";
my $s_trans = $dbh->prepare($str_trans);
Foreach entries I want to load into the transcript table , I do :
$s_trans->bind_param(1,$id);
$s_trans->bind_param(2,$embl_acc);
and $s_trans->execute;
OK. At least you're not preparing it as well as executing it every time
I would like to known, if there are objects and methods with perl DBI to
load by block of statements (execute one statement every 3000 entries for
example) instead of each entry.
Two thoughts:
a) do you have any indexes or primary keys on the table while you're
inserting all this stuff? You don't want the hold-up of maintaining the
index every time, so only create such things after all the data's in place;
b) sure you can batch things up, with transactions. Turn off auto_commit
behaviour e.g. like:
| $dbh=DBI->connect("dbi:[stuff]", "", "", {AutoCommit => 0})
and then in your main loop, maintain a counter of number of rows and every
few hundred, do a commit. Fill in the blanks in the following:
| $rowcount=0;
| $sth=$dbh->prepare (some_insert_statement);
|
| while (looping_over_input_values) {
| $rc=$sth->bind_param(1, something);
| $rc=$sth->bind_param(2, somethingelse);
| $rc=$sth->execute;
|
| if(!($rowcount%100)) {
| $handle->commit;
| }
| $rowcount++;
| }
| $handle->commit; # don't forget this after they're all in
HTH,
~Tim
--
Product Development Consultant
OpenLink Software
Tel: +44 (0) 20 8681 7701
Web: <http://www.openlinksw.com>
Universal Data Access & Data Integration Technology Providers