buffer them and insert them in a transaction 1000 at a time. =A0even w= ith
ruby this should be a peice of cake.
Do any of the ruby db libraries offer support for doing this efficientl= y?
martin
pretty much all of them
[...]
cfp:~/rails_root > ./script/runner a.rb
using sqlite3
elapsed: 0.222311019897461
count: 10000
using ar
elapsed: 7.75591206550598
count: 10000
0.2 seconds for 100000 records seems plenty fast to me. =A07 seconds not= so
much.
If your standard of performance is 10,000 records inserted in a minute, a= ny
database should be able to satisfy your requirements.
And here's the amalgalite version of ara's test... embedded sqlite in a r= uby
extension.
=A0% cat am_inserts.rb
=A0#!/usr/bin/env ruby
=A0require 'rubygems'
=A0require 'amalgalite'
=A0size =3D Integer(ARGV.shift || 10_000)
=A0messages =3D Array.new(size).map{ rand.to_s }
=A0Db =3D "speed-test.db"
=A0FileUtils.rm_f Db if File.exist?( Db )
=A0db =3D Amalgalite:
atabase.new( Db )
=A0db.execute(" CREATE TABLE messages(content); ")
=A0before =3D Time.now.to_f
=A0db.transaction do |db_in_trans|
=A0 =A0messages.each do |m|
=A0 =A0 =A0db_in_trans.execute("insert into messages(content) values( #{m= } )")
=A0 =A0end
=A0end
=A0after =3D Time.now.to_f
=A0elapsed =3D after - before
=A0mps =3D size / elapsed
=A0puts "#{"%0.2f" % elapsed} seconds to insert #{size} records at #{"%0.= 2f" % mps} records per second"
=A0% ruby am_inserts.rb
=A00.38 seconds to insert 10000 records at 25999.01 records per second
=A0% ruby am_inserts.rb 100000
=A03.80 seconds to insert 100000 records at 26344.71 records per second
enjoy,
-jeremy