R
Raphael Bauduin
Hi,
We have to migrate an mssql database to Postgres and plan to do it with the
help of Ruby. As we didn't find DBI modules to access a mssql database, we
installed a sqlrelay and access it with the correspondent dbd module.
Performance on select queries returning a lot of rows are not great. To
execute the statement takes 5 minutes when retrieving 10000 rows.
Is there something wrong in our code? (we're new Ruby converts ;-)
Can we boost performance with some config? From our search on the net,
ther's no issue with the speed of DBI, so we hope we can correct our
problem.
Thanks.
Raph
Here's the code:
####################### CODE ############################################
require 'dbi'
db=DBI.connect("DBI:SQLRelay:host=blah;port=9000;socket=","user","pass")
#prepare statement to send to mssql server
stmt=db.prepare("select top 10000 * from customers")
#next line takes 5 minutes when query selects 10000 rows.
stmt.execute()
#We then print some rows
print "the first row:\n"
for i in stmt.fetch()
print i+",\t"
end
print "\n\n"
print "the next three rows:\n"
for i in stmt.fetch_many(3)
for j in i
print j+",\t"
end
print "\n"
end
################## END OF CODE ##########################################
We have to migrate an mssql database to Postgres and plan to do it with the
help of Ruby. As we didn't find DBI modules to access a mssql database, we
installed a sqlrelay and access it with the correspondent dbd module.
Performance on select queries returning a lot of rows are not great. To
execute the statement takes 5 minutes when retrieving 10000 rows.
Is there something wrong in our code? (we're new Ruby converts ;-)
Can we boost performance with some config? From our search on the net,
ther's no issue with the speed of DBI, so we hope we can correct our
problem.
Thanks.
Raph
Here's the code:
####################### CODE ############################################
require 'dbi'
db=DBI.connect("DBI:SQLRelay:host=blah;port=9000;socket=","user","pass")
#prepare statement to send to mssql server
stmt=db.prepare("select top 10000 * from customers")
#next line takes 5 minutes when query selects 10000 rows.
stmt.execute()
#We then print some rows
print "the first row:\n"
for i in stmt.fetch()
print i+",\t"
end
print "\n\n"
print "the next three rows:\n"
for i in stmt.fetch_many(3)
for j in i
print j+",\t"
end
print "\n"
end
################## END OF CODE ##########################################