SQLBulkCopy memory issue question

S

SteveB

Hi All,

Using the SQLBulkCopy for big CSV files consumes almost all the memory in
the computer. I found how to resolve this problem while google that, but no
code was added. The person claimed that he flushed all temporary working
tables every 100K records to avoid filling up the memory.

If someone dealt with this please write down the code.

Thanks,

Steve
 
G

George

Here is how I did it
2 options.

1. If file resides on the same computer with SQL server then run 'BULK
INSERT' command with ADO
2. If file is not on the same computer then run 'bcc.exe' that comes with
SQL server client tools.

George.
 
S

SteveB

Hi George,

I have many CSV files with 24 mil records and it is all automated the import
many files one after another, and working with the .net just works great for
me. I need to find the way to avoid filling up the memory when dealing with
big files.

How to flush out the temp working tables say for 100K records that were
processed?

Please advise.

Regards,

Steve
 
P

Paul Shapiro

Lookup the batch size parameter in SQL Server's Books OnLine for both of
those commands.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

Forum statistics

Threads
473,773
Messages
2,569,594
Members
45,119
Latest member
IrmaNorcro
Top