Best way of inserting 700,000 records

Discussion in 'ASP .Net' started by Morten Snedker, Dec 12, 2008.

  1. From a CSV-fil +700,000 records are imported with a simple INSERT INTO.

    The import is called by calling an ASPX-page.

    I'm using the code below. I'm gathering updates 1000 at a time, and I
    then execute the query. This to avoid 700,000 calls to the database.

    However, I run into two type of errors:

    1.
    "There is insufficient system memory to run this query."
    This seems fixable with http://support.microsoft.com/kb/912439.

    2.
    The webpage returns "The request timed out before the page could be
    retrieved."

    The scriptTimeout is set to one hour, but timeout occurs before this -
    why, and how to handle?

    Is there a better way of sending the updates to the SQL-server?


    ....
    Server.ScriptTimeout = 3600
    ....
    If CountItems = 1 Then SQL.Append("SET NOCOUNT OFF;")
    SQL.AppendFormat("INSERT INTO Cust_GiftToken (GiftTokenID,
    GiftTokenValidTo, GiftTokenAmount) VALUES ({0}, '{1}', {2});", id,
    validTo.ToString("yyyy-MM-dd"), amount.ToString(enCultureInfo))
    If CountItems = 1000 Then
    command.CommandText = SQL.ToString
    command.ExecuteNonQuery()
    command.CommandText = String.Empty
    CountItems = 0
    SQL.Length = 0
    End If
    ....
    End If


    Any help is greatly appreciated - thanks in advance!

    /Morten
     
    Morten Snedker, Dec 12, 2008
    #1
    1. Advertising

  2. Morten Snedker

    ML Guest

    ML, Dec 12, 2008
    #2
    1. Advertising

  3. Morten Snedker

    Dan Guzman Guest

    I agree with Matija that SqlBulkCopy is the right method from managed code.
    However, I suggest you consider processing large files like this
    asynchronously as a batch process.

    --
    Hope this helps.

    Dan Guzman
    SQL Server MVP
    http://weblogs.sqlteam.com/dang/

    "Morten Snedker" <> wrote in message
    news:...
    > From a CSV-fil +700,000 records are imported with a simple INSERT INTO.
    >
    > The import is called by calling an ASPX-page.
    >
    > I'm using the code below. I'm gathering updates 1000 at a time, and I then
    > execute the query. This to avoid 700,000 calls to the database.
    >
    > However, I run into two type of errors:
    >
    > 1.
    > "There is insufficient system memory to run this query."
    > This seems fixable with http://support.microsoft.com/kb/912439.
    >
    > 2.
    > The webpage returns "The request timed out before the page could be
    > retrieved."
    >
    > The scriptTimeout is set to one hour, but timeout occurs before this -
    > why, and how to handle?
    >
    > Is there a better way of sending the updates to the SQL-server?
    >
    >
    > ...
    > Server.ScriptTimeout = 3600
    > ...
    > If CountItems = 1 Then SQL.Append("SET NOCOUNT OFF;")
    > SQL.AppendFormat("INSERT INTO Cust_GiftToken (GiftTokenID,
    > GiftTokenValidTo, GiftTokenAmount) VALUES ({0}, '{1}', {2});", id,
    > validTo.ToString("yyyy-MM-dd"), amount.ToString(enCultureInfo))
    > If CountItems = 1000 Then
    > command.CommandText = SQL.ToString
    > command.ExecuteNonQuery()
    > command.CommandText = String.Empty
    > CountItems = 0
    > SQL.Length = 0
    > End If
    > ...
    > End If
    >
    >
    > Any help is greatly appreciated - thanks in advance!
    >
    > /Morten
     
    Dan Guzman, Dec 12, 2008
    #3
  4. Morten Snedker

    George Guest

    "Mark Rae [MVP]" <> wrote in message
    news:...
    > "ML" <> wrote in message
    > news:...
    >
    >> Consider using SqlBulkCopy:
    >> http://msdn.microsoft.com/en-us/library/system.data.sqlclient.sqlbulkcopy.aspx

    >
    > Native BCP is *much* faster than SqlBulkCopy:
    > http://www.sqlteam.com/forums/topic.asp?TOPIC_ID=112718
    >
    >
    > --
    > Mark Rae
    > ASP.NET MVP
    > http://www.markrae.net



    Just would like to add that there is a BULK INSERT command in SQL. It's the
    same as BCP utility but you do not need to run external executable (bcp.exe)
    from .NET which is a plus. Minus is that you can not import file that
    resides on external machine other than SQL server.

    George.
     
    George, Dec 12, 2008
    #4
  5. Morten Snedker

    George Guest

    "Mark Rae [MVP]" <> wrote in message
    news:...
    > "George" <> wrote in message
    > news:%...
    >
    >> Just would like to add that there is a BULK INSERT command in SQL.

    >
    > Indeed.
    >
    > But the OP's problem specifically related to timeouts, which is why I
    > didn't mention it.
    > http://www.google.co.uk/search?sour...4GPTB_en-GBGB298GB298&q="BULK INSERT" timeout
    >
    > I could have mentioned it for completeness, I suppose...
    >
    > The advantage of firing off a bcp process is that ASP.NET doesn't have to
    > wait for it to complete, if you set the correct StartInfo parameters...
    >
    >
    > --
    > Mark Rae
    > ASP.NET MVP
    > http://www.markrae.net


    You right, I did not think about timeouts. I have always used the BULK
    INSERT from my external processing program. (I am using MSMQ for lengthy
    operations)

    George.
     
    George, Dec 12, 2008
    #5
  6. Morten Snedker

    gerry Guest

    Hi George,

    can you expand on "you can not import file that resides on external machine
    other than SQL server" ?

    just curious as we have been successfully using BULK INSERT from files that
    exist on remote shares.



    "George" <> wrote in message
    news:%...
    >
    > "Mark Rae [MVP]" <> wrote in message
    > news:...
    >> "ML" <> wrote in message
    >> news:...
    >>
    >>> Consider using SqlBulkCopy:
    >>> http://msdn.microsoft.com/en-us/library/system.data.sqlclient.sqlbulkcopy.aspx

    >>
    >> Native BCP is *much* faster than SqlBulkCopy:
    >> http://www.sqlteam.com/forums/topic.asp?TOPIC_ID=112718
    >>
    >>
    >> --
    >> Mark Rae
    >> ASP.NET MVP
    >> http://www.markrae.net

    >
    >
    > Just would like to add that there is a BULK INSERT command in SQL. It's
    > the same as BCP utility but you do not need to run external executable
    > (bcp.exe) from .NET which is a plus. Minus is that you can not import file
    > that resides on external machine other than SQL server.
    >
    > George.
     
    gerry, Dec 12, 2008
    #6
  7. Morten Snedker

    bruce barker Guest

    as suggested you should use the sqlbulkcopy to do the insert. you should also
    create an import table to just hold the import data. then run a proc to move
    the data from the import table to the live. sometimes its best to have an
    actual import database, so you can have different database settings.

    for just 700K rows and this simple a table, you should be able to get this
    down to under 5 minutes.


    -- bruce (sqlwork.com)


    "Morten Snedker" wrote:

    > From a CSV-fil +700,000 records are imported with a simple INSERT INTO.
    >
    > The import is called by calling an ASPX-page.
    >
    > I'm using the code below. I'm gathering updates 1000 at a time, and I
    > then execute the query. This to avoid 700,000 calls to the database.
    >
    > However, I run into two type of errors:
    >
    > 1.
    > "There is insufficient system memory to run this query."
    > This seems fixable with http://support.microsoft.com/kb/912439.
    >
    > 2.
    > The webpage returns "The request timed out before the page could be
    > retrieved."
    >
    > The scriptTimeout is set to one hour, but timeout occurs before this -
    > why, and how to handle?
    >
    > Is there a better way of sending the updates to the SQL-server?
    >
    >
    > ....
    > Server.ScriptTimeout = 3600
    > ....
    > If CountItems = 1 Then SQL.Append("SET NOCOUNT OFF;")
    > SQL.AppendFormat("INSERT INTO Cust_GiftToken (GiftTokenID,
    > GiftTokenValidTo, GiftTokenAmount) VALUES ({0}, '{1}', {2});", id,
    > validTo.ToString("yyyy-MM-dd"), amount.ToString(enCultureInfo))
    > If CountItems = 1000 Then
    > command.CommandText = SQL.ToString
    > command.ExecuteNonQuery()
    > command.CommandText = String.Empty
    > CountItems = 0
    > SQL.Length = 0
    > End If
    > ....
    > End If
    >
    >
    > Any help is greatly appreciated - thanks in advance!
    >
    > /Morten
    >
     
    bruce barker, Dec 12, 2008
    #7
  8. Morten Snedker

    George Guest

    It's very possible that you successfully did it. But usually that only works
    under a lot of special circumstances.
    BULK INSERT command takes a file name and it runs under SQL account.
    Hence the SQL account must be able to access that file. And that is where
    problem usually is.

    In common scenario, file is uploaded to IIS machine (via Web application)
    into some folder and then SQL server must be able to take it from there.
    That means the folder must be opened up for everybody (Since usually SQL
    server runs under local account unknown to IIS machine)
    Also the the file sharring protocol on IIS must be enabled.
    Rare Admin will do that for you with machine that sits on Internet. Plus if
    you do not have Firewall that sits between IIS machine and internet then
    anyone could upload anything to that shared folder.
    ------------------------------------------
    If you have Intranet then it might work and you could talk an admin into
    that, but considering all that i prefer to use BCP then.
    BCP grabs the file and BULK COPY it to SQL server using standard SQL
    connection port. So less steps in setting up an application and mess it up
    when upgrading server :)



    George.


    "gerry" <> wrote in message
    news:eKUy9$...
    > Hi George,
    >
    > can you expand on "you can not import file that resides on external
    > machine other than SQL server" ?
    >
    > just curious as we have been successfully using BULK INSERT from files
    > that exist on remote shares.
    >
    >
    >
    > "George" <> wrote in message
    > news:%...
    >>
    >> "Mark Rae [MVP]" <> wrote in message
    >> news:...
    >>> "ML" <> wrote in message
    >>> news:...
    >>>
    >>>> Consider using SqlBulkCopy:
    >>>> http://msdn.microsoft.com/en-us/library/system.data.sqlclient.sqlbulkcopy.aspx
    >>>
    >>> Native BCP is *much* faster than SqlBulkCopy:
    >>> http://www.sqlteam.com/forums/topic.asp?TOPIC_ID=112718
    >>>
    >>>
    >>> --
    >>> Mark Rae
    >>> ASP.NET MVP
    >>> http://www.markrae.net

    >>
    >>
    >> Just would like to add that there is a BULK INSERT command in SQL. It's
    >> the same as BCP utility but you do not need to run external executable
    >> (bcp.exe) from .NET which is a plus. Minus is that you can not import
    >> file that resides on external machine other than SQL server.
    >>
    >> George.

    >
    >
     
    George, Dec 12, 2008
    #8
  9. Morten Snedker

    gerry Guest

    thanks

    I think I was misinterpreting your use of "external" - I was just thinking
    "other" as in not the sql server machine.

    We do this over the local network with no inter/intra net involved.


    "George" <> wrote in message
    news:%...
    > It's very possible that you successfully did it. But usually that only
    > works under a lot of special circumstances.
    > BULK INSERT command takes a file name and it runs under SQL account.
    > Hence the SQL account must be able to access that file. And that is where
    > problem usually is.
    >
    > In common scenario, file is uploaded to IIS machine (via Web application)
    > into some folder and then SQL server must be able to take it from there.
    > That means the folder must be opened up for everybody (Since usually SQL
    > server runs under local account unknown to IIS machine)
    > Also the the file sharring protocol on IIS must be enabled.
    > Rare Admin will do that for you with machine that sits on Internet. Plus
    > if you do not have Firewall that sits between IIS machine and internet
    > then anyone could upload anything to that shared folder.
    > ------------------------------------------
    > If you have Intranet then it might work and you could talk an admin into
    > that, but considering all that i prefer to use BCP then.
    > BCP grabs the file and BULK COPY it to SQL server using standard SQL
    > connection port. So less steps in setting up an application and mess it up
    > when upgrading server :)
    >
    >
    >
    > George.
    >
    >
    > "gerry" <> wrote in message
    > news:eKUy9$...
    >> Hi George,
    >>
    >> can you expand on "you can not import file that resides on external
    >> machine other than SQL server" ?
    >>
    >> just curious as we have been successfully using BULK INSERT from files
    >> that exist on remote shares.
    >>
    >>
    >>
    >> "George" <> wrote in message
    >> news:%...
    >>>
    >>> "Mark Rae [MVP]" <> wrote in message
    >>> news:...
    >>>> "ML" <> wrote in message
    >>>> news:...
    >>>>
    >>>>> Consider using SqlBulkCopy:
    >>>>> http://msdn.microsoft.com/en-us/library/system.data.sqlclient.sqlbulkcopy.aspx
    >>>>
    >>>> Native BCP is *much* faster than SqlBulkCopy:
    >>>> http://www.sqlteam.com/forums/topic.asp?TOPIC_ID=112718
    >>>>
    >>>>
    >>>> --
    >>>> Mark Rae
    >>>> ASP.NET MVP
    >>>> http://www.markrae.net
    >>>
    >>>
    >>> Just would like to add that there is a BULK INSERT command in SQL. It's
    >>> the same as BCP utility but you do not need to run external executable
    >>> (bcp.exe) from .NET which is a plus. Minus is that you can not import
    >>> file that resides on external machine other than SQL server.
    >>>
    >>> George.

    >>
    >>

    >
     
    gerry, Dec 15, 2008
    #9
  10. Morten Snedker, Dec 17, 2008
    #10
  11. Morten Snedker

    ML Guest

    ML, Dec 17, 2008
    #11
    1. Advertising

Want to reply to this thread or ask your own question?

It takes just 2 minutes to sign up (and it's free!). Just click the sign up button to choose a username and then you can ask your own questions on the forum.
Similar Threads
  1. Me
    Replies:
    3
    Views:
    426
    shriop
    Jan 31, 2006
  2. rote
    Replies:
    3
    Views:
    433
    Mark Rae [MVP]
    Jan 24, 2008
  3. lector

    storing 1,000,000 records

    lector, Apr 6, 2008, in forum: C Programming
    Replies:
    19
    Views:
    476
    Barry Schwarz
    Apr 8, 2008
  4. Replies:
    1
    Views:
    444
  5. Replies:
    0
    Views:
    589
Loading...

Share This Page