Debian MySQL Perl DBI - connection terminates unexpectedly after 100 secs.

Discussion in 'Perl Misc' started by John, Mar 16, 2009.

  1. John

    John Guest

    Hi

    I'm updating a MySQL table with about 2 million records. After about 100
    seconds the program ends with no error message. If I use LIMIT 800000,50000
    the part where it stopped is OK so the UPDATE part is OK.

    Is there a time out parameter? Or am I running out of memory. Can the
    memory be increased?

    Regards
    John
    John, Mar 16, 2009
    #1
    1. Advertising

  2. John

    J. Gleixner Guest

    Re: Debian MySQL Perl DBI - connection terminates unexpectedly after100 secs.

    John wrote:
    > Hi
    >
    > I'm updating a MySQL table with about 2 million records. After about 100
    > seconds the program ends with no error message. If I use LIMIT 800000,50000
    > the part where it stopped is OK so the UPDATE part is OK.
    >
    > Is there a time out parameter? Or am I running out of memory. Can the
    > memory be increased?


    Check line 24.
    J. Gleixner, Mar 16, 2009
    #2
    1. Advertising

  3. John

    John Guest

    "J. Gleixner" <> wrote in message
    news:49be71fd$0$89873$...
    > John wrote:
    >> Hi
    >>
    >> I'm updating a MySQL table with about 2 million records. After about 100
    >> seconds the program ends with no error message. If I use LIMIT
    >> 800000,50000 the part where it stopped is OK so the UPDATE part is OK.
    >>
    >> Is there a time out parameter? Or am I running out of memory. Can the
    >> memory be increased?

    >
    > Check line 24.


    Line 24 says:-

    our $old_fh=select(STDOUT); $|=1; select($old_fh); # Make standard output
    socket hot

    Can't see how that can affect it.

    Regards
    John
    John, Mar 16, 2009
    #3
  4. John <> wrote:

    > "J. Gleixner" <> wrote in message
    > news:49be71fd$0$89873$...
    > > John wrote:
    > >> I'm updating a MySQL table with about 2 million records. After about 100
    > >> seconds the program ends with no error message. If I use LIMIT
    > >> 800000,50000 the part where it stopped is OK so the UPDATE part is OK.
    > >>
    > >> Is there a time out parameter? Or am I running out of memory. Can the
    > >> memory be increased?

    > >
    > > Check line 24.


    > Line 24 says:-


    > our $old_fh=select(STDOUT); $|=1; select($old_fh); # Make standard output
    > socket hot


    > Can't see how that can affect it.


    I rather guess it was a gentle hint that your problem description
    is too vague to allow to deduce what's going wrong. But your sub-
    ject line may contain a hint: it looks as if the update is done
    from within a CGI script or something similar, running on a web
    server. And normally there's a CPU time limit set for such scripts
    to catch e.g. script that run in an endless loop. That's what I
    would check for first (note: if it's not set then an OS default
    limit may be used!). If you are using Apache it might be a RLimitCPU
    directive. But there could also be a limit on the memory consumption
    of such scripts (RLimitMEM with Apache2)... Perhaps a look at the log
    files of the web server will give you some more hints.

    Regards, Jens
    --
    \ Jens Thoms Toerring ___
    \__________________________ http://toerring.de
    Jens Thoms Toerring, Mar 16, 2009
    #4
  5. John <> wrote:
    >
    > "J. Gleixner" <> wrote in message
    > news:49be71fd$0$89873$...
    >> John wrote:
    >>> Hi
    >>>
    >>> I'm updating a MySQL table with about 2 million records. After about 100
    >>> seconds the program ends with no error message. If I use LIMIT
    >>> 800000,50000 the part where it stopped is OK so the UPDATE part is OK.
    >>>
    >>> Is there a time out parameter? Or am I running out of memory. Can the
    >>> memory be increased?

    >>
    >> Check line 24.

    >
    > Line 24 says:-
    >
    > our $old_fh=select(STDOUT); $|=1; select($old_fh); # Make standard output
    > socket hot
    >
    > Can't see how that can affect it.



    Check line 17.


    --
    Tad McClellan
    email: perl -le "print scalar reverse qq/moc.noitatibaher\100cmdat/"
    Tad J McClellan, Mar 16, 2009
    #5
  6. Re: Debian MySQL Perl DBI - connection terminates unexpectedly after100 secs.

    John wrote:
    > Hi
    >
    > I'm updating a MySQL table with about 2 million records. After about 100
    > seconds the program ends with no error message.


    Are you sure? If there were an error message, do you know where it
    would go and do you know you could find it?

    > If I use LIMIT 800000,50000
    > the part where it stopped is OK so the UPDATE part is OK.
    >
    > Is there a time out parameter?


    I don't think there is one in DBD::mysql. The program that calls your
    perl script could have a time out, or your MySQL server might have a
    time out, but neither of those are Perl issues


    > Or am I running out of memory. Can the
    > memory be increased?


    Unlike a select, there is no reason that large update should consume
    much memory in the Perl client. And I don't use Debian specifically,
    but on other linux systems running out of memory does produce an error
    somewhere, but not always in the first place one might think to look.

    Xho
    Xho Jingleheimerschmidt, Mar 17, 2009
    #6
  7. John

    John Guest

    "Jens Thoms Toerring" <> wrote in message
    news:-berlin.de...
    > John <> wrote:
    >
    >> "J. Gleixner" <> wrote in message
    >> news:49be71fd$0$89873$...
    >> > John wrote:
    >> >> I'm updating a MySQL table with about 2 million records. After about
    >> >> 100
    >> >> seconds the program ends with no error message. If I use LIMIT
    >> >> 800000,50000 the part where it stopped is OK so the UPDATE part is OK.
    >> >>
    >> >> Is there a time out parameter? Or am I running out of memory. Can
    >> >> the
    >> >> memory be increased?
    >> >
    >> > Check line 24.

    >
    >> Line 24 says:-

    >
    >> our $old_fh=select(STDOUT); $|=1; select($old_fh); # Make standard output
    >> socket hot

    >
    >> Can't see how that can affect it.

    >
    > I rather guess it was a gentle hint that your problem description
    > is too vague to allow to deduce what's going wrong. But your sub-
    > ject line may contain a hint: it looks as if the update is done
    > from within a CGI script or something similar, running on a web
    > server. And normally there's a CPU time limit set for such scripts
    > to catch e.g. script that run in an endless loop. That's what I
    > would check for first (note: if it's not set then an OS default
    > limit may be used!). If you are using Apache it might be a RLimitCPU
    > directive. But there could also be a limit on the memory consumption
    > of such scripts (RLimitMEM with Apache2)... Perhaps a look at the log
    > files of the web server will give you some more hints.
    >
    > Regards, Jens
    > --
    > \ Jens Thoms Toerring ___
    > \__________________________ http://toerring.de


    Many thanks. That was very helpful. I have maxed both RLimitCPU and
    RLimitMEM in Apache2 and that has had led to better performance. There are
    severe problems in using MySQL when you start hitting more than 2 million
    rows irrespective how well you tweak your settings.

    Regards
    John
    John, Mar 25, 2009
    #7
  8. John

    John Guest

    "Xho Jingleheimerschmidt" <> wrote in message
    news:49bf08ca$0$9725$...
    > John wrote:
    >> Hi
    >>
    >> I'm updating a MySQL table with about 2 million records. After about 100
    >> seconds the program ends with no error message.

    >
    > Are you sure? If there were an error message, do you know where it would
    > go and do you know you could find it?


    >
    >> If I use LIMIT 800000,50000 the part where it stopped is OK so the UPDATE
    >> part is OK.
    >>
    >> Is there a time out parameter?

    >
    > I don't think there is one in DBD::mysql. The program that calls your
    > perl script could have a time out, or your MySQL server might have a time
    > out, but neither of those are Perl issues
    >
    >
    >> Or am I running out of memory. Can the memory be increased?

    >
    > Unlike a select, there is no reason that large update should consume much
    > memory in the Perl client. And I don't use Debian specifically, but on
    > other linux systems running out of memory does produce an error somewhere,
    > but not always in the first place one might think to look.
    >
    > Xho


    Yes I have now found that error log. There is a special log for slow
    queries. So evidently MySQL are aware that some constructs do take a long
    time.
    I am not certain whether it is a Perl problem. My feeling now it is a MySQL
    problem and that they are aware of similar problems.

    Regards
    John
    John, Mar 25, 2009
    #8
  9. John

    Guest

    On Wed, 25 Mar 2009 07:01:38 -0000, "John" <> wrote:

    >
    >"Jens Thoms Toerring" <> wrote in message
    >news:-berlin.de...
    >> John <> wrote:
    >>
    >>> "J. Gleixner" <> wrote in message
    >>> news:49be71fd$0$89873$...
    >>> > John wrote:
    >>> >> I'm updating a MySQL table with about 2 million records. After about
    >>> >> 100
    >>> >> seconds the program ends with no error message. If I use LIMIT
    >>> >> 800000,50000 the part where it stopped is OK so the UPDATE part is OK.
    >>> >>
    >>> >> Is there a time out parameter? Or am I running out of memory. Can
    >>> >> the
    >>> >> memory be increased?
    >>> >
    >>> > Check line 24.

    >>
    >>> Line 24 says:-

    >>
    >>> our $old_fh=select(STDOUT); $|=1; select($old_fh); # Make standard output
    >>> socket hot

    >>
    >>> Can't see how that can affect it.

    >>
    >> I rather guess it was a gentle hint that your problem description
    >> is too vague to allow to deduce what's going wrong. But your sub-
    >> ject line may contain a hint: it looks as if the update is done
    >> from within a CGI script or something similar, running on a web
    >> server. And normally there's a CPU time limit set for such scripts
    >> to catch e.g. script that run in an endless loop. That's what I
    >> would check for first (note: if it's not set then an OS default
    >> limit may be used!). If you are using Apache it might be a RLimitCPU
    >> directive. But there could also be a limit on the memory consumption
    >> of such scripts (RLimitMEM with Apache2)... Perhaps a look at the log
    >> files of the web server will give you some more hints.
    >>
    >> Regards, Jens
    >> --
    >> \ Jens Thoms Toerring ___
    >> \__________________________ http://toerring.de

    >
    >Many thanks. That was very helpful. I have maxed both RLimitCPU and
    >RLimitMEM in Apache2 and that has had led to better performance. There are
    >severe problems in using MySQL when you start hitting more than 2 million
    >rows irrespective how well you tweak your settings.
    >
    >Regards
    >John
    >
    >

    I am for hire. I've never done your fancy server names stuff.
    But I can assure you I am an expert at Data Base and Perl.
    The learning curve is like the next day.

    Pay me and you get quality software! Price's are negotiable.
    I'm looking for full-time at 140K but will work at $120/hr at
    minimal time.

    -sln
    , Mar 25, 2009
    #9
  10. Re: Debian MySQL Perl DBI - connection terminates unexpectedly after100 secs.

    John wrote:
    >
    > Many thanks. That was very helpful. I have maxed both RLimitCPU and
    > RLimitMEM in Apache2 and that has had led to better performance. There are
    > severe problems in using MySQL when you start hitting more than 2 million
    > rows irrespective how well you tweak your settings.


    I have MySQL tables with 150 times that many rows. Obviously I don't
    try to run unrestricted update DML on them. It isn't how many rows you
    have, it is what you are trying to do with them.

    Xho
    Xho Jingleheimerschmidt, Mar 26, 2009
    #10
  11. John

    J. Gleixner Guest

    Re: Debian MySQL Perl DBI - connection terminates unexpectedly after100 secs.

    John wrote:
    > "Xho Jingleheimerschmidt" <> wrote in message
    > news:49bf08ca$0$9725$...
    >> John wrote:
    >>> Hi
    >>>
    >>> I'm updating a MySQL table with about 2 million records. After about 100
    >>> seconds the program ends with no error message.


    [...]

    >
    > Yes I have now found that error log. There is a special log for slow
    > queries. So evidently MySQL are aware that some constructs do take a long
    > time.


    I'm confused.. Are you doing a query or an update?

    A long running query should be noted so it can possibly be fixed
    by adding indexes, temp tables, etc.

    > I am not certain whether it is a Perl problem. My feeling now it is a MySQL
    > problem and that they are aware of similar problems.


    It's not a perl problem or a MySQL 'problem'. They both can handle
    running for a looooong time.

    If you're going to run a loooooong update, and have it kicked off by
    an Apache process (which, BTW you didn't mention in the beginning),
    then don't mess with Apache, simply fork the MySQL update process
    so Apache doesn't have to keep the connection open.
    J. Gleixner, Mar 26, 2009
    #11
    1. Advertising

Want to reply to this thread or ask your own question?

It takes just 2 minutes to sign up (and it's free!). Just click the sign up button to choose a username and then you can ask your own questions on the forum.
Similar Threads
  1. shmuel witman

    Connection timeout (connection closing after 100 sec )

    shmuel witman, Dec 4, 2005, in forum: ASP .Net Web Services
    Replies:
    2
    Views:
    773
    shmuel witman
    Dec 6, 2005
  2. Charles R. Thompson
    Replies:
    6
    Views:
    141
    Ben Liddicott
    Jan 13, 2004
  3. fred
    Replies:
    3
    Views:
    257
    Zifud
    Mar 17, 2005
  4. libsfan01
    Replies:
    2
    Views:
    121
    Dr John Stockton
    Mar 4, 2006
  5. Replies:
    5
    Views:
    862
Loading...

Share This Page