How to minimize server load when program is run

Discussion in 'Perl Misc' started by Justin C, Jun 13, 2013.

  1. Justin C

    Justin C Guest

    My web-hosts are running perl 5.8.8, other software there is of a
    similar age, and some things are missing (I wanted to 'nice' my
    program, but there is no 'nice').

    I have written a backup program to tar and gzip my entire directory
    tree on their site, and also to dump the db and add that to the tar.
    The program I have written runs one of my cores at 100% for two
    minutes, and uses almost 100MB RAM. If there is a way I'd like to
    reduce this load (as I can't 'nice' it).

    I haven't tried running the program yet, I don't want to get a
    bad name for maxing out the hardware. I've used core modules only,
    and I've used them as per documentation for the versions that were
    part of 5.8.8. I've pasted the code below, I'd be grateful for
    suggestions on how I could do the same while putting as little
    load on the server as possible.

    ~ $ cat bin/wp-backup.pl
    #!/usr/bin/perl
    use warnings;
    use strict;
    use Archive::Tar;
    use File::Find;

    # global vars
    chomp (my $now = `date +"%Y-%m-%d-%H%M"`);
    my $tar;
    my $file = "site.com.$now.tar.gz";
    my $backup_dir = '/var/sites/s/site.com/backups';

    create_archive();
    my $db = extract_db_data();

    $tar->add_files($db);
    $tar->write($archive, 9);

    sub archive_it {
    my $new_name = 'public_html/' . $_;
    (my $old_name = $File::Find::name) =~ s/^\///;
    $tar->add_files($File::Find::name);
    $tar->rename($old_name, $new_name);
    }
    sub create_archive {
    my $www_dir = '/var/sites/s/site.com/public_html';

    $tar = Archive::Tar->new; # declared in globals
    find(\&archive_it, $www_dir); # &archive_it adds it to the tar
    $tar->write($archive);
    }
    sub extract_db_data {
    my $db = {
    user => 'name',
    pass => 'password',
    name => 'db',
    file => "site.com.$now.sql",
    host => '1.0.0.0',
    };

    my @args = ('mysqldump', '--add-drop-table', '--complete-insert',
    '--extended-insert', '--hex-blob', "--host $db->{host}", "--user=$db->{user}",
    "--password=$db->{pass}", $db->{name}, '>', "$backup_dir/$db->{file}");
    system @args == 0 or die "problem running mysqldump: $!";
    return $db_file;
    }

    __END__

    Thankyou for any help or suggestions.


    Justin.

    --
    Justin C, by the sea.
    Justin C, Jun 13, 2013
    #1
    1. Advertising

  2. Justin C

    Dr.Ruud Guest

    On 13/06/2013 16:50, Justin C wrote:

    > The program I have written runs one of my cores at 100% for two
    > minutes, and uses almost 100MB RAM. If there is a way I'd like to
    > reduce this load (as I can't 'nice' it).


    If you want to nice it, see POSIX::nice().

    --
    Ruud
    Dr.Ruud, Jun 13, 2013
    #2
    1. Advertising

  3. Justin C

    Jim Gibson Guest

    In article <>, Justin C
    <> wrote:

    > My web-hosts are running perl 5.8.8, other software there is of a
    > similar age, and some things are missing (I wanted to 'nice' my
    > program, but there is no 'nice').
    >
    > I have written a backup program to tar and gzip my entire directory
    > tree on their site, and also to dump the db and add that to the tar.
    > The program I have written runs one of my cores at 100% for two
    > minutes, and uses almost 100MB RAM. If there is a way I'd like to
    > reduce this load (as I can't 'nice' it).
    >
    > I haven't tried running the program yet, I don't want to get a
    > bad name for maxing out the hardware. I've used core modules only,
    > and I've used them as per documentation for the versions that were
    > part of 5.8.8. I've pasted the code below, I'd be grateful for
    > suggestions on how I could do the same while putting as little
    > load on the server as possible.


    What about doing a sleep(1) after every n files (or n bytes or n
    seconds)? Your program will still max out a CPU while it is active, but
    the average usage will be less. If you sleep(1) after each 1 second of
    execution, then your program will take 4 minutes to run and use 50% of
    a CPU.

    --
    Jim Gibson
    Jim Gibson, Jun 13, 2013
    #3
  4. Justin C

    Willem Guest

    Justin C wrote:
    ) My web-hosts are running perl 5.8.8, other software there is of a
    ) similar age, and some things are missing (I wanted to 'nice' my
    ) program, but there is no 'nice').
    )
    ) I have written a backup program to tar and gzip my entire directory
    ) tree on their site, and also to dump the db and add that to the tar.
    ) The program I have written runs one of my cores at 100% for two
    ) minutes, and uses almost 100MB RAM. If there is a way I'd like to
    ) reduce this load (as I can't 'nice' it).

    Odd, I would have expected a tar/gzip action to be I/O bound.
    That is, use 100% disk read/write capacity and not as much CPU.

    Have you tried how long the 'tar' command takes, how much CPU that uses,
    etc? Or perhaps the database dump is the culprit. You should test those
    separately.


    SaSW, Willem
    --
    Disclaimer: I am in no way responsible for any of the statements
    made in the above text. For all I know I might be
    drugged or something..
    No I'm not paranoid. You all think I'm paranoid, don't you !
    #EOT
    Willem, Jun 17, 2013
    #4
  5. Justin C

    Justin C Guest

    On 2013-06-13, Justin C <> wrote:
    > My web-hosts are running perl 5.8.8, other software there is of a
    > similar age, and some things are missing (I wanted to 'nice' my
    > program, but there is no 'nice').
    >
    > I have written a backup program to tar and gzip my entire directory
    > tree on their site, and also to dump the db and add that to the tar.
    > The program I have written runs one of my cores at 100% for two
    > minutes, and uses almost 100MB RAM. If there is a way I'd like to
    > reduce this load (as I can't 'nice' it).



    [snip]

    Apologies for the (very) late follow up to this. I spent some time
    pondering the options, and tried Ben's suggestion of
    Archive::Tar::Streamed, but it's not installed (and fixed my bad
    date call, thank you Ben). In the end I used bash, and the program
    runs in about ten seconds.

    I realise that, had I written the program well enough, I might have
    got close to that short a time with Perl, but I'm happy with the
    bash solution.

    Thanks to all who replied, all suggestions were useful.

    Justin.

    --
    Justin C, by the sea.
    Justin C, Aug 9, 2013
    #5
  6. On Thursday, June 13, 2013 7:50:20 AM UTC-7, Justin C wrote:
    > My web-hosts are running perl 5.8.8, other software there is of a
    >
    > similar age, and some things are missing (I wanted to 'nice' my
    >
    > program, but there is no 'nice').
    >
    >
    >
    > I have written a backup program to tar and gzip my entire directory
    >
    > tree on their site, and also to dump the db and add that to the tar.
    >
    > The program I have written runs one of my cores at 100% for two
    >
    > minutes, and uses almost 100MB RAM. If there is a way I'd like to
    >
    > reduce this load (as I can't 'nice' it).
    >
    >
    >
    > I haven't tried running the program yet, I don't want to get a
    >
    > bad name for maxing out the hardware. I've used core modules only,
    >
    > and I've used them as per documentation for the versions that were
    >
    > part of 5.8.8. I've pasted the code below, I'd be grateful for
    >
    > suggestions on how I could do the same while putting as little
    >
    > load on the server as possible.
    >
    >
    >
    > ~ $ cat bin/wp-backup.pl
    >
    > #!/usr/bin/perl
    >
    > use warnings;
    >
    > use strict;
    >
    > use Archive::Tar;
    >
    > use File::Find;
    >
    >
    >
    > # global vars
    >
    > chomp (my $now = `date +"%Y-%m-%d-%H%M"`);
    >
    > my $tar;
    >
    > my $file = "site.com.$now.tar.gz";
    >
    > my $backup_dir = '/var/sites/s/site.com/backups';
    >
    >
    >
    > create_archive();
    >
    > my $db = extract_db_data();
    >
    >
    >
    > $tar->add_files($db);
    >
    > $tar->write($archive, 9);
    >
    >
    >
    > sub archive_it {
    >
    > my $new_name = 'public_html/' . $_;
    >
    > (my $old_name = $File::Find::name) =~ s/^\///;
    >
    > $tar->add_files($File::Find::name);
    >
    > $tar->rename($old_name, $new_name);
    >
    > }
    >
    > sub create_archive {
    >
    > my $www_dir = '/var/sites/s/site.com/public_html';
    >
    >
    >
    > $tar = Archive::Tar->new; # declared in globals
    >
    > find(\&archive_it, $www_dir); # &archive_it adds it to the tar
    >
    > $tar->write($archive);
    >
    > }
    >
    > sub extract_db_data {
    >
    > my $db = {
    >
    > user => 'name',
    >
    > pass => 'password',
    >
    > name => 'db',
    >
    > file => "site.com.$now.sql",
    >
    > host => '1.0.0.0',
    >
    > };
    >
    >
    >
    > my @args = ('mysqldump', '--add-drop-table', '--complete-insert',
    >
    > '--extended-insert', '--hex-blob', "--host $db->{host}", "--user=$db->{user}",
    >
    > "--password=$db->{pass}", $db->{name}, '>', "$backup_dir/$db->{file}");
    >
    > system @args == 0 or die "problem running mysqldump: $!";
    >
    > return $db_file;
    >
    > }
    >
    >
    >
    > __END__
    >
    >
    >
    > Thankyou for any help or suggestions.
    >
    >
    >
    >
    >
    > Justin.
    >
    >
    >
    > --
    >
    > Justin C, by the sea.


    bzip2 with -1 level is best combo of speed and compress on linux.
    consider lvm snapshots for db backup
    :)
    use rsync or place it in http server and wget pull it
    johannes falcone, Aug 16, 2013
    #6
    1. Advertising

Want to reply to this thread or ask your own question?

It takes just 2 minutes to sign up (and it's free!). Just click the sign up button to choose a username and then you can ask your own questions on the forum.
Similar Threads
  1. Boki
    Replies:
    8
    Views:
    6,084
    Darryl Pierce
    Apr 22, 2005
  2. Erik  Bethke
    Replies:
    1
    Views:
    1,880
    Erik Bethke
    Feb 8, 2005
  3. Frank Niessink
    Replies:
    3
    Views:
    664
    Frank Niessink
    Feb 28, 2006
  4. Matcon
    Replies:
    3
    Views:
    21,567
    Matcon
    May 28, 2008
  5. Fa Sidd
    Replies:
    12
    Views:
    212
    ara howard
    Jan 26, 2008
Loading...

Share This Page