How to minimize server load when program is run

J

Justin C

My web-hosts are running perl 5.8.8, other software there is of a
similar age, and some things are missing (I wanted to 'nice' my
program, but there is no 'nice').

I have written a backup program to tar and gzip my entire directory
tree on their site, and also to dump the db and add that to the tar.
The program I have written runs one of my cores at 100% for two
minutes, and uses almost 100MB RAM. If there is a way I'd like to
reduce this load (as I can't 'nice' it).

I haven't tried running the program yet, I don't want to get a
bad name for maxing out the hardware. I've used core modules only,
and I've used them as per documentation for the versions that were
part of 5.8.8. I've pasted the code below, I'd be grateful for
suggestions on how I could do the same while putting as little
load on the server as possible.

~ $ cat bin/wp-backup.pl
#!/usr/bin/perl
use warnings;
use strict;
use Archive::Tar;
use File::Find;

# global vars
chomp (my $now = `date +"%Y-%m-%d-%H%M"`);
my $tar;
my $file = "site.com.$now.tar.gz";
my $backup_dir = '/var/sites/s/site.com/backups';

create_archive();
my $db = extract_db_data();

$tar->add_files($db);
$tar->write($archive, 9);

sub archive_it {
my $new_name = 'public_html/' . $_;
(my $old_name = $File::Find::name) =~ s/^\///;
$tar->add_files($File::Find::name);
$tar->rename($old_name, $new_name);
}
sub create_archive {
my $www_dir = '/var/sites/s/site.com/public_html';

$tar = Archive::Tar->new; # declared in globals
find(\&archive_it, $www_dir); # &archive_it adds it to the tar
$tar->write($archive);
}
sub extract_db_data {
my $db = {
user => 'name',
pass => 'password',
name => 'db',
file => "site.com.$now.sql",
host => '1.0.0.0',
};

my @args = ('mysqldump', '--add-drop-table', '--complete-insert',
'--extended-insert', '--hex-blob', "--host $db->{host}", "--user=$db->{user}",
"--password=$db->{pass}", $db->{name}, '>', "$backup_dir/$db->{file}");
system @args == 0 or die "problem running mysqldump: $!";
return $db_file;
}

__END__

Thankyou for any help or suggestions.


Justin.
 
D

Dr.Ruud

The program I have written runs one of my cores at 100% for two
minutes, and uses almost 100MB RAM. If there is a way I'd like to
reduce this load (as I can't 'nice' it).

If you want to nice it, see POSIX::nice().
 
J

Jim Gibson

Justin C said:
My web-hosts are running perl 5.8.8, other software there is of a
similar age, and some things are missing (I wanted to 'nice' my
program, but there is no 'nice').

I have written a backup program to tar and gzip my entire directory
tree on their site, and also to dump the db and add that to the tar.
The program I have written runs one of my cores at 100% for two
minutes, and uses almost 100MB RAM. If there is a way I'd like to
reduce this load (as I can't 'nice' it).

I haven't tried running the program yet, I don't want to get a
bad name for maxing out the hardware. I've used core modules only,
and I've used them as per documentation for the versions that were
part of 5.8.8. I've pasted the code below, I'd be grateful for
suggestions on how I could do the same while putting as little
load on the server as possible.

What about doing a sleep(1) after every n files (or n bytes or n
seconds)? Your program will still max out a CPU while it is active, but
the average usage will be less. If you sleep(1) after each 1 second of
execution, then your program will take 4 minutes to run and use 50% of
a CPU.
 
W

Willem

Justin C wrote:
) My web-hosts are running perl 5.8.8, other software there is of a
) similar age, and some things are missing (I wanted to 'nice' my
) program, but there is no 'nice').
)
) I have written a backup program to tar and gzip my entire directory
) tree on their site, and also to dump the db and add that to the tar.
) The program I have written runs one of my cores at 100% for two
) minutes, and uses almost 100MB RAM. If there is a way I'd like to
) reduce this load (as I can't 'nice' it).

Odd, I would have expected a tar/gzip action to be I/O bound.
That is, use 100% disk read/write capacity and not as much CPU.

Have you tried how long the 'tar' command takes, how much CPU that uses,
etc? Or perhaps the database dump is the culprit. You should test those
separately.


SaSW, Willem
--
Disclaimer: I am in no way responsible for any of the statements
made in the above text. For all I know I might be
drugged or something..
No I'm not paranoid. You all think I'm paranoid, don't you !
#EOT
 
J

Justin C

My web-hosts are running perl 5.8.8, other software there is of a
similar age, and some things are missing (I wanted to 'nice' my
program, but there is no 'nice').

I have written a backup program to tar and gzip my entire directory
tree on their site, and also to dump the db and add that to the tar.
The program I have written runs one of my cores at 100% for two
minutes, and uses almost 100MB RAM. If there is a way I'd like to
reduce this load (as I can't 'nice' it).


[snip]

Apologies for the (very) late follow up to this. I spent some time
pondering the options, and tried Ben's suggestion of
Archive::Tar::Streamed, but it's not installed (and fixed my bad
date call, thank you Ben). In the end I used bash, and the program
runs in about ten seconds.

I realise that, had I written the program well enough, I might have
got close to that short a time with Perl, but I'm happy with the
bash solution.

Thanks to all who replied, all suggestions were useful.

Justin.
 
J

johannes falcone

My web-hosts are running perl 5.8.8, other software there is of a

similar age, and some things are missing (I wanted to 'nice' my

program, but there is no 'nice').



I have written a backup program to tar and gzip my entire directory

tree on their site, and also to dump the db and add that to the tar.

The program I have written runs one of my cores at 100% for two

minutes, and uses almost 100MB RAM. If there is a way I'd like to

reduce this load (as I can't 'nice' it).



I haven't tried running the program yet, I don't want to get a

bad name for maxing out the hardware. I've used core modules only,

and I've used them as per documentation for the versions that were

part of 5.8.8. I've pasted the code below, I'd be grateful for

suggestions on how I could do the same while putting as little

load on the server as possible.



~ $ cat bin/wp-backup.pl

#!/usr/bin/perl

use warnings;

use strict;

use Archive::Tar;

use File::Find;



# global vars

chomp (my $now = `date +"%Y-%m-%d-%H%M"`);

my $tar;

my $file = "site.com.$now.tar.gz";

my $backup_dir = '/var/sites/s/site.com/backups';



create_archive();

my $db = extract_db_data();



$tar->add_files($db);

$tar->write($archive, 9);



sub archive_it {

my $new_name = 'public_html/' . $_;

(my $old_name = $File::Find::name) =~ s/^\///;

$tar->add_files($File::Find::name);

$tar->rename($old_name, $new_name);

}

sub create_archive {

my $www_dir = '/var/sites/s/site.com/public_html';



$tar = Archive::Tar->new; # declared in globals

find(\&archive_it, $www_dir); # &archive_it adds it to the tar

$tar->write($archive);

}

sub extract_db_data {

my $db = {

user => 'name',

pass => 'password',

name => 'db',

file => "site.com.$now.sql",

host => '1.0.0.0',

};



my @args = ('mysqldump', '--add-drop-table', '--complete-insert',

'--extended-insert', '--hex-blob', "--host $db->{host}", "--user=$db->{user}",

"--password=$db->{pass}", $db->{name}, '>', "$backup_dir/$db->{file}");

system @args == 0 or die "problem running mysqldump: $!";

return $db_file;

}



__END__



Thankyou for any help or suggestions.





Justin.

bzip2 with -1 level is best combo of speed and compress on linux.
consider lvm snapshots for db backup
:)
use rsync or place it in http server and wget pull it
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

Forum statistics

Threads
473,756
Messages
2,569,534
Members
45,007
Latest member
OrderFitnessKetoCapsules

Latest Threads

Top