Feed a directory listing to a script

S

Shabam

I have a command script that backs up a user account. This involves moving
files from different directories into an archive.

Now, I need that script to back up all user accounts on the system, by going
through the directory structure and running the backup script on each one.
Can someone show me how this can be done? I'm not a perl programmer and
have only dabbled a bit in it.

My directory structure is like this:

/Users/0/
/Users/1/
/Users/2/
/Users/3/
.... so on...

User account names reside in those folders, so user jason would be in
"Users/j/jason".

Please don't tell me to just tar/gz the /Users/ directory. That will not
work for this because it will be greater than 4GBs, and it won't allow me to
restore accounts individually.

Thanks for any help.
 
A

Anno Siegel

Shabam said:
I have a command script that backs up a user account. This involves moving
files from different directories into an archive.

Now, I need that script to back up all user accounts on the system, by going
through the directory structure and running the backup script on each one.
Can someone show me how this can be done? I'm not a perl programmer and
have only dabbled a bit in it.

My directory structure is like this:

/Users/0/
/Users/1/
/Users/2/
/Users/3/
... so on...

User account names reside in those folders, so user jason would be in
"Users/j/jason".

So what have you tried so far, and how does it fail?
Please don't tell me to just tar/gz the /Users/ directory. That will not
work for this because it will be greater than 4GBs,
So?

and it won't allow me to
restore accounts individually.

Ah, but it does. The problem is, you'd have to read through the entire
tar file, but you can restore any selection of files you want.

Anno
 
J

Joe Smith

Shabam said:
I have a command script that backs up a user account. This involves moving
files from different directories into an archive.

Now, I need that script to back up all user accounts on the system, by going
through the directory structure and running the backup script on each one.

Here is a hint:

perl -le 'while(($user,$pw,$uid,$gid,$q,$c,$name,$home)=getpwent){print
"~$user = $home for $name" if $uid > 100}'
Can someone show me how this can be done? I'm not a perl programmer and
have only dabbled a bit in it.

OK, here's another hint. Replace the print() part with this:

system "tar cvf $user.tar $home >$user.dir 2>>error.log";

-Joe

P.S. Next time, do not include comp.lang.perl; it has been replaced
by the comp.lang.perl.misc newsgroup.
 
A

Anno Siegel

[newsgroups trimmed]

Shabam said:
You don't get it do you?

What don't I get? Some file systems have a size limit (usually at 2 GB,
not 4), but others don't. I have built and used backups that were much
larger.

As for your original question, you visit each home directory and run your
backup script on it with an individual output file. That's what thousands
of sysadmins are doing. What's the problem?

Anno
 
M

Matthew King

Shabam said:
My directory structure is like this:

/Users/0/
/Users/1/
/Users/2/
/Users/3/
... so on...

User account names reside in those folders, so user jason would be in
"Users/j/jason".

You don't even need to use perl, you can do this directly in bash:

for k in /Users/*/*/; do run_backup_script "$k"; done

The perl equivalent would look similar but IIRC be a bit mor involved.

Matthew
 
J

Joe Smith

Shabam said:
You don't get it do you?

Get what? Modern versions of tar can create archive files of
greater than 2 or 4 gigabytes.

linux% ls -l 5gigabyte.zip
-rw-r--r-- 1 jms jms 5751592946 May 3 19:37 5gigabyte.zip
linux% tar cf 5gb.tar 2005-03-01.zip
linux% ls -l 5gb.tar
-rw-r--r-- 1 jms jms 5751603200 Aug 9 22:30 5gb.tar

So why do you say 4GB wont work?

-Joe
 
J

Justin C

Get what? Modern versions of tar can create archive files of
greater than 2 or 4 gigabytes.

Maybe the OP has a DAT drive that doesn't support tapes bigger than
2/4GB?

Justin.
 
T

Tim X

Shabam said:
I have a command script that backs up a user account. This involves moving
files from different directories into an archive.

Now, I need that script to back up all user accounts on the system, by going
through the directory structure and running the backup script on each one.
Can someone show me how this can be done? I'm not a perl programmer and
have only dabbled a bit in it.

My directory structure is like this:

/Users/0/
/Users/1/
/Users/2/
/Users/3/
... so on...

User account names reside in those folders, so user jason would be in
"Users/j/jason".

Please don't tell me to just tar/gz the /Users/ directory. That will not
work for this because it will be greater than 4GBs, and it won't allow me to
restore accounts individually.

Firstly, if your not a perl programmer, why do you plan to use perl
for this task? This could easily be done with just a bash script.

Secondly, your statement about not being able to extract individual
account data from a single tar file is incorrect. You can extract
individual files or groups of files from a tar archive.

The basic building blocks for your script are two loops. The outer
loop goes through the outer list of directories and for each of those,
the inner loop goes through the user accounts in each directory and
processes them in whatever way you want.

The perl functions you probably want are opendir and readdir. Try
perldoc -f readdir, but to be honest, if your not a perl programmer,
save yourself time and just use bash (unless you want to learn perl).

Tim
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

No members online now.

Forum statistics

Threads
473,756
Messages
2,569,535
Members
45,008
Latest member
obedient dusk

Latest Threads

Top