Arugument list too long on Linux System

B

Brad Tilley

I was using Python to tar up some files each day in a directory by
calling os.popen("/bin/tar.......).
Everything was working well until the app that generated the files was
set to generate a file every two seconds for nine hours each day (16,200
files each day). After this I began getting this error:

"Argument list too long"

This would happen from the script or from the command line invoking tar
natively. So, I wrote my way around this by using this code:

tar_it = tarfile.open("%s.tar" %real_time, "w")
for root, dirs, files in os.walk(path):
for f in files:
jpg = os.path.splitext(f)
## If the file has a '.jpg' extension add it to the tar file.
if jpg[1] == '.jpg':
tar_it.add(f)

But, I don't understand why I got this error in the first place. I use a
rather robust Debian GNU\Linux system (3.2GHz proc, 2GB DDR400 RAM, 15K
SCSI drives) what causes this type of error? This is not really a Python
issue, but I thought some knowledgeable users on the list might be
willing to explain it.

Thanks,

Brad
 
D

Diez B. Roggisch

But, I don't understand why I got this error in the first place. I use a
rather robust Debian GNU\Linux system (3.2GHz proc, 2GB DDR400 RAM, 15K
SCSI drives) what causes this type of error? This is not really a Python
issue, but I thought some knowledgeable users on the list might be
willing to explain it.

Its a system limit - googling reveals that is usually set to 128kb on linux
systems.

Using the -T option of tar might help you here - just create a temporary
file, stare your filenames in there and pass it to tar.
 
J

Jaime Wyant

If your python script processed '*.jpg' files, then you could probably
get away with using the glob module ->

import glob
files_to_process = glob.glob( "/path/to/files/*.jpg" )
for f in files_to_process:
tar_it.add(f)

Something like that ought to be close to what you need. I'm not sure
what the limitation is to commandline arguments, so I can't be much
help there :(.

jw

tar_it = tarfile.open("%s.tar" %real_time, "w")
for root, dirs, files in os.walk(path):
for f in files:
jpg = os.path.splitext(f)
## If the file has a '.jpg' extension add it to the tar file.
if jpg[1] == '.jpg':
tar_it.add(f)

I was using Python to tar up some files each day in a directory by
calling os.popen("/bin/tar.......).
Everything was working well until the app that generated the files was
set to generate a file every two seconds for nine hours each day (16,200
files each day). After this I began getting this error:

"Argument list too long"

This would happen from the script or from the command line invoking tar
natively. So, I wrote my way around this by using this code:

tar_it = tarfile.open("%s.tar" %real_time, "w")
for root, dirs, files in os.walk(path):
for f in files:
jpg = os.path.splitext(f)
## If the file has a '.jpg' extension add it to the tar file.
if jpg[1] == '.jpg':
tar_it.add(f)

But, I don't understand why I got this error in the first place. I use a
rather robust Debian GNU\Linux system (3.2GHz proc, 2GB DDR400 RAM, 15K
SCSI drives) what causes this type of error? This is not really a Python
issue, but I thought some knowledgeable users on the list might be
willing to explain it.

Thanks,

Brad
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

No members online now.

Forum statistics

Threads
473,769
Messages
2,569,582
Members
45,065
Latest member
OrderGreenAcreCBD

Latest Threads

Top