how to create multiple zip files that are no larger than 2 Gb each

T

texasreddog

I have an interesting problem, where I am zipping up a large archive
of jpg images. The zip file can be no larger than 2 Gb. If I try to
zip up n number of images, where n can vary a lot, and if the zip file
gets to 2Gb, the zip process fails.

Would there be some way that I could track the .zip file as it is
being zipped, so when it reaches 2 Gb in size, a new zip file could be
created and the zip process could continue? I don't think the split
command will work, because that runs after zip finishes.

Or, is there a way to possibly start another zip file, when the first
one fails, and then somehow force the zip to pick up where it left
off?

Any suggestions/ideas/tips are much appreciated! Thanks!
 
X

xhoster

texasreddog said:
I have an interesting problem, where I am zipping up a large archive
of jpg images. The zip file can be no larger than 2 Gb. If I try to
zip up n number of images, where n can vary a lot, and if the zip file
gets to 2Gb, the zip process fails.

Would there be some way that I could track the .zip file as it is
being zipped, so when it reaches 2 Gb in size, a new zip file could be
created and the zip process could continue?

I wouldn't expect jpgs to compress by any meaningful amount, so you could
probably assume the size of the archive will be slightly more than the sum
of the sizes of the original files.

I don't think the split
command will work, because that runs after zip finishes.

Or, is there a way to possibly start another zip file, when the first
one fails, and then somehow force the zip to pick up where it left
off?

I don't know the answer, but I expect it will depend on what program/module
you are using to do the zipping.

Xho
 
J

Joe Smith

texasreddog said:
I have an interesting problem, where I am zipping up a large archive
of jpg images. The zip file can be no larger than 2 Gb. If I try to
zip up n number of images, where n can vary a lot, and if the zip file
gets to 2Gb, the zip process fails.

Would there be some way that I could track the .zip file as it is
being zipped, so when it reaches 2 Gb in size, a new zip file could be
created and the zip process could continue?

Here is what I am currently using for that purpose.
-Joe

our $MAXSIZE = 2047 * 1024 * 1024; # Just less than 2 GB

sub create_zip {
my($out_file,@files) = @_;
for (my $N='a'; @files; $N++) {
my $zip_file = "$out_file$N.zip";
my $name = shift @files; # Always do at least one file per zip
my @names = ($name);
my $bytes = -s $name or next;
while (@files and $bytes < $MAXSIZE) {
$name = shift @files;
$bytes += -s $name || 0; # Cumulative uncompressed size
if ($bytes < $MAXSIZE) {
push @names, $name; # OK to add this one
} else {
unshift @files,$name; # Put it back on list of files to do
}
}
my $cmd = '|zip ' . (-f $zip_file ? '-yu' : '-y ') . " $zip_file -@";
print "$cmd (" . @names . " files)\n"; # Number of files being zipped
print " ",join("\n ",@names),"\n" if $verbose;
if (open ZIP,$cmd) {
print ZIP join "\n",@names,'' or warn "print(ZIP): $! ($?)";
close ZIP;
my ($err,$sig) = ($? >> 8, $? & 0xff); # 12 = "No changes"
warn "zip returned error code $err\n" if $err and $err != 12;
warn "At least one file is in use; cannot be backed up\n" if $err == 18;
next if $err == 12;
print "$zip_file = " . bytes_count(-s $zip_file) . "\n\n";
sleep 1;
} else {
warn "open('$cmd') failed: $! ($?)";
}
}
}
 
P

Peter Wyzl

texasreddog said:
I have an interesting problem, where I am zipping up a large archive
of jpg images. The zip file can be no larger than 2 Gb. If I try to
zip up n number of images, where n can vary a lot, and if the zip file
gets to 2Gb, the zip process fails.

Would there be some way that I could track the .zip file as it is
being zipped, so when it reaches 2 Gb in size, a new zip file could be
created and the zip process could continue? I don't think the split
command will work, because that runs after zip finishes.

Or, is there a way to possibly start another zip file, when the first
one fails, and then somehow force the zip to pick up where it left
off?

Any suggestions/ideas/tips are much appreciated! Thanks!

You probably need to update your Perl to a more recent version if you can't
create files over 2Gb in size.

P
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

Forum statistics

Threads
473,744
Messages
2,569,484
Members
44,903
Latest member
orderPeak8CBDGummies

Latest Threads

Top