Multipart form upload causes script to hang after 16K printed

J

John

Firstly, the OS is FreeBSD 4.7-RELEASE-p27 (VKERN) #33, perl is
v5.6.1., and Apache is 1.3.27.

I'm using following (very pared down version) HTML form for submitting
up to 5 files as uploads:

<html><body>
<form enctype="multipart/form-data" action="upload.cgi" method="post"
name="upload_form">
<input type="file" name="file[1]" size=25
accept="image/*,application/*"><br>
<input type="file" name="file[2]" size=25 ... (this repeats until
file[5])
<input type="submit" value="Upload" name="Upload">
<INPUT TYPE="reset" value="Reset" name="Reset">
</form>
</html></body>

After a ton of debugging, I consistently encounter a problem when the
upload data is over 16K, and the script is attempting to output
-ANYTHING- over 16K. And I mean, even when the script doesn't even
process the submitted data!

I've debugged the script down to virtually nothing, and the processing
of the uploaded data doesn't seem to have anything to do with the
problem. For example, here's upload.cgi which has really nothing to
do with the submitted data anymore:

#------------------------
#!/usr/local/bin/perl -w

use strict;

my $header_page = $ENV{'DOCUMENT_ROOT'} .
'/scripts/uploadheader.html';

# $|=0; #nope this doesn't help
# $|=1; #neither does this

print "Content-type: text/html\n\n\<HTML><HEAD><TITLE>File
Upload</TITLE>\n";
&spitout_file($header_page);
exit;
#-----------------
sub spitout_file {
my $file = $_[0]; my $endbyte=(-s $file); my $string='';
if ($endbyte>0){
open FILE, $file || die "Unable to locate file $file";
read FILE, $string, $endbyte;
close FILE;
print $string;
undef $string;
}else{
return 0;
}
return 1;
}

I've also tried spitout_file as this:

#-----------------
sub spitout_file {
my $file = $_[0]; my $l;
open (AFILE, "<$file") || die "Unable to locate file $file";
while (<AFILE>) {print $_;}
close (AFILE);
}
#-----------------

I've tried other variations on spitout_file as well, including reading
1000, 8192, and 16384 bytes at a time. Every method, every time, will
not print the contents of $file if it's more than 16384 bytes, and the
submitted data is more than 16384. All methods -will- read the file,
but they won't print. It's as if "print" is broken and or some buffer
is full and can't take anymore, so perl just quits.

If the submitted data is <16384, the script works -or- if $file is
<16384, the script works. And, of course, the script works fine if
called by itself (not the action of this upload form.)

I've got no error messages, no Apache error_log messages, and nothing
to go on other than what's above, and have tried everything I can
think of. I've searched the groups and the web and cannot find
anything resembling this.

What am I missing?
 
B

Brian McCauley

Firstly, the OS is FreeBSD 4.7-RELEASE-p27 (VKERN) #33, perl is
v5.6.1., and Apache is 1.3.27.

I'm using following (very pared down version) HTML form for submitting
up to 5 files as uploads:
After a ton of debugging, I consistently encounter a problem when the
upload data is over 16K, and the script is attempting to output
-ANYTHING- over 16K. And I mean, even when the script doesn't even
process the submitted data!

If the CGI script does not process (or at least discard) the data
presented to it on STDIN by a HTTP server then that data will just be
left sitting in the FIFO between the HTTP server process and the CGI
process. If the data exceeds the size of a FIFO buffer and the HTTP
server process doesn't bother trying to read from the GCI's STDOUT
until it has finished writing the CGI request to the CGI's STDIN then
the web server (or at least one hander thread/subprocess) will stall.
I don't know enough about the internals of Apache 1.3 to be sure if it
behaves this way but from what you describe I'd guess this is what's
happening.

--
\\ ( )
. _\\__[oo
.__/ \\ /\@
. l___\\
# ll l\\
###LL LL\\
 
J

John

Brian McCauley said:
If the CGI script does not process (or at least discard) the data
presented to it on STDIN by a HTTP server then that data will just be
left sitting in the FIFO between the HTTP server process and the CGI
process. If the data exceeds the size of a FIFO buffer and the HTTP
server process doesn't bother trying to read from the GCI's STDOUT
until it has finished writing the CGI request to the CGI's STDIN then
the web server (or at least one hander thread/subprocess) will stall.
I don't know enough about the internals of Apache 1.3 to be sure if it
behaves this way but from what you describe I'd guess this is what's
happening.

Hi Brian,

The actual script does process the form data. I just ommitted that
from my post since the script wouldn't even get that far. Regardless,
you are correct. I changed the script to process the submitted data
prior to performing any output, and it works fine!

Funny how something so obvious is hard to see after hours of mulling
over code. Thanks for pointing it out and solving the problem!

Regards,
Dave
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

Forum statistics

Threads
473,731
Messages
2,569,432
Members
44,832
Latest member
GlennSmall

Latest Threads

Top