webrick fails with large files

R

Roger Pack

So...apparently webrick does the following for serving pages:
loads the page into a string object
then sends that string to the requestor.
Unfortunately this means that if it is a very large file being served
that that buffer string will get very large. I.e. with a 700MB file it
will fail because Ruby runs out of memory. This fact also defeats the
'streaming' aspect of some ruby functions, for example the RoR send_file
has an option to send a file 'a chunk at a time' to the client--however
these chunks are all conglomerated within webrick then sent--so it
doesn't stream out as it hoped to be doing. I therefore see this as a
bug in Webrick and was wondering what others thought.
Cheers!
-Roger
 
E

Eric Hodel

So...apparently webrick does the following for serving pages:
loads the page into a string object
then sends that string to the requestor.
Unfortunately this means that if it is a very large file being served
that that buffer string will get very large.

Don't dynamically build and send large strings. Use an IO instead.
WEBrick knows the difference and responds accordingly.
 
R

Roger Pack

Eric said:
Don't dynamically build and send large strings. Use an IO instead.
WEBrick knows the difference and responds accordingly.

I believe that rails must use strings, then--any ideas for that case?
Thanks!
 
M

Mat Schaffer

So...apparently webrick does the following for serving pages:
loads the page into a string object
then sends that string to the requestor.
Unfortunately this means that if it is a very large file being served
that that buffer string will get very large. I.e. with a 700MB
file it
will fail because Ruby runs out of memory. This fact also defeats the
'streaming' aspect of some ruby functions, for example the RoR
send_file
has an option to send a file 'a chunk at a time' to the client--
however
these chunks are all conglomerated within webrick then sent--so it
doesn't stream out as it hoped to be doing. I therefore see this as a
bug in Webrick and was wondering what others thought.
Cheers!
-Roger

I doubt Webrick was ever really intended for that sort of work. You
could try Mongrel, although it may yield the same result. Rock solid
ruby deployment is still something of a work in progress, I feel.
-Mat
 
B

Brian Candler

I doubt Webrick was ever really intended for that sort of work. You
could try Mongrel, although it may yield the same result. Rock solid
ruby deployment is still something of a work in progress, I feel.

Or run your code as a fastcgi under Apache.

This doesn't stop you from trying to read a 700MB file into a string of
course, but it does give you the option to simply open the file, read it
chunk at a time and squirt it to STDOUT.

I believe Rails will run happily under fastcgi. You'll just need to tell it
not to render anything if you're generating your own HTTP headers and body
directly.

Fastcgi also has the advantage of automatically making your program thread
safe, since each instance is a separate process. The downside is that if
you're handling (say) 5 concurrent client requests, you'll have five
Ruby+Rails processes spawned from scratch, each with their own memory
footprint.

Regards,

Brian.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

No members online now.

Forum statistics

Threads
473,777
Messages
2,569,604
Members
45,202
Latest member
MikoOslo

Latest Threads

Top