Mirroring large files over HTTP

L

Lars Haugseth

I'm working on a script where I want to download large files off a remote
web server and store on a local filesystem.

At the moment I'm using code like this:

require 'open-uri'

open(filename, 'w') do |file|
file.write(open(remote_url).read)
end

I assume this will read the complete content of the remote file into
memory before writing it to the local file. If that assumption is
correct, what is the best/easiest way to do a buffered piecemeal
fetch/store? I've looked at the net/http library but haven't found
anything in there that looks relevant to this.
 
L

Lars Haugseth

* Lars Haugseth said:
I'm working on a script where I want to download large files off a remote
web server and store on a local filesystem.

At the moment I'm using code like this:

require 'open-uri'

open(filename, 'w') do |file|
file.write(open(remote_url).read)
end

I assume this will read the complete content of the remote file into
memory before writing it to the local file. If that assumption is
correct, what is the best/easiest way to do a buffered piecemeal
fetch/store? I've looked at the net/http library but haven't found
anything in there that looks relevant to this.

Turns out the OpenURI module is indeed fetching the remote resource
in segments and storing to a temporary file. However, my code above
will read the complete contents of that file into memory before
writing it back out to another file.

By inspecting the OpenURI source code I've learned that this is how
it's done (sans proxy handling, error handling etc.):

uri = URI.parse(url)
http = Net::HTTP.new(uri.host, uri.port)
http.request_get(uri.path) do |response|
open(filename, 'w') do |file|
response.read_body do |segment|
file.write(segment)
end
end
end

I'm a little surprised not to find any convenience method in the standard
libraries doing all this for me, though.
 
E

Eric Hodel

Turns out the OpenURI module is indeed fetching the remote resource
in segments and storing to a temporary file. However, my code above
will read the complete contents of that file into memory before
writing it back out to another file.

I'm a little surprised not to find any convenience method in the
standard
libraries doing all this for me, though.

Why? It's all of one line:

output.write input.read(16384) until input.eof?
 
L

Lars Haugseth

* Eric Hodel said:
Why? It's all of one line:

output.write input.read(16384) until input.eof?

Nice enough, but one will need a bit more than that singke line to do the
whole operation from start to finish.

I was thinking more of something like SomeClass.mirror(url, filename).
 
L

Lars Haugseth

* Lars Haugseth said:
Nice enough, but one will need a bit more than that singke line to do the
whole operation from start to finish.

I was thinking more of something like SomeClass.mirror(url, filename).

Today I came across the the curb¹ gem (Ruby bindings for libcurl) while
reading a blog posting² about net/http performance, and this gem provides
a convenient class method that does exactly what I want:

require 'curb'
Curl::Easy.download(url, filename)

It also provides lots of other nice stuff, so I will definitely look
into using this one for my future HTTP client needs.

[1] http://curb.rubyforge.org/
[2] http://apocryph.org/analysis_ruby_18x_http_client_performance
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

No members online now.

Forum statistics

Threads
473,755
Messages
2,569,536
Members
45,007
Latest member
obedient dusk

Latest Threads

Top