T
thefed
What is the best way to download files from the internet (HTTP) that
are greater than 1GB?
Here's the story in whole....
I was trying to use Ruby Net::HTTP to manage a download from
wikipedia... Specifically all current versions of the english one...
But anyways, as I was downloading it, I got a memory error as I ran
out of RAM.
My current code:
open(@opts[
ut], "w") do |f|
http = Net::HTTP.new(@url.host, @url.port)
c = http.start do |http|
a = Net::HTTP::Get.new(@url.page)
http.request(a)
end
f.write(c.body)
end
I was hoping there'd be some method that I can attach a block to, so
that for each byte it will call the block.
Is there some way to write the bytes to the file as they come in, not
at the end?
Thanks,
---------------------------------------------------------------|
~Ari
"I don't suffer from insanity. I enjoy every minute of it" --1337est
man alive
are greater than 1GB?
Here's the story in whole....
I was trying to use Ruby Net::HTTP to manage a download from
wikipedia... Specifically all current versions of the english one...
But anyways, as I was downloading it, I got a memory error as I ran
out of RAM.
My current code:
open(@opts[
http = Net::HTTP.new(@url.host, @url.port)
c = http.start do |http|
a = Net::HTTP::Get.new(@url.page)
http.request(a)
end
f.write(c.body)
end
I was hoping there'd be some method that I can attach a block to, so
that for each byte it will call the block.
Is there some way to write the bytes to the file as they come in, not
at the end?
Thanks,
---------------------------------------------------------------|
~Ari
"I don't suffer from insanity. I enjoy every minute of it" --1337est
man alive