A big thank you to Robby Russell...

G

Gavin Kistner

...for providing another RubyForge mirror via his company,
PlanetArgon.
All mirror providers are listed here:

With the sudden influx of mirrors, what's the expected bandwidth
these days? I might be able to offer up another mirror if it's down
to not insane levels.
 
T

Tom Copeland

With the sudden influx of mirrors, what's the expected bandwidth
these days? I might be able to offer up another mirror if it's down
to not insane levels.

It's up to about 225 GB per month overall, so perhaps 50 GB per mirror.
But Dennis Oeklers kindly volunteered to shoulder half the load himself
on http://lauschmusik.de/ , so that cuts the bandwidth usage for the
other mirrors to perhaps 30 GB per month.

Yours,

Tom
 
S

Sam Mayes

------=_Part_25867_12636558.1127838241118
Content-Type: text/plain; charset=ISO-8859-1
Content-Transfer-Encoding: quoted-printable
Content-Disposition: inline

whats the process for becomming a mirror?

Sam

It's up to about 225 GB per month overall, so perhaps 50 GB per mirror.
But Dennis Oeklers kindly volunteered to shoulder half the load himself
on http://lauschmusik.de/ , so that cuts the bandwidth usage for the
other mirrors to perhaps 30 GB per month.

Yours,

Tom

------=_Part_25867_12636558.1127838241118--
 
T

Tom Copeland

whats the process for becomming a mirror?

Procedurally, just send me or Rich Kilmer an email and we can work out
the details. Technically, you just need to be able to rsync down about
1.1 GB of data to a host somewhere (rubyforge.yourdomain.org or
whatever) and then be able handle to handle the bandwidth load...

Yours,

Tom
 
A

Austin Ziegler

Procedurally, just send me or Rich Kilmer an email and we can work out
the details. Technically, you just need to be able to rsync down about
1.1 GB of data to a host somewhere (rubyforge.yourdomain.org or
whatever) and then be able handle to handle the bandwidth load...

As the first mirror, I had committed to about 50Gb bandwidth out of my
available 100Gb. The 50Gb was blown in the first month (thanks, Curt!
-- it was because of Curt's RoR article) but settled down after that
with the addition of Dennis Oelkers's mirror. By time the fourth and
fifth mirrors were announced, I was at about 70Gb again (which wasn't
a problem; the machine isn't used for much else right now), so I
expect to go below 30Gb per month *for now*, as Dennis is soaking up
50% of the total bandwidth and the other four mirrors are handling the
50% that's left. I wouldn't be surprised if the next move is about 35%
Dennis, 35% Robby, and 30% the rest of us, because Robby has bandwidth
to spare for RubyForge, too.

The commitment is likely to be about 30Gb initially.

-austin
 
H

Han Holl

------=_Part_1367_31351036.1127899483487
Content-Type: text/plain; charset=ISO-8859-1
Content-Transfer-Encoding: quoted-printable
Content-Disposition: inline

Did you consider using bittorrent ?
The more it's used for legitimate purposes, the harder it will be to do awa=
y
with it.
And you'll _never_ het slashdotted.

Cheers,

Han Holl


------=_Part_1367_31351036.1127899483487--
 
A

Austin Ziegler

Did you consider using bittorrent? The more it's used for legitimate
purposes, the harder it will be to do away with it. And you'll _never_
[get] slashdotted.

Well, honestly, BitTorrent requires that there be continuous interest in
the files being torrented. Unless the files are well seeded and/or
downloaded often, BitTorrent downloads will often be much slower than
simple FTP downloads with round-robin mirrors as have been set up for
RubyForge.

There is, by the way, exactly one file that would even remotely qualify
for BitTorrenting based on the popularity profile -- ruby182-15.exe.

-austin
 
H

Han Holl

------=_Part_2118_17298815.1127914885420
Content-Type: text/plain; charset=ISO-8859-1
Content-Transfer-Encoding: quoted-printable
Content-Disposition: inline

Well, honestly, BitTorrent requires that there be continuous interest in
the files being torrented. Unless the files are well seeded and/or
downloaded often, BitTorrent downloads will often be much slower than
simple FTP downloads with round-robin mirrors as have been set up for
RubyForge.

Oh, I didn't know that. I my naivety I thought that if 4 servers were all
continuously offering these bittorent files, simple FTP download speed woul=
d
be the lower boundary.
Apparently there is more overhead in the bittorrent protocol than I thought=
 
T

Tom Copeland

Did you consider using bittorrent ?
The more it's used for legitimate purposes, the harder it will be to do away
with it.
And you'll _never_ het slashdotted.

Yup, we hosted some large files on BT for a bit... but the torrents
really didn't get used much. Also, our tracker got hijacked (due to a
misconfiguration, my fault) which kind of left me with a bad taste for
the whole thing.

Yours,

Tom
 
T

Tom Copeland

There is, by the way, exactly one file that would even remotely qualify
for BitTorrenting based on the popularity profile -- ruby182-15.exe.

Yup, and even that one didn't get downloaded via BT much; folks still
just went to the file releases page.

Yours,

Tom
 
M

Morgan

[Delayed reaction due to me still catching up on reading ruby-talk after
no mail access for a while... Up to october 1 now...]

Han said:
continuously offering these bittorent files, simple FTP download speed would
be the lower boundary.
Apparently there is more overhead in the bittorrent protocol than I thought.

The problem as I understand it is that bittorrent is really best for
distributing
small numbers of large files. A single tracker can coordinate things for
lots of
files, but as far as actual seeding goes (the act most analagous to a system
providing an ftp download), it's unusual for more than a few files to be seeded
at a time. (The original bt client, I believe, would only allow three
instances to
be open at a time.) This works well enough for, say, having downloads of linux
cd images. (Which is how I got mine the last time I tried doing anything with
it.) Not so good for something like rubyforge, where you have a very large
number
of small files - you could make a torrent of the entire thing maybe, which
would
be useful to almost no one...

I've thought off and on about a system that could deal with this sort thing
better.
Something with the file verification systems like bittorrent (both to
protect against
ordinary data corruption and malicious file modifications), that allowed
passively
seeding files - if someone wants it, it's available, but if not, resources
aren't
wasted on it. Also less dependence on a single central server, since running a
high-usage bt tracker seems to be very hard on a system. Throw in some
public/private key to allow validation of <whatever file replaces .torrent>
files
that were acquired through an untrusted source...

Mostly there's three things that have kept me from actually trying to make
this.
1. I don't know enough about ruby (particularly thread handling) to make it
work.
2. I don't know enough about designing network protocols to make it work.
3. I doubt my ability to convince enough other people to use it to make it
worthwhile.

-Morgan, hates seeing "seeds: 0"...
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

No members online now.

Forum statistics

Threads
473,755
Messages
2,569,537
Members
45,022
Latest member
MaybelleMa

Latest Threads

Top