BitTorrent Idea, thoughts?

Discussion in 'Ruby' started by Jp Hastings-spital, Jun 11, 2009.

  1. (Bear with me, there's a reason this is on ruby-forum)
    Bittorrent downloads can be a pain when there are no seeders at all. I
    propose a special client, run in tandem with any bittorrent client
    (there are some requirements, I'll go through these later) that would
    allow an appropriately set-up torrent to always have at least one peer.

    The basic concept: This new client finds the "pieces" of the torrent
    file from alternate sources, namely the http-web (sites like
    rapidshare.com, mediafire.com, but not exclusively so, even ftp and
    basic hosting could be used).

    In detail:
    For the uninitiated: torrents work by splitting the target files into
    pieces (usually of 64KB,128KB,512KB or 1MB in size) each piece has its
    SHA1 hash taken and put into the .torrent file, along with a 'tracker'
    url (eg. piratebay.org), this .torrent file is then distributed.
    When a peer wants to download the files in the torrent, it will (very
    broadly) query the 'tracker' to get a list of computers with that
    specific torrent available, go through the list of SHA1 hashes and then
    query each of these computers asking for some of the pieces needed (by
    using the SHA1 hash).

    The problem arises when the tracker lists no computers as having the
    torrent available, then there is nowhere to download the file from!

    My proposed solution: the initial seeder (or anyone with the torrent and
    the files) generates hundreds of files, each containing exactly one
    piece of the torrent data, and named by the SHA1 hash referring to that
    piece. A number of these files are put into a zip file and uploaded to
    rapidshare.com/mediafire.com/an ftp site/anywhere. The URL of the zip
    file is put into a bookmarking site (I've actually built a very simple
    dedicated bookmarking system for this purpose) and the URL is tagged
    with something like:
    'torrent:piece=01234_the_pieces_sha1_hash_here_BCDEF' - once for each of
    the pieces contained within the zip. (The URL would also be tagged with
    the 'infohash' of the torrent, allowing the file to be associated with a
    specific torrent)

    My proposed client would search the bookmarking site for any URLs
    containing any pieces that might be needed by the 'real' bittorrent
    client, would download them, extract the zip and provide the piece to
    the 'real' bittorrent client as if it were a local peer.

    --

    So, I've written a (painfully hack-ish) script to create these zips and
    upload them (in ruby) but I'd now like to go about modifying rubytorrent
    (http://rubytorrent.rubyforge.org/) - old tho it is - to work as one of
    these 'fake' peers.

    Do you think this is a reasonable idea?
    Can 'Local Peer Discovery' be used in reverse to find out what
    torrents/pieces a specific client is looking for?
    - (The proposed bonjour/zeroconf announcing of torrents/pieces would
    have worked for this, but I'm unsure as to how Local Peer Discovery
    works - as implemented by uTorrent etc)

    Love to hear anyone's opinion, usability, feasibility or any other
    'bility-ways, Cheers, JP
    --
    Posted via http://www.ruby-forum.com/.
     
    Jp Hastings-spital, Jun 11, 2009
    #1
    1. Advertising

  2. Jp Hastings-spital

    Roger Pack Guest

    > My proposed client would search the bookmarking site for any URLs
    > containing any pieces that might be needed by the 'real' bittorrent
    > client, would download them, extract the zip and provide the piece to
    > the 'real' bittorrent client as if it were a local peer.


    is this single point of failure, though? and who's going to give all the
    free bandwidth?
    -=r
    --
    Posted via http://www.ruby-forum.com/.
     
    Roger Pack, Jun 12, 2009
    #2
    1. Advertising

  3. Roger Pack wrote:
    >> My proposed client would search the bookmarking site for any URLs
    >> containing any pieces that might be needed by the 'real' bittorrent
    >> client, would download them, extract the zip and provide the piece to
    >> the 'real' bittorrent client as if it were a local peer.

    >
    > is this single point of failure, though? and who's going to give all the
    > free bandwidth?
    > -=r


    Cheers Roger,
    Do you mean the bookmarking site, or the piece hosting?

    The bookmarking system can be adapted to use almost any webservice where
    you can tag objects (the object would contain the url information, eg.
    tumblr posts, or urls on delicious, furl, reddit etc, or even a URL
    encoded in a 2D matrix could be created so image sites like flickr could
    be used) - granted it looks like it would be a centralised system, but
    if a number of sites could be used then my proposed client could query
    each of these systems for pieces. On top of that you could automate the
    decentralising by having each tag-site have every other tag-site's url
    tagged with 'torrent-tag-site' or something similar.

    The piece hosting could be provided by any mirror service, by a private
    hoster (eg. I could host the pieces to my own comic book, in order to
    make sure its always available), or by rapidshare, mediafire or any
    other service that provides free hosting.
    --
    Posted via http://www.ruby-forum.com/.
     
    Jp Hastings-spital, Jun 12, 2009
    #3
    1. Advertising

Want to reply to this thread or ask your own question?

It takes just 2 minutes to sign up (and it's free!). Just click the sign up button to choose a username and then you can ask your own questions on the forum.
Similar Threads
  1. xunling
    Replies:
    1
    Views:
    329
    Victor Bazarov
    Jan 10, 2005
  2. xunling
    Replies:
    6
    Views:
    321
  3. bruce
    Replies:
    2
    Views:
    249
    billie
    Oct 3, 2005
  4. camdenjobs
    Replies:
    1
    Views:
    290
    Jeffrey Schwab
    Nov 11, 2005
  5. Andrew Pressel

    Re: Thoughts on implementing this idea

    Andrew Pressel, Jul 30, 2006, in forum: Perl
    Replies:
    0
    Views:
    1,834
    Andrew Pressel
    Jul 30, 2006
Loading...

Share This Page