S
Seth Morabito
------=_Part_17074_33322032.1132438624133
Content-Type: text/plain; charset=ISO-8859-1
Content-Transfer-Encoding: quoted-printable
Content-Disposition: inline
I've done some searching through the archives, but so far I haven't found a=
n
answer to this question.
I have an application that allows users to request arbitrary URLs. The
underlying mechanism uses Net::HTTP.get() to fetch the object at the URL an=
d
attempts to parse it as an XML document.
That all works fine, but it leaves open a fairly trivial DoS attack -- a
user can create a CGI that spews back content continuously, for example. To
lessen this potential, I would really like to specify a byte limit for the
GET, i.e., "Stop reading and close the socket if you have read more than
1MB". HTTP 'Range' doesn't seem like an option, because there's no reason t=
o
expect a malicious server to respect it in the request.
Does anyone have any ideas, or pointers?
Thanks,
-Seth
------=_Part_17074_33322032.1132438624133--
Content-Type: text/plain; charset=ISO-8859-1
Content-Transfer-Encoding: quoted-printable
Content-Disposition: inline
I've done some searching through the archives, but so far I haven't found a=
n
answer to this question.
I have an application that allows users to request arbitrary URLs. The
underlying mechanism uses Net::HTTP.get() to fetch the object at the URL an=
d
attempts to parse it as an XML document.
That all works fine, but it leaves open a fairly trivial DoS attack -- a
user can create a CGI that spews back content continuously, for example. To
lessen this potential, I would really like to specify a byte limit for the
GET, i.e., "Stop reading and close the socket if you have read more than
1MB". HTTP 'Range' doesn't seem like an option, because there's no reason t=
o
expect a malicious server to respect it in the request.
Does anyone have any ideas, or pointers?
Thanks,
-Seth
------=_Part_17074_33322032.1132438624133--