web hoster won't secure CGI

W

wana

According to Lincoln Stein's book on CGI.pm, to make CGI safe, you
have to make configuration changes at the beginning of the CGI.pm
file. I asked my web hosting company if they had made these changes
to protect me from multi-megabyte uploads or large entries in
textfields and they told me that they make no changes to Perl modules
and I have no access to it myself.

Is there another way to provide this protection?

Thanks,

wana
 
M

Matt Garrish

wana said:
According to Lincoln Stein's book on CGI.pm, to make CGI safe, you
have to make configuration changes at the beginning of the CGI.pm
file. I asked my web hosting company if they had made these changes
to protect me from multi-megabyte uploads or large entries in
textfields and they told me that they make no changes to Perl modules
and I have no access to it myself.

You should be more specific, because most people will probably not have read
the book. With respect to the example you cite, there's no reason why you
would need to change anything in the source. If you read the documentation
for the module you will quickly discover how to accomplish what you're after
(see in particular the section "Avoiding Denial of Service Attacks"). I do
find it a bit hard to believe that he would have only mentioned the source
code method in his book, though.

Matt
 
A

A. Sinan Unur

Read the CGI.pm documentation:

http://search.cpan.org/~lds/CGI.pm-
3.05/CGI.pm#Avoiding_Denial_of_Service_Attacks

....
Don't use CGI.pm. Pick up a copy of CGI Programming on the World Wide
Web (1996, the 1st edition--NOT the 2nd), from O'Reilly, if you can
find it.

OK ... The blind leading the deaf or whatever the phrase is comes to
mind. Now, Gunnar, before you interject and say "it is OK not to use CGI
for such and such reasons", let me point out that it is not OK for this
particular poster to do that before actually learning Perl.
Here's the code from that book:

In the main script, call the function:

&parse_form_data(*FORM);

Do you know what the & does? What's up with the typeglob?
which will put your variables into the %FORM hash.

and here's the routine:

sub parse_form_data
{
local (*FORM_DATA) = @_;

local ( $request_method, $query_string, @key_value_pairs,
$key_value, $key, $value, $arg_value, $arg_index );

Oh, beautiful!
@key_value_pairs = split (/&/, $query_string);

Ta - da!
I stopped using this when CGI.pm came out and haven't looked
back since.

Your advice above and this statement contradict each other. I think you
should follow the latter.

Sinan.
 
E

Eric Schwartz

Michael Vilain said:
Don't use CGI.pm.

Wow, what a bad idea, in 90% of cases, including yours.
Here's the code from that book:

In the main script, call the function:

&parse_form_data(*FORM);

Wow, I can think of at least three things wrong with that function
call already-- can you?
which will put your variables into the %FORM hash.

and here's the routine:

sub parse_form_data
{
local (*FORM_DATA) = @_;

Why would you do this? eeeevil. If you must do such a thing, declare
the hash %FORM_DATA with 'my', and return it at the end of this
function. Better yet, don't, and use CGI.pm.
local ( $request_method, $query_string, @key_value_pairs,
$key_value, $key, $value, $arg_value, $arg_index );

All bad! Use lexical variables here instead. It's easy enough to
do-- swap 'my' for 'local' there. You haven't had to use 'local' for
private variables since perl 5 came out ages and ages ago. It's also
bad style in general to use package variables when they aren't needed.
$request_method = $ENV{'REQUEST_METHOD'};
if ( $request_method eq "" && $ENV{'SHELL'} ne "" ) {
$request_method = "SHELL";
}

Note: CGI.pm handles all this for you. And better than this code
does.
if ($request_method eq "GET") {
$query_string = $ENV{'QUERY_STRING'};
} elsif ($request_method eq "POST") {
read (STDIN, $query_string, $ENV{'CONTENT_LENGTH'});

You don't check that you read $ENV{CONTENT_LENGTH} (no quotes needed
there, or anywhere you have used a hash here. That could be bad, but
you'll never know!
} elsif ($request_method eq "SHELL") {
#
# for the shell, build a string of arguments so it can be
# parsed later on
#
$arg_index = 1;
foreach $arg_value (@ARGV) {
$arg_value =~ s/=/%3D/g;
$value = sprintf ("ARG%d=%s",$arg_index,$arg_value);
if ($arg_index == 1) {
$query_string = $value;
} else {
$query_string = join ("&", $query_string, $value);
}
$arg_index++;
}

You don't do proper URL escaping on your keys or values there. CGI.pm
handles this properly.
} else {
&return_error (500, "Server Error",
"Server uses unsupported method");

Why are you calling functions with &? That hasn't been necessary for
years now.
}

@key_value_pairs = split (/&/, $query_string);

What if your user agent submits parameters separated by ';' instead of
'&'? This is perfectly legal to do, and you don't handle it at all.
foreach $key_value (@key_value_pairs) {
($key, $value) = split (/=/, $key_value);
$value =~ tr/+/ /;
$value =~ s/%([\dA-Fa-f][\dA-Fa-f])/pack ("C", hex ($1))/eg;

I haven't checked, but I would bet this isn't a complete URL-decoding
algorithm.
$value =~ s/[;><&\*`\|\!]//g; # prevent sub-shell nasties

What sub shell? I see no sub shell here....
if (defined($FORM_DATA{$key})) {
$FORM_DATA{$key} = join ("\0", $FORM_DATA{$key}, $value);

Ick. How annoying; it makes you manually detect multiple values
yourself and parse them out again manually. CGI.pm will happily hand
you an list if you get multiple values for one named parameter.
} else {
$FORM_DATA{$key} = $value;
}
}
}


I stopped using this when CGI.pm came out and haven't looked back since.

Good advice! I was afraid you were still using this rather buggy
code. I would advise anyone else still using this code to stop using
it as well, and use CGI.pm.

And to forestall Gunnar, yes, it is possible to write your own correct
URL-parsing code, if you know what you're doing. This isn't it.
That's why, unless you have a good reason not to, you should use
CGI.pm.

-=Eric
 
B

brian d foy

[[ This message was both posted and mailed: see
the "To," "Cc," and "Newsgroups" headers for details. ]]

wana said:
According to Lincoln Stein's book on CGI.pm, to make CGI safe, you
have to make configuration changes at the beginning of the CGI.pm
file. I asked my web hosting company if they had made these changes
to protect me from multi-megabyte uploads or large entries in
textfields and they told me that they make no changes to Perl modules
and I have no access to it myself.
Is there another way to provide this protection?

You don't need to change the module itself because you can set the
variable values yourself. Load CGI.pm, then change the values.

use CGI;

$CGI::pRIVATE_TEMPFILES = 1;

my $query = CGI->new();

You can install your own version of CGI.pm and change it how you
like. Just tell your script how to where to find it.

Maybe the world needs an interface to all of these variables, so
you could just use something like:

CGI->enable_private_tempfiles;
CGI->set_post_max( 1024 );
 
G

Gunnar Hjalmarsson

A. Sinan Unur said:
Now, Gunnar, before you interject and say "it is OK not to use CGI
for such and such reasons", let me point out that it is not OK for
this particular poster to do that before actually learning Perl.

:)

For some reason, both you and Eric mentioned me in this thread. Let me
explain (again): I have a relaxed relation to Perl modules, and I use
modules when I find it suitable to do so. I often use CGI.pm, and I
normally advise others to use it for parsing CGI data. At the same time,
to me it's not a gathering catastrophe each time somebody considers to
use some other code.

Personally I pay more heed to CGI security, and it's satisfying to
notice that the OP knew that CGI.pm does not protect against DoS attacks
by default, despite several posts here that have given the opposite
impression. As others have pointed out, the CGI docs describe
alternative methods to enable such protection.

Michael's schizophrenic post didn't even address the OP's question, and
I had no intention to comment on it.
 
T

Tassilo v. Parseval

Also sprach wana:
According to Lincoln Stein's book on CGI.pm, to make CGI safe, you
have to make configuration changes at the beginning of the CGI.pm
file.

You don't have to alter CGI.pm's sourcecode, though. These changes
happen through package variables that are accessible from outside the
module.
I asked my web hosting company if they had made these changes
to protect me from multi-megabyte uploads or large entries in
textfields and they told me that they make no changes to Perl modules
and I have no access to it myself.

Is there another way to provide this protection?

By setting $CGI::pOST_MAX or even $CGI::DISABLE_UPLOADS to some sane
value after use()ing the module. See "Avoiding Denial of Service
Attacks" near the end of CGI.pm's perldocs.

Tassilo
 
A

A. Sinan Unur

A. Sinan Unur wrote:

:)

For some reason, both you and Eric mentioned me in this thread.

I forgot to put a smiley at the end ... It was meant in good humor,
especially because you and I had an exchange related to some other home-
cooked CGI parsing routine.
I have a relaxed relation to Perl modules, and I use modules when I find
it suitable to do so.

And I tend to go look for a module first and only rarely give them up (for
example, File::Copy in response to Abigail's comments).
to me it's not a gathering catastrophe each time somebody considers to
use some other code.

True, but you have to condition that statement on the OP's identity.
it's satisfying to notice that the OP knew that CGI.pm does not protect
against DoS attacks by default, despite several posts here that have
given the opposite impression.

On the other hand, it is prejudicial against the OP that he did not, upon
uncovering this, go and read the CGI.pm docs first. Instead, he went to his
hosting company, and wanted them to change the site CGI.pm to suit his and
his needs only. While we agree that it is useful to put a limit on POST
data size, it is far from clear that that limit ought to be the same for
every script.

Sinan.
 
W

wana

Gunnar said:
:)

For some reason, both you and Eric mentioned me in this thread. Let me
explain (again): I have a relaxed relation to Perl modules, and I use
modules when I find it suitable to do so. I often use CGI.pm, and I
normally advise others to use it for parsing CGI data. At the same time,
to me it's not a gathering catastrophe each time somebody considers to
use some other code.

Personally I pay more heed to CGI security, and it's satisfying to
notice that the OP knew that CGI.pm does not protect against DoS attacks
by default, despite several posts here that have given the opposite
impression. As others have pointed out, the CGI docs describe
alternative methods to enable such protection.

Michael's schizophrenic post didn't even address the OP's question, and
I had no intention to comment on it.

Thanks, I read the CGI docs and the book chapter and it does explain how to
protect scripts globally (change CGI.pm file) or on a script-by-script
basis.  The book words it a little differently and emphasizes the changing
of the CGI.pm file more.  The script-by-script method is mentioned as a way
to override the changes you made to CGI.pm so individual scripts may open
themselves up to larger downloads if they want, although, on second
reading, it is clear you can do it either way.

It seems to me that a web hosting service would probably want to set the
defaults in such a way to prevent attacks:

               $CGI::pOST_MAX=1024 * 100;  # max 100K posts
               $CGI::DISABLE_UPLOADS = 1;  # no uploads

Then individual scripts still have the option to override.

I don't mean to put down my web provider.  They have given me incredible
service for the small price I pay.  It is like a home away from home.  I
didn't even know what Perl was when I signed up 2 years ago.  Now that I am
learning, I find that they have hundreds of modules installed and I have
had no problems running some cool programs there.  They just have a policy
of not modifying any modules, which makes sense.

By the way, the book 'MySQL and Perl for the Web' by Paul Dubois clued me in
on the importance of security in CGI scripts and the complete lack of
security that html forms provide.
 
G

Gunnar Hjalmarsson

wana said:
Thanks, I read the CGI docs and the book chapter and it does explain
how to protect scripts globally (change CGI.pm file) or on a
script-by-script basis. The book words it a little differently and
emphasizes the changing of the CGI.pm file more.

If that's the case, Lincoln had reasonably not shared web hosting
environments in mind when he wrote it.
It seems to me that a web hosting service would probably want to set
the defaults in such a way to prevent attacks:

$CGI::pOST_MAX=1024 * 100; # max 100K posts
$CGI::DISABLE_UPLOADS = 1; # no uploads

I'd say that it would be most ill-advised to do so, for precisely the
same reason why Lincoln Stein does not change the default behaviour of
the module.
They just have a policy of not modifying any modules, which makes
sense.

Indeed. Actually, if they had accepted to change it after a request from
one of their customers, *that* would have been a reason to consider
putting them down. ;-)
By the way, the book 'MySQL and Perl for the Web' by Paul Dubois
clued me in on the importance of security in CGI scripts and the
complete lack of security that html forms provide.

That's great. Making DoS attacks more difficult is one thing you can
do. Validating the data and enabling tainted mode are two other
important steps to reduce the inherent risks with CGI.
 
C

ctcgag

According to Lincoln Stein's book on CGI.pm, to make CGI safe, you
have to make configuration changes at the beginning of the CGI.pm
file. I asked my web hosting company if they had made these changes
to protect me from multi-megabyte uploads or large entries in
textfields and they told me that they make no changes to Perl modules
and I have no access to it myself.

I am not an expert on this, but I believe that the OS provides the
capabilities to limit user processes in various which, if employed, would
be just as effective against mult-megabyte uploads as the changes to CGI
would be. (And would also protect against other things that changing CGI
does not.)
Is there another way to provide this protection?

You can do it yourself as described in the docs, but that only protects you
against DOS aimed at you. DOS aimed you co-hostees could still take you
down along with the intended target.

Xho
 
R

Robert Sedlacek

Am 01 Nov 2004 19:55:15 GMT schrieb said:
I am not an expert on this, but I believe that the OS provides the
capabilities to limit user processes in various which, if employed, would
be just as effective against mult-megabyte uploads as the changes to CGI
would be. (And would also protect against other things that changing CGI
does not.)

Sure. (man) ulimit [-a] tells it's possible to limit file sizes, maximal
locked memory, maximum memory size, opened files, cpu time, maximum user
processes, virtual memory size and some others.

g,
Robert
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

No members online now.

Forum statistics

Threads
473,756
Messages
2,569,540
Members
45,024
Latest member
ARDU_PROgrammER

Latest Threads

Top