gets() - dangerous?

J

Jordan Abel

(e-mail address removed) said:


Who cares? What would the press know about it?


I like C. I read the Wikipedia article on C. It's very anti-C, and clearly
written by someone who doesn't know C well enough to use it properly.

The Wikipedia attitude to C is like that of an amateur cook railing against
the dangers of carving knives, and recommending butter knives for all your
carving needs.

I also read some of the Wiki articles on other languages - C++, Python,
Lisp. No such anti-language sentiment there.

From a comp.lang.c perspective, then, Wikipedia sucks.

So edit it to make it a more neutral point of view.
 
J

Joe Wright

Richard said:
Mark McIntyre said:




Or even simple partisanship. Wikipedia shows considerable dislike for C, but
is very positive about, say, C++, Python, and Lisp.

Encyclopaediae are supposed to be impartial. Whoever wrote the C bit did not
honour this tradition. Not sure why - for heaven's sake, it's only a
programming language! But take a look, and you'll see a "criticisms"
section - which is not something I noticed in the C++, Python or Lisp
sections. Does this mean those languages are beyond criticism? Or does it
simply mean the Wikids don't understand C very well?
Clearly an opportunity for you Richard, to set the record straight. You
have credentials enough to be accepted by Wiki I'm sure. And you'd be
good at it, it's only writing after all. Not reading. :)
 
R

Richard Heathfield

Joe Wright said:
Clearly an opportunity for you Richard, to set the record straight.

Don't tempt me.
You have credentials enough to be accepted by Wiki I'm sure.

That would change as soon as I started deleting entire swathes of useless
material and told them to read a decent book about C and spend a few years
writing it properly before expressing an opinion on it.
 
T

those who know me have no need of my name

in comp.lang.c i read:
Why did it make it into the standard, then?

because it was in wide use. see the rationale if you would like more
words.
 
S

S.Tobias

Emmanuel Delahaye said:
(e-mail address removed) a écrit :
Personally, I am waiting for the compiler that implements gets as the
following:

char * gets (char * s) {
....
remove (__FILE__);
return s;
}

BTW, I suppose that you want some [C99] 'inline'. If not, the effect
would be limited (the implementation file is probably not that close)
I don't think `inline' would help anything. Inline functions are
not expanded the same way as macros are, and `__FILE__' in above
code (after adding `inline') should still resolve to the name of
the implementation file.

#define gets(s) (remove(__FILE__), gets(s))
 
K

Kenny McCormack

Richard Heathfield said:
Or even simple partisanship. Wikipedia shows considerable dislike for C, but
is very positive about, say, C++, Python, and Lisp.

I don't see where you get this. I read the Wikipedia entry for C, and,
yes, it includes a Critcism section. I consider the article quite
balanced, and it sounds like you are saying that any criticism is bad.
This is the position one expects from defenders of a faith. A "you're
either with us or against us" mentality.

I think we all agree that C has its problems. That, first of all, any
language that was designed to be "portable assembler" is going to have
problems in the safety department. And second, that the existence of all
this "UB" in the language definition (I.e., that which is the primary
subject of this newsgroup) is not a Good Thing. Necessary, of course,
given what C is, but not in any sense desirable in the abstract.

I think it is entirely fair and reasonable (aye, in fact to not do so would
be irresponsible) to warn potential newcomers to the language of its
dangers.
 
K

Kenny McCormack

Richard Heathfield said:
That would change as soon as I started deleting entire swathes of useless
material and told them to read a decent book about C and spend a few years
writing it properly before expressing an opinion on it.

But that's precisely the problem - most (counted by sheer numbers - i.e.,
one human being, one vote) people who use C don't and never will use it
correctly. I think it is for them that the Wikipedia article is written.

Then, of course, there is also the universal phenomenon that whenever the
media (media of any nature, and yes, in today's world, that includes
Wikipedia) reports on *anything* that you (rhetorical "you") knows anything
about, they get it all wrong. Or, more precisely, all you see are the
errors. Just something we all have to live with.
 
K

Kenny McCormack

Joe Wright said:
When was it that use of gets() became widely known as evil? I started C
fifteen or more years ago and it was evil then. Why are some now just
discovering it is evil? Is anyone listening?

Ah, I see. You are a newcomer to the language.
 
K

Kenny McCormack

How
could anyone do so once the executable binary have been generated? I

Why don't you ask the FBI Computer Task Force?[/QUOTE]

Google for "Smashing the stack for fun and profit".
 
M

Malcolm

Richard Heathfield said:
That would change as soon as I started deleting entire swathes of useless
material and told them to read a decent book about C and spend a few years
writing it properly before expressing an opinion on it.
I once expressed scepticism about formal methods (my point was that usually
the formal method is so compex and burdensome that it is easier for a human
to write correct code first time than to follow the method in all its
intricacies).

Some supporter of formal methods said that I didn't have the experience to
make such an assertion.

I replied that I had been on a six month training course on formal methods.

"Ha," said the supporter of formal methods, "the people who devise these
methods have often spent twenty years developing them. And you are rejecting
them on the basis of a six month course."

The problem with that argument is that the number of six month courses I can
go on is strictly limited. I wouldn't say I am necessarily right about
formal methods, but I have a great deal of experience in programming. If
someone cannot convince me of the value of his approach in six months, then
I must have a powerful and qualified case that the approach is not, in fact,
valuable.
 
R

Richard Heathfield

Kenny McCormack said:
I don't see where you get this. I read the Wikipedia entry for C, and,
yes, it includes a Critcism section. I consider the article quite
balanced, and it sounds like you are saying that any criticism is bad.

Not at all. But if you're going to have a "criticisms" section for one
language, why not for all of them? Are all languages flawless, perfect, and
beyond reproach, except for C and Pascal?
This is the position one expects from defenders of a faith. A "you're
either with us or against us" mentality.

Nonsense. I'm not asking for partiality. I'm asking for impartiality.

I think we all agree that C has its problems.

And C++ doesn't?
That, first of all, any
language that was designed to be "portable assembler" is going to have
problems in the safety department. And second, that the existence of all
this "UB" in the language definition (I.e., that which is the primary
subject of this newsgroup) is not a Good Thing. Necessary, of course,
given what C is, but not in any sense desirable in the abstract.

Of course it's desirable. It gives the C programmer and implementation lots
of scope for inventiveness in a tight spot. Take, for example, this code:

void printat(const char *s, int x, int y)
{
unsigned char *p = (unsigned char *)0xb8000000UL + 160 * y + 2 * x;
int c = (g_fg << 4) | g_bg;
while(*s)
{
*p++ = *s++;
*p++ = c;
}
}

Extremely badly-behaved code, utterly undefined behaviour, but it works just
fine on the right platform, and is extremely quick compared to the
alternatives on that platform. For the Standard to mandate the behaviour of
this code would be meaningless, and for the Standard to forbid this code
would be overly restrictive. Making the behaviour undefined makes perfect
sense. Basically, it's saying "weeeelllll, okay, if that's what you want to
do, I won't try to stop you", which is fine by me.
I think it is entirely fair and reasonable (aye, in fact to not do so
would be irresponsible) to warn potential newcomers to the language of its
dangers.

That is true of all programming languages of any power, so now you are
effectively suggesting that Wikipedia is irresponsibly lax in not warning
of the dangers of other languages.
 
R

Richard Heathfield

Malcolm said:
"Ha," said the supporter of formal methods, "the people who devise these
methods have often spent twenty years developing them. And you are
rejecting them on the basis of a six month course."

And you had every right to choose not to use formal methods, on the strength
of your six month course spent learning about them. What your six month
course does /not/ give you is the credentials necessary for writing an
authoritative encyclopaedia article criticising formal methods.
 
G

Grumble

Lee said:
Whenever I use the gets() function, the gnu c compiler gives a
warning that it is dangerous to use gets(). Is this due to the
possibility of array overflow? Is it correct that the program flow can
be altered by giving some specific calculated inputs to gets()? How
could anyone do so once the executable binary have been generated? I
have heard many of the security problems and other bugs are due to
array overflows.

http://www.insecure.org/stf/smashstack.txt
 
M

Malcolm

Richard Heathfield said:
Malcolm said:


And you had every right to choose not to use formal methods, on the
strength
of your six month course spent learning about them. What your six month
course does /not/ give you is the credentials necessary for writing an
authoritative encyclopaedia article criticising formal methods.
It would be rare for someone to say "I have spent twenty years studying and
developing formal methods, and I conclude that I have basically wasted my
time and they cannot generally improve productivity or error-rate".
Not impossible or unherad of, but rare.
 
R

Richard Heathfield

Malcolm said:
It would be rare for someone to say "I have spent twenty years studying
and developing formal methods, and I conclude that I have basically wasted
my time and they cannot generally improve productivity or error-rate".
Not impossible or unherad of, but rare.

Someone who has spent 20 years studying and developing a discipline is
indeed likely to look favourably upon it; but he or she will also know a
great deal about it. And that's not a bad place from which to write an
encyclopaedia article, in many circumstances. (Not all; I did think of a
few counter-examples to this!) Certainly a position of relative ignorance
is a /bad/ place from which to write an encyclopaedia article.

Let's take one or two of the Wiki criticisms and look at them more closely:

"In other words, C permits many operations that are generally not
desirable".

So what? Just because Mr Generally doesn't want this particular operation,
it doesn't mean /I/ don't want it or /you/ don't want it. Good for C!

"many simple errors made by a programmer are not detected by the compiler or
even when they occur at runtime."

But programmers tend to make these errors less and less as they gain
knowledge of and experience with C, and so this is a diminishing problem.
If they're bright, the programmers will in any case learn from other
programmers' experiences rather than their own. So this really isn't as big
a problem as it is made to sound.

"One problem with C is that automatically and dynamically allocated objects
are not initialized; they initially have whatever value is present in the
memory space they are assigned."

That isn't a problem at all! It's common sense that the bits in a space
aren't the bits you want until you set them to be the bits you want.
Setting them arbitrarily to zero as a matter of language policy is just a
pointless intermediate step. If /you/ want a given object to have a value
if 0, C makes that easy to do: T obj = {0};

"Another common problem is that heap memory cannot be reused until it is
explicitly released by the programmer with free()".

That's simply not true. As long as you have a pointer to it, you can and may
keep on using it. And if you don't, you mustn't. The article writer's
answer to this tiny housekeeping matter is automatic garbage collection,
which has a whole bundle of its own issues.

"Pointers are one primary source of danger; because they are unchecked, a
pointer can be made to point to any object of any type, including code, and
then written to, causing unpredictable effects."

Actually, you have to be fighting the type system if you want to get a
pointer of any object type to point to something else. For void *, sure,
but void * has redeeming features which make it worth keeping. I doubt
whether any experienced C programmer really considers this to be a problem.

"Although C has native support for static arrays, it does not verify that
array indexes are valid (bounds checking). For example, one can write to
the sixth element of an array with five elements, yielding generally
undesirable results. This is called a buffer overflow. This has been
notorious as the source of a number of security problems in C-based
programs."

Absolutely true. And if you put morons into Mack trucks, the accident rate
for Macks will go up. Put those morons into Mercs, and the accident rate
for Mercs will climb. Now set those morons to writing C code, and look what
happens to the accident rate for C programs.

Buffer overflow is a known and very minor problem. The reason it's a minor
problem is this: it's a simple, easy-to-understand problem. It's not always
easy to understand how it can be exploited, but that's irrelevant. The
weakness itself is simple, and simply avoided. This is like saying "if you
go walking on the motorway, you might get killed; DON'T WALK ON THE
MOTORWAY". People still go walking on the motorway, and people still get
killed doing so. That is not the fault of the motorway.

Incidentally, the article was very complimentary about "Numerical Recipes in
C" for its innovative approach to arrays before I corrected it a few weeks
or months ago (along with one or two other minor corrections I had made as
a prelude to an overhaul, which I abandoned when I found that my
corrections had been edited!). I pointed out that the Numerical Recipes
"solution" isn't a solution at all, being based on utter ignorance of the
rules of C - but that's been modded down to "there is also a solution based
on negative addressing". Stupid stupid stupid.

Well, I could go on to address the other crits if I could be bothered. Let
the Wikids put the above right first. At present, I cannot recommend the
Wiki's C article to anyone. It is, quite simply, riddled with wrongs.
 
J

James Dow Allen

Jack said:
The solution is simple: don't use gets(). Not ever.

I suppose my presence in these c.l.c dialogs is a form of masochism but
....

I use gets() everyday! I think I get the "dangerous" message
from almost every make I do! Frankly, if I used a gets() alternative
to avoid the "danger" I'd probably end up using a strcpy()
with the same danger!

You don't need to URL me to the "dangerous" explanation: I used
to design exploits myself. But the fact is, most of the programs I
write
are not for distribution and are run only on my personal machine,
usually with an impermeable firewall. Who's going to exploit me?
My alter ego? My 10-year old daughter? The input to my gets()
is coming from a file I or my software created, and which has
frequent line-feeds. The buffer into which I gets() is ten times
as big as necessary. If one of my data files gets corrupted,
diagnosing
the gets() overrun would be the least of my worries.

I'll agree that such coding should be avoided in programs with
unpredictable
input, but there are *lots* and *lots* of code fragments that suffer
from the same "danger" as gets(), so to me the suggestion that gets()
specifically be barred by the compiler seems like a joke. Or do you
think strcpy() should be barred also?

Glad to help,
James
 
R

Richard Heathfield

James Dow Allen said:
I'll agree that such coding should be avoided in programs with
unpredictable input,

And therefore the only safe advice here in clc is "don't use gets()", since
we can never predict what input will be provided to it.
but there are *lots* and *lots* of code fragments that suffer
from the same "danger" as gets(), so to me the suggestion that gets()
specifically be barred by the compiler seems like a joke. Or do you
think strcpy() should be barred also?

The thing about gets() is that you can't code to make it robust, in a way
that you can with strcpy(). You shouldn't ever, ever, EVER call the
strcpy() function until you have first checked that your target buffer is
big enough to receive the input string. You see? You can hack up a check to
make a strcpy() call safe. You can't do the same for gets(), because to do
so would involve seeing into the future.
 
G

Giannis Papadopoulos

James said:
I suppose my presence in these c.l.c dialogs is a form of masochism but
...

I use gets() everyday! I think I get the "dangerous" message
from almost every make I do! Frankly, if I used a gets() alternative
to avoid the "danger" I'd probably end up using a strcpy()
with the same danger!

You don't need to URL me to the "dangerous" explanation: I used
to design exploits myself. But the fact is, most of the programs I
write
are not for distribution and are run only on my personal machine,
usually with an impermeable firewall. Who's going to exploit me?
My alter ego? My 10-year old daughter? The input to my gets()
is coming from a file I or my software created, and which has
frequent line-feeds. The buffer into which I gets() is ten times
as big as necessary. If one of my data files gets corrupted,
diagnosing
the gets() overrun would be the least of my worries.

I'll agree that such coding should be avoided in programs with
unpredictable
input, but there are *lots* and *lots* of code fragments that suffer
from the same "danger" as gets(), so to me the suggestion that gets()
specifically be barred by the compiler seems like a joke. Or do you
think strcpy() should be barred also?

Glad to help,
James

Although it may seem that it is ok to use such code for personal apps,
it is not a very good idea overall.

And I think it is a good practice to write robust and safe programs,
even for personal use.
 
E

Eric Sosman

James said:
I use gets() everyday! [...]
> [...] The input to my gets()
is coming from a file I or my software created, and which has
frequent line-feeds. The buffer into which I gets() is ten times
as big as necessary. [...]

Semi-rhetorical question: If you're sure that the
file's lines are of reasonable length, why make the
buffer ten times larger than needed? Pascal's Wager?
 
R

Richard Heathfield

Eric Sosman said:
James said:
I use gets() everyday! [...]
[...] The input to my gets()
is coming from a file I or my software created, and which has
frequent line-feeds. The buffer into which I gets() is ten times
as big as necessary. [...]

Semi-rhetorical question: If you're sure that the
file's lines are of reasonable length, why make the
buffer ten times larger than needed? Pascal's Wager?

Something like that, I guess.

For the record, if I'm cranking out some one-off code, I use oversized
buffers too - much, much larger than I could possibly need - but I still
use fgets rather than gets.

(If I'm not sure that the code is going to be discarded fairly soon, I would
not use fgets either; instead, I'd use something that can handle
arbitrarily long lines.)
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

Forum statistics

Threads
473,770
Messages
2,569,585
Members
45,082
Latest member
KetonaraKetoACV

Latest Threads

Top