Andrey Tarasevich said:
Memory allocated by 'alloca' function obeys rather artificial
non-C-ish lifetime rules, which might easily lead to rather nasty
surprises when used without caution (did anyone mention
newcomers?). In C99 language VLAs provide support for run-time sized
automatic memory, essentially eliminating the need for 'alloca'.
And *if* alloca were to be standardized, the standard would have to
specify its behavior in considerably more rigorous detail than has
ever been done by any of the existing implementations of it. The
behavior wouldn't necessarily have to be well defined in all cases,
but the standard would at least have to specify precisely the cases in
which the behavior is undefined. The behavior when alloca() is called
as part of a function argument would probably be left undefined, but
what about other corner cases?
Just one example off the top of my head:
{
int n = rand() % 1000 + 1;
int *p;
int vla[(p = alloca(n), n)];
}
Obviously that's a silly thing to do, but the standard would need
either to define its behavior or to make it clear that it's undefined.
It might be easier to define the cases where the behavior *is*
defined, perhaps something similar to the tight restrictions on
setjmp() and longjmp().
And if the standard were to define the behavior in cases that don't
work properly in all existing implementations, then existing
implementations would be non-conforming, and code that depends on the
new standard's semantics would be *quietly* non-portable to older
implementations.
Furthermore, the idea of an allocation function that can't signal an
out-of-memory condition is disturbing. (I wish VLAs had a way to do
this.)
alloca was IMHO a nice idea, but it's turned out to be largely
impractical in real life.
I note, however, that jacob hasn't actually advocated that alloca
should be standardized, merely that it should be used. He's
indirectly performed a public service, by giving us an opportunity to
warn newbies away from it.