Hi Barry,
[snip program]
As far as I can see, this is a perfectly valid C program that should
reach the 'return 0' statement always, were it not for the fact that in
It is valid in the theoretical sense but it executes in the real
world.
Of course I agree, but this being comp.lang.c, I think it's valid to
look at the ill-lit corners of the language as it is specified.
I mean, people here sometimes insist that code like:
#include <stdio.h>
int main(void)
{
int a[1];
printf("%d", a[1]);
return 0;
}
....could cause nuclear detonation or even (perish the thought!)
harddrive formatting, so I think there's a bit of a tradition with
regard to projecting theoretical issues onto practice.
All systems have limited resources. Even virtual memory is backed up
in some kind of file. I could not find in the standard a minimum for
the number of recursive function calls an implementation was required
to support. It does state that a function can be recursively called
at least once.
Of course you're right. I would be already half satisfied if there would
be a statement in the standard to the effect that "function calls are
allowed to consume an unspecified amount of an unspecified type of one
or more unspecified types of resources, which the function will release
upon completion of execution. Undefined behavior can occur when the
amount of said resources in use at any one time exceeds an unspecified
threshold."
The downside of such a statement of course would be that any program
performing function calls could cause undefined behavior. But right now
that seems to be an accurate representation of the state of affairs
anyway, so the question is: which one of the two evils is the most evil?
Please realize that I'm playing the devil's advocate here... I'm just
curious if I'm either missing something crucial in the Standard, or that
perhaps a wording could be found to mend the situation to the effect
that something real we sometimes observe in practice (stack overflows)
is covered by the Standard.
[[ some numbers on estimate of memory usage in the sample program ]]
The sample program was of course devised exactly to trigger a stack
overflow on any (or at least most) practical implementations.
An interesting point is that by recognizing tail recursion, a very smart
compiler could even come up with a translation that does, indeed,
eventually reach the 'return 0' statement.
Would you be asking your question if your program simply issued malloc
requests until no more memory was available? The only difference is
that malloc fails "politely" by returning NULL.
That's a good homologous case, yes, with the exception (as you point
out) that there is a perfectly good provision within the Standard to
handle it.
There doesn't seem to be any way for a recursion failure to be "polite." Both fail for
exactly the same reason. Only the symptoms of that failure are
different.
I beg to differ on your first statement, I think there is a way, but it
would require an amendment to the standard.
A function invocation could check if it would exceed the stack resource,
and (if so) abort execution, for example. This would require a change of
the standard, and would have performance ramifications. But it would be
possible.
Welcome to the limitations of practicality.
That's all good and dandy, but we still seem to be stuck in a situation
where a fully compliant program invokes a segmentation fault, which can
only signify undefined behavior (as the Standard doesn't talk about
segmentation faults). If that situation could be improved (and the point
of this discussion, for me at least, is to see if it can), I for one
would welcome it!
Best regards,
Sidney Cadot
The Netherlands