William Pursell said:
What is the definition of "correct"?
There are many, but here's a topical one...
A strictly conforming program shall use only those
features of the language and library specified in
this International Standard. It shall not produce
output dependent on any unspecified, undefined, or
implementation-defined behavior, and shall not
exceed any minimum implementation limit.
You'll note that it includes one non-output criteria
that Jacob was asking for.
In 2005 I posted 21 lines of code that (at the time)
crashed gcc and lcc-win32. It had no output statements,
so was highly deterministic! It was quite simply a
declaration of an array. It was syntactically and
semantically valid, but it violated minimum limits.
but it is generally not feasible to check all possible
inputs.
Precisely. However... it _is_ generally possible to prove
program correctness without having to trial all inputs.
[Or at least write code where this is generally possible.]
So, Jacob is right in the sense that the output is the
only thing that matters in an academic sense of determining
the correctness of the program,
Ever heard of denotational semantics?
Of course, you don't have to go to that extreme, but the
fact is, there are plenty of ways to determine correctness
that don't require running the program and checking the
output.
[And yes, I've heard of the Knuth quote: "I have only
proved it correct, not tried it." ;-]
and Santosh is right in
the practical sense that a program that contains a buffer
overflow will probably generate incorrect output on some
input string.
I agree that Santosh was right, but would point out that
what he said is not the same as what you've said he
said.