[snips]
When the construct used in virtually every piece of C code one runs
across reads "count++", where "count++" is such a common idiom that
avoiding its use suggests there is some reason (either neophyte status,
or something less obvious) for doing so, then yes, lacking comments
explaining precisely _why_ such a screwball construct is being used, the
result _is_ obscure code.
Without additional explanation (eg "Imported from MatLab, which uses this
sort of construct") there is no readily apparent reason for using such a
construct. If we assume the coder is not a neophyte, it then follows he
is using this screwball notation for a specific purpose, which implies
there is some behaviour involved which shows up in "count = count + 1"
but _does not_ show up in "count++".
Which means now we have do scratch our heads, go running for the standard
(and the compiler documentation), check "count" to see if there's some
special magic associated with it, and try to figure out _what_ the
different behaviour is that's being relied upon.
When the search fails (assuming it does, i.e. we find no special magic)
we're left not with confidence the construct works as we'd expect, but
rather the uneasy feeling it is relying on some bizarre behaviour, quite
possibly of an implementation-specific optimizer, or some equivalent,
which we'll never be able to fully understand, let alone rely upon. The
code, as a result, simply cannot be trusted.
There are languages in which "count = count + 1" are common idiom. To
people used to those languages, such a construct may be clear and
concise. C is not one of those languages.
Indeed, the very fact this has engendered a discussion as involved as
this should be sufficient to show that such constructs are _not_ trusted
by C coders, but _are_ treated as flags suggesting extreme review is
warranted.
Actually, there are good reasons for dropping all use of pre and post
increment in favor of normal expressions:
* C "standardization" has considerably messed up the evaluation order
of these constructs: in a striking reversal of the intentions of most
standardization efforts, the standards geeks have gravely pronounced
that its evaluation order is Heisenbergian uncertain
* They only apply to lValues which makes them non-orthogonal to most
other operators
* A modern compiler will generate the same code for count++ as for
count = count + 1
* A modern optimizing compiler will override the scheduling that the
programmer attempts in if (a++) foo, compiling this and "if (a) foo; a
= a+1" in the same optimized way.
The problem is what "technological anthropologist" Diane Vaughan calls
"normalized deviance". She developed this anthropological construct to
explain why an all-male team at NASA subcontractor was pressured to
approve the January 1986 launch of the Challenger Space Shuttle (the
"teacher in space" mission which exploded shortly after leaving the
launch pad).
Vaughan, the author of "The Challenger Launch Decision" (Univ of Chi
1999) realized that in male workgroups, a "macho" attitude causes
technical men to decide collectively to abandon good practice and to
mistreat dissidents.
This phenomenon seems to me well advanced in C. C was developed in a
deviant fashion: an adolescent prank intended to show the "grownups"
of the Multics (PL/I based) project that bearded hippy weirdos could
program better than grey flannel suits. For this reason, and because
Kernighan and Pike had no visible way of demonstrating their
superiority except by an easily measured development time, C was from
the start a mishmash of ill-digested notions.
These included the preprocessor, a macro facility developed when the
serious problems of such processing were becoming evident in macro
assemblers: the pre and post increment operators which were (I
believe) merely implemented to use attractive machine instructions:
and above all, the unspeakably amateurish choice of Nul to terminate
strings, which permanently deprived a common and useful character of
membership in strings.
The psychology was adolescent in contrast to the mature and adult
effort to develop Algol on the part of serious scientists, and it was
funded solely by the monopolistic market failure that was the Bell
system of the time.
However, the fact that Kernighan and Pike got away with this nonsense,
and the even bigger nonsense of unix, normalized deviance into mythos.
The real contributions to the advance of software of a slightly older
generation were stolen by irresponsible adolescents and today, this
has created the sort of antics that occur on this newsgroup, including
Seebach's resentment-based analysis of code, and his substitution of
personal hatred for science.