Stephen said:
There are plenty of things I don't like about C++, but that's no excuse
not to import the _useful_ improvements and reduce the conflict between
real C and the C-like subset of C++.
The debate, of course, is about "useful," and different
people will have different notions of utility. My own bias
is that a Really Strong case justifies a change, but an It
Would Be Nice doesn't. I'm in sympathy with Kernighan and
Plauger's remark: "When in doubt, treat `feature' as a
pejorative."
It drives me nuts that in C a "const int" cannot be a constant integer
expression, e.g. for switch case statements.
extern const int foo;
switch (getchar()) { case foo: ... }
const int bar = rand();
switch (getchar()) { case bar: ... }
const int baz = rand() & 1;
switch (rand() & 1) {
case 0: ...
case baz: ... /* duplicate case label? or not? */
}
.... and so on; all the usual gotchas. Yes, all could be made
well-defined by adopting suitable definitions, but has anything
useful been gained?
It also bothers me that a
string literal is not const.
The practical reason for literals being unalterable but
non-const is that millyuns and millyuns of lines of existing
code would break, requiring legions and legions of programmers
to spend time on what amounts to busy-work, "fixing" already-
correct code, inevitably committing silly misteaks along the
way and thereby adding bugs. It's regrettable, yes -- but it's
a sobering example of what happens when a new feature (`const')
is added to a language that didn't always have it. See also
the unholy mess that "generics" made of Java.
Namespaces, default arguments, and even
function overloading are useful--and can be made backwards compatible.
Perhaps. "Probably," even. But there's a cost, too: Today
if I write fputc('\n') the compiler bleats, I give myself a
dope slap, and the silly mistake is fixed right away. But
when some bright lad decides that the second argument to fputc()
should default to stdout, the compiler holds its peace[*] and I
don't learn about my error until the program misbehaves. That's
not an undiluted improvement ...
[*] Yes, the compiler can always issue a warning. But if the
nice new feature provokes a warning every time it's used, it will
not be pleasant to use. Especially in shops that adopt a "No
warnings allowed" policy.[**]
[**] Yes, I know such a policy is silly, even hopeless, when
carried to the extreme. But even without going to extremes, it
would be no fun at all if half the calls to a function elicited
warnings.
Finally, let's take a look at the O.P.'s case from a software
engineering viewpoint. He suggests a function that originally
took just one argument, an int, but was later enhanced to take
a second char* argument. For the benefit of existing one-arg
callers, he decides the second argument should default to NULL
if not provided explicitly. Fine, but then what?
So programmers write calls to the new, improved foo(), and
rely on the fact that the second argument defaults to NULL if
not specified. They happily write foo(42) instead of writing
foo(42, NULL), secure in the knowledge that the defaulter will
supply the missing NULL for them ...
... and then somebody decides the second argument should
default to "" instead ...
"Nobody would ever do *that*" I hear you cry, "That's
changing the published API!" Well, the API has *already*
changed once, hasn't it? Isn't that prima facie evidence that
it's unstable, subject to further change? And doesn't that
mean that the prudent programmer would write foo(42, NULL)
anyhow, Just In Case? I betcha "Don't use default arguments"
would make it into everybody's coding guidelines right rapidly --
leaving a feature that's actively shunned by careful coders ...