Is it acceptable practice to have a #define with a semicolon in it,
such as:
#define SMALL 1;
I didn't think it was, but a very good friend of mine claims it's
perfectly acceptable if you want to prevent the #define from being
used in an expression like if(SMALL).
It seems as though some people have taken issue with my
characterization of the situation. First off, it wasn't a very good
friend, but actually someone I don't even know. It was a Usenet thing.
That was supposed to be a joke for others reading, but one person was
very upset and called me a liar. So no, it wasn't "a very good
friend." Also, they seem to have an issue with the way I presented the
situation. Here is the full post:
"'The semi-colon will be expanded as part of the macro, causing the
printf to fail to compile.'
Correct - but in reality what I actually do is exploit that to make it
intentionally fail!
e.g. I could easily write
if(SMALL) { do something } else { do something else }
That is bad programming - for the most part, I know I would never
write if(SMALL) ... because if I set SMALL to 2,3,4, then everything
is OK when configuring the software, but if accidentally set SMALL to
0 the execution of the if() statement will change and that would have
been an unintentional side effect.
If I accidentally wrote the code with if(SMALL) it will not fail
especially hard to spot the mistake if it is buried in a complex
formula. And there is no warning of impending doom.
So by putting semicolon in #define SMALL 1; I've made sure on
compiling it it is guaranteed to fail when used out of context."
So that's the whole quote (of which I see no difference in what I said
before), so if you feel differently about it being poor coding
practice I would like to hear why again. Also I'm sorry I jokingly
called someone I don't know, my very good friend. Thanks.