#define with semicolon

C

cc

Is it acceptable practice to have a #define with a semicolon in it,
such as:

#define SMALL 1;

I didn't think it was, but a very good friend of mine claims it's
perfectly acceptable if you want to prevent the #define from being
used in an expression like if(SMALL).
 
K

Keith Thompson

cc said:
Is it acceptable practice to have a #define with a semicolon in it,
such as:

#define SMALL 1;

I didn't think it was, but a very good friend of mine claims it's
perfectly acceptable if you want to prevent the #define from being
used in an expression like if(SMALL).

Why would you want to prevent it from being used in an expression?
I think "1;" is a poor example of what your friend is talking about.
I'd be interested in seeing a better example.

A #define can contain any token sequence you like. The macro name
will be expanded to that token sequence every time you use it.
If you want that token sequence to include a semicolon, then you
should have a semicolon in the definition.

But most of the time, a macro expansion is used either in an
expression context (in which case it *shouldn't* have any semicolons,
and it should be protected by parentheses where necessary), or
in a statement context (in which case, if it consists of multiple
substatements, you need to use the "do { ... } while (0)" trick).
 
C

cc

Why would you want to prevent it from being used in an expression?
I think "1;" is a poor example of what your friend is talking about.
I'd be interested in seeing a better example.

That was his example. That was also his explanation of why he did it
(so the compiler would complain if he used it as an expression).

Another example was from the linux kernel.

/usr/src/linux-3.0.0-rc7-mainline/include/linux/mfd/tps65910.h:
#define LDO_MAX_VOLT 3300;

A #define can contain any token sequence you like.  The macro name
will be expanded to that token sequence every time you use it.
If you want that token sequence to include a semicolon, then you
should have a semicolon in the definition.

I know what #define does. I was asking about coding standards more or
less, and if a #define with a semicolon was commonly used and accepted
practice.
But most of the time, a macro expansion is used either in an
expression context (in which case it *shouldn't* have any semicolons,
and it should be protected by parentheses where necessary), or
in a statement context (in which case, if it consists of multiple
substatements, you need to use the "do { ... } while (0)" trick).

Right. So you see no logical reason to ever use something like #define
SMALL 1;? I don't either, but I was just making sure there wasn't
something I missed.
 
D

Dr Nick

cc said:
That was his example. That was also his explanation of why he did it
(so the compiler would complain if he used it as an expression).

Another example was from the linux kernel.

/usr/src/linux-3.0.0-rc7-mainline/include/linux/mfd/tps65910.h:
#define LDO_MAX_VOLT 3300;

Flippin' heck. I hope I'm nowhere near the keyboard when anything the
kernal controls get close to that. Mainline or not.
I know what #define does. I was asking about coding standards more or
less, and if a #define with a semicolon was commonly used and accepted
practice.


Right. So you see no logical reason to ever use something like #define
SMALL 1;? I don't either, but I was just making sure there wasn't
something I missed.

I can't think of one.

I had a quick look through my source collection and the only example I
could find where I had a #define ending with a ; was one of those things
where you define a macro one way then include a file, then define it
another and include the file again.
 
K

Keith Thompson

cc said:
That was his example. That was also his explanation of why he did it
(so the compiler would complain if he used it as an expression).

How else would he use it?
Another example was from the linux kernel.

/usr/src/linux-3.0.0-rc7-mainline/include/linux/mfd/tps65910.h:
#define LDO_MAX_VOLT 3300;

I suspect that's just an error. Perhaps it's only used in contexts
where the extra semicolon is harmless, such as
voltage = LDO_MAX_VOLT;
which expands to
voltage = 3300;;
which is an assignment statement followed by an expression statement.

Or, worse, if it's used like this:
voltage = LDO_MAX_VOLT + 1;
then it expands to
voltage = 330; + 1;
where the "+ 1;" is an expression statement that discards the result
(and voltage gets the wrong value).
I know what #define does. I was asking about coding standards more or
less, and if a #define with a semicolon was commonly used and accepted
practice.

I'd say no. It's more commonly a mistake -- and if you're unlucky,
the compiler won't warn you about it.
Right. So you see no logical reason to ever use something like #define
SMALL 1;? I don't either, but I was just making sure there wasn't
something I missed.

I won't say there's *never* a reason to do something like that.
There are cases where macros will expand to something other than
an expression or a statement. It usually means you're messing with
the language syntax, which is dangerous but *sometimes* useful.

Many years ago, I wrote something like:

#define EVER ;;
...
for (EVER) {
...
}

but I got better.
 
H

Harald van Dijk

Right. So you see no logical reason to ever use something like #define
SMALL 1;? I don't either, but I was just making sure there wasn't
something I missed.

I do, though it does not apply to your case. There are lint-like tools
that allow you to declare that a macro expands to a statement. The
tool will verify that it is only ever used as a statement, but in
return, it has to actually *be* a statement, or it gets very confused.
A macro expansion that would be a statement if you add a semicolon
does not qualify.
 
K

Keith Thompson

Harald van Dijk said:
I do, though it does not apply to your case. There are lint-like tools
that allow you to declare that a macro expands to a statement. The
tool will verify that it is only ever used as a statement, but in
return, it has to actually *be* a statement, or it gets very confused.
A macro expansion that would be a statement if you add a semicolon
does not qualify.

From your description, it sounds like there are some bad lint-like
tools out there.

A macro that's intended to expand to a statement (and not to an
expression) should use the "do { ... } while (0)" trick to avoid
problems when used with if/else.
 
H

Harald van Dijk

From your description, it sounds like there are some bad lint-like
tools out there.

A macro that's intended to expand to a statement (and not to an
expression) should use the "do { ... } while (0)" trick to avoid
problems when used with if/else.

That depends. As long as it warns for empty statements, which includes
the cases where the macro is immediately followed by a semicolon, it
is fine. Regardless of whether the macro appears in an if statement,
it expects the macro to always be used by itself. And if it is always
used by itself, it causes no problems before an else: it looks just as
you would normally use it. It is just as valid as far as C is
concerned. The main thing it has going against it is that it gets very
confusing when you mix it with macros that do expect to be followed by
a semicolon. (I don't use it myself, by the way.)
 
J

Joe Pfeiffer

cc said:
That was his example. That was also his explanation of why he did it
(so the compiler would complain if he used it as an expression).

Another example was from the linux kernel.

/usr/src/linux-3.0.0-rc7-mainline/include/linux/mfd/tps65910.h:
#define LDO_MAX_VOLT 3300;

I was curious enough I went and looked that one up -- it's the only
#define in the file that ends with a semicolon (even LDO_MIN_VOLT
doesn't), and a recursive grep fails to turn the symbol up anywhere else
in the kernel. I'm guessing the reason for this one was an
overly-clever way of keeping anybody from using it (for anything!) in
what seems to be a fairly new driver.
 
K

Keith Thompson

Joe Pfeiffer said:
I was curious enough I went and looked that one up -- it's the only
#define in the file that ends with a semicolon (even LDO_MIN_VOLT
doesn't), and a recursive grep fails to turn the symbol up anywhere else
in the kernel. I'm guessing the reason for this one was an
overly-clever way of keeping anybody from using it (for anything!) in
what seems to be a fairly new driver.

I'm guessing that it's just a mistake that nobody has fixed yet.

Adding the semicolon won't keep it from being used. In many cases, it
won't change anything:

voltage = LDO_MAX_VOLT;

and in others it can silently change the meaning of the code:

voltage = LDO_MAX_VOLT + 1;
 
B

Barry Schwarz

Is it acceptable practice to have a #define with a semicolon in it,
such as:

#define SMALL 1;

I didn't think it was, but a very good friend of mine claims it's
perfectly acceptable if you want to prevent the #define from being
used in an expression like if(SMALL).

Acceptable is in the eye of the beholder. If you are at work, it is
whatever standards your company adopts. If you are at home, it is
whatever your preference is.

The only thing perfect about is that it is perfectly legal syntax.

I don't find it acceptable at all personally but there is no reason
why you or anyone else reading this should care what I think.
 
B

Ben Bacarisse

cc said:
Another example was from the linux kernel.

/usr/src/linux-3.0.0-rc7-mainline/include/linux/mfd/tps65910.h:
#define LDO_MAX_VOLT 3300;

It's quote clear from the context that it's a typo. Fortunately the
macro is not used anywhere!

<snip>
 
H

Hallvard B Furuseth

Keith said:
I won't say there's *never* a reason to do something like that.
There are cases where macros will expand to something other than
an expression or a statement. It usually means you're messing with
the language syntax, which is dangerous but *sometimes* useful.

Many years ago, I wrote something like:

#define EVER ;;
...
for (EVER) {
...
}

but I got better.

More generally, you can end up with such strange-looking macros if you
use the preprocessor to extend the language. E.g. macros for a poor
man's exception facility, typically with setjmp at the core. These
might also use ugliness like

#define FOO_BEGIN(x) { <something>
#define FOO_END(x) <something else> }

You should then try to keep the ugliness inside macro definitions,
so code using the macros will not look too bad. That can lead to
strange definitions like '#define SMALL 1;'.

You might even deliberatey design the facility so the 'x' macro
parameter above takes an argument of format '<number> <semicolon>',
if user code is always supposed to pass a macro like SMALL, never a
number. It could still be naughty and pass FOO_BEGIN(1;) directly
instead of FOO_BEGIN(SMALL), but that will at least look strange.
Maybe that's what the OP's friend is talking about. However it makes
no sense to speak of that in isolation, without reference to the macro
set which uses SMALL.
 
P

Phil Carmody

Joe Pfeiffer said:
I was curious enough I went and looked that one up -- it's the only
#define in the file that ends with a semicolon (even LDO_MIN_VOLT
doesn't), and a recursive grep fails to turn the symbol up anywhere else
in the kernel. I'm guessing the reason for this one was an
overly-clever way of keeping anybody from using it (for anything!) in
what seems to be a fairly new driver.

There's worse.

$ git grep define\ DELAY_1

Ug.

Phil
 
W

Walter Banks

cc said:
Is it acceptable practice to have a #define with a semicolon in it,
such as:

#define SMALL 1;

I didn't think it was, but a very good friend of mine claims it's
perfectly acceptable if you want to prevent the #define from being
used in an expression like if(SMALL).

I wish I had a nickel for every customer call I have taken over the last
30 years where a semicolon at the end of a #define changed an
expression. The worst ones are when an expression is split in two and
creates two valid statements. No compiler errors or warnings just
application anguish.

Regards,
 
G

Gene

I won't say there's *never* a reason to do something like that.
There are cases where macros will expand to something other than
an expression or a statement.  It usually means you're messing with
the language syntax, which is dangerous but *sometimes* useful.

Many years ago, I wrote something like:

    #define EVER ;;
    ...
    for (EVER) {
        ...
    }

but I got better.

A recovering macroholic?
 
C

cc

Is it acceptable practice to have a #define with a semicolon in it,
such as:

#define SMALL 1;

I didn't think it was, but a very good friend of mine claims it's
perfectly acceptable if you want to prevent the #define from being
used in an expression like if(SMALL).

It seems as though some people have taken issue with my
characterization of the situation. First off, it wasn't a very good
friend, but actually someone I don't even know. It was a Usenet thing.
That was supposed to be a joke for others reading, but one person was
very upset and called me a liar. So no, it wasn't "a very good
friend." Also, they seem to have an issue with the way I presented the
situation. Here is the full post:

"'The semi-colon will be expanded as part of the macro, causing the
printf to fail to compile.'

Correct - but in reality what I actually do is exploit that to make it
intentionally fail!

e.g. I could easily write


if(SMALL) { do something } else { do something else }


That is bad programming - for the most part, I know I would never
write if(SMALL) ... because if I set SMALL to 2,3,4, then everything
is OK when configuring the software, but if accidentally set SMALL to
0 the execution of the if() statement will change and that would have
been an unintentional side effect.

If I accidentally wrote the code with if(SMALL) it will not fail
especially hard to spot the mistake if it is buried in a complex
formula. And there is no warning of impending doom.

So by putting semicolon in #define SMALL 1; I've made sure on
compiling it it is guaranteed to fail when used out of context."

So that's the whole quote (of which I see no difference in what I said
before), so if you feel differently about it being poor coding
practice I would like to hear why again. Also I'm sorry I jokingly
called someone I don't know, my very good friend. Thanks.
 
A

Anders Wegge Keller

if (some_condition)
some_var = SMALL + some_other_var++;


I wish you a merry time debugging code like this.
 
R

Roberto Waltman

Keith said:
Many years ago, I wrote something like:

#define EVER ;;
...
for (EVER) {
...
}

but I got better.

I am tempted to do that often, because with some compilers this,

while (1) { ... }

generates a warning about the "expression being constant", while your
example is accepted silently.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

Forum statistics

Threads
473,744
Messages
2,569,483
Members
44,901
Latest member
Noble71S45

Latest Threads

Top