#define with semicolon

Discussion in 'C Programming' started by cc, Jul 13, 2011.

  1. cc

    cc Guest

    Is it acceptable practice to have a #define with a semicolon in it,
    such as:

    #define SMALL 1;

    I didn't think it was, but a very good friend of mine claims it's
    perfectly acceptable if you want to prevent the #define from being
    used in an expression like if(SMALL).
     
    cc, Jul 13, 2011
    #1
    1. Advertising

  2. cc <> writes:
    > Is it acceptable practice to have a #define with a semicolon in it,
    > such as:
    >
    > #define SMALL 1;
    >
    > I didn't think it was, but a very good friend of mine claims it's
    > perfectly acceptable if you want to prevent the #define from being
    > used in an expression like if(SMALL).


    Why would you want to prevent it from being used in an expression?
    I think "1;" is a poor example of what your friend is talking about.
    I'd be interested in seeing a better example.

    A #define can contain any token sequence you like. The macro name
    will be expanded to that token sequence every time you use it.
    If you want that token sequence to include a semicolon, then you
    should have a semicolon in the definition.

    But most of the time, a macro expansion is used either in an
    expression context (in which case it *shouldn't* have any semicolons,
    and it should be protected by parentheses where necessary), or
    in a statement context (in which case, if it consists of multiple
    substatements, you need to use the "do { ... } while (0)" trick).

    --
    Keith Thompson (The_Other_Keith) <http://www.ghoti.net/~kst>
    Nokia
    "We must do something. This is something. Therefore, we must do this."
    -- Antony Jay and Jonathan Lynn, "Yes Minister"
     
    Keith Thompson, Jul 13, 2011
    #2
    1. Advertising

  3. cc

    cc Guest

    On Jul 13, 3:38 pm, Keith Thompson <> wrote:
    > cc <> writes:
    > > Is it acceptable practice to have a #define with a semicolon in it,
    > > such as:

    >
    > > #define SMALL 1;

    >
    > > I didn't think it was, but a very good friend of mine claims it's
    > > perfectly acceptable if you want to prevent the #define from being
    > > used in an expression like if(SMALL).

    >
    > Why would you want to prevent it from being used in an expression?
    > I think "1;" is a poor example of what your friend is talking about.
    > I'd be interested in seeing a better example.


    That was his example. That was also his explanation of why he did it
    (so the compiler would complain if he used it as an expression).

    Another example was from the linux kernel.

    /usr/src/linux-3.0.0-rc7-mainline/include/linux/mfd/tps65910.h:
    #define LDO_MAX_VOLT 3300;


    > A #define can contain any token sequence you like.  The macro name
    > will be expanded to that token sequence every time you use it.
    > If you want that token sequence to include a semicolon, then you
    > should have a semicolon in the definition.


    I know what #define does. I was asking about coding standards more or
    less, and if a #define with a semicolon was commonly used and accepted
    practice.

    > But most of the time, a macro expansion is used either in an
    > expression context (in which case it *shouldn't* have any semicolons,
    > and it should be protected by parentheses where necessary), or
    > in a statement context (in which case, if it consists of multiple
    > substatements, you need to use the "do { ... } while (0)" trick).
    >


    Right. So you see no logical reason to ever use something like #define
    SMALL 1;? I don't either, but I was just making sure there wasn't
    something I missed.
     
    cc, Jul 13, 2011
    #3
  4. cc

    Ben Pfaff Guest

    cc <> writes:

    > Is it acceptable practice to have a #define with a semicolon in it,
    > such as:
    >
    > #define SMALL 1;


    No, I'd assume that was a typo.
    --
    Ben Pfaff
    http://benpfaff.org
     
    Ben Pfaff, Jul 13, 2011
    #4
  5. cc

    Dr Nick Guest

    cc <> writes:

    > On Jul 13, 3:38 pm, Keith Thompson <> wrote:
    >> cc <> writes:
    >> > Is it acceptable practice to have a #define with a semicolon in it,
    >> > such as:

    >>
    >> > #define SMALL 1;

    >>
    >> > I didn't think it was, but a very good friend of mine claims it's
    >> > perfectly acceptable if you want to prevent the #define from being
    >> > used in an expression like if(SMALL).

    >>
    >> Why would you want to prevent it from being used in an expression?
    >> I think "1;" is a poor example of what your friend is talking about.
    >> I'd be interested in seeing a better example.

    >
    > That was his example. That was also his explanation of why he did it
    > (so the compiler would complain if he used it as an expression).
    >
    > Another example was from the linux kernel.
    >
    > /usr/src/linux-3.0.0-rc7-mainline/include/linux/mfd/tps65910.h:
    > #define LDO_MAX_VOLT 3300;


    Flippin' heck. I hope I'm nowhere near the keyboard when anything the
    kernal controls get close to that. Mainline or not.

    >> A #define can contain any token sequence you like.  The macro name
    >> will be expanded to that token sequence every time you use it.
    >> If you want that token sequence to include a semicolon, then you
    >> should have a semicolon in the definition.

    >
    > I know what #define does. I was asking about coding standards more or
    > less, and if a #define with a semicolon was commonly used and accepted
    > practice.
    >
    >> But most of the time, a macro expansion is used either in an
    >> expression context (in which case it *shouldn't* have any semicolons,
    >> and it should be protected by parentheses where necessary), or
    >> in a statement context (in which case, if it consists of multiple
    >> substatements, you need to use the "do { ... } while (0)" trick).
    >>

    >
    > Right. So you see no logical reason to ever use something like #define
    > SMALL 1;? I don't either, but I was just making sure there wasn't
    > something I missed.


    I can't think of one.

    I had a quick look through my source collection and the only example I
    could find where I had a #define ending with a ; was one of those things
    where you define a macro one way then include a file, then define it
    another and include the file again.
    --
    Online waterways route planner | http://canalplan.eu
    Plan trips, see photos, check facilities | http://canalplan.org.uk
     
    Dr Nick, Jul 13, 2011
    #5
  6. cc <> writes:
    > On Jul 13, 3:38 pm, Keith Thompson <> wrote:
    >> cc <> writes:
    >> > Is it acceptable practice to have a #define with a semicolon in it,
    >> > such as:

    >>
    >> > #define SMALL 1;

    >>
    >> > I didn't think it was, but a very good friend of mine claims it's
    >> > perfectly acceptable if you want to prevent the #define from being
    >> > used in an expression like if(SMALL).

    >>
    >> Why would you want to prevent it from being used in an expression?
    >> I think "1;" is a poor example of what your friend is talking about.
    >> I'd be interested in seeing a better example.

    >
    > That was his example. That was also his explanation of why he did it
    > (so the compiler would complain if he used it as an expression).


    How else would he use it?

    > Another example was from the linux kernel.
    >
    > /usr/src/linux-3.0.0-rc7-mainline/include/linux/mfd/tps65910.h:
    > #define LDO_MAX_VOLT 3300;


    I suspect that's just an error. Perhaps it's only used in contexts
    where the extra semicolon is harmless, such as
    voltage = LDO_MAX_VOLT;
    which expands to
    voltage = 3300;;
    which is an assignment statement followed by an expression statement.

    Or, worse, if it's used like this:
    voltage = LDO_MAX_VOLT + 1;
    then it expands to
    voltage = 330; + 1;
    where the "+ 1;" is an expression statement that discards the result
    (and voltage gets the wrong value).

    >> A #define can contain any token sequence you like.  The macro name
    >> will be expanded to that token sequence every time you use it.
    >> If you want that token sequence to include a semicolon, then you
    >> should have a semicolon in the definition.

    >
    > I know what #define does. I was asking about coding standards more or
    > less, and if a #define with a semicolon was commonly used and accepted
    > practice.


    I'd say no. It's more commonly a mistake -- and if you're unlucky,
    the compiler won't warn you about it.

    >> But most of the time, a macro expansion is used either in an
    >> expression context (in which case it *shouldn't* have any semicolons,
    >> and it should be protected by parentheses where necessary), or
    >> in a statement context (in which case, if it consists of multiple
    >> substatements, you need to use the "do { ... } while (0)" trick).

    >
    > Right. So you see no logical reason to ever use something like #define
    > SMALL 1;? I don't either, but I was just making sure there wasn't
    > something I missed.


    I won't say there's *never* a reason to do something like that.
    There are cases where macros will expand to something other than
    an expression or a statement. It usually means you're messing with
    the language syntax, which is dangerous but *sometimes* useful.

    Many years ago, I wrote something like:

    #define EVER ;;
    ...
    for (EVER) {
    ...
    }

    but I got better.

    --
    Keith Thompson (The_Other_Keith) <http://www.ghoti.net/~kst>
    Nokia
    "We must do something. This is something. Therefore, we must do this."
    -- Antony Jay and Jonathan Lynn, "Yes Minister"
     
    Keith Thompson, Jul 13, 2011
    #6
  7. On Jul 13, 8:56 pm, cc <> wrote:
    > Right. So you see no logical reason to ever use something like #define
    > SMALL 1;? I don't either, but I was just making sure there wasn't
    > something I missed.


    I do, though it does not apply to your case. There are lint-like tools
    that allow you to declare that a macro expands to a statement. The
    tool will verify that it is only ever used as a statement, but in
    return, it has to actually *be* a statement, or it gets very confused.
    A macro expansion that would be a statement if you add a semicolon
    does not qualify.
     
    Harald van Dijk, Jul 13, 2011
    #7
  8. Harald van Dijk <> writes:
    > On Jul 13, 8:56 pm, cc <> wrote:
    >> Right. So you see no logical reason to ever use something like #define
    >> SMALL 1;? I don't either, but I was just making sure there wasn't
    >> something I missed.

    >
    > I do, though it does not apply to your case. There are lint-like tools
    > that allow you to declare that a macro expands to a statement. The
    > tool will verify that it is only ever used as a statement, but in
    > return, it has to actually *be* a statement, or it gets very confused.
    > A macro expansion that would be a statement if you add a semicolon
    > does not qualify.


    From your description, it sounds like there are some bad lint-like
    tools out there.

    A macro that's intended to expand to a statement (and not to an
    expression) should use the "do { ... } while (0)" trick to avoid
    problems when used with if/else.

    --
    Keith Thompson (The_Other_Keith) <http://www.ghoti.net/~kst>
    Nokia
    "We must do something. This is something. Therefore, we must do this."
    -- Antony Jay and Jonathan Lynn, "Yes Minister"
     
    Keith Thompson, Jul 13, 2011
    #8
  9. On Jul 13, 10:01 pm, Keith Thompson <> wrote:
    > Harald van Dijk <> writes:
    > > On Jul 13, 8:56 pm, cc <> wrote:
    > >> Right. So you see no logical reason to ever use something like #define
    > >> SMALL 1;? I don't either, but I was just making sure there wasn't
    > >> something I missed.

    >
    > > I do, though it does not apply to your case. There are lint-like tools
    > > that allow you to declare that a macro expands to a statement. The
    > > tool will verify that it is only ever used as a statement, but in
    > > return, it has to actually *be* a statement, or it gets very confused.
    > > A macro expansion that would be a statement if you add a semicolon
    > > does not qualify.

    >
    > From your description, it sounds like there are some bad lint-like
    > tools out there.
    >
    > A macro that's intended to expand to a statement (and not to an
    > expression) should use the "do { ... } while (0)" trick to avoid
    > problems when used with if/else.


    That depends. As long as it warns for empty statements, which includes
    the cases where the macro is immediately followed by a semicolon, it
    is fine. Regardless of whether the macro appears in an if statement,
    it expects the macro to always be used by itself. And if it is always
    used by itself, it causes no problems before an else: it looks just as
    you would normally use it. It is just as valid as far as C is
    concerned. The main thing it has going against it is that it gets very
    confusing when you mix it with macros that do expect to be followed by
    a semicolon. (I don't use it myself, by the way.)
     
    Harald van Dijk, Jul 13, 2011
    #9
  10. cc

    Joe Pfeiffer Guest

    cc <> writes:

    > On Jul 13, 3:38 pm, Keith Thompson <> wrote:
    >> cc <> writes:
    >> > Is it acceptable practice to have a #define with a semicolon in it,
    >> > such as:

    >>
    >> > #define SMALL 1;

    >>
    >> > I didn't think it was, but a very good friend of mine claims it's
    >> > perfectly acceptable if you want to prevent the #define from being
    >> > used in an expression like if(SMALL).

    >>
    >> Why would you want to prevent it from being used in an expression?
    >> I think "1;" is a poor example of what your friend is talking about.
    >> I'd be interested in seeing a better example.

    >
    > That was his example. That was also his explanation of why he did it
    > (so the compiler would complain if he used it as an expression).
    >
    > Another example was from the linux kernel.
    >
    > /usr/src/linux-3.0.0-rc7-mainline/include/linux/mfd/tps65910.h:
    > #define LDO_MAX_VOLT 3300;


    I was curious enough I went and looked that one up -- it's the only
    #define in the file that ends with a semicolon (even LDO_MIN_VOLT
    doesn't), and a recursive grep fails to turn the symbol up anywhere else
    in the kernel. I'm guessing the reason for this one was an
    overly-clever way of keeping anybody from using it (for anything!) in
    what seems to be a fairly new driver.
     
    Joe Pfeiffer, Jul 14, 2011
    #10
  11. Joe Pfeiffer <> writes:
    > cc <> writes:
    >> On Jul 13, 3:38 pm, Keith Thompson <> wrote:
    >>> cc <> writes:
    >>> > Is it acceptable practice to have a #define with a semicolon in it,
    >>> > such as:
    >>>
    >>> > #define SMALL 1;
    >>>
    >>> > I didn't think it was, but a very good friend of mine claims it's
    >>> > perfectly acceptable if you want to prevent the #define from being
    >>> > used in an expression like if(SMALL).
    >>>
    >>> Why would you want to prevent it from being used in an expression?
    >>> I think "1;" is a poor example of what your friend is talking about.
    >>> I'd be interested in seeing a better example.

    >>
    >> That was his example. That was also his explanation of why he did it
    >> (so the compiler would complain if he used it as an expression).
    >>
    >> Another example was from the linux kernel.
    >>
    >> /usr/src/linux-3.0.0-rc7-mainline/include/linux/mfd/tps65910.h:
    >> #define LDO_MAX_VOLT 3300;

    >
    > I was curious enough I went and looked that one up -- it's the only
    > #define in the file that ends with a semicolon (even LDO_MIN_VOLT
    > doesn't), and a recursive grep fails to turn the symbol up anywhere else
    > in the kernel. I'm guessing the reason for this one was an
    > overly-clever way of keeping anybody from using it (for anything!) in
    > what seems to be a fairly new driver.


    I'm guessing that it's just a mistake that nobody has fixed yet.

    Adding the semicolon won't keep it from being used. In many cases, it
    won't change anything:

    voltage = LDO_MAX_VOLT;

    and in others it can silently change the meaning of the code:

    voltage = LDO_MAX_VOLT + 1;

    --
    Keith Thompson (The_Other_Keith) <http://www.ghoti.net/~kst>
    Nokia
    "We must do something. This is something. Therefore, we must do this."
    -- Antony Jay and Jonathan Lynn, "Yes Minister"
     
    Keith Thompson, Jul 14, 2011
    #11
  12. On Wed, 13 Jul 2011 11:19:53 -0700 (PDT), cc <>
    wrote:

    >Is it acceptable practice to have a #define with a semicolon in it,
    >such as:
    >
    >#define SMALL 1;
    >
    >I didn't think it was, but a very good friend of mine claims it's
    >perfectly acceptable if you want to prevent the #define from being
    >used in an expression like if(SMALL).


    Acceptable is in the eye of the beholder. If you are at work, it is
    whatever standards your company adopts. If you are at home, it is
    whatever your preference is.

    The only thing perfect about is that it is perfectly legal syntax.

    I don't find it acceptable at all personally but there is no reason
    why you or anyone else reading this should care what I think.

    --
    Remove del for email
     
    Barry Schwarz, Jul 14, 2011
    #12
  13. cc <> writes:
    <snip>
    > Another example was from the linux kernel.
    >
    > /usr/src/linux-3.0.0-rc7-mainline/include/linux/mfd/tps65910.h:
    > #define LDO_MAX_VOLT 3300;


    It's quote clear from the context that it's a typo. Fortunately the
    macro is not used anywhere!

    <snip>
    --
    Ben.
     
    Ben Bacarisse, Jul 14, 2011
    #13
  14. Keith Thompson writes:
    >> Right. So you see no logical reason to ever use something like #define
    >> SMALL 1;? I don't either, but I was just making sure there wasn't
    >> something I missed.

    >
    > I won't say there's *never* a reason to do something like that.
    > There are cases where macros will expand to something other than
    > an expression or a statement. It usually means you're messing with
    > the language syntax, which is dangerous but *sometimes* useful.
    >
    > Many years ago, I wrote something like:
    >
    > #define EVER ;;
    > ...
    > for (EVER) {
    > ...
    > }
    >
    > but I got better.


    More generally, you can end up with such strange-looking macros if you
    use the preprocessor to extend the language. E.g. macros for a poor
    man's exception facility, typically with setjmp at the core. These
    might also use ugliness like

    #define FOO_BEGIN(x) { <something>
    #define FOO_END(x) <something else> }

    You should then try to keep the ugliness inside macro definitions,
    so code using the macros will not look too bad. That can lead to
    strange definitions like '#define SMALL 1;'.

    You might even deliberatey design the facility so the 'x' macro
    parameter above takes an argument of format '<number> <semicolon>',
    if user code is always supposed to pass a macro like SMALL, never a
    number. It could still be naughty and pass FOO_BEGIN(1;) directly
    instead of FOO_BEGIN(SMALL), but that will at least look strange.
    Maybe that's what the OP's friend is talking about. However it makes
    no sense to speak of that in isolation, without reference to the macro
    set which uses SMALL.

    --
    Hallvard
     
    Hallvard B Furuseth, Jul 14, 2011
    #14
  15. cc

    Phil Carmody Guest

    Joe Pfeiffer <> writes:
    > cc <> writes:
    >
    > > On Jul 13, 3:38 pm, Keith Thompson <> wrote:
    > >> cc <> writes:
    > >> > Is it acceptable practice to have a #define with a semicolon in it,
    > >> > such as:
    > >>
    > >> > #define SMALL 1;
    > >>
    > >> > I didn't think it was, but a very good friend of mine claims it's
    > >> > perfectly acceptable if you want to prevent the #define from being
    > >> > used in an expression like if(SMALL).
    > >>
    > >> Why would you want to prevent it from being used in an expression?
    > >> I think "1;" is a poor example of what your friend is talking about.
    > >> I'd be interested in seeing a better example.

    > >
    > > That was his example. That was also his explanation of why he did it
    > > (so the compiler would complain if he used it as an expression).
    > >
    > > Another example was from the linux kernel.
    > >
    > > /usr/src/linux-3.0.0-rc7-mainline/include/linux/mfd/tps65910.h:
    > > #define LDO_MAX_VOLT 3300;

    >
    > I was curious enough I went and looked that one up -- it's the only
    > #define in the file that ends with a semicolon (even LDO_MIN_VOLT
    > doesn't), and a recursive grep fails to turn the symbol up anywhere else
    > in the kernel. I'm guessing the reason for this one was an
    > overly-clever way of keeping anybody from using it (for anything!) in
    > what seems to be a fairly new driver.


    There's worse.

    $ git grep define\ DELAY_1

    Ug.

    Phil
    --
    "At least you know where you are with Microsoft."
    "True. I just wish I'd brought a paddle." -- Matthew Vernon
     
    Phil Carmody, Jul 14, 2011
    #15
  16. cc

    Walter Banks Guest

    cc wrote:

    > Is it acceptable practice to have a #define with a semicolon in it,
    > such as:
    >
    > #define SMALL 1;
    >
    > I didn't think it was, but a very good friend of mine claims it's
    > perfectly acceptable if you want to prevent the #define from being
    > used in an expression like if(SMALL).


    I wish I had a nickel for every customer call I have taken over the last
    30 years where a semicolon at the end of a #define changed an
    expression. The worst ones are when an expression is split in two and
    creates two valid statements. No compiler errors or warnings just
    application anguish.

    Regards,

    --
    Walter Banks
    Byte Craft Limited
    http://www.bytecraft.com
     
    Walter Banks, Jul 14, 2011
    #16
  17. cc

    Gene Guest

    On Jul 13, 4:30 pm, Keith Thompson <> wrote:
    > I won't say there's *never* a reason to do something like that.
    > There are cases where macros will expand to something other than
    > an expression or a statement.  It usually means you're messing with
    > the language syntax, which is dangerous but *sometimes* useful.
    >
    > Many years ago, I wrote something like:
    >
    >     #define EVER ;;
    >     ...
    >     for (EVER) {
    >         ...
    >     }
    >
    > but I got better.
    >
    > --
    > Keith Thompson (The_Other_Keith)  <http://www.ghoti.net/~kst>


    A recovering macroholic?
     
    Gene, Jul 14, 2011
    #17
  18. cc

    cc Guest

    On Jul 13, 2:19 pm, cc <> wrote:
    > Is it acceptable practice to have a #define with a semicolon in it,
    > such as:
    >
    > #define SMALL 1;
    >
    > I didn't think it was, but a very good friend of mine claims it's
    > perfectly acceptable if you want to prevent the #define from being
    > used in an expression like if(SMALL).


    It seems as though some people have taken issue with my
    characterization of the situation. First off, it wasn't a very good
    friend, but actually someone I don't even know. It was a Usenet thing.
    That was supposed to be a joke for others reading, but one person was
    very upset and called me a liar. So no, it wasn't "a very good
    friend." Also, they seem to have an issue with the way I presented the
    situation. Here is the full post:

    "'The semi-colon will be expanded as part of the macro, causing the
    printf to fail to compile.'

    Correct - but in reality what I actually do is exploit that to make it
    intentionally fail!

    e.g. I could easily write


    if(SMALL) { do something } else { do something else }


    That is bad programming - for the most part, I know I would never
    write if(SMALL) ... because if I set SMALL to 2,3,4, then everything
    is OK when configuring the software, but if accidentally set SMALL to
    0 the execution of the if() statement will change and that would have
    been an unintentional side effect.

    If I accidentally wrote the code with if(SMALL) it will not fail
    especially hard to spot the mistake if it is buried in a complex
    formula. And there is no warning of impending doom.

    So by putting semicolon in #define SMALL 1; I've made sure on
    compiling it it is guaranteed to fail when used out of context."

    So that's the whole quote (of which I see no difference in what I said
    before), so if you feel differently about it being poor coding
    practice I would like to hear why again. Also I'm sorry I jokingly
    called someone I don't know, my very good friend. Thanks.
     
    cc, Jul 15, 2011
    #18
  19. cc <> writes:

    if (some_condition)
    some_var = SMALL + some_other_var++;


    I wish you a merry time debugging code like this.

    --
    /Wegge

    Leder efter redundant peering af dk.*,linux.debian.*
     
    Anders Wegge Keller, Jul 15, 2011
    #19
  20. Keith Thompson wrote:
    >Many years ago, I wrote something like:
    >
    > #define EVER ;;
    > ...
    > for (EVER) {
    > ...
    > }
    >
    >but I got better.


    I am tempted to do that often, because with some compilers this,

    while (1) { ... }

    generates a warning about the "expression being constant", while your
    example is accepted silently.

    --
    Roberto Waltman

    [ Please reply to the group.
    Return address is invalid ]
     
    Roberto Waltman, Jul 15, 2011
    #20
    1. Advertising

Want to reply to this thread or ask your own question?

It takes just 2 minutes to sign up (and it's free!). Just click the sign up button to choose a username and then you can ask your own questions on the forum.
Similar Threads
  1. Dan
    Replies:
    3
    Views:
    4,603
  2. theotyflos
    Replies:
    3
    Views:
    474
    Thomas Matthews
    Feb 19, 2004
  3. robin liu
    Replies:
    3
    Views:
    825
    Robin Liu
    Apr 21, 2006
  4. Brian Takita

    #define _ and #define __

    Brian Takita, Jan 23, 2006, in forum: Ruby
    Replies:
    0
    Views:
    468
    Brian Takita
    Jan 23, 2006
  5. Peter Michaux

    Cookies: semicolon vs. semicolon-space

    Peter Michaux, Dec 10, 2006, in forum: Javascript
    Replies:
    3
    Views:
    221
    Michael Winter
    Dec 11, 2006
Loading...

Share This Page