#define

Discussion in 'C++' started by srinivas reddy, Jul 8, 2003.

  1. I have defined some variables using #define preprocessor instruction.
    And then later I checked whether I had defined a variable. Sometimes
    even though a variable have been defined, #if !defined(var) construct
    is returning true. I am using gcc3.0.1 on SunOS8. I would appreciate
    if any body can tell me whether this is a bug or I am doing something
    wrong.

    tia,
    Srinivas
    srinivas reddy, Jul 8, 2003
    #1
    1. Advertising

  2. srinivas reddy

    David White Guest

    srinivas reddy <> wrote in message
    news:...
    > I have defined some variables using #define preprocessor instruction.
    > And then later I checked whether I had defined a variable. Sometimes
    > even though a variable have been defined, #if !defined(var) construct
    > is returning true. I am using gcc3.0.1 on SunOS8. I would appreciate
    > if any body can tell me whether this is a bug or I am doing something
    > wrong.


    I would have said that this surely is a bug, but I wouldn't put anything
    past the C++ preprocessor.

    Incomprehensibly, #if var1 == var2 simply converts var1 and var2 to "0"
    (yes, "0", even though the preprocessor is a _text_ replacer) if it hasn't
    come across definitions of them (something like Basic assuming that any
    undefined variable it comes across must be an int; and I thought C++ got rid
    of implicit this and implicit that because they are thought unsafe). 0 == 0
    is true, of course.

    I can only assume that those with influence who wish to see the end of the
    preprocessor altogether are trying to accelerate its death by ensuring that
    it works as badly as possible.

    DW
    David White, Jul 8, 2003
    #2
    1. Advertising

  3. srinivas reddy

    Pete Becker Guest

    David White wrote:
    >
    > I can only assume that those with influence who wish to see the end of the
    > preprocessor altogether are trying to accelerate its death by ensuring that
    > it works as badly as possible.
    >


    Then it must be that the folks who originally came up with the idea of
    the preprocessor thirty years ago tried to accelerate its death, because
    replacing undefined symbols with 0 in arithmetic expressions has been
    the rule since the beginning.

    --

    Pete Becker
    Dinkumware, Ltd. (http://www.dinkumware.com)
    Pete Becker, Jul 8, 2003
    #3
  4. srinivas reddy

    Howard Guest

    "srinivas reddy" <> wrote in message
    news:...
    > I have defined some variables using #define preprocessor instruction.
    > And then later I checked whether I had defined a variable. Sometimes
    > even though a variable have been defined, #if !defined(var) construct
    > is returning true. I am using gcc3.0.1 on SunOS8. I would appreciate
    > if any body can tell me whether this is a bug or I am doing something
    > wrong.
    >
    > tia,
    > Srinivas


    Perhaps you have defined the variable, but not in the compilation unit in
    which your #if statement is written? In other words, if you #define the
    variable in unit1.h, and make your check in unit2.cpp, then you need to add
    #include "unit1.h" in unit2.cpp before checking if the variable exists.

    (I usually put my #define's in a precompiled header if they're going to be
    widely used in my projects. But that's with CodeWarrior...I don't know how
    to use gcc so it may be different.)

    Just a thought...

    -Howard
    Howard, Jul 8, 2003
    #4
  5. David White wrote:
    > srinivas reddy <> wrote in message
    > news:...
    >> I have defined some variables using #define preprocessor instruction.
    >> And then later I checked whether I had defined a variable. Sometimes
    >> even though a variable have been defined, #if !defined(var) construct
    >> is returning true. I am using gcc3.0.1 on SunOS8. I would appreciate
    >> if any body can tell me whether this is a bug or I am doing something
    >> wrong.

    >
    > I would have said that this surely is a bug, but I wouldn't put
    > anything past the C++ preprocessor.
    >
    > Incomprehensibly, #if var1 == var2 simply converts var1 and var2 to
    > "0" (yes, "0", even though the preprocessor is a _text_ replacer) if
    > it hasn't come across definitions of them (something like Basic
    > assuming that any undefined variable it comes across must be an int;
    > and I thought C++ got rid of implicit this and implicit that because
    > they are thought unsafe). 0 == 0 is true, of course.
    >
    > I can only assume that those with influence who wish to see the end
    > of the preprocessor altogether are trying to accelerate its death by
    > ensuring that it works as badly as possible.


    It isn't the preprocessor that is bad--even the conversion to 0 that you mention
    here. It is *misuse* of the preprocessor that is bad. The preprocessor is
    actually a critical component of the C and C++ compilation process. It makes it
    possible to write code that works on multiple platforms, as well as write code
    that works on various current compilers (as opposed to the idealistic perfect
    C++ implementation).

    Regards,
    Paul Mensonides
    Paul Mensonides, Jul 8, 2003
    #5
  6. srinivas reddy

    David White Guest

    Pete Becker <> wrote in message
    news:...
    > David White wrote:
    > >
    > > I can only assume that those with influence who wish to see the end of

    the
    > > preprocessor altogether are trying to accelerate its death by ensuring

    that
    > > it works as badly as possible.
    > >

    >
    > Then it must be that the folks who originally came up with the idea of
    > the preprocessor thirty years ago tried to accelerate its death, because
    > replacing undefined symbols with 0 in arithmetic expressions has been
    > the rule since the beginning.


    I accept that, but why hasn't it been fixed along with everything else?
    Implicit int, matching of function argument types, insistence that function
    definitions be present, etc. have been some of the many improvements to C++
    since C. I don't think anyone disputes that these are all good things. The
    more prgrammer errors you can detect at compile time the better. Why leave
    something there that's so obviously bad?

    DW
    David White, Jul 8, 2003
    #6
  7. srinivas reddy

    Pete Becker Guest

    David White wrote:
    >
    > Pete Becker <> wrote in message
    > news:...
    > > David White wrote:
    > > >
    > > > I can only assume that those with influence who wish to see the end of

    > the
    > > > preprocessor altogether are trying to accelerate its death by ensuring

    > that
    > > > it works as badly as possible.
    > > >

    > >
    > > Then it must be that the folks who originally came up with the idea of
    > > the preprocessor thirty years ago tried to accelerate its death, because
    > > replacing undefined symbols with 0 in arithmetic expressions has been
    > > the rule since the beginning.

    >
    > I accept that, but why hasn't it been fixed along with everything else?


    Because it's not broken.

    > Implicit int, matching of function argument types, insistence that function
    > definitions be present, etc. have been some of the many improvements to C++
    > since C. I don't think anyone disputes that these are all good things. The
    > more prgrammer errors you can detect at compile time the better. Why leave
    > something there that's so obviously bad?
    >


    The fact that you don't understand it doesn't make it bad.

    --

    Pete Becker
    Dinkumware, Ltd. (http://www.dinkumware.com)
    Pete Becker, Jul 8, 2003
    #7
  8. srinivas reddy

    David White Guest

    Pete Becker <> wrote in message
    news:...
    > David White wrote:
    > >
    > >
    > > I accept that, but why hasn't it been fixed along with everything else?

    >
    > Because it's not broken.
    >
    > > Implicit int, matching of function argument types, insistence that

    function
    > > definitions be present, etc. have been some of the many improvements to

    C++
    > > since C. I don't think anyone disputes that these are all good things.

    The
    > > more prgrammer errors you can detect at compile time the better. Why

    leave
    > > something there that's so obviously bad?
    > >

    > The fact that you don't understand it doesn't make it bad.


    What have I said that indicates that I don't understand it? Did I describe
    it wrongly?

    I'm interested to know: do you think that assuming that an undefined
    preprocessor symbol is "0" is a good thing, or something that wouldn't be
    improved by a compiler error saying that the symbol is undefined? If so, why
    not extend the principle to assuming that any symbol in a C++ expression is
    an 'int'?

    myVariable = 7;
    // myVariable not defined anywhere: so it must be an 'int'.

    Okay?

    myVariable = myFunction(3, "abc", 2.65);
    // myFunction not defined anywhere: so it must be int myFunction(int, char
    *, double);

    Okay?

    DW
    David White, Jul 9, 2003
    #8
  9. srinivas reddy

    Pete Becker Guest

    David White wrote:
    >
    > Pete Becker <> wrote in message
    > news:...
    > > David White wrote:
    > > >
    > > >
    > > > I accept that, but why hasn't it been fixed along with everything else?

    > >
    > > Because it's not broken.
    > >
    > > > Implicit int, matching of function argument types, insistence that

    > function
    > > > definitions be present, etc. have been some of the many improvements to

    > C++
    > > > since C. I don't think anyone disputes that these are all good things.

    > The
    > > > more prgrammer errors you can detect at compile time the better. Why

    > leave
    > > > something there that's so obviously bad?
    > > >

    > > The fact that you don't understand it doesn't make it bad.

    >
    > What have I said that indicates that I don't understand it? Did I describe
    > it wrongly?


    You said earlier that the preprocessor is "a _text_ replacer."

    >
    > I'm interested to know: do you think that assuming that an undefined
    > preprocessor symbol is "0" is a good thing, or something that wouldn't be
    > improved by a compiler error saying that the symbol is undefined?


    No. It would make some things much more verbose, and would only help
    beginners.

    > If so, why
    > not extend the principle to assuming that any symbol in a C++ expression is
    > an 'int'?


    Non sequitur.

    --

    Pete Becker
    Dinkumware, Ltd. (http://www.dinkumware.com)
    Pete Becker, Jul 9, 2003
    #9
  10. srinivas reddy

    David White Guest

    Pete Becker <> wrote in message
    news:...
    > David White wrote:
    > >
    > > What have I said that indicates that I don't understand it? Did I

    describe
    > > it wrongly?

    >
    > You said earlier that the preprocessor is "a _text_ replacer."


    Yes, and that statement was _clearly_ made in the context of #define and
    #if.

    #define X Y

    Doesn't this replace the symbol 'X' found anywhere in the source code with
    the text 'Y'?

    Also: "Because they rearrange the program text before the compiler proper
    sees it, macros are..." - The C++ Programming Language (3rd ed.), page 160.

    Given that macros _do_ replace text, why should an undefined symbol become
    '0' rather than ''?

    > > I'm interested to know: do you think that assuming that an undefined
    > > preprocessor symbol is "0" is a good thing, or something that wouldn't

    be
    > > improved by a compiler error saying that the symbol is undefined?

    >
    > No. It would make some things much more verbose,


    Such as?

    And is the increased verbosity worse than no message from the compiler when
    a symbol is used without having been defined?

    Speaking of verbosity, the way to ensure that preprocessor symbols are not
    silently converted to 0 is:

    #if !defined(REACTOR_TYPE) || !defined(REACTOR_NEW_MODEL)
    #error REACTOR_TYPE or REACTOR_NEW_MODEL not defined
    #endif

    Apart from the fact that if one remembers to do this then one would have
    ensured that the symbols were defined, is it not verbose to place this in
    every source file in which these symbols are used?

    > and would only help
    > beginners.


    I see. So, only beginners would ever forget to ensure that both of these are
    #defined somewhere?

    #if REACTOR_TYPE == REACTOR_NEW_MODEL

    > > If so, why
    > > not extend the principle to assuming that any symbol in a C++ expression

    is
    > > an 'int'?

    >
    > Non sequitur.


    void f(int reactorType)
    {
    // No definition of REACTOR_NEW_MODEL given
    if(reactorType == REACTOR_NEW_MODEL)
    {
    // ...
    }
    }

    Why should this be an error, but not the preprocessor version?

    DW
    David White, Jul 9, 2003
    #10
  11. srinivas reddy

    David White Guest

    "Paul Mensonides" <> wrote in message
    news:ZoHOa.12065$H17.3639@sccrnsc02...
    > David White wrote:
    > > Incomprehensibly, #if var1 == var2 simply converts var1 and var2 to
    > > "0" (yes, "0", even though the preprocessor is a _text_ replacer) if
    > > it hasn't come across definitions of them (something like Basic
    > > assuming that any undefined variable it comes across must be an int;
    > > and I thought C++ got rid of implicit this and implicit that because
    > > they are thought unsafe). 0 == 0 is true, of course.
    > >
    > > I can only assume that those with influence who wish to see the end
    > > of the preprocessor altogether are trying to accelerate its death by
    > > ensuring that it works as badly as possible.

    >
    > It isn't the preprocessor that is bad--even the conversion to 0 that you

    mention
    > here.


    Well, I think the conversion to 0 _is_ bad. Given that you can use #ifdef or
    #if defined() for things such as:
    #ifdef _cplusplus

    how can the implicit conversion to 0 of an undefined symbol be a good thing?
    Why is it better than issuing an error?

    > It is *misuse* of the preprocessor that is bad. The preprocessor is
    > actually a critical component of the C and C++ compilation process.


    I agree. That's why I'd like it to work _safely_.

    > It makes it
    > possible to write code that works on multiple platforms, as well as write

    code
    > that works on various current compilers (as opposed to the idealistic

    perfect
    > C++ implementation).


    Yes, but I want to do it safely. I do not want the outcome of an #if to be
    one of these two possibilities:
    1. The result of the expression of previously defined symbols.
    2. A programmer's mistake in forgetting to include the defined symbols.

    This is inherently unsafe. The possibility of no. 2 is the reason that C++
    insists on all function definitions being present and that there is a
    suitable match for every argument. Does not one other person here think that
    this is a problem?

    DW
    David White, Jul 9, 2003
    #11
  12. David White wrote:
    [...]
    > This is inherently unsafe. The possibility of no. 2 is the reason that C++
    > insists on all function definitions being present and that there is a
    > suitable match for every argument. Does not one other person here think that
    > this is a problem?


    http://groups.google.com/groups?threadm=
    (Subject: Using a define that hasn't been #defined)

    regards,
    alexander.

    --
    "Status quo, you know, that is Latin for ``the mess we're in.''"

    -- Ronald Reagan
    Alexander Terekhov, Jul 9, 2003
    #12
  13. srinivas reddy

    Ron Natalie Guest

    "David White" <> wrote in message news:KbLOa.8912$...

    >
    > Doesn't this replace the symbol 'X' found anywhere in the source code with
    > the text 'Y'?


    No, it doesn't. Pete is right, you seem not to understand the preprocessor.

    > Given that macros _do_ replace text, why should an undefined symbol become
    > '0' rather than ''?


    They do not replace text, they replace tokens.
    Ron Natalie, Jul 9, 2003
    #13
  14. Ron Natalie wrote:
    > "David White" <> wrote in message
    > news:KbLOa.8912$...
    >
    >>
    >> Doesn't this replace the symbol 'X' found anywhere in the source
    >> code with
    >> the text 'Y'?

    >
    > No, it doesn't. Pete is right, you seem not to understand the
    > preprocessor.
    >
    >> Given that macros _do_ replace text, why should an undefined symbol
    >> become '0' rather than ''?

    >
    > They do not replace text, they replace tokens.


    Even more specifically, they replace macro invocations:

    #define X() Y

    X // X

    Regards,
    Paul Mensonides
    Paul Mensonides, Jul 9, 2003
    #14
  15. David White wrote:

    >> It isn't the preprocessor that is bad--even the conversion to 0 that
    >> you mention here.

    >
    > Well, I think the conversion to 0 _is_ bad. Given that you can use
    > #ifdef or #if defined() for things such as:
    > #ifdef _cplusplus
    >
    > how can the implicit conversion to 0 of an undefined symbol be a good
    > thing? Why is it better than issuing an error?


    Because it is a "reasonable default." Reasonable defaults make code less
    verbose. This happens with templates also:

    template<class T> void f(int reactorType)
    {
    // No definition of REACTOR_NEW_MODEL given
    if(reactorType == T::REACTOR_NEW_MODEL)
    {
    // ...
    }
    }

    The compiler will pass this with no problem even though it still parses the
    expression, etc.. The reasonable default here is "non-type". The point being
    that the language has to deal with unknown names and make assumptions about what
    they mean in various places.

    That is just the way it is. You know what the behavior is, so writing "safe"
    code is up to you to use the language in safe ways. C and C++ certainly don't
    protect you from unsafe usage many areas, why should they do that here?

    If you changed the behavior to an error, how would you do this in a non-verbose
    way:

    # if !__cplusplus && __STDC_VERSION__ >= 199901L

    You'd have to do something really annoying because you cannot use any
    conditional test that uses the name outside the defined operator. You can't
    even do this:

    #if defined(__STDC_VERSION__) && __STDC_VERSION__ >= 199901L

    ...because that constitutes an error under your model if __STDC_VERSION__ is not
    defined. You'd have to separate the test for definition from the conditional
    expression:

    # if !defined __cplusplus && defined __STDC_VERSION__
    # if __STDC_VERSION__ >= 199901L
    # // 1
    # else
    # // 2
    # endif
    # else
    # // 2
    # endif

    ....and that is a code doubler for point 2.

    If you changed the behavior to expanding to nil instead of 0, you'd have silent
    changes in other ways. You'd also end of seeing a lot of "hacks" like this:

    # if !(__cplusplus+0) && (__STDC_VERSION__+0) >= 199901L

    In order to simulate the common scenario that we already have built into the
    preprocessor.

    >> It is *misuse* of the preprocessor that is bad. The preprocessor is
    >> actually a critical component of the C and C++ compilation process.

    >
    > I agree. That's why I'd like it to work _safely_.


    It does work safely if used correctly. I said it before already: The #if and
    #elif directives are not designed to implicitly perform the kind of verification
    that you want--because that kind of verification (if done by default) is
    downright annoying.

    Further, the root problem here is 1) forgetting to include a file, or 2) design
    error. Assuming that it is just a case of forgetting to include the file that
    defines the symbols, there a many ways in which a program can silently change
    meaning in C++ by not including a file (e.g. silently choosing different
    function overloads or different template specializations).

    >> It makes it
    >> possible to write code that works on multiple platforms, as well as
    >> write code that works on various current compilers (as opposed to
    >> the idealistic perfect C++ implementation).

    >
    > Yes, but I want to do it safely. I do not want the outcome of an #if
    > to be one of these two possibilities:
    > 1. The result of the expression of previously defined symbols.


    It is totally ill-conceived. You can do what you want reasonably, but you
    cannot do what it already does reasonably. You already have the option to do
    what you want:

    #if defined(REACTOR_TYPE) \
    && defined(REACTOR_NEW_MODEL) \
    && REACTOR_TYPE == REACTOR_NEW_MODEL

    You can't go back the other way.

    > 2. A programmer's mistake in forgetting to include the defined
    > symbols.
    >
    > This is inherently unsafe.


    No it isn't _inherently_ unsafe. It can be unsafe in certain contexts, and you
    have to be aware of that when you write code. However, the alternative is much
    worse. You can simulate what you want with a small amount of code; you cannot
    simulate what it already does with a small amount of code.

    > The possibility of no. 2 is the reason
    > that C++ insists on all function definitions being present and that
    > there is a suitable match for every argument. Does not one other
    > person here think that this is a problem?


    C++ does not insist on all function declarations that you've defined in a group
    of files be present at each overload resolution--which can cause silent
    differences in overload resolution, etc..

    Regards,
    Paul Mensonides
    Paul Mensonides, Jul 9, 2003
    #15
  16. srinivas reddy

    David White Guest

    Ron Natalie <> wrote in message
    news:3f0c2841$0$87915$...
    >
    > "David White" <> wrote in message

    news:KbLOa.8912$...
    >
    > >
    > > Doesn't this replace the symbol 'X' found anywhere in the source code

    with
    > > the text 'Y'?

    >
    > No, it doesn't. Pete is right, you seem not to understand the

    preprocessor.

    I do. I just didn't use precise enough language in this pedantic place. I am
    aware that the 'X' in 'MAX' would not be replaced by 'Y'.

    > > Given that macros _do_ replace text, why should an undefined symbol

    become
    > > '0' rather than ''?

    >
    > They do not replace text, they replace tokens.


    Okay. Why you didn't correct Stroustrup's loose language as well? :)

    Getting off the track.

    DW
    David White, Jul 10, 2003
    #16
  17. srinivas reddy

    David White Guest

    Paul Mensonides <> wrote in message
    news:k1%Oa.19433$Ph3.1902@sccrnsc04...
    > David White wrote:
    >
    > >> It isn't the preprocessor that is bad--even the conversion to 0 that
    > >> you mention here.

    > >
    > > Well, I think the conversion to 0 _is_ bad. Given that you can use
    > > #ifdef or #if defined() for things such as:
    > > #ifdef _cplusplus
    > >
    > > how can the implicit conversion to 0 of an undefined symbol be a good
    > > thing? Why is it better than issuing an error?

    >
    > Because it is a "reasonable default." Reasonable defaults make code less
    > verbose. This happens with templates also:


    In the case of an expression of the type "#if var1 == var2" it is not a
    reasonable default.

    Even if there is a case in some instances for 'var1' to have an assumed
    value of 0, it's hard to see why 'var2' being undefined is not always an
    error.

    > template<class T> void f(int reactorType)
    > {
    > // No definition of REACTOR_NEW_MODEL given
    > if(reactorType == T::REACTOR_NEW_MODEL)
    > {
    > // ...
    > }
    > }
    >
    > The compiler will pass this with no problem even though it still parses

    the
    > expression, etc.. The reasonable default here is "non-type". The point

    being
    > that the language has to deal with unknown names and make assumptions

    about what
    > they mean in various places.


    The programmer no doubt knows that T might not have REACTOR_NEW_MODEL
    defined, and would find it a pain, and downright wrong, to have to define it
    for all possible T. But that is not analagous to #defining certain symbols
    once, which does not suffer the problem of polluting a bunch of classes with
    a name that has nothing to do with them.

    > That is just the way it is. You know what the behavior is, so writing

    "safe"
    > code is up to you to use the language in safe ways. C and C++ certainly

    don't
    > protect you from unsafe usage many areas, why should they do that here?
    >
    > If you changed the behavior to an error, how would you do this in a

    non-verbose
    > way:
    >
    > # if !__cplusplus && __STDC_VERSION__ >= 199901L
    >
    > You'd have to do something really annoying because you cannot use any
    > conditional test that uses the name outside the defined operator. You

    can't
    > even do this:
    >
    > #if defined(__STDC_VERSION__) && __STDC_VERSION__ >= 199901L
    >
    > ..because that constitutes an error under your model if __STDC_VERSION__

    is not
    > defined. You'd have to separate the test for definition from the

    conditional
    > expression:
    >
    > # if !defined __cplusplus && defined __STDC_VERSION__
    > # if __STDC_VERSION__ >= 199901L
    > # // 1
    > # else
    > # // 2
    > # endif
    > # else
    > # // 2
    > # endif


    How about:
    # if defined __cplusplus || __STDC_VERSION__ < 199901L
    // 2
    #else
    // 1
    #endif

    The compiler can ignore __STDC_VERSION__ if __cplusplus is true because it's
    redundant.

    > ...and that is a code doubler for point 2.
    >
    > If you changed the behavior to expanding to nil instead of 0, you'd have

    silent
    > changes in other ways. You'd also end of seeing a lot of "hacks" like

    this:
    >
    > # if !(__cplusplus+0) && (__STDC_VERSION__+0) >= 199901L
    >
    > In order to simulate the common scenario that we already have built into

    the
    > preprocessor.
    >
    > >> It is *misuse* of the preprocessor that is bad. The preprocessor is
    > >> actually a critical component of the C and C++ compilation process.

    > >
    > > I agree. That's why I'd like it to work _safely_.

    >
    > It does work safely if used correctly. I said it before already: The #if

    and
    > #elif directives are not designed to implicitly perform the kind of

    verification
    > that you want--because that kind of verification (if done by default) is
    > downright annoying.
    >
    > Further, the root problem here is 1) forgetting to include a file, or 2)

    design
    > error. Assuming that it is just a case of forgetting to include the file

    that
    > defines the symbols, there a many ways in which a program can silently

    change
    > meaning in C++ by not including a file (e.g. silently choosing different
    > function overloads or different template specializations).
    >
    > >> It makes it
    > >> possible to write code that works on multiple platforms, as well as
    > >> write code that works on various current compilers (as opposed to
    > >> the idealistic perfect C++ implementation).

    > >
    > > Yes, but I want to do it safely. I do not want the outcome of an #if
    > > to be one of these two possibilities:
    > > 1. The result of the expression of previously defined symbols.

    >
    > It is totally ill-conceived. You can do what you want reasonably, but you
    > cannot do what it already does reasonably. You already have the option to

    do
    > what you want:
    >
    > #if defined(REACTOR_TYPE) \
    > && defined(REACTOR_NEW_MODEL) \
    > && REACTOR_TYPE == REACTOR_NEW_MODEL


    That doesn't do what I want, because it doesn't give an error under any
    circumstance. In my case it's one model or the other or it's an error. No
    model at all is an error.

    > You can't go back the other way.
    >
    > > 2. A programmer's mistake in forgetting to include the defined
    > > symbols.
    > >
    > > This is inherently unsafe.

    >
    > No it isn't _inherently_ unsafe. It can be unsafe in certain contexts,

    and you
    > have to be aware of that when you write code. However, the alternative is

    much
    > worse. You can simulate what you want with a small amount of code; you

    cannot
    > simulate what it already does with a small amount of code.
    >
    > > The possibility of no. 2 is the reason
    > > that C++ insists on all function definitions being present and that
    > > there is a suitable match for every argument. Does not one other
    > > person here think that this is a problem?

    >
    > C++ does not insist on all function declarations that you've defined in a

    group
    > of files be present at each overload resolution--which can cause silent
    > differences in overload resolution, etc..


    Yes, but how likely is it really that function(int) and function(float) are
    going to be defined separately?

    You've raised a lot of good points. I'm not trying to suggest that you can
    eliminate all sources of error. I just ask for a way to eliminate what I
    believe is a _highly likely_ source of error in a project with a
    considerable amount of conditional compiling.

    Scenario (similar to my real case but on a larger scale):
    A large project has to available in a "Lite" form, which has just been given
    the green light by management. The only practicable way to do it is to
    excise certain features by conditional compilation. You examine the hundreds
    of source files, searching for certain keywords perhaps, to locate all those
    places that require conditional compilation. There are a total of 173 source
    and header files that require the addition of some sort of preprocessor
    condition, say using LITE_VERSION and NORMAL_VERSION (or just #ifdef
    NORMAL_VERSION #else, or whatever you prefer). How would you ensure that all
    code for a given version is included, and all code for the other version is
    excluded, when either version of the project is built?

    DW
    David White, Jul 10, 2003
    #17
  18. srinivas reddy

    David White Guest

    Sam Holden <> wrote in message
    news:...
    > On Thu, 10 Jul 2003 12:31:36 +1000,
    > David White <> wrote:
    > >
    > > The main problem, as I see it, is that you are tediously going through

    the
    > > source files, adding something like:
    > > #if BUILDTYPE == NORMAL_VERSION
    > > or
    > > #ifdef NORMAL_VERSION
    > > in all the appropriate places.
    > >
    > > You do this over days, in 173 different files. Apparently, you are

    expected
    > > not to forget, in all 173 files, to include the file that contains the
    > > definiitions of the relevant symbols. Furthermore, you have other

    progammers
    > > maintaining your code after you. They notice these #ifs in your code and

    add
    > > them themselves to their code. They, too, must never forget to include

    the
    > > appropriate header, day after day, new source file after new source

    file, or
    > > they will silently compile the wrong code.
    > >
    > > I am simply saying that this is a deficiency in a language in which
    > > compile-time checking is held up as one of its major virtues.

    >
    > The preprocessor is a low level processor.
    >
    > By using it for such things, instead of using C++ itself you are trading
    > a strongly typed language with compile time checking for a simple text
    > processor which substitutes tokens and has primitive selection syntax.
    >
    >
    > If you write C++ in which all your pointers are void* and you cast (with
    > C-style casts) to the appropriate type when you use them you lose the
    > benefit of strong typing. It's still C++ though, but you can't complain
    > about the lack f compile time checking.
    >
    > The preprocessor is an old language and its behaviour is reasonably
    > well understood, and lots of existing code relies on the behaviour you
    > don't like.
    >
    > It's not going to change. No matter how much you complain about it.


    Yes, I realized that very quickly. I can't even get anyone here to agree
    that anything needs to be fixed.

    > Using #if in anything but a header file is madness in my opinion. I
    > guess using it to give some platform specific optimisation might be OK,
    > but I'd try and find a way to move the #if itself into a header.


    You can't always use C++ itself. How, for example, do you make a declaration
    depend on whether you are compiling a NORMAL or LITE version of the
    software? Suppose you also have EXTRA_LITE and PREMIUM versions. And suppose
    the following constant has a different value for each one:

    #if MODEL == EXTRA_LITE
    const int NrWheels = 2;
    #elif MODEL == LITE
    const int NrWheels = 4;
    // etcetera

    I agree with you about the preprocessor. It's horrible and I wouldn't use it
    if I could easily avoid it.

    DW
    David White, Jul 10, 2003
    #18
  19. srinivas reddy

    David White Guest

    Alexander Terekhov <> wrote in message
    news:...
    >
    > David White wrote:
    > [...]
    > > This is inherently unsafe. The possibility of no. 2 is the reason that

    C++
    > > insists on all function definitions being present and that there is a
    > > suitable match for every argument. Does not one other person here think

    that
    > > this is a problem?

    >
    > http://groups.google.com/groups?threadm=
    > (Subject: Using a define that hasn't been #defined)


    Thanks, Alexander. I am not alone after all.

    DW
    David White, Jul 10, 2003
    #19
  20. srinivas reddy

    Sam Holden Guest

    On Thu, 10 Jul 2003 13:35:24 +1000, David White <> wrote:
    > Sam Holden <> wrote in message
    > news:...
    >> On Thu, 10 Jul 2003 12:31:36 +1000,
    >> David White <> wrote:
    >> >

    >
    >> Using #if in anything but a header file is madness in my opinion. I
    >> guess using it to give some platform specific optimisation might be OK,
    >> but I'd try and find a way to move the #if itself into a header.

    >
    > You can't always use C++ itself. How, for example, do you make a declaration
    > depend on whether you are compiling a NORMAL or LITE version of the
    > software? Suppose you also have EXTRA_LITE and PREMIUM versions. And suppose
    > the following constant has a different value for each one:
    >
    > #if MODEL == EXTRA_LITE
    > const int NrWheels = 2;
    > #elif MODEL == LITE
    > const int NrWheels = 4;
    > // etcetera


    By moving those values into the header file, since they are obviously
    in some way dependant on the MODEL and hence having them in one place
    rather than scattered throughout many source files is nice (even
    if they are in fact only used once - it unifies the location of all
    such data).

    #define MODEL_NRWHEELS 2
    or
    #define MODEL_NRWHEELS 4

    which get set by the appropriate model specific header.

    and then

    const int NrWheels = MODEL_NRWHEELS;

    Has the disadvantage that when the code is changed, sometimes
    the model header files will have to be modified as well. But
    has the advantage that when adding another model you will
    hopefully just have to add the appropriate header file set
    a the appropriate define and all the code now works - as opposed
    to finding all those #ifs and adding #elif MODEL == NEW_MODEL.
    Of course in practice things never work out and you have
    to modify a bit of the code.

    But I'd personally be using the C++ type system rather than
    #defines to tell things apart. I've been bitten by inconsistant
    settings in #defines in different compile runs or files in C
    too many times.

    --
    Sam Holden
    Sam Holden, Jul 10, 2003
    #20
    1. Advertising

Want to reply to this thread or ask your own question?

It takes just 2 minutes to sign up (and it's free!). Just click the sign up button to choose a username and then you can ask your own questions on the forum.
Similar Threads
  1. Max

    where to define a type?

    Max, Sep 12, 2003, in forum: VHDL
    Replies:
    1
    Views:
    459
    Mike Treseler
    Sep 12, 2003
  2. Elektro
    Replies:
    1
    Views:
    545
    Egbert Molenkamp
    Feb 7, 2005
  3. theotyflos
    Replies:
    3
    Views:
    443
    Thomas Matthews
    Feb 19, 2004
  4. robin liu
    Replies:
    3
    Views:
    803
    Robin Liu
    Apr 21, 2006
  5. Brian Takita

    #define _ and #define __

    Brian Takita, Jan 23, 2006, in forum: Ruby
    Replies:
    0
    Views:
    436
    Brian Takita
    Jan 23, 2006
Loading...

Share This Page