gcc, class forward declarations and destructor calls

Discussion in 'C++' started by Juha Nieminen, Mar 5, 2007.

  1. I once made my own smart pointer implementation for a project and
    at one point fought to death to find a malicious bug. The program
    was not working and I couldn't figure out why.
    The source of the problem was that MSVC++ was not giving me a
    warning even though it should have. The problem was that I was
    creating a smart pointer from a forward-declaration of a class
    which, it seems, makes it impossible for the smart pointer to call
    the destructor of the class properly.
    In other words, I had something like this:

    class SomeClass;

    class AnotherClass
    {
    SmartPtr<SomeClass> ptr;
    ...
    };

    After including the full declaration of SomeClass instead of just
    forward-declaring it, it started working.

    Now, I decided to try what gcc says about this. When I did, it was
    way more informative. It said, among other things:

    SmartPtr.hh:80: warning: possible problem detected in invocation
    of delete operator:
    SmartPtr.hh:80: note: neither the destructor nor the class-specific
    operator delete will be called, even if they are declared when the
    class is defined.

    Ok, I can buy that. I assume this agrees with the C++ standard?

    However, now comes the weird stuff: If I compile a test program
    with gcc with either no optimizations or just with "-O" then it
    indeed does not call the destructor of that class. However, if
    I compile with "-O2" or higher, it *does* call the destructor!
    (It still gives the warning, though.)

    I even tried making SomeClass be derived from a base class with
    a virtual destructor, and both destructors were properly called
    when compiling with "-O2" (but none when compiling with "-O" or
    without it).

    I'm curious: Why?
    Juha Nieminen, Mar 5, 2007
    #1
    1. Advertising

  2. Juha Nieminen

    Lionel B Guest

    On Mon, 05 Mar 2007 14:37:55 +0200, Juha Nieminen wrote:

    > I once made my own smart pointer implementation for a project and
    > at one point fought to death to find a malicious bug. The program
    > was not working and I couldn't figure out why.
    > The source of the problem was that MSVC++ was not giving me a
    > warning even though it should have. The problem was that I was
    > creating a smart pointer from a forward-declaration of a class
    > which, it seems, makes it impossible for the smart pointer to call
    > the destructor of the class properly.
    > In other words, I had something like this:
    >
    > class SomeClass;
    >
    > class AnotherClass
    > {
    > SmartPtr<SomeClass> ptr;
    > ...
    > };
    >
    > After including the full declaration of SomeClass instead of just
    > forward-declaring it, it started working.
    >
    > Now, I decided to try what gcc says about this. When I did, it was
    > way more informative. It said, among other things:
    >
    > SmartPtr.hh:80: warning: possible problem detected in invocation
    > of delete operator:
    > SmartPtr.hh:80: note: neither the destructor nor the class-specific
    > operator delete will be called, even if they are declared when the
    > class is defined.
    >
    > Ok, I can buy that. I assume this agrees with the C++ standard?
    >
    > However, now comes the weird stuff: If I compile a test program
    > with gcc with either no optimizations or just with "-O" then it
    > indeed does not call the destructor of that class. However, if
    > I compile with "-O2" or higher, it *does* call the destructor!
    > (It still gives the warning, though.)


    Sounds like you are invoking Undefined Behaviour - in which case all bets
    are off and the compiler can do whatever it likes (including defrosting
    your fridge). So not so much "weird", perhaps, as "irrelevant".

    > I even tried making SomeClass be derived from a base class with
    > a virtual destructor, and both destructors were properly called when
    > compiling with "-O2" (but none when compiling with "-O" or without it).
    >
    > I'm curious: Why?


    Who cares? It's UB. My guess might be that something gets inlined
    somewhere or perhaps the order of some function calls is different between
    the two cases.

    --
    Lionel B
    Lionel B, Mar 5, 2007
    #2
    1. Advertising

  3. Juha Nieminen

    JLS Guest

    On Mar 5, 8:08 am, Lionel B <> wrote:
    > On Mon, 05 Mar 2007 14:37:55 +0200, Juha Nieminen wrote:
    > > I once made my own smart pointer implementation for a project and
    > > at one point fought to death to find a malicious bug. The program
    > > was not working and I couldn't figure out why.
    > > The source of the problem was that MSVC++ was not giving me a
    > > warning even though it should have. The problem was that I was
    > > creating a smart pointer from a forward-declaration of a class
    > > which, it seems, makes it impossible for the smart pointer to call
    > > the destructor of the class properly.
    > > In other words, I had something like this:

    >
    > > class SomeClass;

    >
    > > class AnotherClass
    > > {
    > > SmartPtr<SomeClass> ptr;
    > > ...
    > > };

    >
    > > After including the full declaration of SomeClass instead of just
    > > forward-declaring it, it started working.

    >
    > > Now, I decided to try what gcc says about this. When I did, it was
    > > way more informative. It said, among other things:

    >
    > > SmartPtr.hh:80: warning: possible problem detected in invocation
    > > of delete operator:
    > > SmartPtr.hh:80: note: neither the destructor nor the class-specific
    > > operator delete will be called, even if they are declared when the
    > > class is defined.

    >
    > > Ok, I can buy that. I assume this agrees with the C++ standard?

    >
    > > However, now comes the weird stuff: If I compile a test program
    > > with gcc with either no optimizations or just with "-O" then it
    > > indeed does not call the destructor of that class. However, if
    > > I compile with "-O2" or higher, it *does* call the destructor!
    > > (It still gives the warning, though.)

    >
    > Sounds like you are invoking Undefined Behaviour - in which case all bets
    > are off and the compiler can do whatever it likes (including defrosting
    > your fridge). So not so much "weird", perhaps, as "irrelevant".
    >
    > > I even tried making SomeClass be derived from a base class with
    > > a virtual destructor, and both destructors were properly called when
    > > compiling with "-O2" (but none when compiling with "-O" or without it).

    >
    > > I'm curious: Why?

    >
    > Who cares? It's UB. My guess might be that something gets inlined
    > somewhere or perhaps the order of some function calls is different between
    > the two cases.
    >
    > --
    > Lionel B- Hide quoted text -
    >
    > - Show quoted text -


    It may be undefined behavior, but that doesn't mean that someone
    cannot be curious as to what the compiler is doing. As such, it seems
    like a reasonable question, even if others do not care. It seems to
    me, however, that as this question is a compiler specific question, it
    would be better asked in a forum geared to that particular compiler.

    I would think that a better understanding of the operations of a
    particular compiler would allow users of that compiler to better
    understand issues which they may, or may not have, with that compiler.
    Knowing what error cases a compiler will catch and report, and which
    ones will be ignored, is of obvious use, if you use that compiler. It
    may be undefined behavior, but a good compiler will catch and report
    that. We are long past the stage where "garbage in - garbage out" was
    acceptable to anyone but a newbie. The purpose of a compiler is to
    make it easier for a developer to accomplish a given task. Defrosting
    your fridge may be acceptable by the language definition, but it is
    not acceptable for the implementation of the language. Generating a
    warning, is.

    MSVC++ has several warning levels as well as an option to catch
    warnings that are only detectable when optimization is turned on. I
    would make sure that the warning levels are at 4 and the other option
    turned on. There is also some value is running programs through static
    code analyzers such as PC-Lint, Fortify, Coverity, among others. You
    will find errors in your program. This may be one of them.
    JLS, Mar 5, 2007
    #3
  4. Juha Nieminen

    Lionel B Guest

    On Mon, 05 Mar 2007 07:04:40 -0800, JLS wrote:

    > On Mar 5, 8:08 am, Lionel B <> wrote:
    >> On Mon, 05 Mar 2007 14:37:55 +0200, Juha Nieminen wrote:
    >> > I once made my own smart pointer implementation for a project and
    >> > at one point fought to death to find a malicious bug. The program
    >> > was not working and I couldn't figure out why.


    [snip]

    >> > However, now comes the weird stuff: If I compile a test program
    >> > with gcc with either no optimizations or just with "-O" then it
    >> > indeed does not call the destructor of that class. However, if
    >> > I compile with "-O2" or higher, it *does* call the destructor!
    >> > (It still gives the warning, though.)

    >>
    >> Sounds like you are invoking Undefined Behaviour - in which case all bets
    >> are off and the compiler can do whatever it likes (including defrosting
    >> your fridge). So not so much "weird", perhaps, as "irrelevant".
    >>
    >> > I even tried making SomeClass be derived from a base class with
    >> > a virtual destructor, and both destructors were properly called when
    >> > compiling with "-O2" (but none when compiling with "-O" or without it).

    >>
    >> > I'm curious: Why?

    >>
    >> Who cares? It's UB. My guess might be that something gets inlined
    >> somewhere or perhaps the order of some function calls is different between
    >> the two cases.
    >>
    >> --
    >> Lionel B


    Please don't quote sigs.

    >> - Hide quoted text -
    >> - Show quoted text -


    or this stuff.

    > It may be undefined behavior, but that doesn't mean that someone
    > cannot be curious as to what the compiler is doing. As such, it seems
    > like a reasonable question, even if others do not care. It seems to
    > me, however, that as this question is a compiler specific question, it
    > would be better asked in a forum geared to that particular compiler.


    Definitely. The OP's query was technically OT. Your commentary on the
    other hand is, to my mind, borderline on-topic ;)

    > I would think that a better understanding of the operations of a
    > particular compiler would allow users of that compiler to better
    > understand issues which they may, or may not have, with that compiler.
    > Knowing what error cases a compiler will catch and report, and which
    > ones will be ignored, is of obvious use, if you use that compiler. It
    > may be undefined behavior, but a good compiler will catch and report
    > that.


    As did one of the OP's compilers...

    > We are long past the stage where "garbage in - garbage out" was
    > acceptable to anyone but a newbie. The purpose of a compiler is to
    > make it easier for a developer to accomplish a given task. Defrosting
    > your fridge may be acceptable by the language definition, but it is
    > not acceptable for the implementation of the language. Generating a
    > warning, is.


    Agreed. However in practice, since a compiler cannot possibly report every
    instance of potentially UB, I believe it is actually dangerous to rely on
    your compiler to detect badly-written or semantically incorrect code (and
    bad form to whinge about it when your compiler fails to nanny you to
    your satisfaction).

    > MSVC++ has several warning levels as well as an option to catch
    > warnings that are only detectable when optimization is turned on. I
    > would make sure that the warning levels are at 4 and the other option
    > turned on. There is also some value is running programs through static
    > code analyzers such as PC-Lint, Fortify, Coverity, among others. You
    > will find errors in your program. This may be one of them.


    Sure, I always have my warning level turned up to 11 (and if possible set
    my compiler to treat warnings as errors). But bugs still get through and
    even if it makes me feel better I cannot in all honesty blame the
    compiler for not spotting my own cruddy coding errors.

    --
    Lionel B
    Lionel B, Mar 5, 2007
    #4
    1. Advertising

Want to reply to this thread or ask your own question?

It takes just 2 minutes to sign up (and it's free!). Just click the sign up button to choose a username and then you can ask your own questions on the forum.
Similar Threads
  1. Honne Gowda A
    Replies:
    2
    Views:
    873
    Karl Heinz Buchegger
    Oct 31, 2003
  2. frs
    Replies:
    20
    Views:
    752
    Alf P. Steinbach
    Sep 21, 2005
  3. arun
    Replies:
    2
    Views:
    543
    benben
    Jun 13, 2006
  4. Steven T. Hatton
    Replies:
    2
    Views:
    445
    Steven T. Hatton
    Nov 20, 2006
  5. Belebele
    Replies:
    2
    Views:
    320
    Juha Nieminen
    Aug 29, 2008
Loading...

Share This Page