wrong compiler warning?

Z

zeppe

Hi all,

I've a problem. The code that follows creates a warning in both gcc and
visual c++. However, I think it's correct: basically, there is a
function that return an object of a derived class, that's bounded to a
base class reference to delay the destruction of the actual object to
the end of the reference scope. Actually, I don't use the reference: the
code that matters is in the destructor, and I want it to be executed at
the end of the method.

Really strange, both the compilers warn me about the "useless"
reference... I'm feared that if the compilers "think" that the reference
is useless, they could try some dangerous optimization...

Here follows a simple example:


#include <iostream>

class Base
{
public:
Base() { std::cout << "Base\n"; std::cout.flush(); }
virtual ~Base() { std::cout << "~Base\n"; std::cout.flush(); }

virtual char* Name() const { return "Base"; }
};

class Derived : public Base
{
public:
Derived() { std::cout << "Derived\n"; std::cout.flush(); }
~Derived() { std::cout << "~Derived\n"; std::cout.flush(); }

char* Name() const { return "Derived"; }
};

Derived foo()
{
std::cout << "foo function\n"; std::cout.flush();
return Derived();
}

int main()
{
std::cout << "begin of the program\n"; std::cout.flush();
const Base& obj = foo();
std::cout << "this is the program body\n"; std::cout.flush();

return 0;
}




Thank you,

Zeppe
 
V

Victor Bazarov

zeppe said:
I've a problem. The code that follows creates a warning in both gcc
and visual c++. However, I think it's correct: basically, there is a
function that return an object of a derived class, that's bounded to a
base class reference to delay the destruction of the actual object to
the end of the reference scope. Actually, I don't use the reference:
the code that matters is in the destructor, and I want it to be
executed at the end of the method.

Just declare an object instead of a reference.

V
 
Z

zeppe

Victor said:
Just declare an object instead of a reference.

V

if i declare an object instead of a reference i lose the type polymorphism.

i.e., output with reference:

begin of the program
foo function
Base
Derived
this is the program body
~Derived
~Base


output with object:

begin of the program
foo function
Base
Derived
~Derived
~Base
this is the program body
~Base


Of course in the real world i don't know that foo returns Derived, so
the object should be Base.

Zeppe
 
V

Victor Bazarov

zeppe said:
if i declare an object instead of a reference i lose the type
polymorphism.

What polymorphism? Polymorphism in deleting it? D-tors cannot call
virtual functions polymorphically. If you want to rely on proper
d-tor to be called, you probably need to use 'auto_ptr' and return
a pointer from your function, not a temporary.
i.e., output with reference:

[..]

Of course in the real world i don't know that foo returns Derived, so
the object should be Base.

Huh?

V
 
Z

zeppe

Victor said:
What polymorphism? Polymorphism in deleting it? D-tors cannot call
virtual functions polymorphically. If you want to rely on proper
d-tor to be called, you probably need to use 'auto_ptr' and return
a pointer from your function, not a temporary.

I don't see the difference. If I have

class Derived: public Base

and i return a temporary Derived, and I attach a reference const Base&,
I have polymrphism in descructor, that is, i delay the Temporary
destruction until the end of the reference lifetime.

I have

Derived foo();

I can solve writing:

Derived obj = foo();

but if I have a more complex sitation and i don't know that foo returns
a Derived, i can only rely in auto_ptr or in references.

Bye,

Zeppe
 
V

Victor Bazarov

zeppe said:
[..]
I have

Derived foo();

I can solve writing:

Derived obj = foo();

but if I have a more complex sitation and i don't know that foo
returns a Derived, i can only rely in auto_ptr or in references.

Your 'foo' returns _an_object_. How can you not know what it is?
Your 'foo' doesn't return a reference, in which case you could claim
the ability to use polymorphism. If it's slicing you're trying to
avoid, your 'foo' has already sliced everything, binding a reference
to it is pointless.

V
 
Z

zeppe

Victor Bazarov ha scritto:
Your 'foo' returns _an_object_. How can you not know what it is?
> Your 'foo' doesn't return a reference, in which case you could claim
> the ability to use polymorphism.

You are right. But actually, in a more complex example, the real object
could be somewhat hidden, for example with traits. Consider the
following example (a bit longer):

#include <iostream>

class Base
{
public:
Base() { std::cout << "Base\n"; std::cout.flush(); }
virtual ~Base() { std::cout << "~Base\n"; std::cout.flush(); }

virtual char* Name() const { return "Base"; }
};

class Derived : public Base
{
public:
Derived() { std::cout << "Derived\n"; std::cout.flush(); }
~Derived() { std::cout << "~Derived\n"; std::cout.flush(); }

char* Name() const { return "Derived"; }
};



template <class T>
struct Bar_traits
{
typedef Base ret_type;
};


template <>
struct Bar_traits<int>
{
typedef Derived ret_type;
};



template <class T>
typename Bar_traits<T>::ret_type foo(){
std::cout << "foo function\n"; std::cout.flush();
return typename Bar_traits<T>::ret_type();
}


int main()
{
std::cout << "begin of the program\n"; std::cout.flush();
const Base& obj = foo<int>();
std::cout << "this is the program body\n"; std::cout.flush();

return 0;
}



I know I could declare a traits, i.e. Bar_traits<int> obj = foo<int>();
but is not always clean to do so. I was asking if binding the return
temporary type polimorphically on a reference is allowed.
If it's slicing you're trying to
avoid, your 'foo' has already sliced everything, binding a reference
to it is pointless.

Why has sliced? I returned a 'Derived', that's all. Binding a 'Derived'
with a 'Base' reference it's illegal (I don't think)? It produces slicing?

Thank you for answers.

Zeppe
 
V

Victor Bazarov

zeppe said:
Victor Bazarov ha scritto:


You are right. But actually, in a more complex example, the real
object could be somewhat hidden, for example with traits. Consider the
following example (a bit longer):

#include <iostream>

class Base
{
public:
Base() { std::cout << "Base\n"; std::cout.flush(); }
virtual ~Base() { std::cout << "~Base\n"; std::cout.flush(); }

virtual char* Name() const { return "Base"; }
};

class Derived : public Base
{
public:
Derived() { std::cout << "Derived\n"; std::cout.flush(); }
~Derived() { std::cout << "~Derived\n"; std::cout.flush(); }

char* Name() const { return "Derived"; }
};



template <class T>
struct Bar_traits
{
typedef Base ret_type;
};


template <>
struct Bar_traits<int>
{
typedef Derived ret_type;
};



template <class T>
typename Bar_traits<T>::ret_type foo(){
std::cout << "foo function\n"; std::cout.flush();
return typename Bar_traits<T>::ret_type();
}


int main()
{
std::cout << "begin of the program\n"; std::cout.flush();
const Base& obj = foo<int>();
std::cout << "this is the program body\n"; std::cout.flush();

return 0;
}



I know I could declare a traits, i.e. Bar_traits<int> obj =
foo<int>(); but is not always clean to do so.

Not always? What are you talking about?

All you need to do is to use the function itself to introduce the
type. With introduction of TR1 it's going to be very easy, AFAIUI:

std::tr1::result_of<foo<int> > obj = foo<int>();

Meanwhile you can probably use 'Boost'.
I was asking if binding
the return temporary type polimorphically on a reference is allowed.

Yes, it's allowed. That's why you have a *warning* and not an *error*
in your program. Simply ignore your warning if you don't want to make
changes to have your program compile cleanly.
Why has sliced? I returned a 'Derived', that's all. Binding a
'Derived' with a 'Base' reference it's illegal (I don't think)? It
produces slicing?

No. Binding a reference does not introduce slicing. Read what I wrote
more carefully, and think about it before replying.

Now, think about it. Next time somebody makes your 'foo<int>' return
a type that is not a descendant of 'Base'. What do you do then? That
is why you shouldn't stop mid-stream when making your stuff generic.
If your function relies on some traits template to return a proper
object type, use the same traits to declare the receiving object.

Is there anything else here that needs clarification?

V
 
Z

zeppe

Victor Bazarov ha scritto:
Not always? What are you talking about?

Bacause sometimes the template matching is automatic and based on the
function argument list, and the template arguments are many, and I don't
want to write it explicitely, because is useless, dirty and error prone.
Simply that.
All you need to do is to use the function itself to introduce the
type. With introduction of TR1 it's going to be very easy, AFAIUI:

std::tr1::result_of<foo<int> > obj = foo<int>();

Meanwhile you can probably use 'Boost'.

Thank you for the suggestion. Actually, I could use traits in a similar
way, but I would avoid to explicitely list the template arguments.
No. Binding a reference does not introduce slicing. Read what I wrote
more carefully, and think about it before replying.

Yes, I answered like that because I can't understand where the slicing
could take place in my original program. Since it don't seem to me that
foo could cause slicing (maybe i don't have yet well understood your
initial comment), I was asking if some other piece of code were illegal.
Now, think about it. Next time somebody makes your 'foo<int>' return
a type that is not a descendant of 'Base'. What do you do then?

Compile-time error? I want the reference to the correct object because I
want the correct destructor to be called at the end of the reference
variable life, but I *do* want that the return object of the foo method
is derived from a certain type (i.e., sometimes I need to use some
'Base' methods).

That
is why you shouldn't stop mid-stream when making your stuff generic.
If your function relies on some traits template to return a proper
object type, use the same traits to declare the receiving object.

Is there anything else here that needs clarification?

For the reasons above, I want to know that the returned object is
derived from Base, I don't want the stuff to be too generic.

End of all, I still don't understand why the compiler gives me the error
that the variable isn't used. It *is* (in the destructor), and I don't
think my code is unsafe.

Thank you for your patience.

Zeppe
 
V

Victor Bazarov

zeppe said:
[..]
End of all, I still don't understand why the compiler gives me the
error that the variable isn't used.

Uh... It doesn't give you the *error*. It gives you a *warning*.
Your code is not ill-formed. Just ignore the warning, will you? Just
let it go. Let it be. Forgedaboudit. Disable it using compiler-
specific means (like a command-line switch or a pragma), if you are
startled by its appearing in your compiler output.

It is not a common idiom to return an object, bind it to a reference
and wait till the reference goes out of scope. Yes, it does prolong
the lifetime of the temporary. It's just not used so often, that's
why the compiler writers decided to let you know that you might be
missing something, probably. I am not sure I need to explain to you
the merit of compiler *warnings*, but here is an example:

Blah blah1, &blah2 = somefunction();

blah1.foo();
blah1.foo(); // inteded to say 'blah2.foo()', but made a typo

Here, the compiler warns you of 'blah2' being unused. And it is
unused, for all intents and purposes. Make it const, and it does
change something, but not enough to justify dropping the warning.

Again, I feel a bit strange trying to justify a compiler warning to
you. Contact your compiler makers if you feel that a warning is not
appropriate in that case, and see what they tell you.
It *is* (in the destructor), and
I don't think my code is unsafe.

What makes you think that warning has anything to do with safety?

V
 
Z

zeppe

Victor Bazarov ha scritto:
Here, the compiler warns you of 'blah2' being unused. And it is
unused, for all intents and purposes. Make it const, and it does
change something, but not enough to justify dropping the warning.

Again, I feel a bit strange trying to justify a compiler warning to
you.

For example, I don't understand why:

Derived obj = foo();

doesn't give me any warning, obj being not used, where

const Derived& obj = foo();

does.
What makes you think that warning has anything to do with safety?

uhm. You are right. I've noticed that often the compiler is quite
"smart": for instance:

Derived obj = foo();

doesn't give me any warn even if obj is not used (i thought that it were
'cause of the destructor being called), where

int obj = foo2(); // foo2 returns int

does (obj not used). So i've always thought that the compiler warn me
about code that is unsafe, unsafe in other architectures (i.e., some
implicit casts), error-prone (missing parenthesis, etc): but always,
where the code is correct, there is a simple way to make it compile
without warns without changing the program logic.

It's possible that if the compiler warn me about a not used object, in
the optimization removes that object introducing a bug in my program?

Thank you!

Zeppe
 
V

Victor Bazarov

zeppe said:
[..]
It's possible that if the compiler warn me about a not used object, in
the optimization removes that object introducing a bug in my program?

I strongly doubt that. There are several explicitly allowed types of
optimization the compiler can use, and this is not one of them. No
compiler exists without a flaw, either, so if you find something out
of the ordinary (by using good test cases), contact the compiler vendor
or manufacturer.

V
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

No members online now.

Forum statistics

Threads
473,769
Messages
2,569,582
Members
45,062
Latest member
OrderKetozenseACV

Latest Threads

Top