Problem with Visual C++ 7.1.3088 and bit-fields?

D

Dennis McCrohan

Hi-

I'm wondering if anyone has any feedback or knowledge of the following
problem. I have searched the MSDN Knowledge Base and not found anything
relevant...

I have an enum similar to:

enum FruitType { apple, pear, peach, mango };

And a class FruitContainer that has a member variable of FruitType,
declared with a bit-width of 2:

FruitType m_myFruit : 2;

and access functions:

FruitType GetMyFruit () { return m_myFruit; }

void SetMyFruit(FruitType myFruit) { m_myFruit = myFruit; }

What we are seeing the Visual C++ 7.1 debugger is that if we call
SetMyFruit() with a value of apple or pear, and then retrieve the value
using GetMyFruit(), everything works as expected. But if we pass
SetMyFruit() the value mango, it ends up getting interpreted as the
integer value -1, and that is what is returned by GetMyFruit(). It
appears that the compiler or run-time is confused and considering the
enum value to be a signed quantity. The same code works correctly on
Solaris (compiled using Workshop) or Linux (using gcc). If I remove the
bit-width from the declaration, the code works correctly on Windows.
Also works if I set the bit-width to 3.

Thanks,

-Dennis McCrohan

Any biases, misguided opinions, or outright idiocy expressed in this
message are absolutely my own, and do not represent those of my
employer...
 
P

Pete Becker

Dennis said:
enum FruitType { apple, pear, peach, mango };


FruitType m_myFruit : 2;

The underlying type for an enumeration has to be an integral type that's
large enough to represent the values. There's no requirement that it be
unsigned if there are no negative values. Looks like the compiler is
using a signed type, which is allowed. So either change your bitfield
width to 3, declare the bitfield type as unsigned (which will require
some casts), or get rid of the bitfield entirely. That last one is
probably best: bitfields are best used for things like hardware access,
where bit positions matter.
 
V

Victor Bazarov

Dennis said:
I'm wondering if anyone has any feedback or knowledge of the following
problem. I have searched the MSDN Knowledge Base and not found anything
relevant...

I have an enum similar to:

enum FruitType { apple, pear, peach, mango };

And a class FruitContainer that has a member variable of FruitType,
declared with a bit-width of 2:

FruitType m_myFruit : 2;

and access functions:

FruitType GetMyFruit () { return m_myFruit; }

void SetMyFruit(FruitType myFruit) { m_myFruit = myFruit; }

What we are seeing the Visual C++ 7.1 debugger is that if we call
SetMyFruit() with a value of apple or pear, and then retrieve the value
using GetMyFruit(), everything works as expected. But if we pass
SetMyFruit() the value mango, it ends up getting interpreted as the
integer value -1, and that is what is returned by GetMyFruit(). It
appears that the compiler or run-time is confused and considering the
enum value to be a signed quantity. The same code works correctly on
Solaris (compiled using Workshop) or Linux (using gcc). If I remove the
bit-width from the declaration, the code works correctly on Windows.
Also works if I set the bit-width to 3.

The underlying type for the 'enum FruitType' is apparently signed. You
have no control over that, I believe. The compiler picks the underlying
type for you (see 7.2/5). The compiler does not "get confused", it just
picks whatever it thinks is right.

If you want to make sure what type is used, do

typedef unsigned char FruitType;
FruitType const apple = 0, pear = 1, peach = 2, mango = 3;
....

Of course it has a side effect: the type is not unique and does not
differ from unsigned char, but at least you know it's unsigned...

Victor
 
?

=?ISO-8859-1?Q?=22Nils_O=2E_Sel=E5sdal=22?=

Dennis said:
Hi-

I'm wondering if anyone has any feedback or knowledge of the following
problem. I have searched the MSDN Knowledge Base and not found anything
relevant...

I have an enum similar to:

enum FruitType { apple, pear, peach, mango };

And a class FruitContainer that has a member variable of FruitType,
declared with a bit-width of 2:
Then enums are probably integers (not unsigned integers).
I'd guess if you make it of size 3, it would work.

But why use a bitfield here ?
 
O

Old Wolf

Pete Becker said:
bitfields are best used for things like hardware access,
where bit positions matter.

How can you say that, given that in C++ the alignment and
allocation of bitfields is implementation-defined? In fact
I think they don't even have to be in the order that they
were declared.
(C99 is a bit stricter: the bits have to be in the order
declared and with no padding -- but it's still implementation-
defined whether the bits number left-to-right or right-to-left,
and what happens across unit boundaries)
 
P

Pete Becker

Old said:
How can you say that, given that in C++ the alignment and
allocation of bitfields is implementation-defined?

The compiler typically lines things up to match the hardware that it's
targeting.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

No members online now.

Forum statistics

Threads
473,768
Messages
2,569,574
Members
45,050
Latest member
AngelS122

Latest Threads

Top