enums and signed/unsigned

M

Marcel Müller

The following code generates a warning about an singned/unsigned integer
comparison. I don't understand why.


typedef enum
{ TATTR_NONE = 0x00U
, TATTR_SONG = 0x01U
, TATTR_PLAYLIST = 0x02U
, TATTR_INVALID = 0x08U
, TATTR_WRITABLE = 0x10U
} ATTRIBUTES;

inline static ATTRIBUTES operator|(ATTRIBUTES l, ATTRIBUTES r) \
{ return (ATTRIBUTES)((unsigned)l|r); } \
inline static ATTRIBUTES operator&(ATTRIBUTES l, ATTRIBUTES r) \
{ return (ATTRIBUTES)((unsigned)l&r); } \
inline static ATTRIBUTES& operator|=(ATTRIBUTES& l, ATTRIBUTES r) \
{ return l = (ATTRIBUTES)((unsigned)l|r); } \
inline static ATTRIBUTES& operator&=(ATTRIBUTES& l, ATTRIBUTES r) \
{ return l = (ATTRIBUTES)((unsigned)l&r); } \
inline static ATTRIBUTES operator*(bool l, ATTRIBUTES r) \
{ return (ATTRIBUTES)(-l&(unsigned)r); } \
inline static ATTRIBUTES operator*(ATTRIBUTES l, bool r) \
{ return (ATTRIBUTES)((unsigned)l&-r); } \
inline static ATTRIBUTES operator~(ATTRIBUTES a) \
{ return (ATTRIBUTES)~(unsigned)a; }

static unsigned tattr = TATTR_NONE;

int main()
{ if ( (tattr & (TATTR_PLAYLIST|TATTR_WRITABLE|TATTR_INVALID))
== (TATTR_PLAYLIST|TATTR_WRITABLE) )
return 1;
return 0;
}


It seems that the expression
(unsigned) & (enum type)
always evalueates to signed int regardless of the definition of the enum
constants.

Are enums always signed? And if so, why is

enum X
{ value = UINT_MAX
};

not an error?


Marcel
 
J

James Kanze

The following code generates a warning about an signed/unsigned integer
comparison. I don't understand why.
typedef enum
{ TATTR_NONE = 0x00U
, TATTR_SONG = 0x01U
, TATTR_PLAYLIST = 0x02U
, TATTR_INVALID = 0x08U
, TATTR_WRITABLE = 0x10U
} ATTRIBUTES;
inline static ATTRIBUTES operator|(ATTRIBUTES l, ATTRIBUTES r) \
{ return (ATTRIBUTES)((unsigned)l|r); } \
inline static ATTRIBUTES operator&(ATTRIBUTES l, ATTRIBUTES r) \
{ return (ATTRIBUTES)((unsigned)l&r); } \
inline static ATTRIBUTES& operator|=(ATTRIBUTES& l, ATTRIBUTES r) \
{ return l = (ATTRIBUTES)((unsigned)l|r); } \
inline static ATTRIBUTES& operator&=(ATTRIBUTES& l, ATTRIBUTES r) \
{ return l = (ATTRIBUTES)((unsigned)l&r); } \
inline static ATTRIBUTES operator*(bool l, ATTRIBUTES r) \
{ return (ATTRIBUTES)(-l&(unsigned)r); } \
inline static ATTRIBUTES operator*(ATTRIBUTES l, bool r) \
{ return (ATTRIBUTES)((unsigned)l&-r); } \
inline static ATTRIBUTES operator~(ATTRIBUTES a) \
{ return (ATTRIBUTES)~(unsigned)a; }
static unsigned tattr = TATTR_NONE;
int main()
{ if ( (tattr & (TATTR_PLAYLIST|TATTR_WRITABLE|TATTR_INVALID))
== (TATTR_PLAYLIST|TATTR_WRITABLE) )
return 1;
return 0;
}
It seems that the expression
(unsigned) & (enum type)
always evalueates to signed int regardless of the definition of the enum
constants.
Are enums always signed?

An enum defines a new type, which is neither signed nor
unsigned. When used in an expression, however, in most cases
(and always when it is an operand to a binary operator, but
*not* when it is an argument to a function, such as a user
defined overloaded operator), integral promotion takes place,
and the enum is converted to the first of int, unsigned, long,
unsigned long, long long or unsigned long long which can
represent all of its legal values. In your case, this would be
an int.
And if so, why is
enum X
{ value = UINT_MAX
};
not an error?

Because int cannot represent UINT_MAX, so the underlying type
must be something else. In this case, it must be unsigned int,
because an implementation isn't allowed to make it a type larger
than int if either int or unsigned int would work.

On the other hand, the following

enum X { a = UINT_MAX, b = -1 };

must be of a type larger than int, since neither int nor
unsigned int can represent all of its values; as soon as any
value is negative, it must be a signed type. Which raises the
question: what should the type of:

enum X { a = ULLLONG_MAX, b = -1 };

be? I would expect it to be a compiler error, but it compiles
under g++. And g++'s typeinfo stuff is designed to be almost
totally unusable: typeid(anX | 0).name() returns "n", which of
course tells me absolutely nothing (but sizeof tells me it has
16 bytes, which is a good start---except that INT128_MAX
isn't defined, even when I include <stdint.h>).
 
J

James Kanze

On Sun, 12 May 2013 16:50:47 +0200, Marcel M�ller
Yes, enums are of type int, not unsigned int.

That's completely wrong. Each enum defines a new type. And the
underlying type can be any promoted integral type, depending on
the values in the enum.
Because UINT_MAX is converted to int and value is evaluated as -1.

Only if the compiler is completely broken.
 
B

Balog Pal

The following code generates a warning about an singned/unsigned integer
comparison. I don't understand why.


typedef enum
{ TATTR_NONE = 0x00U
, TATTR_SONG = 0x01U
, TATTR_PLAYLIST = 0x02U
, TATTR_INVALID = 0x08U
, TATTR_WRITABLE = 0x10U
} ATTRIBUTES;

It seems that the expression
(unsigned) & (enum type)
always evalueates to signed int regardless of the definition of the enum
constants.

The enumerators do not have 'type' like int or unsigned int, and will
not reflect the type of "initializers", just the value.
Are enums always signed? And if so, why is

The underlying type is chosen by the compiler to be able to represent
every value (and all of them or-ed together). for your case it can be
char, unsigned char, or even a 5-bit thing.

If you want to force the underlying type, you can do that with C++11
(and a compiler supporting that feature), look up the changed for enum
("enum class" or "strongly typed enum"), those may solve your problem.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

Forum statistics

Threads
473,755
Messages
2,569,535
Members
45,007
Latest member
obedient dusk

Latest Threads

Top