order of bit fields

  • Thread starter Martin Vorbrodt
  • Start date
M

Martin Vorbrodt

is i have this:

struct {
unsigned char bit7: 1;
unsigned char bit6: 1;
unsigned char bit5: 1;
unsigned char bit4: 1;
unsigned char bit3: 1;
unsigned char bit2: 1;
unsigned char bit1: 1;
unsigned char bit0: 1;
};

can i assume that bit0 is the lowest (2^0) and bit7 is the highest (2^7)
bit? is this guaranteed by the standard or is it implementation dependent?
 
G

Guest

Martin Vorbrodt said:
is i have this:

struct {
unsigned char bit7: 1;
unsigned char bit6: 1;
unsigned char bit5: 1;
unsigned char bit4: 1;
unsigned char bit3: 1;
unsigned char bit2: 1;
unsigned char bit1: 1;
unsigned char bit0: 1;
};

can i assume that bit0 is the lowest (2^0) and bit7 is the highest (2^7)
bit? is this guaranteed by the standard or is it implementation dependent?

The order of the bits and the amount of padding (and possibly others) are
implementation dependent.

Ali
 
J

Jack Klein

is i have this:

struct {
unsigned char bit7: 1;
unsigned char bit6: 1;
unsigned char bit5: 1;
unsigned char bit4: 1;
unsigned char bit3: 1;
unsigned char bit2: 1;
unsigned char bit1: 1;
unsigned char bit0: 1;
};

can i assume that bit0 is the lowest (2^0) and bit7 is the highest (2^7)
bit? is this guaranteed by the standard or is it implementation dependent?

No, you can't. And you can't assume that the compiler will only use
an 8-bit char (assuming CHAR_BIT is 8 on your platform) to store it.
 
G

Greg

Martin said:
is i have this:

struct {
unsigned char bit7: 1;
unsigned char bit6: 1;
unsigned char bit5: 1;
unsigned char bit4: 1;
unsigned char bit3: 1;
unsigned char bit2: 1;
unsigned char bit1: 1;
unsigned char bit0: 1;
};

can i assume that bit0 is the lowest (2^0) and bit7 is the highest (2^7)
bit? is this guaranteed by the standard or is it implementation dependent?

The order of the bits and the size of an allocated bitfield are not
just implementation-dependent - they are implementation-defined.

Therefore, although the standard mandates no particular bit order or
allocation size of a bitfield, every C++ compiler must nonetheless
document the bit order and the allocation size of a bitfield compiled
with that compiler.

Greg
 
M

Martin Vorbrodt

Greg said:
dependent?

The order of the bits and the size of an allocated bitfield are not
just implementation-dependent - they are implementation-defined.

Therefore, although the standard mandates no particular bit order or
allocation size of a bitfield, every C++ compiler must nonetheless
document the bit order and the allocation size of a bitfield compiled
with that compiler.

Greg

do you know of two different compilers with different bit fields order? i'm
asking because so far i tested gcc and msvc++ and they seam to be
consistent. bits go from least significant to the most significant, and when
i use unsigned char for bitfields <= 8 bits, it allocates then at byte
boundary. do you know a compiler i could test it with that has a radically
different behaviour?
 
I

Ian

Martin said:
do you know of two different compilers with different bit fields order? i'm
asking because so far i tested gcc and msvc++ and they seam to be
consistent. bits go from least significant to the most significant, and when
i use unsigned char for bitfields <= 8 bits, it allocates then at byte
boundary. do you know a compiler i could test it with that has a radically
different behaviour?
Well I've used quite a few over the years and never seen one that didn't
do this.

Ian
 
G

Greg

Martin said:
do you know of two different compilers with different bit fields order? i'm
asking because so far i tested gcc and msvc++ and they seam to be
consistent. bits go from least significant to the most significant, and when
i use unsigned char for bitfields <= 8 bits, it allocates then at byte
boundary. do you know a compiler i could test it with that has a radically
different behaviour?

The bit order tends to correlate with the endianess of the target
processor architecture. Big-endian compilers tend to lay out the bits
in an order that is the reverse of the order used by a little endian
compiler.

Some compilers allow the user to override the default bitfield order.
For example, Metrowerks CodeWarrior (and more recently, gcc) support a
#pragma reverse_bitfields directive that will reverse the order of the
bits in a bitfield from the order that would otherwise have applied.

Greg
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Similar Threads


Members online

No members online now.

Forum statistics

Threads
473,744
Messages
2,569,484
Members
44,903
Latest member
orderPeak8CBDGummies

Latest Threads

Top