order of bit fields

Discussion in 'C++' started by Martin Vorbrodt, Nov 1, 2005.

  1. is i have this:

    struct {
    unsigned char bit7: 1;
    unsigned char bit6: 1;
    unsigned char bit5: 1;
    unsigned char bit4: 1;
    unsigned char bit3: 1;
    unsigned char bit2: 1;
    unsigned char bit1: 1;
    unsigned char bit0: 1;

    can i assume that bit0 is the lowest (2^0) and bit7 is the highest (2^7)
    bit? is this guaranteed by the standard or is it implementation dependent?
    Martin Vorbrodt, Nov 1, 2005
    1. Advertisements

  2. Martin Vorbrodt

    Guest Guest

    The order of the bits and the amount of padding (and possibly others) are
    implementation dependent.

    Guest, Nov 1, 2005
    1. Advertisements

  3. Martin Vorbrodt

    Jack Klein Guest

    No, you can't. And you can't assume that the compiler will only use
    an 8-bit char (assuming CHAR_BIT is 8 on your platform) to store it.
    Jack Klein, Nov 2, 2005
  4. Martin Vorbrodt

    Greg Guest

    The order of the bits and the size of an allocated bitfield are not
    just implementation-dependent - they are implementation-defined.

    Therefore, although the standard mandates no particular bit order or
    allocation size of a bitfield, every C++ compiler must nonetheless
    document the bit order and the allocation size of a bitfield compiled
    with that compiler.

    Greg, Nov 2, 2005
  5. do you know of two different compilers with different bit fields order? i'm
    asking because so far i tested gcc and msvc++ and they seam to be
    consistent. bits go from least significant to the most significant, and when
    i use unsigned char for bitfields <= 8 bits, it allocates then at byte
    boundary. do you know a compiler i could test it with that has a radically
    different behaviour?
    Martin Vorbrodt, Nov 2, 2005
  6. Martin Vorbrodt

    Ian Guest

    Well I've used quite a few over the years and never seen one that didn't
    do this.

    Ian, Nov 3, 2005
  7. Martin Vorbrodt

    Greg Guest

    The bit order tends to correlate with the endianess of the target
    processor architecture. Big-endian compilers tend to lay out the bits
    in an order that is the reverse of the order used by a little endian

    Some compilers allow the user to override the default bitfield order.
    For example, Metrowerks CodeWarrior (and more recently, gcc) support a
    #pragma reverse_bitfields directive that will reverse the order of the
    bits in a bitfield from the order that would otherwise have applied.

    Greg, Nov 3, 2005
    1. Advertisements

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments (here). After that, you can post your question and our members will help you out.