Syntax for union parameter

Discussion in 'C Programming' started by Rick C. Hodgin, Jan 29, 2014.

  1. Rick C. Hodgin

    David Brown Guest

    You both seem to misunderstand that standards-defined headers, such as
    <stdint.h>, /are/ part of the language. They are not "hacks" or
    "add-ons" - the C standards say exactly what should and should not be in

    Why do you think it matters if int32_t is defined in a header that comes
    as part of the toolchain, or if it is built in like "int"? The language
    defines the type "int32_t", and says exactly what it should be. You use
    it in the same way regardless of how it happens to be implemented.

    It is common in C, and in a great many other languages, for features to
    be implemented as part of the standard library rather than inside the
    compiler itself. This greatly simplifies the implementation, while also
    making it easier to make code compatible across different versions of
    the language. (When using pre-C99 C, you can write your own int32_t
    definitions - if it were built in like "int", you could not do so - or
    at least, not as easily.)

    And of course, you are both making wildly unsupported assumptions that
    the types "int32_t", etc., are defined as typedefs of native types.

    First, native types are /not/ variable-sized on any given platform. The
    sizes are implementation-dependent - that means that the sizes can vary
    between implementation, but within a given implementation, the sizes are
    fixed. So the people writing <stdint.h> for any given toolchain know
    /exactly/ what size "int", "short", "long", etc., are - and if those
    sizes ever change, they change <stdint.h> to match.

    Secondly, there is absolutely no requirement that these types be
    implemented as typedefs (although obviously that is quite a common
    method). The compiler is free to implement them as built-in types
    (though it can't treat them as keywords, as it does for "int", "long"
    and "short"), or using compiler-specific extensions. As an example, in
    avr-gcc they are defined as typedefs but use a compiler-specific
    extension rather than plain "int", "short", etc.:

    (excerpt from <stdint.h> from avr-gcc-4.5.1)

    typedef int int8_t __attribute__((__mode__(__QI__)));
    typedef unsigned int uint8_t __attribute__((__mode__(__QI__)));
    typedef int int16_t __attribute__ ((__mode__ (__HI__)));
    typedef unsigned int uint16_t __attribute__ ((__mode__ (__HI__)));
    typedef int int32_t __attribute__ ((__mode__ (__SI__)));
    typedef unsigned int uint32_t __attribute__ ((__mode__ (__SI__)));
    #if !__USING_MINT8
    typedef int int64_t __attribute__((__mode__(__DI__)));
    typedef unsigned int uint64_t __attribute__((__mode__(__DI__)));

    The reason this is done is that this particular toolchain supports a
    non-standard command-line flag to make "int" 8-bit. It is not often
    used, but for some code it is vital to make the object code as small and
    fast as possible (the AVR is 8-bit). So the header file uses this
    compiler extension which gives exact type sizes.
    David Brown, Feb 8, 2014
    1. Advertisements

  2. Rick C. Hodgin

    David Brown Guest

    Note that the documentation here can often be written as:

    #include <limits.h>

    _Static_assert(CHAR_BIT == 8,
    "This code is written with the assumption that char is 8-bits");
    _Static_assert(sizeof(int) == 4,
    "This code is written with the assumption that int is 32-bits");

    This sort of thing makes the assumptions clear, and makes the code fail
    to compile (with a clear error message) if the assumptions do not hold.

    (If your compiler does not support C11 _Static_assert() yet, then it's
    not hard to make own with a couple of macros. But the native static
    assertion is the best choice if it is supported.)
    David Brown, Feb 8, 2014
    1. Advertisements

  3. There is very little practical difference.
    C requires long long to be at least 64 bits, and I have yet to see a
    C compiler where it isn't exactly 64 bits.
    Keith Thompson, Feb 9, 2014
  4. You've said that you care about the CPU instructions generated for your

    Write a C program that uses, for example, uint32_t. Write an equivalent
    program that uses unsigned int on a platform where that type is 32 bits
    wide. Compare the generated code.

    You obsess about things that don't matter.
    Keith Thompson, Feb 9, 2014
  5. Rick C. Hodgin

    Ian Collins Guest

    Even if a platform were to have a long long bigger than 64 bits, there's
    int64_t... So there really isn't anything lacking from C in this
    context (unless the platform had something obscure like CHAR_BIT=9).
    Ian Collins, Feb 9, 2014
  6. Damn! There goes yet another irony-meter.

    You owe me big time, now.
    Kenny McCormack, Feb 9, 2014
  7. Rick C. Hodgin

    Ian Collins Guest

    It would be interesting to see Rick's manual functions for 64 bits ints
    for that machine...
    Ian Collins, Feb 9, 2014
  8. Rick C. Hodgin

    BartC Guest

    Why are they optional then? If I leave out stdint.h, I get: "error: unknown
    type name 'int32_t'". Doesn't look like it's an essential, fundamental part
    of the language!
    I might use 'int32_t' for example, to guarantee a certain bitwidth, when
    using 'int' would be just too vague. What's odd is then finding out that
    int32_t is defined in terms of int anyway!
    It's less common for essential primitive types to be defined in a standard
    Not really. It's just another entry in a symbol table which happens to mean
    the same as the equivalent int. However having them in the compiler *would*
    simplify (1) the distribution by not needing stdint.h, and (2) a million
    user programs which don't need to explicitly include stdint.h.
    Except they wouldn't; they would use s32 or u32 (or i32 and u32 in my case).
    'int32_t' et al are just too much visual clutter in a language which already
    has plenty.
    BartC, Feb 9, 2014
  9. Exactly. You understand, Bart.

    There is an external requirement in using stdint.h, one that is not part
    of the compiler's built-in abilities. It is an add-on hack which was, as
    you stated, the easier way to provide apparently bit-size compatibility
    without really doing anything. And whereas it may be defined as part
    of the official spec ... any developer could've arrived upon that solution
    on their own (as I did, and as you indicate below you did with i32 and u32).
    Exactly! C is silly in all of its allowances (int will be at least 16-bits,
    but can be anything really, just check with the specific implementation and
    learn about your compiler author, his whims, his preferences, and the
    underlying computer architecture, and then you will know what you're
    dealing with ... oh, and if you want an explicit size ... then you're out
    of luck unless your compiler is C99 compliant and/or includes the stdint.h
    file, or unless you've manually determined that a particular type is a
    particular size on a particular platform in a particular version for a particular build). "Oh, but it allows for faster executable code!"

    I say in reply (with a very specific countenance, well known inflection,
    and easily identifiable intonation): "Seriously?"
    Ding ding ding! That is the winning answer. :)
    It's amazing to me, BartC. It's like there's this total division in what
    people place value on occurring here. They either see it the C way (where
    including an external file (or several to obtain other similar features),
    one which uses a wholly clunky naming convention, is a good thing), or they recognize that such information should be part of the innate compiler without
    any external requirement.

    To be honest, I do not understand this disparity at all, though it resonates
    in all great circles to our antipodal point of the universe.

    Best regards,
    Rick C. Hodgin
    Rick C. Hodgin, Feb 9, 2014
  10. Rick C. Hodgin

    Tonton Th Guest

    Using another language, like Perl or INTERCAL.
    Tonton Th, Feb 9, 2014
  11. Rick C. Hodgin

    James Kuyper Guest

    To enable legacy code, which might contain conflicting definitions of
    the same identifiers, to be compilable without any special handling.
    Such code will generally NOT contain #include <stdint.h>, so no
    modifications are needed for it to compile properly.

    This isn't the only way they could have achieved that result, a new
    standard #pragma would have worked, too. However, <stdint.h> can be
    added to a existing implementation relatively easily - it's just normal
    C code containing typedefs - the only tricky part is choosing the right
    type for each typedef. Adding recognition of a new #pragma, and enabling
    special handling of those identifiers depending upon whether or not that
    #pragma had been turned on, is a bit more complicated.
    James Kuyper, Feb 9, 2014
  12. I thought your solution was a duplicated set of typedefs that need to
    checked manually? That's not what stdint.h is.

    Seriously? You know enough about what is common in a great many other
    languages? I can't say you don't, but I've come to a different
    conclusion from my limited experience.

    Of course it's a winning answer in one respect -- it can't be wrong,
    because any type so defined can be declared to be not primitive.

    Ben Bacarisse, Feb 9, 2014
  13. Rick C. Hodgin

    BartC Guest

    There must be better ways of dealing with legacy code that don't require (1)
    supporting everything that ever existed in the language and (2) new
    additions to keep getting uglier and uglier (and longer) to avoid name
    conflicts (eg '_Bool', 'uint64_t').

    (Since most modules need stdio.h, a language version code could be applied
    to that.)

    (There could also be better ways of getting minimum and maximum type limits.
    INT_MAX works, but suppose I have:

    typedef int32_t T;

    How do I get the maximum value of T in a tidy manner? What I'm saying is
    that a trivial addition to the syntax could allow you to say: T'MAX or
    short'MIN or long int'BITS, some attribute that can be applied to any type,
    without needing include files full of sprawling lists of #defines. However
    doing things elegantly doesn't seem to be the C style...)
    BartC, Feb 9, 2014
  14. Rick C. Hodgin

    Jorgen Grahn Guest

    From my fairly narrow perspective, what's left undefined is more or
    less what couldn't be defined without:
    - a speed penalty on (once) popular architectures
    - preventing C from running on architectures which seemed
    to be popular and important
    - forcing incompatible changes in things that are bigger than
    the C language. E.g., it is (or used to be) hard to convince
    Unix vendors to put more intelligence in the linker.

    Frankly, I don't understand why people are so hung up the phrase
    "undefined" nowadays. It wasn't like that until recently. Some kind
    of fashion? Or is there an influx of Java refugees?

    I don't think I ever saw it as anything but a necessity -- and I came
    to C after having studied more pure and academical languages. I mean,
    it's a fact: if you want to implement something like C pointer
    arithmetic on (say) a 7MHz Motorola 68000 and you want it to be fast,
    you cannot define what happens at out of bounds access. And so on.

    Jorgen Grahn, Feb 9, 2014
  15. Duplicated in what way? With regards to stdint.h? Visual Studio 2008
    doesn't come with stdint.h. And, from what I've read, even Visual
    Studio 2010, which does come with stdint.h, uses u_int32_t, rather than
    uint32_t, for unsigned, so a duplicate typedef is required for code
    there as well. And, I'll admit that Microsoft's C99 support is lacking,
    so no surprise there.
    Someone had to check the ones in stdint.h. And I would estimate also
    that any self-respecting developer would check those. In fact, when
    I run configure scripts on Linux source files to build some version of
    an application, I almost always see "checking to see integer size" and
    other similar messages during the build script.

    People know these things vary like hairdos ... so they must always
    test for them.
    I've never had another language where fundamental data types are of
    variable size. From assembly through Java, they are a known size.
    Only in the land of C, the home of the faster integer for those
    crucial "for (i=0; i<10; i++)" loops, do we find them varying in
    Yes, the epitome of the bolt-on [strike]work[/strike]hack-around.

    Best regards,
    Rick C. Hodgin
    Rick C. Hodgin, Feb 9, 2014
  16. Yes, that would be handy. I think the C way would be to introduce new
    operators like sizeof that can be applied to type names or expressions,
    but the syntax is not the issue.
    The interesting question (to me) is what motivates some changes getting
    into the language and not others. I think the committee is conservative
    by nature, based, maybe, on a general sense of "it's been OK so far".
    Maybe there is knock-on effect from C99 being ignored by Microsoft.

    Perhaps the existence of C++ has taken the pressure off from endlessly
    adding new features to C. Fortran, which is even older, has had no fork
    of "Fortran++", so everything has gone into the core standard. I think
    C benefits from not being C++ despite the fact that I use C++ every time
    I actually want to get something done.
    Ben Bacarisse, Feb 9, 2014
  17. Rick C. Hodgin

    Robbie Brown Guest

    I've just completed my first project in C, very simple, very basic first
    year computer science stuff, A 'Bounded Array' component and an
    encapsulated Linked List and a Stack and FIFO Queue that 'extended' the
    List by using only selected functions that 'make sense' in terms of
    those latter two structures. Everything is done with pointers for
    maximum flexibility, no assumption is made about the 'type' of data
    stored and there are no hard coded 'magic numbers'

    I spent 10% of the time figuring out the logic and writing code and 90%
    of the time fretting about whether there was some arcane rule that I had
    violated that meant that although the thing worked as far as I could
    tell there was no way of actually knowing because unless I read and
    inwardly digested the entire language spec something may be 'undefined' ...

    This is my particular problem at the moment, something can appear to
    work no matter how hard I test it yet still be 'undefined' in terms of
    the language spec ... it just makes me feel uneasy.

    Oh yes, and apart from never wearing brown shoes I don't 'do fashion'
    And how long do you think this can go on for? Will 'the committee' still
    be determined to support a Babbage difference engine in 100 years time.
    How can something progress and improve when it has to support everything
    that ever existed?

    Anyway, I look on this as an interesting academic exercise now, If I
    want to be productive I'll use Java, if I want to really exercise the
    old gray matter I keep on with C. I've just spent the day deconstructing
    the (main) stack (frame) from 0x7fffffffefff all the way down to
    0x7fffff7af000 ... I would NEVER have bothered before.

    Not sure my wife is so enthusiastic though.
    Robbie Brown, Feb 9, 2014
  18. It's backwards, Dr Nick. It should be int is fixed, and fint is the fast
    form which exists on a given platform. If I need speed, I use fint. If I
    need a certain size, I use int. And if I want to do the research to find
    out how big fint is so I can use it in lieu of int, then I can do that as
    well, and then always use fint if it's big enough.

    C is backwards. It is backwards in this area of unknown sizes, but minimum
    allowed sizes, and it is backwards in the area of undefined behavior, where
    it should have defined behavior and overrides.

    That's my position. It's not just lip service. It's where I'm proceeding
    from with my position about deficiencies in C, and in my efforts to build
    RDC without those deficiencies.

    Best regards,
    Rick C. Hodgin
    Rick C. Hodgin, Feb 9, 2014
  19. Written out more than once. In the program I tried, it had them in the
    source code of the test program. There are also used elsewhere so they
    must be written out on at least one more place.

    Yes. Someone who knows all that needs to be know to get the definitions
    right. And they do that once for all the thousands of developers who
    use the definitions.

    If you think this is the same as just writing out your own wherever they
    are needed, well, I must bite my tongue.
    Your estimation would be wrong, then.

    You've never heard of Fortran, Pascal, C++, Ada, Haskell or Python? Or
    did you just not know enough about them to know what the various
    standard say about the fundamental types?
    No, not only in the land of C.

    Ben Bacarisse, Feb 9, 2014
  20. Snap! (nearly.) I left Lisp off the list because the status of fixnum
    has changed a bit over the years and I was not sure where things stood
    right now, but I am sure you are right that it is not 100% prescribed.
    Ben Bacarisse, Feb 9, 2014
    1. Advertisements

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments (here). After that, you can post your question and our members will help you out.