Re: short int always 16 bits or not?

Discussion in 'C Programming' started by Malcolm McLean, Apr 20, 2013.

  1. On Saturday, April 20, 2013 1:14:26 AM UTC+1, Shriramana Sharma wrote:
    > Hello. I am reading the C99 standard as available from: http://www.open-std.org/jtc1/sc22/WG14/www/docs/n1256.pdf
    >
    > I note that it specifies (on p 34) macros defining the minimum and maximum
    > values of a short int corresponding to a size of 16 bits. However it doesn't
    > explicitly say that short int-s should be of 16 bits size. So can I trust
    > short int-s to be 16 bits size or not?
    >
    > Also, doesn't prescribing #define-s for integer type min/max values conflict
    > with the general (?) understanding that the size of these types are
    > implementation defined? I mean, is the general understanding wrong? (For
    > instance see: http://en.wikipedia.org/wiki/Short_integer#cnote_b_grp_notesc)
    >
    >
    >
    > Finally, why would anyone want char to be other than 8 bits? *Is* char on any > platform *not* 8 bits?
    >


    Denis Ritchie, who designed C, made a mistake by making "char" (a variable that
    holds a character in a human-readable language) and "byte" (the smallest
    addressible unit of memory) the same same thing.
    256 characters aren't enough for some purposes. And whilst most computers use
    8 bit bytes internally, this isn't universal, particularly on big machines.

    So C has a bit of a problem. The solution, which sort of works, is to allow char
    to be more than 8 bits, on some platforms to solve the byte issue, and to
    introduce wchar_t to solve the bigt alphabet issue.

    As for redefining every basic type, this is often done by people with a limited
    understanding of software engineering, who think that they are making the
    program more robust by allowing the possibility of redefining the type. In
    practice, it's most unlikely that this won't break things, and the introduction
    of new types causes more problems than it solves, certainly it makes it hard to
    integrate code from two programs.

    --
    Malcolm's website
    http://www.malcolmmclean.site11.com/www
     
    Malcolm McLean, Apr 20, 2013
    #1
    1. Advertising

  2. Malcolm McLean

    James Kuyper Guest

    On 04/20/2013 04:49 AM, Malcolm McLean wrote:
    > On Saturday, April 20, 2013 1:14:26 AM UTC+1, Shriramana Sharma wrote:

    ....
    >> Finally, why would anyone want char to be other than 8 bits? *Is* char on any > platform *not* 8 bits?
    >>

    >
    > Denis Ritchie, who designed C, made a mistake by making "char" (a variable that
    > holds a character in a human-readable language) and "byte" (the smallest
    > addressible unit of memory) the same same thing.


    Not quite. 'char' is a data type, while 'byte' is a unit for measuring
    the amount of memory required to store an object. As a data type, 'char'
    has an integer conversion rank, and if signed, it might have either 1's
    complement, 2's complement, or sign-magnitude representation. As a unit
    for measuring storage, a byte has none of those things. He decided to
    make sizeof(char) == 1 byte.

    C would arguably have been better if designed from the start with
    something similar to the current wchar_t and size-named types,
    preferably with different names, rather than with char, short, int, and
    long. I'd recommend thinking along those lines when designing a new
    language. However, it would break too much legacy code to ever move C in
    that direction.
    --
    James Kuyper
     
    James Kuyper, Apr 20, 2013
    #2
    1. Advertising

  3. Malcolm McLean

    Eric Sosman Guest

    On 4/20/2013 6:59 AM, James Kuyper wrote:
    > On 04/20/2013 04:49 AM, Malcolm McLean wrote:
    >> On Saturday, April 20, 2013 1:14:26 AM UTC+1, Shriramana Sharma wrote:

    > ...
    >>> Finally, why would anyone want char to be other than 8 bits? *Is* char on any > platform *not* 8 bits?
    >>>

    >>
    >> Denis Ritchie, who designed C, made a mistake by making "char" (a variable that
    >> holds a character in a human-readable language) and "byte" (the smallest
    >> addressible unit of memory) the same same thing.

    >
    > Not quite. 'char' is a data type, while 'byte' is a unit for measuring
    > the amount of memory required to store an object. As a data type, 'char'
    > has an integer conversion rank, and if signed, it might have either 1's
    > complement, 2's complement, or sign-magnitude representation. As a unit
    > for measuring storage, a byte has none of those things. He decided to
    > make sizeof(char) == 1 byte.
    >
    > C would arguably have been better if designed from the start with
    > something similar to the current wchar_t and size-named types,
    > preferably with different names, rather than with char, short, int, and
    > long. I'd recommend thinking along those lines when designing a new
    > language. However, it would break too much legacy code to ever move C in
    > that direction.


    Also, keep in mind the amount of memory on the machines where
    early C and Unix were born. Quoth one DMR:

    "During [B's] development, [Thompson] continually struggled
    against memory limitations: each language addition inflated
    the compiler so it could barely fit, but each rewrite taking
    advantage of the feature reduced its size."

    In that sort of environment, one hasn't the luxury of adding every
    desirable feature.

    --
    Eric Sosman
    d
     
    Eric Sosman, Apr 20, 2013
    #3
    1. Advertising

Want to reply to this thread or ask your own question?

It takes just 2 minutes to sign up (and it's free!). Just click the sign up button to choose a username and then you can ask your own questions on the forum.
Similar Threads
  1. Barry Schwarz

    Re: short int always 16 bits or not?

    Barry Schwarz, Apr 20, 2013, in forum: C Programming
    Replies:
    2
    Views:
    205
    glen herrmannsfeldt
    Apr 22, 2013
  2. Eric Sosman

    Re: short int always 16 bits or not?

    Eric Sosman, Apr 20, 2013, in forum: C Programming
    Replies:
    0
    Views:
    203
    Eric Sosman
    Apr 20, 2013
  3. Shriramana Sharma

    Re: short int always 16 bits or not?

    Shriramana Sharma, Apr 20, 2013, in forum: C Programming
    Replies:
    10
    Views:
    318
    Ken Brody
    Apr 25, 2013
  4. James Kuyper

    Re: short int always 16 bits or not?

    James Kuyper, Apr 20, 2013, in forum: C Programming
    Replies:
    2
    Views:
    186
    James Kuyper
    Apr 22, 2013
  5. Les Cargill

    Re: short int always 16 bits or not?

    Les Cargill, Apr 20, 2013, in forum: C Programming
    Replies:
    0
    Views:
    188
    Les Cargill
    Apr 20, 2013
Loading...

Share This Page