What Defins the "C" Language?

R

Randy Yates

In Harbison and Steele's text (fourth edition, p.111)
it is stated,

The C language does not specify the range of integers that the
integral types will represent, except ot say that type int may not
be smaller than short and long may not be smaller than int.

They go on to say,

Many implementations represent characters in 8 bits, type short in
16 bits, and type long in 32 bits, with type int using either 16 or
32 bits depending on the implementation. ISO C requires
implementations to use at least these widths.

If the C language is not defined by ISO C, then what defines it?
 
A

Alex Fraser

Randy Yates said:
In Harbison and Steele's text (fourth edition, p.111)
it is stated,

The C language does not specify the range of integers that the
integral types will represent, except ot say that type int may not
be smaller than short and long may not be smaller than int.

They go on to say,

Many implementations represent characters in 8 bits, type short in
16 bits, and type long in 32 bits, with type int using either 16 or
32 bits depending on the implementation. ISO C requires
implementations to use at least these widths.

If the C language is not defined by ISO C, then what defines it?

There are requirements for conforming implementations, specified in ISO
standard documents, such as those you quote above. There is some leeway in
these requirements, as illustrated above, presumably to allow appropriate
choices to be made when creating an implementation for some specific
platform.

What is in the quotes that makes you think it is not defined by ISO C?

Alex
 
?

=?ISO-8859-1?Q?=22Nils_O=2E_Sel=E5sdal=22?=

Alex said:
There are requirements for conforming implementations, specified in ISO
standard documents, such as those you quote above. There is some leeway in
these requirements, as illustrated above, presumably to allow appropriate
choices to be made when creating an implementation for some specific
platform.

What is in the quotes that makes you think it is not defined by ISO C?
Looks to me like he's asking if ISO doesn't define this'n'that(integer
sizes etc.) Who does for a given arch/compiler ?
 
R

Randy Yates

Nils O. Selåsdal said:
Looks to me like he's asking if ISO doesn't define this'n'that(integer
sizes etc.) Who does for a given arch/compiler ?

Prexactly. :)

I seem to recall conversations in years past of old machines that
had strange integer sizes (9 bits?) which C would support. Am I
delusional?
 
M

Merrill & Michele

"Randy Yates"
Nils O. Selåsdal said:
Alex Fraser wrote:
"Randy Yates"
In Harbison and Steele's text (fourth edition, p.111) [implementation of integer stuff]
What is in the quotes that makes you think it is not defined by ISO
C?

Looks to me like he's asking if ISO doesn't define this'n'that(integer
sizes etc.) Who does for a given arch/compiler ?

Prexactly. :)

I seem to recall conversations in years past of old machines that
had strange integer sizes (9 bits?) which C would support. Am I
delusional?

You are delusional if you think that the assiduous study of H&S won't reveal
a rich, flexible language that uses the ANSI/ISO standard as a bulwark. MPJ
 
G

Goran Larsson

Randy Yates said:
I seem to recall conversations in years past of old machines that
had strange integer sizes (9 bits?) which C would support. Am I
delusional?

Nine bits are not enough for the required ranges of integer types
such as short, int and long. It is, however, a perfectly valid
size for a byte, i.e. a char, unsigend char or signed char, in C.
A nine bit byte is only a strange size for people that view the
world through a PC.
 
M

Michael Mair

Merrill said:
"Randy Yates"

[implementation of integer stuff]
Prexactly. :)

I seem to recall conversations in years past of old machines that
had strange integer sizes (9 bits?) which C would support. Am I
delusional?

The standard demands CHAR_BIT>=8 which gives you no problem with
9-bit-bytes. Furthermore, the effective range of signed int/unsigned int
according to the standard is that of a 16 bit number (but for INT_MIN),
so sizeof(int)*CHAR_BIT==18 (or 32, nowadays) gives you no problem,
either. The same for 36-bit-longs.
IMO, H&S have that right. Or did I misunderstand your question, too?

BTW: The good old machines with 6 bits to a byte probably do not
have any C implementations to speak of (... I wait to be contradicted).
You are delusional if you think that the assiduous study of H&S won't reveal
a rich, flexible language that uses the ANSI/ISO standard as a bulwark. MPJ

Your point was... ? Do you want to encourage/contradict/... the OP?


Cheers
Michael
 
R

Randy Yates

Nine bits are not enough for the required ranges of integer types
such as short, int and long.

What ranges are those? Where are they specified? This statement contradicts
H&S: "The C language does not specify the range of integers that the
integral types will represent...".
 
D

dandelion

Randy Yates said:
"Nils O. Selåsdal" <[email protected]> writes:
Prexactly. :)

I seem to recall conversations in years past of old machines that
had strange integer sizes (9 bits?) which C would support. Am I
delusional?

Back in the Bad Old Days (tm) of bitslicers any number of bits was posible.
I recall a conversation with an "old pro" who told me about 47-bits
computers.
 
M

Merrill & Michele

"Michael Mair"
Merrill said:
"Randy Yates"
"Nils O. Selåsdal" <[email protected]>

Alex Fraser wrote:

"Randy Yates"
In Harbison and Steele's text (fourth edition, p.111)

[implementation of integer stuff]
What is in the quotes that makes you think it is not defined by ISO
C?

Looks to me like he's asking if ISO doesn't define this'n'that(integer
sizes etc.) Who does for a given arch/compiler ?

Prexactly. :)

I seem to recall conversations in years past of old machines that
had strange integer sizes (9 bits?) which C would support. Am I
delusional?

The standard demands CHAR_BIT>=8 which gives you no problem with
9-bit-bytes. Furthermore, the effective range of signed int/unsigned int
according to the standard is that of a 16 bit number (but for INT_MIN),
so sizeof(int)*CHAR_BIT==18 (or 32, nowadays) gives you no problem,
either. The same for 36-bit-longs.
IMO, H&S have that right. Or did I misunderstand your question, too?

BTW: The good old machines with 6 bits to a byte probably do not
have any C implementations to speak of (... I wait to be contradicted).
You are delusional if you think that the assiduous study of H&S won't reveal
a rich, flexible language that uses the ANSI/ISO standard as a bulwark.
MPJ

Your point was... ? Do you want to encourage/contradict/... the OP?

The sentence stands by itself. The OP had internal contradiction. Motives
are OT. MPJ
 
M

Merrill & Michele

Goran Larsson said:
Nine bits are not enough for the required ranges of integer types
such as short, int and long. It is, however, a perfectly valid
size for a byte, i.e. a char, unsigend char or signed char, in C.
A nine bit byte is only a strange size for people that view the
world through a PC.

What about a 62-bit byte? MPJ
 
R

Randy Yates

Merrill & Michele said:
[...]
The OP had internal contradiction.

What was the contradiction?
Motives are OT.

You must be kidding, right? I'm asking a very fundamental question
of how C is defined. Isn't that smack in the center of the topicality
for this group?
 
D

Dan Pop

In said:
Prexactly. :)

The ISO standard delegates certain decisions to the implementor. Some
of them MUST be documented by the implementor, others need not be.

Whenever the ISO standard says: "this or that is implementation-defined",
the implementor must make a choice and document it. Otherwise, the
implementor is not required to document his choice, e.g. he is under
no obligation to reveal the exact definition of size_t.

Quite often, the standard specifies only limits, leaving the actual
choice to the implementor. For example, the character types must have
*at least* 8 bits, but the actual value is up to the implementor.
I seem to recall conversations in years past of old machines that
had strange integer sizes (9 bits?) which C would support. Am I
delusional?

K&R1 mentions such an implementation. The only standard C types
suitable for 9-bit integers are the character types.

Dan
 
R

Richard Tobin

A nine bit byte is only a strange size for people that view the
world through a PC.

If you excise PCs from history, you will still find a trend towards
8-bit bytes. There will probably never be a new architecture with
9-bit bytes.

-- Richard
 
R

Richard Tobin

The OP had internal contradiction.
[/QUOTE]
What was the contradiction?

I don't know about a contradiction, but your question was rather
strange because you asked "If the C language is not defined by ISO C,
then what defines it?" without citing anything that said that
the C language was not defined by ISO C.

You quoted these paragraphs:

The C language does not specify the range of integers that the
integral types will represent, except ot say that type int may not
be smaller than short and long may not be smaller than int.

and

Many implementations represent characters in 8 bits, type short in
16 bits, and type long in 32 bits, with type int using either 16 or
32 bits depending on the implementation. ISO C requires
implementations to use at least these widths.

I suppose one could take them to mean "the C language does not require
types to be of those minimum sizes, but ISO C does", but I don't think
that is the intended meaning. If it *is* the intended meaning, then
it is presumably referring to pre-standard C.

-- Richard
 
D

Dan Pop

In said:
Nine bits are not enough for the required ranges of integer types
such as short, int and long.

Last time I checked, signed char was an integer type. Ditto about
unsigned char:

4 There are five standard signed integer types, designated as
signed char, short int, int, long int, and long long int.

6 For each of the signed integer types, there is a corresponding
(but different) unsigned integer type (designated with the keyword
unsigned) that uses the same amount of storage (including sign
information) and has the same alignment requirements.
It is, however, a perfectly valid
size for a byte, i.e. a char, unsigend char or signed char, in C.

Only unsigned char qualifies as "byte". The other character types may
ignore certain bits or combinations of bits in a byte.
A nine bit byte is only a strange size for people that view the
world through a PC.

Or through pretty much anything else in current use today: Unix
workstations, supercomputers, SCSI and IDE disks, TCP/IP networks and
the underlying networking hardware, USB devices. Ditto for most of
the open source software in use today. Our current hosted computing
world revolves around 8-bit bytes and there is no indication that
this is going to change anytime in the future.

Dan
 
D

Dan Pop

In said:
BTW: The good old machines with 6 bits to a byte probably do not
have any C implementations to speak of (... I wait to be contradicted).

On those machines, 6 bits was one of the available options for the size
of a byte, rather than the hardwired size of a byte. They were word
addressed machines with word sizes of 12, 18 or 36 bits. At least one
such machine, the PDP-10, had K&R C implemented on it, but I guess that
it used a larger byte size.

Dan
 
A

Alex Fraser

Randy Yates said:
What ranges are those? Where are they specified? This statement
contradicts H&S: "The C language does not specify the range of integers
that the integral types will represent...".

No, it does not contradict H&S: the (exact) ranges are not specified, but
"minimum" (ie smallest absolute value) ranges are. These minimum ranges are:

Type Minimum Name Maximum Name
signed char <= -127 SCHAR_MIN >= 127 SCHAR_MAX
unsigned char 0 >= 255 UCHAR_MAX
(signed) short (int) <= -32767 SHRT_MIN >= 32767 SHRT_MAX
unsigned short (int) 0 >= 65535 USHRT_MAX
signed/(signed) int <= -32767 INT_MIN >= 32767 INT_MAX
unsigned (int) 0 >= 65535 UINT_MAX
(signed) long (int) <= -2147483647 LONG_MIN >= 2147483647 LONG_MAX
unsigned long (int) 0 >= 4294967295 ULONG_MAX

The Name columns refer to constants defined in <limits.h> containing the
actual value(s) for the corresponding type in a given implementation.

Type 'char' is either the same as 'signed char' or the same as 'unsigned
char'; CHAR_MIN (equal to 0 or SCHAR_MIN) and CHAR_MAX (equal to SCHAR_MAX
or UCHAR_MAX) describe its range.

The above implies that type char must be at least 8 bits, types short and
int at least 16 bits, and type long at least 32 bits. Hence the statement
ending the second H&S quote you gave: "ISO C requires implementations to use
at least these widths."

C99 extended the above list by introducing '(signed) long long (int)' and
'unsigned long long (int)' types and <limits.h> constants: LLONG_MIN
<= -(2^63 - 1), LLONG_MAX >= 2^63 - 1, and ULLONG_MAX >= 2^64 - 1.

Alex
 
R

Randy Yates

What was the contradiction?

I don't know about a contradiction, but your question was rather
strange because you asked "If the C language is not defined by ISO C,
then what defines it?" without citing anything that said that
the C language was not defined by ISO C.[/QUOTE]

I thought I had, by the paragraphs I cited and you repeated below:
You quoted these paragraphs:

The C language does not specify the range of integers that the
integral types will represent, except ot say that type int may not
be smaller than short and long may not be smaller than int.

and

Many implementations represent characters in 8 bits, type short in
16 bits, and type long in 32 bits, with type int using either 16 or
32 bits depending on the implementation. ISO C requires
implementations to use at least these widths.

I suppose one could take them to mean "the C language does not require
types to be of those minimum sizes, but ISO C does",

Yes, that is how I interpreted the statements.
but I don't think
that is the intended meaning.

How could it be otherwise? First the authors state that the language
does not specify the range of integers, then they state that ISO C
requires a minimum width, implying a minimum range. The two statements
can't be both true for the same standard.
If it *is* the intended meaning, then
it is presumably referring to pre-standard C.

That is exactly how I was interpreting it, but what defines
"pre-standard C"???
 
R

Richard Tobin

I suppose one could take them to mean "the C language does not require
types to be of those minimum sizes, but ISO C does",
[/QUOTE]
Yes, that is how I interpreted the statements.
How could it be otherwise?

I took the second to just be a completion of the first.

But supposing you're right:
That is exactly how I was interpreting it, but what defines
"pre-standard C"???

The consensus represented by K&R 1 and the compilers that existed
before the first ANSI / ISO standard. I don't have a K&R 1 to hand to
check what they said.

-- Richard
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

No members online now.

Forum statistics

Threads
473,796
Messages
2,569,645
Members
45,369
Latest member
Carmen32T6

Latest Threads

Top