rationale for #define true 1 in stdbool.h


B

Ben Hinkle

I'm curious, what was the rationale for making a builtin type _Bool but then
having
#define true 1
#define false 0
in stdbool.h? That seems very odd that true and false don't have type _Bool.
In particular I'm poking around with some language extensions to C and one
of the most obvious extensions is overloading. Since "true" doesn't have
type _Bool it makes overloading behavior with _Bool very odd. You'd think
that at least it could be
#define true ((bool)1)

I notice in the C99 spec it says the true and false defines "are suitable
for use in #if preprocessor directives". Was it anticipated that true and
false would be used primarily for #if directives? One would imagine that a
more important property would be something like sizeof(bool) ==
sizeof(true).

thanks,
-Ben
 
Ad

Advertisements

E

Eric Sosman

Ben Hinkle wrote On 01/10/06 16:18,:
I'm curious, what was the rationale for making a builtin type _Bool but then
having
#define true 1
#define false 0
in stdbool.h? That seems very odd that true and false don't have type _Bool.
In particular I'm poking around with some language extensions to C and one
of the most obvious extensions is overloading. Since "true" doesn't have
type _Bool it makes overloading behavior with _Bool very odd. You'd think
that at least it could be
#define true ((bool)1)

I notice in the C99 spec it says the true and false defines "are suitable
for use in #if preprocessor directives". Was it anticipated that true and
false would be used primarily for #if directives? One would imagine that a
more important property would be something like sizeof(bool) ==
sizeof(true).

"Suitable for use in #if" is one reason (bool)1 wouldn't
work. Types do not yet exist when the preprocessor operates,
so casts can't be evaluated. (In fact, #if true would turn
into #if (bool)1 and then #if (0)1, eliciting a diagnostic.)

As for the sizeof complaint, although opinions obviously
vary it doesn't strike me as an "important" property. IMHO
it is usually -- not always, but usually -- a poor idea to
write sizeof(type) when sizeof *ptr is practical. Besides,
we've already got sizeof 'x' > sizeof(char) on most systems,
and the only people it seems to bother are defectors to the
Dark Side With The Plus Signs.

Personally, I still don't understand the motivation for
adding _Bool to the language. The Rationale draws attention
to some properties of _Bool, but sheds no light on why those
properties were so desirable as to prompt the addition of a
whole new type -- especially since everything that can be
done with _Bool seems eminently do-able without it. Perhaps
the C9X committee suffered from Pascal envy?
 
O

Old Wolf

Eric said:
Personally, I still don't understand the motivation for
adding _Bool to the language.

For me, it's desirable because assigning any non-zero value
to it causes it to have a non-zero value. This is not true for
any builtin type except for unsigned long long, which would
be a waste of memory if it were used as a boolean type.

I have accidentally written code like this:

bool b = (flags & FLAG_FOO);

where FLAG_FOO is something like 0x100. It took a
long debugging session to track down the problem; even
when I'd isolated the problem to this one block of code,
I still couldn't for the life of me figure out what was going on,
until I looked up the definition of 'bool'. (It turned out to
be a typedef for unsigned char).
 
K

Keith Thompson

Ben Hinkle said:
I'm curious, what was the rationale for making a builtin type _Bool but then
having
#define true 1
#define false 0
in stdbool.h? That seems very odd that true and false don't have type _Bool.
In particular I'm poking around with some language extensions to C and one
of the most obvious extensions is overloading. Since "true" doesn't have
type _Bool it makes overloading behavior with _Bool very odd. You'd think
that at least it could be
#define true ((bool)1)

I notice in the C99 spec it says the true and false defines "are suitable
for use in #if preprocessor directives". Was it anticipated that true and
false would be used primarily for #if directives? One would imagine that a
more important property would be something like sizeof(bool) ==
sizeof(true).

Character constants don't have type char either; they're of type int
(sizeof('a')==sizeof(int)).

Making true and false be of type _Bool wouldn't be very useful, since
they'd be promoted to int in most contexts anyway.

If the language had been changed so that all conditions must be of
type _Bool, rather than of any scalar type, making false and true be
of type _Bool might have made more sense -- but that kind of change
would break existing code.
 
B

Ben Hinkle

Eric Sosman said:
Ben Hinkle wrote On 01/10/06 16:18,:

"Suitable for use in #if" is one reason (bool)1 wouldn't
work. Types do not yet exist when the preprocessor operates,
so casts can't be evaluated. (In fact, #if true would turn
into #if (bool)1 and then #if (0)1, eliciting a diagnostic.)

Right. I wouldn't consider using true and false in #if's important.
As for the sizeof complaint, although opinions obviously
vary it doesn't strike me as an "important" property. IMHO
it is usually -- not always, but usually -- a poor idea to
write sizeof(type) when sizeof *ptr is practical. Besides,
we've already got sizeof 'x' > sizeof(char) on most systems,
and the only people it seems to bother are defectors to the
Dark Side With The Plus Signs.

I'm with Them, then. Justifying one "mistake" (#define true 1) with another
(type of 'a' isn't char) doesn't make me feel warm and fuzzy. I assume there
are good reasons for things, though.
Personally, I still don't understand the motivation for
adding _Bool to the language. The Rationale draws attention
to some properties of _Bool, but sheds no light on why those
properties were so desirable as to prompt the addition of a
whole new type -- especially since everything that can be
done with _Bool seems eminently do-able without it. Perhaps
the C9X committee suffered from Pascal envy?

Not enough Pascal envy, perhaps ;-)
 
P

Peter Nilsson

Eric said:
Personally, I still don't understand the motivation for
adding _Bool to the language. The Rationale draws attention
to some properties of _Bool, but sheds no light on why those
properties were so desirable as to prompt the addition of a
whole new type -- especially since everything that can be
done with _Bool seems eminently do-able without it. Perhaps
the C9X committee suffered from Pascal envy?

Or perhaps the C9X committee could see the plethora of
programs that already have varing (and subtly incompatible)
kludges for the same thing that was missing from C originally,
namely, a basic boolean type.
 
Ad

Advertisements

E

Eric Sosman

Old said:
Eric Sosman wrote:




For me, it's desirable because assigning any non-zero value
to it causes it to have a non-zero value. This is not true for
any builtin type except for unsigned long long, which would
be a waste of memory if it were used as a boolean type.

I have accidentally written code like this:

bool b = (flags & FLAG_FOO);

where FLAG_FOO is something like 0x100. It took a
long debugging session to track down the problem; even
when I'd isolated the problem to this one block of code,
I still couldn't for the life of me figure out what was going on,
until I looked up the definition of 'bool'. (It turned out to
be a typedef for unsigned char).

Accidents will happen (and have to me, most certainly).
This particular accident isn't one that has beset my path
and laid a trap for my unwary feet; in the "isolate the
bit" context I tend to think of (and write) the result in
the same type as the original flags, not as a truth value
saying "was it set, or was it not?"

Still, if one wants the WISOWIN semantics, one can
always resort to the double negative:

bool b = !!(flags & FLAG_FOO);

or the (possibly clearer)

bool b = (flags & FLAG_FOO) != 0;

Thirty-five-plus years of C somehow scraped by without
anything more, and I still don't see what _Bool brings to
the party. Well, maybe I'm just a party pooper.
 
C

Chris Torek

Or perhaps the C9X committee could see the plethora of
programs that already have varing (and subtly incompatible)
kludges for the same thing that was missing from C originally,
namely, a basic boolean type.

In other words, there was a demand, so they filled it.

Of course, there is also a lot of demand for methamphetamine. :)

(Or as Rob Pike said of the X Window System: "Sometimes when you
fill a vacuum, it still sucks.")
 
C

Chuck F.

Eric said:
.... snip ...

Thirty-five-plus years of C somehow scraped by without anything
more, and I still don't see what _Bool brings to the party.
Well, maybe I'm just a party pooper.

You can always avoid it by simply failing to #include <stdbool.h>.
Then the only evidence remaining is an identifier in the
implementors namespace. But for people who do want to define true,
false, and bool, everything is standardized.

--
"If you want to post a followup via groups.google.com, don't use
the broken "Reply" link at the bottom of the article. Click on
"show options" at the top of the article, then click on the
"Reply" at the bottom of the article headers." - Keith Thompson
More details at: <http://cfaj.freeshell.org/google/>
 
A

Alex Fraser

[snip]
Making true and false be of type _Bool wouldn't be very useful, since
they'd be promoted to int in most contexts anyway.

Which leads to the obvious question: why are "small" types promoted to int?
If the language had been changed so that all conditions must be of
type _Bool, rather than of any scalar type, making false and true be
of type _Bool might have made more sense -- but that kind of change
would break existing code.

I guess you would also want to change the type of the result of the equality
and relational operators.

Alex
 
K

Keith Thompson

Alex Fraser said:
[snip]
Making true and false be of type _Bool wouldn't be very useful, since
they'd be promoted to int in most contexts anyway.

Which leads to the obvious question: why are "small" types promoted to int?

Historical reasons and code efficiency, I think.

For many CPUs, arithmetic operations single-word operands are more
efficient than operations on smaller operands. Given:

short x, y, z;
... x * y + z ...

promoting each operand from short (perhaps 16 bits) to int (perhaps 32
bits) is likely to result in faster code than performin the operations
on the short operands (which might not even be directly supported by
the hardware).

Making the semantics consistent across different CPUs is probably
better than doing it in the most efficient way for each system,
causing some apparently straightforward operations to be
implementation-defined.
 
Ad

Advertisements

B

Ben Hinkle

Chuck F. said:
You can always avoid it by simply failing to #include <stdbool.h>. Then
the only evidence remaining is an identifier in the implementors
namespace. But for people who do want to define true, false, and bool,
everything is standardized.

But if the intention is that stdbool be used for standard boolean coding
then it will appear in headers and once you include someone's header that
uses it then you are using it, too. So if the intention of stdbool is that
it is included in, say, 90% of headers that involve function taking a
boolean parameter then as an individual there's not much one can do to avoid
it. However if the intention is that it gets used by a few individuals (more
particuarly, not in library headers) then how it behaves isn't a big deal.

-Ben
 
L

lawrence.jones

Eric Sosman said:
Personally, I still don't understand the motivation for
adding _Bool to the language.

The main reason was that a huge number of third-party packages define
some sort of boolean type, but while they usually don't agree on the
definition, they frequently agree on the name, leading to problems for
anyone trying to use them together. Having a standard boolean type was
viewed as the obvious solution to that problem.

-Larry Jones

They say winning isn't everything, and I've decided
to take their word for it. -- Calvin
 
L

lawrence.jones

Old Wolf said:
For me, it's desirable because assigning any non-zero value
to it causes it to have a non-zero value. This is not true for
any builtin type except for unsigned long long [...]

unsigned long long l = 0.1;

-Larry Jones

I'll be a hulking, surly teen-ager before you know it!! -- Calvin
 
J

Jordan Abel

But if the intention is that stdbool be used for standard boolean coding
then it will appear in headers and once you include someone's header that
uses it then you are using it, too. So if the intention of stdbool is that
it is included in, say, 90% of headers that involve function taking a
boolean parameter then as an individual there's not much one can do to avoid
it. However if the intention is that it gets used by a few individuals (more
particuarly, not in library headers) then how it behaves isn't a big deal.

-Ben

function declarations in headers could just use _Bool instead.
 
Ad

Advertisements

O

Old Wolf

Eric said:
Accidents will happen (and have to me, most certainly).
This particular accident isn't one that has beset my path
and laid a trap for my unwary feet;

In fact my situation was a bit more disguised:

bool test_foo(FLAGS flags)
{
return flags & FLAG_FOO;
}
bool b = !!(flags & FLAG_FOO);

Yes, once I had been alerted to the danger, I went through all of
my source-code looking for such things, and did that to them.
 
Ad

Advertisements


Top