Size of bool unspecified: why?

J

JBarleycorn

James Kanze said:
James Kanze wrote:
[...]
Their use is fairly limited, and I can't
think of a case where I'd use them in a struct. (They are
convenient for certain operations, however, when their
restricted portability isn't an issue.)
You view of the concept seems to be exactly the opposite of mine. Your
view seems more appropriate, to me, for a 4GL rather than for C/C++.

My view is based on Kernighan and Richie. It was current thirty
years ago, and dominates things like Unix kernel code (which is
hardly written in the spirit of a 4GL).
It depends on how you program. When I say that, I mean for the way I
program and use the language.

I'd still be interested in seeing an actual example where the
size of bool makes a difference. Suppose it was required to be
1: XDR requires it to be four bytes, so if you're serializing in
XDR format, you write:

void
writeBool( std::eek:stream& dest, bool value )
{
writeInt( dest, value ? 1 : 0 );
}

or something along those lines. Code which, of course, works
equally well if the size of bool is something other than 1.
If I had to be continually faced with having to think about what the
effect of integral type size changes would be in C++, I would not use
the
language. Though maybe I wouldn't program at all then!

Who thinks of it? Ever. About the only thing one might
consider is whether the type is large enough: I've seen a lot of
code which uses long, rather than int, because INT_MAX is only
guaranteed to be 32767. This even in code that will clearly
never run on a 16 bit machine. *IF* I felt I had to take this
into consideration, I'd use things like int_least32_t or
int_fast32_t. (But if the code clearly will never be used on a
16 bit machine, I'll just go with int.)
int and bool are in the same boat in that respect. The *only* time I
use
int, or its unsigned counterpart, is when I consciously *want* a type
to
expand across 32-bit and 64-bit platforms, and yes, I test in a header
if
my assumptions are valid on a given platform. I view fixed-width
integral
types, even if I have to synthesize them as I did before there was
stdint.h, as the "normal" ones and the "expanding" types as the
special
ones. Try it sometime and you'll probably never go back to the other
way
of thinking.

Why? It sounds like a major step backwards to me, and it
certainly isn't how the language was designed to be used.

I'm not knocking it. If you and others want to use 30 year old paradigms,
go for it. Just don't expect me to adopt and be enslaved by such
limitations.
 
I

Ian Collins

Contraire: you are limited by your paradigm (or choose to promote it for
ulterior reasons). (Or I'm so far ahead of you that you can't even see
the dust anymore... Beep! Beep!).

Come on, enough piss and wind, let's see some real examples. If you
have any that is...
 
J

JBarleycorn

James Kanze said:
James said:
"Ian Collins" <[email protected]> wrote in message
[...]
Not in terms of the standard. Implementation defined things have
to be defined by the implementation.
Why is "unspecified" necessary? Does anyone think that not
specifying the size of a bool is a good thing?
First, I don't think it's unspecified.
This "unspecified" designation is something new to me. I don't really
understand the point of it or how to determine if something is
"implementation defined" or that.

How you determine is by reading the standard.

Ha! It's not easy to find what you're looking for in there. That's more
(only?) for implementers than users.
The distinction
is formally important: something that is implementation defined
must be fully defined and specified by the implementation, so if
you are writing for a specific implementation, you can count on
it. The size of an int is a good example: it may vary between
different implementations, but for a given implementation, it
will be the same every time you compile your code (at least with
the same options), and (at least in theory), you can find out
exactly what it is in the documentation. Something which is
unspecified needed be documented, and can vary. The classical
example is order of evaluation: if you write something like 'f()
+ g()', and f() and g() have side effects, the order those side
effects occur is unspecified, and might be different from one
compilation to the next (and often will be different if you
change the optimization options).



'bool' is an integral type. In the rare cases where you need
fixed-width types, you can specify int32_t, and assign a bool to
it, with no problem.

You're not listening.
So what's the other half?

Data. (It's obvious now that I said it, huh).
[...]
Right, they didn't, but had they, they would have stood at
least some chance at not implementing a boolean type that is
broken.

Well, the C++ definition of bool isn't broken.

That's not just an opinion, it's a C++er's opinion. The qualification
*is* needed. You know, "consider the source".

In your C++er's opinion...
the C
definition is, but the C committee obviously doesn't agree with
me on that:). In both cases, the type has been "weakened"

"weakened", LOL. Lame try at trying to alleviate the issue: that C++ bool
is *broken* beyond repair. Replacement, where possible, is the only
option.
by
considerations of backwards compatibility: implicit conversions
in C++ (so that e.g. 'if (p)', where 'p' is a pointer, still
works---the original proposal would have deprecated this, but
the final version didn't), and in C, the need to include a
special header to "activate" bool, and the fact that it is still
a typedef to another integral type.
[...]
Are not 64-bit ints indeed more
efficient (in time) than 32-bit ints, say on x86 vs. x64? Anyone
have a good simple test for this?
It depends on how you use them, but I'd be very surprised if
there were any use cases where 64 bit ints would be faster than
32 bit ints.
I was thinking that the register size and instructions on a 64-bit
machine would somehow favor 64-bit operations.

I can't speak for all 64 bit machines, of course, but at least
on an Intel or a Sparc, there should be no speed difference for
an isolated instance.

I suspect that the case between 16-bit and 32-bit types on a 32-bit
platform will behave similarly, so I can profile that and "extrapolate".
The most important one is probably the cost of testing one, i.e.
the execution time for 'if ( aBooleanValue )'. In some
contexts, the cost of handling large arrays (where the number of
instances you can fit in a cache line is important) may be
significant.

And the other 10?
Note that on most of the systems I currently use, the effective
size of a bool will vary: although the value is effectively
maintained on a single byte, they will pass a bool into a
function as a 4 or 8 byte value, and in a struct, the effective
size (including padding) will depend on what comes after the
bool in the struct, and may be one, two, four or eight bytes.



No. In practice, it tends to vary, and even depend on the exact
model of the chip being used. Still, in my current work (which
includes a lot of high performance number crunching), we
consistently find that reducing the size of each element in an
array improves performance accross the board, on all of our
different machines.

That's not testing the things I mentioned and was immediately concerned
with though. I was looking for insight about the performance of the
common mechanisms (passing back and forth and arith ops, mostly the
former though).
 
J

JBarleycorn

James Kanze said:
I suspect the reason Nick argues against it is that he has tried
it. I can remember when Microsoft changed the layout of a long
between versions of a compiler. Update the compiler, and none
of your data was readable. Unless, of course, you'd written the
serialization code correctly. And of course, a lot of companies
today are migrating from Solaris Sparc or IBM AIX to Linux on a
PC.

Is the practice bad or the practicianer? (rhetorical). The dangers of
making assumptions and then not controlling/testing them.
 
I

Ian Collins

I'm not knocking it. If you and others want to use 30 year old paradigms,
go for it. Just don't expect me to adopt and be enslaved by such
limitations.

More vacuous noise...
 
J

JBarleycorn

James Kanze said:
Most compilers do insert padding,

VC++ 2010 zero-extends unsigned ints of 8-bit before pushing them onto
the stack as an arg. I just tested it because the documentation just says
it "widens" and I had taken that, for some reason, to mean "by zero or
sign extension" and wanted to know now if that was indeed the case, and
it is so. Here's the doc:
http://msdn.microsoft.com/en-us/library/984x0h58.aspx
since not doing so will either
slow the code down significantly (Intel) or cause the program to
crash (most other processors) because of misaligned data.
Inserting padding is not expanding an argument to word size;
when a compiler inserts padding, the bytes in the padding have
unspecified (and often random) values. If I have something
like:

void f( char ch );

char aChar;
f( aChar );

I expect a compiler on an Intel to generate either:

push aChar

(if `aChar` is correctly aligned), or

mov al, aChar
push eax

to pass the argument.

No, it (VC++) does zero or sign extension before pushing the arg on the
stack. Here is the relevant portion of the result of my test code:

; 26 : pass8arg = Pass8(pass8arg);

movzx ecx, BYTE PTR _pass8arg$[ebp]
push ecx
call ?Pass8@@YGEE@Z ; Pass8
mov BYTE PTR _pass8arg$[ebp], al
 
J

James Kanze

[...]
Who does? In over thirty years of experience in C and C++, I've
yet to see any application which uses such a header. I'm
tempted to say that someone who uses such a header doesn't
understand the basic philosophy behind C and C++.
Now we're getting somewhere! You saying that you code against a
*"philosophical"* paradigm.

I'm saying that I use data abstraction, as a tool to manage
complexity.
The word "religion" will surface soon, I'm
sure. Then you can say that my designs are unholy and yours are holy and
then the congregation you belong to will ... something.

There's no religion about it: the higher the level of
abstraction, the more efficient the programmer. Given a good
compiler (something rare), the more efficient the generated
code, too. Against higher levels of abstraction: it's often
more work (defining a class, where just using int will do), and
some particular interfaces (including most serialization) are
defined in terms of a low level of abstraction.
Like I *said*: there is no *similar* type.

Except all signed integral types.

[...]
I'm not going to repeat myself.

Because there's nothing to repeat? To date, you've yet to give
any concrete reason why there's a problem with bool. Just vague
assertations which aren't supported by actual facts.
1. The size of C++ bool can't be relied on.

So what? If you're worried about the size of bool, you're
worried about the wrong things.
2. C++ bool has brain-damaged behavior (promotion to int).

I wouldn't go so far as "brain-damaged" (unless you want to
argue that C++ is brain-damaged everywhere, because it is based
on C). It's regrettable. It's also necessary, for reasons of
backward compatibility. Like the original authors of the
proposal, I would have preferred that the implicit conversions
be deprecated, but there was no consensus in the committee for
this, so it wasn't.
Now it is, but historically, I have an incling, that on all those
platforms, bool was synonymous with int. In that case, you have just
different versions (old and new) of the compiler and encounter a size
change of type (or synonym) bool.

Historically, there wasn't a type bool, and a lot of programmers
used:
typedef int bool;
(I wouldn't be surprised if <windows.h> still has a typedef for
BOOL. Or maybe a macro.) I don't think that any implementation
on a byte addressable machine has ever made it more than a byte.
(But since I've never worried about it, I could be wrong.)
 
A

Alf P. Steinbach

Who does? In over thirty years of experience in C and C++, I've
yet to see any application which uses such a header.

I'm
tempted to say that someone who uses such a header doesn't
understand the basic philosophy behind C and C++.

*hark*


Cheers & hth.,

- Alf
 
J

JBarleycorn

James said:
[...]
Who does? In over thirty years of experience in C and C++, I've
yet to see any application which uses such a header. I'm
tempted to say that someone who uses such a header doesn't
understand the basic philosophy behind C and C++.
Now we're getting somewhere! You saying that you code against a
*"philosophical"* paradigm.

I'm saying that I use data abstraction, as a tool to manage
complexity.

No, you did not say that.
There's no religion about it

scuuze me, "philosophy", as you put it then.
: the higher the level of
abstraction,

Whoa, what would a C programmer like you know about that?
the more efficient the programmer. Given a good
compiler (something rare),

Why don't you make one then? Not up to the task, Mr. Efficiency?
the more efficient the generated
code, too.

Trully a C mantra: performance and portability at all costs, just a tad
above assembly language.
Against higher levels of abstraction: it's often
more work (defining a class, where just using int will do), and
some particular interfaces (including most serialization) are
defined in terms of a low level of abstraction.

Don't argue my point! (I mean if you want to win!).
Except all signed integral types.

Don't mock me. I *said*, there is no similar type.
[...]
I'm not going to repeat myself.

Because there's nothing to repeat?

Because it would fall to your "deaf" ears.
To date, you've yet to give
any concrete reason why there's a problem with bool.

Fine, you don't "get it", but now you want a "concrete reason". Do you
ever look outside? There is a world beyond bits and bytes, I assure you.
Just vague
assertations which aren't supported by actual facts.

Thanks for admitting that. You almost make me want to admit being sober
is good sometimes too. I can't take much more of this, and I *am*
leaving. You suck cuz I know your name, and I came here for technical
info only, Fred.
So what? If you're worried about the size of bool, you're
worried about the wrong things.

I do worry, but not about such things. Tell me something "life-changing"
like bool is not what I think it is. I mean it's just 2 PD values, for
Pete's sake.
I wouldn't go so far as "brain-damaged"

OK, at least you recognize and admit it (I must feel that you are
defending the language... a whole new area, or is that what "language
lawyer" means?).
(unless you want to
argue that C++ is brain-damaged everywhere, because it is based
on C).

You have a (my) point: C++ is brain-damaged. But you say, WTF, let all
else and "new" be similarly afflicited.
It's regrettable.

It's minor, kinda. When I need a boolean, I use "my" class (I still have
a few drinks because it converts to uint8, yeah I lose sleep over it, I
don't get out much).
It's also necessary, for reasons of
backward compatibility.

I know. I know, really. I use the word "paradigm" a lot, you know, I'm
not devoid of them, it's just that ... I dunno.
Like the original authors of the
proposal, I would have preferred that the implicit conversions
be deprecated, but there was no consensus in the committee for
this, so it wasn't.

So you are fired. ;)
Historically, there wasn't a type bool, and a lot of programmers
used:
typedef int bool;

And here I am "whining" about broken bool! But wait, it's just as bad!
bool is like throwing a dog a bone, you know he wants the steak, but
he'll knaw on that bone for... well, till it's gone! (Good think she has
the umbrella now, cuz that "safety doposite box" is surely just worth of,
deposits!
(I wouldn't be surprised if <windows.h> still has a typedef for
BOOL. Or maybe a macro.)

"To be honest", I am surprised you don't know that.
I don't think that any implementation
on a byte addressable machine has ever made it more than a byte.

Hmm, I wouldn't know either, cuz I have never used bool directly. I think
"typedef bool int" was "all the rage, "way" back when, but I'm not the
historian (you are! Na na! :p ).
(But since I've never worried about it, I could be wrong.)

You "could" be? You usually are!
 
J

JBarleycorn

Robert said:
[...]
Who does? In over thirty years of experience in C and C++, I've
yet to see any application which uses such a header. I'm
tempted to say that someone who uses such a header doesn't
understand the basic philosophy behind C and C++.
Now we're getting somewhere! You saying that you code against a
*"philosophical"* paradigm.

I'm saying that I use data abstraction, as a tool to manage
complexity.
The word "religion" will surface soon, I'm
sure. Then you can say that my designs are unholy and yours are
holy and then the congregation you belong to will ... something.

There's no religion about it: the higher the level of
abstraction, the more efficient the programmer. Given a good
compiler (something rare), the more efficient the generated
code, too. Against higher levels of abstraction: it's often
more work (defining a class, where just using int will do), and
some particular interfaces (including most serialization) are
defined in terms of a low level of abstraction.
How is bool different?
If bool doesn't meet your expectations, you don't have the option
to change to a similar type of different width, because there is
no similar
type of different width.
Sure there is: signed char, short, int and long are all similar
types, with potentially different widths. (Don't ask me why
bool is a signed integral type, rather than an unsigned one.)
Like I *said*: there is no *similar* type.

Except all signed integral types.

[...]
Exactly like every other integral type.
I'm not going to repeat myself.

Because there's nothing to repeat? To date, you've yet to give
any concrete reason why there's a problem with bool. Just vague
assertations which aren't supported by actual facts.
That sounds like the most rational decision. Or rather, you use
the bool the implementation gives you, until you need something
faster for your application.
1. The size of C++ bool can't be relied on.

So what? If you're worried about the size of bool, you're
worried about the wrong things.
2. C++ bool has brain-damaged behavior (promotion to int).

I wouldn't go so far as "brain-damaged" (unless you want to
argue that C++ is brain-damaged everywhere, because it is based
on C). It's regrettable. It's also necessary, for reasons of
backward compatibility. Like the original authors of the
proposal, I would have preferred that the implicit conversions
be deprecated, but there was no consensus in the committee for
this, so it wasn't.
(As I've mentionned earlier, I
expect that bool is a single byte on most, if not all, byte
addressed platforms.)
Now it is, but historically, I have an incling, that on all those
platforms, bool was synonymous with int. In that case, you have just
different versions (old and new) of the compiler and encounter a
size change of type (or synonym) bool.

Historically, there wasn't a type bool, and a lot of programmers
used:
typedef int bool;
(I wouldn't be surprised if <windows.h> still has a typedef for
BOOL. Or maybe a macro.) I don't think that any implementation
on a byte addressable machine has ever made it more than a byte.
(But since I've never worried about it, I could be wrong.)


Just for variety, the Windows API has BOOL, which typedefs as an int
(and is a very old definition in Windows - back to at least Win2.x),
plus the newer BOOLEAN, which typedefs as a char (byte).

Speak when you are spoken to, bitch.
 
J

JBarleycorn

Alf said:

See, Fred Kanya, you are... umm, what is it.. are you ... well you
know... you go to church but, naughty little you... are you... you like
to play with your tingly bits.... you are... umm, let me think about
this.... could you perhaps be
 
J

James Kanze

[...]
Historically, there wasn't a type bool, and a lot of programmers
used:
typedef int bool;
(I wouldn't be surprised if <windows.h> still has a typedef for
BOOL. Or maybe a macro.) I don't think that any implementation
on a byte addressable machine has ever made it more than a byte.
(But since I've never worried about it, I could be wrong.)
Just for variety, the Windows API has BOOL, which typedefs as an int
(and is a very old definition in Windows - back to at least Win2.x),
plus the newer BOOLEAN, which typedefs as a char (byte).

And if I recall correctly, some of the functions which return
BOOL return 1, 0 or -1 (yes, no or maybe? true, false, or I
don't know?).
 
J

James Kanze

James said:
news:2ba060fd-d11e-4cda-b0c8-1038a58b6cf6@a31g2000vbt.googlegroups.com...
[...]
Who does? In over thirty years of experience in C and C++, I've
yet to see any application which uses such a header. I'm
tempted to say that someone who uses such a header doesn't
understand the basic philosophy behind C and C++.
Now we're getting somewhere! You saying that you code against a
*"philosophical"* paradigm.
I'm saying that I use data abstraction, as a tool to manage
complexity.
No, you did not say that.

I said enough that any competent programmer would have
understood it that way.
scuuze me, "philosophy", as you put it then.

It's neither philosophy nor religion. It's effective, pragmatic
programming,
Whoa, what would a C programmer like you know about that?

I'm a professional, that's all. Regardless of the language I
use (and I haven't used C in over 20 years), I write code
professionally. And you can obtain a pretty high level of
abstraction in C (even if a lot of programmers don't).

One of the first levels of abstraction is ignoring the number of
bits in an int.

[...]
Don't argue my point! (I mean if you want to win!).

Who cares about "winning"? I'm interesting in communicating and
understanding. And in an honest presentation of the facts.

[...]
Don't mock me. I *said*, there is no similar type.

I know what you said. And you're wrong. (If you were honest,
you'd admit it, but that seems to be asking too much.)
And here I am "whining" about broken bool! But wait, it's just as bad!
bool is like throwing a dog a bone, you know he wants the steak, but
he'll knaw on that bone for...

:) I like the image. In a lot of ways, C++ is just a bone.
The problem is that there isn't any steak, and in most cases,
the other languages I've seen are empty dishes. Or as I usually
put it: C++ is about the worst language you can imagine. Except
for all of the others.
well, till it's gone! (Good think she has
the umbrella now, cuz that "safety doposite box" is surely just worth of,
deposits!
"To be honest", I am surprised you don't know that.

Why should I? I know that at one point in time, they did, but
it doesn't seem relevant today. I do a lot of Windows
programming, and I've never used it.
 
A

Alf P. Steinbach

news:2ba060fd-d11e-4cda-b0c8-1038a58b6cf6@a31g2000vbt.googlegroups.com...
[...]
Historically, there wasn't a type bool, and a lot of programmers
used:
typedef int bool;
(I wouldn't be surprised if<windows.h> still has a typedef for
BOOL. Or maybe a macro.) I don't think that any implementation
on a byte addressable machine has ever made it more than a byte.
(But since I've never worried about it, I could be wrong.)
Just for variety, the Windows API has BOOL, which typedefs as an int
(and is a very old definition in Windows - back to at least Win2.x),
plus the newer BOOLEAN, which typedefs as a char (byte).

And if I recall correctly, some of the functions which return
BOOL return 1, 0 or -1 (yes, no or maybe? true, false, or I
don't know?).

BOOL was used as an allround x != 0 boolean type, yes, in particular as
result of dialog proc.

It's one of the reasons that writing x == true is a very bad habit.

Then there was/is VARIANT_BOOL, where true is -1. It's for the script
language interface, which was originally the Visual Basic interface.

Cheers,

- Alf
 
J

JBarleycorn

James said:
James said:
[...]
Who does? In over thirty years of experience in C and C++, I've
yet to see any application which uses such a header. I'm
tempted to say that someone who uses such a header doesn't
understand the basic philosophy behind C and C++.
Now we're getting somewhere! You saying that you code against a
*"philosophical"* paradigm.
I'm saying that I use data abstraction, as a tool to manage
complexity.
No, you did not say that.

I said enough that any competent programmer would have
understood it that way.

You are HOW old and still trying to instigate pissing contests? Ha! You
weren't doing that, huh, you were spewing your propaganda. I got your
"competant programmer", "right here, asshat".
It's neither philosophy nor religion. It's effective, pragmatic
programming,

"Hello". Apparently, you don't know which way is up. Enlighten us all: Is
it a philosophy by which you propose ... WTF are you doing? Do you WANT
something? What do you want?
I'm a professional

A professional WHAT?
, that's all.

"that's all" = weasel words.
Regardless of the language I
use

Yeah, regardless of "the language I use", that is good, but good for you
and not so much for me, and certainly a key exploitation of task roles.
(and I haven't used C in over 20 years)

How much of a lie ist that? Did you know C before C++ became about? You
simply CANNOT be JUST a C++ programmer if you knew C prior. So if you
did, I'm calling you "a liar".
, I write code
professionally.

I believe that. It's a good job for you, and has been. I'm not knockin'
it in the least. "You go girl!".
And you can obtain a pretty high level of
abstraction in C (even if a lot of programmers don't).

It's irrelevant... you are rambling.
One of the first levels of abstraction is ignoring the number of
bits in an int.

So you are having a hard time with enlightenment. Don't give it another
thought. You did good: you milked it for all it was worth as a blind
follower, and you got things done. Someone has got to do the work, and
you're good at it, right? If you didn't like digging the hole all the
time you were digging it, it's not my problem, don't make it so. Don't
become an imposer/oppressor just because your candle is about to
flameout. You didn't do bad before (professionally, or maybe...
nevermind), so don't start doing (really?) bad now.
[...]
Don't argue my point! (I mean if you want to win!).

Who cares about "winning"?

You do.
I'm interesting in communicating and
understanding.

No you're not. You just got a drift of my line of thinking and trying to
recover from your blatant hypocrysy. ( ;) :p) )
And in an honest presentation of the facts.

You ARE "a programmer", that's for sure. It's all black or white huh. You
and Patricia in the Java NG seem to have not found each other. (I mean,
if you believe that like people attract each other).
[...]
Don't mock me. I *said*, there is no similar type.

I know what you said.

No you don't.
And you're wrong.

No I'm not. And it's an easy construct. 1st grade. I would "call you"
"paradigmically blind", but I don't understand people who act smart, but
aren't. I'm not saying there isn't value in that. I just want to
understand it.
(If you were honest,
you'd admit it, but that seems to be asking too much.)

I am honest, that's why you can't win. I mean why you can't win here. You
will win. Because you have more ICBMs. It's not like I don't get it/you.
But it's not that I'm here just observing you "tards".... <more thoughts,
but out of fuel>.

(I'm NOT picking on you. I wouldn't do that. Do you know why?).
And here I am "whining" about broken bool! But wait, it's just as
bad! bool is like throwing a dog a bone, you know he wants the
steak, but he'll knaw on that bone for...

:) I like the image.

"I have my moments".
In a lot of ways, C++ is just a bone.

Oh shut up and go milk it. (enjoy).
The problem is that there isn't any steak, and in most cases,
the other languages I've seen are empty dishes. Or as I usually
put it: C++ is about the worst language you can imagine. Except
for all of the others.

So it's MY fault now?!
Why should I? I know that at one point in time, they did, but
it doesn't seem relevant today. I do a lot of Windows
programming, and I've never used it.

I remember you when you were just a SPARC. "You have come a long way
grasshopper".
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

No members online now.

Forum statistics

Threads
473,770
Messages
2,569,586
Members
45,086
Latest member
ChelseaAmi

Latest Threads

Top