which tutorial to use?

S

santosh

user923005 said:
The same arguments could have been made for 16 bit ints changing into
32 bit ints 'back in the day'.

Aren't you glad that they did not take that nonsense stance back then?

Ten years from now, 32 bit ints are going to look very foolish.
Especially when stdint.h gives you 32 bit sizes if you want them.

IMO-YMMV.

I must admit you are right though. The way the real world works *is*
broken.

The real world is practical. IMHO, int should reflect the most commonly
used integer ranges in most programs. A 32 bit int suits this. A 64 bit
int is perhaps an overkill. Just because the underlying processor's
registers have expanded doesn't mean that the most commonly used
integer type should also do so. The compiler can always put 32 bit
values into 64 bit registers for temporary calculations.

Ultimately it seems to me to be a elegance vs. space trade-off. I admit
that one a 64 bit system a 64 bit int type is very elegant and in
keeping with what Ritchie desired. But that's mean that int absolutely
*has* to mirror the processor GRP width. If other concerns are greater
then we could look at other models. After all, even if an int isn't 64
bits we can still use int64_t from stdint.h.
P.S.
From K&R2, p36:
"The intent is that short and long should provide different lengths of
integers where practical; int will normally be the natural size for a
particular machine. short is often 16 bits long, and int either 16 or
32 bits. Each compiler is free to choose appropriate sizes for its own
hardware, subject only to the the restriction that shorts and ints are
at least 16 bits, longs are at least 32 bits, and short is no longer
than int, which is no longer than long."

I think at the time Ritchie made this decision, he would probably have
little imagined that one day 64 bit systems would become the norm.
Particularly, one popular architecture supports a range of sizes from 8
to 16 to 32 to 64 and a lot of code and compilers have taken advantage
of this availability. Ritchie probably would have never imagined the
long long type, and I doubt he liked it's eventual creation, but it was
a practical necessity, as a very popular platform tied long to 32 bits.
 
S

santosh

Ioannis said:
I am not sure what you are saying here. One can find the
maximum/minimum values of his system types by checking limits.h and
float.h




So why do we need long long?

For LP32 systems and L32P64 systems.
 
I

Ian Collins

user923005 said:
A quote from that link:
"The world is currently dominated by 32-bit computers, a situation
that is likely to exist for the foreseeable future."

Ummm.... When was *that* written?

1997.
 
U

user923005

The real world is practical. IMHO, int should reflect the most commonly
used integer ranges in most programs. A 32 bit int suits this.

At the time the 80386 CPU came out, it was a 16 bit int that suited
that exact same thing.
A 64 bit
int is perhaps an overkill. Just because the underlying processor's
registers have expanded doesn't mean that the most commonly used
integer type should also do so. The compiler can always put 32 bit
values into 64 bit registers for temporary calculations.

At 4GHz, how long does it take to exhaust a 32 bit integer? Not very
long. 50 years from now, imagine the pain that will result from
fixing all the code broken by short-sighted decisions.
Ultimately it seems to me to be a elegance vs. space trade-off. I admit
that one a 64 bit system a 64 bit int type is very elegant and in
keeping with what Ritchie desired. But that's mean that int absolutely
*has* to mirror the processor GRP width. If other concerns are greater
then we could look at other models. After all, even if an int isn't 64
bits we can still use int64_t from stdint.h.

Touche. However, I would like code that uses 'int' to get the
hardware specific fastest possible integer type.
I think at the time Ritchie made this decision, he would probably have
little imagined that one day 64 bit systems would become the norm.

I guess that you are wrong about that. That quote was written in 1988
and I already had used 16, 32 and 60 bit systems and imagined that 64
bit systems were going to be popular soon (though my 'soon' was a
little optimistic).
Particularly, one popular architecture supports a range of sizes from 8
to 16 to 32 to 64 and a lot of code and compilers have taken advantage
of this availability. Ritchie probably would have never imagined the
long long type, and I doubt he liked it's eventual creation, but it was
a practical necessity, as a very popular platform tied long to 32 bits.

It may have been a practical necessity, but that does not make it the
best possible solution.
Often, we make pragmatic decisions simply because we're lazy.
 
I

Ian Collins

user923005 said:
Touche. However, I would like code that uses 'int' to get the
hardware specific fastest possible integer type.
Which may well be 32 bits on a a 64 bit machine.
 
R

Richard Heathfield

jacob navia said:

I think C is a very good language, and it is popular because a simple
language is always better than a bloated one.

I agree entirely.

(This hardly ever happens, which makes it all the more noteworthy.)
 
R

Richard Heathfield

santosh said:

But if int is
made 64 bits then we have to sacrifice either a 16 bit or 32 bit type,
mostly the former.

No, we don't. At least one Cray system uses C8SILP64. Why not do that?
But a 32 bit short just feels weird!

Not to me. I've never used a system with 32-bit short, but I have no
particular objection to doing so. C8S32IP64L128 is perfectly fine by me.
Also if int is
64 bits what is long supposed to be?

At least 32 bits. 64 is fine, 128 is fine too.
And what about long long?

What about it? But if you must have one, make it twice as long as long.
That's why I agree with Jacob that even under 64 bit systems int is best
left at 32 bits, and long made 64 bits.

And that makes long long unnecessary, which was Ioannis's point in the
first place.
 
B

Ben Pfaff

Richard Heathfield said:
santosh said:

No, we don't. At least one Cray system uses C8SILP64. Why not do that?

That sacrifices *both* 16-bit and 32-bit types. Sacrificing both
is not an improvement over sacrificing one or the other.
 
R

Richard Heathfield

user923005 said:
A quote from that link:
"The world is currently dominated by 32-bit computers, a situation
that is likely to exist for the foreseeable future."

Ummm.... When was *that* written?

In computing, "the foreseeable future" is about 20 minutes.
 
R

Richard Heathfield

Ben Pfaff said:
That sacrifices *both* 16-bit and 32-bit types. Sacrificing both
is not an improvement over sacrificing one or the other.

Oh, I see - my "No, we don't" is quite wrong, isn't it? Fair enough. But
what I was really trying (and obviously failing) to get across was "who
cares?" - I know I don't. Personally, I wouldn't mind if they made
everything from short upwards 256 bits right now, which would save an
awful lot of messing about over the years.

The long long type represents a failure of nerve.
 
S

santosh

Richard said:
santosh said:


What has the size of a pointer to do with the size of a long int?

Probably not much. I just mentioned two common models. I should have
said L32 systems.
 
S

santosh

Richard said:
santosh said:



No, we don't. At least one Cray system uses C8SILP64. Why not do that?


Not to me. I've never used a system with 32-bit short, but I have no
particular objection to doing so. C8S32IP64L128 is perfectly fine by
me.


At least 32 bits. 64 is fine, 128 is fine too.

If int is 64 bits I don't think long could be 32 bits unless 32 bits of
int are padding bits.
What about it? But if you must have one, make it twice as long as
long.


And that makes long long unnecessary, which was Ioannis's point in the
first place.

long long was necessary probably because long is not guaranteed to be
greater than 32 bits. Until long long there was no standard 64 bit
integer.
 
S

santosh

Richard said:
Ben Pfaff said:


Oh, I see - my "No, we don't" is quite wrong, isn't it? Fair enough.
But what I was really trying (and obviously failing) to get across was
"who cares?" - I know I don't. Personally, I wouldn't mind if they
made everything from short upwards 256 bits right now, which would
save an awful lot of messing about over the years.

Oh right. Why not make all integers 1024 bits for the even farther
future? C's goal was to be efficient to implement and run. It's type
system has consequently reflected what various systems have provided
over the years and the result of other compromises.

If we want a clean, elegant language with unlimited width types we have
plenty to choose from right now. C was meant to be practical, not
infinitely flexible (though in practise it has turned out to be very
adaptable!) or elegant or whatever.
The long long type represents a failure of nerve.

Or maybe pragmatism. How else could WG14 have introduced a guaranteed 64
bits type without breaking existing code?
 
P

Philip Potter

santosh said:
Or maybe pragmatism. How else could WG14 have introduced a guaranteed 64
bits type without breaking existing code?

One way would be to guarantee that long was 64 bits. This is not
necessarily the best solution, but it is a solution.

Phil
 
R

Richard Heathfield

santosh said:
If int is 64 bits I don't think long could be 32 bits unless 32 bits of
int are padding bits.

Right. The point is that, provided that long int meets the requirements
imposed upon it by the Standard, it shouldn't matter how big it is.

long long was necessary probably because long is not guaranteed to be
greater than 32 bits. Until long long there was no standard 64 bit
integer.

The more impositions we make on implementations to provide data types of
certain minimum size, the harder it is to provide conforming,
high-performance implementations. So it could easily be argued that long
long int makes code less portable, not more so. If an implementation can
provide a 64-bit type, fine, let long int be 64 bits on that
implementation. But if it can't, why try to force it?

If the real world weren't important, ISO could simply have made every int
type, qualified or not, 1048576 bits wide and be done with the whole issue
forever. But the real world /does/ matter, and there are always (or, at
least, for some time to come) going to be some platforms that are going to
struggle to provide 64-bit integers.
 
R

Richard Heathfield

santosh said:
Richard Heathfield wrote:


Oh right. Why not make all integers 1024 bits for the even farther
future? C's goal was to be efficient to implement and run. It's type
system has consequently reflected what various systems have provided
over the years and the result of other compromises.

This is precisely my argument against an at-least-64-bit type.
If we want a clean, elegant language with unlimited width types we have
plenty to choose from right now. C was meant to be practical, not
infinitely flexible (though in practise it has turned out to be very
adaptable!) or elegant or whatever.

Right. One practical solution is to impose only a *lower* limit, not an
upper limit, on type sizes - which is what we already had.
Or maybe pragmatism. How else could WG14 have introduced a guaranteed 64
bits type without breaking existing code?

My whole point is that they didn't need to.
 
S

santosh

Philip said:
One way would be to guarantee that long was 64 bits.

Which would break a lot of currently conforming implementations.
This is not
necessarily the best solution, but it is a solution.

IMHO, it seems even worse than the solution represented by long long.
 
S

santosh

Richard said:
santosh said:

Right. The point is that, provided that long int meets the
requirements imposed upon it by the Standard, it shouldn't matter how
big it is.



The more impositions we make on implementations to provide data types
of certain minimum size, the harder it is to provide conforming,
high-performance implementations. So it could easily be argued that
long long int makes code less portable, not more so. If an
implementation can provide a 64-bit type, fine, let long int be 64
bits on that implementation. But if it can't, why try to force it?

This same argument could probably have been made during the
standardisation process of C89/C90 with regards to 32 bits for long.
If the real world weren't important, ISO could simply have made every
int type, qualified or not, 1048576 bits wide and be done with the
whole issue forever. But the real world /does/ matter, and there are
always (or, at least, for some time to come) going to be some
platforms that are going to struggle to provide 64-bit integers.

Even with emulation? If so, they could be non-conforming.

Not having a guaranteed 64 bits integer type in ISO C would mean that
programs using values of this size would either have to use a compiler
extension or a bignum library, both of which reduce portability, or be
open to the possibility of not compiling on some set of systems.

There is an argument to be made that 64 bit integers are not often
required by most computations. Indeed by this angle of reasoning an
even stronger case can be made against the inclusion of complex and
imaginary types, as well as the whole of fenv.h's facilities.

IOW you seem to be saying that C99 was essentially unnecessary. Would
you say the same for C1x?
 
S

santosh

Richard said:
santosh said:


This is precisely my argument against an at-least-64-bit type.


Right. One practical solution is to impose only a *lower* limit, not
an upper limit, on type sizes - which is what we already had.

Well, by this reasoning short and long would be unnecessary too. char
and int alone would do the trick with int being anything starting from
16 bits.

IMHO *too* much uncertainty makes it harder to program. Of course, too
many types achieve the same too.
My whole point is that they didn't need to.

Okay. That's a fair position.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

No members online now.

Forum statistics

Threads
473,770
Messages
2,569,583
Members
45,075
Latest member
MakersCBDBloodSupport

Latest Threads

Top