Malcolm McLean wrote, On 21/07/07 19:51:
They can also read the context in which you post it. It was a reply to
"more integer types make the language richer, as do more looping
constructs", the difference being that integer formats are a standard
for passing data about whilst loops are not.
See previous comment which explains why people might not think you have
grasped the context.
Now if you think I am stupid,
I did not call you stupid, I said you are saying something stupid. If
you can't see the distinction...
> rather than just disagreeing with me, you
should take a reality check. Check the website and see whether this is
consistent with a stupid person. Maybe the stupidity goes the other
way.
Or perhaps YOU should check the websites that have been pointed out to
you, such as the Intel documentation saying that you should have a 32
bit int for C on the Itanium processor, or the sites posted some time
back which explained why the Unix people decided on a 32 bit int.
I seem to be the only person who has cottoned on to the
Normally when one person is at odds with the majority of experts (I'm
talking about Intel, and those responsible for Unix standard, not just a
semi-random collection of people on Usenet) then it turns out to be the
one person who is wrong. Occasionally it is the other way around, but
not normally. So perhaps it is YOU who should reconsider and actually
look at the things you have been pointed at.
language-wrecking potential of not being able to use int as an arbitrary
array index.
It can't be a very high potential, since the language has not died in
all the time since the C89 standard was released.
> Newbies will just see size_t i, with some sort of
explanation why it has to be a size_t even though it is a counter, and
think "this is a highly specialised language just for experts."
Most newbies are not dealing with arrays with more than 32767 elements,
and by the time they are they have normally learnt enough about
programming to be able to cope with the concept of different types for
different purposes.
You'll find I am right when C loses its status as the default language
for specifying algorithms, but by then the damage will have been done
and it will be too late.
Lets see, Fortran has continued to be the default language for some
types of algorithms (C having never supplanted it), maths has continued
to be the default language for others (I've used it myself)...
However, if it has not lost its status in all the years since 1989 when
the types you complain about were introduced (not forgetting all the
16/32 bit processors) then I don't see why those types should suddenly
kill the language now.
> So we must get size_t and ptrdiff_t out of the
language now. If they were either necessary or a good idea such
fundamental types would have been in C from day one.
There are a number of things that were in C from day one that DMR has
stated were, with hindsight, a bad idea. So something not being in C
from day one does not mean it is a bad idea. Also DMR was involved in
the standardisation, as were lots of other people who we have more
reason to believe know what makes a good language than you.
> They are neither,
as long as int is the natural integer size of the machine. Lose that
convention, and there is no choice.
It has been pointed out already that 32 bits is *also* a natural size of
64 bit processors.
> You need ptrdiff_t and size_t, all
over your code, wrecking its beauty and legibility.
A lot of people find that using specific types for specific purposes
*increases* legibility. This has also been pointed out to you. Of
course, legibility is subjective, so different people disagree about
what is legible.
There are always a myriad of little objections to any standard.
Actually, they are BIG objections. It's just they you don't consider
anything that disagrees with your preconceived notions as being a big
objection.
> "Are you
going to throw away a brand new screw-making machine, Sir Joseph, purely
because it produces thread with a pitch that is 5 degrees lower than
your standard?" It's a good point. The machine will have to be scrapped.
However the benefits massively outweighed that sort of consideration.
How about, you have to throw away this Oracle cluster you have built at
tremendous expense because the language you are using no longer has
types that match the standard SQL types, oh and the SQL servers written
in C having to be rewritten in anothe rlanguage as well. Or you have to
switch to another language for writing the JVM because it no longer
supports the integer types you need. Image processing is a pretty large
domain as well, not just for home use but commercial as well. Not to
mention all those big server applications which have to be ported to
another language because suddenly the data is too bloated if it is
written in C (yes, there are modern servers which are fully populated in
terms of RAM, and they make use of all of it).