return the start of a substring in a string in c

F

Flash Gordon

Malcolm McLean wrote, On 14/07/07 20:22:
That's a very common fallacy. I can find some objection to your
evidence, therefore you have offered a "no evidence" position.
Eg Martha saw Fred do the murder. But Martha is Fred's ex-mistress.
Therefore there is no evidence against Fred. No. It's plausible that an
ex-mistress would want to frame someone for murder, but not very likely
given the risks.

Your fallacy is that an unrepresentative sample proves something. Most
people on this group post using male names, therefore most people in the
world are male.

Several reasons why you have not proved your point have been pointed out
to you, and the experience of everyone on this group who expresses an
opinion disagrees with you, so my study, which is based on a sample of C
programmers in different fields (rather than one person doing a study on
a different point with Java), by your logic proves that you are wrong.

Alternatively, we have one study for and one against, with more people
disagreeing with you than agreeing, so the balance of evidence currently
available suggests it is more likely that you are wrong than correct.
I haven't seen anyone really demolish my claim that most processor
cycles are consumed in moving data from place to another.

So your unrelated study and personal opinion are evidence, but the
experience of anyone else does not count. I think you have a very
inflated opinion of yourself.

Had more people on the group agreed with you I would have accepted that
was supporting evidence (not proof), but I don't think anyone has posted
supporting you.
> Generally what
is offered is "I can write a program where that isn't true" or "my
subjective opinion is otherwise because I do X, which involves a lot of
integer calculation".

Misrepresenting what others say does not prove your point either. Others
are saying that there many years of experience in real programming in
the real world for real world applications in a variety of application
domains disagrees with your opinion.
> That is weak because we naturally say "the
spreadheet is calculating an average". It is, and that is point of the
operation. However really it is updating a video display.

So you don't could the 50 your 100 integer calculations used to generate
a value, only the dozen used to calculate where to get and put data?

Your experience and opinion proves your point but the experience and
opinion of everyone else does not count? Why should anyone take account
of your experience or opinion if you don't consider the experience of
others relevant?
 
M

Malcolm McLean

Malcolm McLean said:
It is an idea. I could modify the GNU front end, and it would probably be
quite trivial, though quite hard to understand the code structure.
However the vast majority of my programming has to work on anything, >
which is partly why having several integer types is such a nuisance.
On second thoughts, the snag is the linker. If integers are passed by
address, which is by no means uncommon, you have to rebuild every library.
This only needs doing once, but it means that to be practical the decision
cannot be taken by one person, because every library has to be run through
the new compiler. Or you've got to make major changes to gcc so that the 64
bit C compiler is effectively a new language written on top of existing
libraries, rather than an interface to those libraries, which could be done
but isn't a simple as switching the code for "int" and "long long" in the
front end.

(You've also got the problem of library code not written in C. There will
never be a good answer to that one.)
 
M

Malcolm McLean

Flash Gordon said:
So you don't could the 50 your 100 integer calculations used to generate a
value, only the dozen used to calculate where to get and put data?
That's right. if we say "most integers are used to count things" then what
matters is the final contribution of that integer to the program's output.
Normally it ends up being used as an array index. However there is almost
always some intermediate calculation, such as an increment and comparison to
step through a loop. If the intermediate calculations are very elaborate,
such as a cryptographical app that, ultimately, produces a list of indices
into a character table, then you could say that the analysis isn't too
useful because we are not really representing where the integers spend their
time. In the last example, the fact that the integer will eventually index
the character table isn't important and we are unlikely to care if it goes
through a conversion subroutine to put it into the range of the character
set. However generally the final destination will dominate our choice of
representation.
 
F

Flash Gordon

Malcolm McLean wrote, On 14/07/07 21:11:
That's right. if we say "most integers are used to count things" then
what matters is the final contribution of that integer to the program's
output. Normally it ends up being used as an array index. However there

That may be your experience, but it is not mine.
So you think that in
a[b+c] = d*e*f*g*h*h;
most stuff is to do with indexing? I suggest that almost everyone would
disagree with you. Only 3 operation in the above are to do with the
indexing, all the rest are calculating a value which has nothing to do
with indexing. Most of my work actually has simpler expressions for
indexing (if there is any) and more complex expressions for calculating
the value.

To take just one real example. Starting with under a dozen numbers which
were not part of an array the processor spent on average over 20ms doing
3 dimensional geometry mostly in integer arithmetic for speed and
probably about 2ms doing a completely separate task which involved a
1024 element array, but in that task the output was 2 numbers.
is almost always some intermediate calculation, such as an increment and
comparison to step through a loop.

Ah, you think if there is one loop in a program then all integer
arithmetic is to do with counting, even if it is nothing to do with the
loop variable.
> If the intermediate calculations are
very elaborate, such as a cryptographical app that, ultimately, produces
a list of indices into a character table, then you could say that the
analysis isn't too useful because we are not really representing where
the integers spend their time. In the last example, the fact that the
integer will eventually index the character table isn't important and we
are unlikely to care if it goes through a conversion subroutine to put
it into the range of the character set. However generally the final
destination will dominate our choice of representation.

I will leave someone who knows about cryptography to answer that.

Personally I'm starting to think you are using a very strange definition
of what counting and array indexing is to come up with your figures. I
don't consider calculating the value to put in to an array to be
anything to do with counting or indexing, I only count the calculation
of where to put it as indexing, which a lot of the time is just one or
two additions, maybe a multiplication. Calculating the value on the
other hand is often dozens or more operations.
 
F

Flash Gordon

Malcolm McLean wrote, On 14/07/07 20:27:

However the vast majority of my programming has to work on anything,
which is partly why having several integer types is such a nuisance.

So use the types defined by the standard for the purposes they were
defined, then it will work. That is why different integer types are
provided for different purposes! You can even <gasp> use the values in
limits.h to find out what the limits are at compile time.

Alternatively move to Java where everything is nailed down to be the
same on all implementations (apart from my Java code that will not run
on SCO despite doing nothing exotic) however inefficient it is.
ints have been able to address all of memory space, with the unimportant
exception of strings or char arrays which occupy more than 50% of
memory, of the vast majority of machines up till now. The advent of 64
bits onto the desktop will change that.

Hold your breath whilst you wait. Of course, this will rewuire holding
your breath at least until 128 bit processors are used for desktops, but
that is your problem not mine.
 
A

Army1987

As Flash says, we've been through all this before. I even pulled up some
Java stats that showed pretty clearly that there were about as many indexed
array accesses as integer operations in a sample of Java programs. I
couldn't find a similar study for C, though I didn't try too hard, but there
is no reason to suppose that C programs are radically different from Java
ones.
Yes, I'm trying to make a point, but...
What fraction of integers is used to index arrays in this program?
#include <stdio.h>
#include <limits.h>
#define MAX 108
#define DIGITS (((CHAR_BIT * (int)sizeof(int) - 1) * 28 - 1)/ 93 + 1)
#define COLUMNS (79 / (DIGITS + 1))
#define SQR(x) ((x) * (x))
int main(void)
{
int primes[MAX] = { 2, 3, 5, 7 };
int *current = &primes[4];
int k1, k2;
size_t column = 0;
for (k1 = 11, k2 = 13; current < primes + MAX; k1 += 6, k2 += 6) {
int *cursor;
int flag = 1;
for (cursor = &primes[2]; SQR(*cursor) <= k1; cursor++)
if (k1 % *cursor == 0) {
flag = 0;
break;
}
if (flag)
*current++ = k1;
if (current == primes + MAX)
break;
flag = 1;
for (cursor = &primes[2]; SQR(*cursor) <= k2; cursor++)
if (k2 % *cursor == 0) {
flag = 0;
break;
}
if (flag)
*current++ = k2;
if (k2 > INT_MAX - 6)
break;
}
for (current = primes; current < primes + MAX; current++) {
printf("%*d ", DIGITS, *current);
if (++column >= COLUMNS) {
putchar('\n');
column = 0;
}
}
if (column)
putchar('\n');
return 0;
}
 
M

Malcolm McLean

Army1987 said:
As Flash says, we've been through all this before. I even pulled up some
Java stats that showed pretty clearly that there were about as many
indexed
array accesses as integer operations in a sample of Java programs. I
couldn't find a similar study for C, though I didn't try too hard, but
there
is no reason to suppose that C programs are radically different from Java
ones.
Yes, I'm trying to make a point, but...
What fraction of integers is used to index arrays in this program?
#include <stdio.h>
#include <limits.h>
#define MAX 108
#define DIGITS (((CHAR_BIT * (int)sizeof(int) - 1) * 28 - 1)/ 93 + 1)
#define COLUMNS (79 / (DIGITS + 1))
#define SQR(x) ((x) * (x))
int main(void)
{
int primes[MAX] = { 2, 3, 5, 7 };
int *current = &primes[4];
int k1, k2;
size_t column = 0;
for (k1 = 11, k2 = 13; current < primes + MAX; k1 += 6, k2 += 6) {
int *cursor;
int flag = 1;
for (cursor = &primes[2]; SQR(*cursor) <= k1; cursor++)
if (k1 % *cursor == 0) {
flag = 0;
break;
}
if (flag)
*current++ = k1;
if (current == primes + MAX)
break;
flag = 1;
for (cursor = &primes[2]; SQR(*cursor) <= k2; cursor++)
if (k2 % *cursor == 0) {
flag = 0;
break;
}
if (flag)
*current++ = k2;
if (k2 > INT_MAX - 6)
break;
}
for (current = primes; current < primes + MAX; current++) {
printf("%*d ", DIGITS, *current);
if (++column >= COLUMNS) {
putchar('\n');
column = 0;
}
}
if (column)
putchar('\n');
return 0;
}
Malcolm (upthread)
 
A

Army1987

Army1987 said:
As Flash says, we've been through all this before. I even pulled up some
Java stats that showed pretty clearly that there were about as many
indexed
array accesses as integer operations in a sample of Java programs. I
couldn't find a similar study for C, though I didn't try too hard, but
there
is no reason to suppose that C programs are radically different from Java
ones.
Yes, I'm trying to make a point, but...
What fraction of integers is used to index arrays in this program? [snip]
int main(void)
{
int primes[MAX] = { 2, 3, 5, 7 };
int *current = &primes[4];
int k1, k2;
size_t column = 0;
for (k1 = 11, k2 = 13; current < primes + MAX; k1 += 6, k2 += 6) {
int *cursor;
int flag = 1;
for (cursor = &primes[2]; SQR(*cursor) <= k1; cursor++)
if (k1 % *cursor == 0) {
flag = 0;
break;
}
if (flag)
*current++ = k1;
if (current == primes + MAX)
break;
[snip]
Malcolm (upthread)
I don't know about Java, but if it doesn't have pointer arithmetic
*cursor++ or equivalent, the above program would need far more
array indexing than C.
So is very dubious at least.
 
B

Ben Bacarisse

Malcolm McLean said:
As Flash says, we've been through all this before. I even pulled up
some Java stats that showed pretty clearly that there were about as
many indexed array accesses as integer operations in a sample of Java
programs. I couldn't find a similar study for C, though I didn't try
too hard, but there is no reason to suppose that C programs are
radically different from Java ones.

I don't want to kick this off again (I missed the last one, must have
been during a break in reading c.l.c) but until Java is the language
of choice for either OS development or embedded systems there are
obvious reasons to suppose that C programs are (statistically)
different from Java ones.

That aside, I just don't know where you are going with your argument.
I was initially correcting the logical fallacy that "most integers are
used to count" is not the same as "most counting is done with
integers". If you *are* saying that "most integers are used to count"
and let us suppose that that turns out to be true, where does that
take you? How does that advance the cause for denuding C of its
(other) integer types? To put it another way, what is the problem, to
which universal 64 bit integers is the solution?

As I said, I don't want to kick this off again, so please feel free to
leave this answered if it was dealt with last time. I'll look for the
old thread if I feel I have to know.
 
M

Malcolm McLean

Ben Bacarisse said:
I don't want to kick this off again (I missed the last one, must have
been during a break in reading c.l.c) but until Java is the language
of choice for either OS development or embedded systems there are
obvious reasons to suppose that C programs are (statistically)
different from Java ones.
You can count the number of array access in a Java program by looking at the
bytecode. It's a bit more difficult with a C compiler. However the two
languages are very similar - yes occasionally you might want to use a
pointer rather than an index variable to walk a C array for some reason, and
there will be a difference between embedded apps and business logic. A
difference can be statistically highly significant without being either
large or important. For instance slightly more boys than girls are born, but
for most practical purposes we can regard the ratio as 50%.
That aside, I just don't know where you are going with your argument.
I was initially correcting the logical fallacy that "most integers are
used to count" is not the same as "most counting is done with
integers".
Viritually all counting is done with integers, and most integers are used to
count things. Both are true, but one doesn't imply the other. Both are
important in deciding what representation of integers we need, but the
second more important.
If you *are* saying that "most integers are used to count"
and let us suppose that that turns out to be true, where does that
take you? How does that advance the cause for denuding C of its
(other) integer types? To put it another way, what is the problem, to
which universal 64 bit integers is the solution?
Let's say we've got hammers in various calibres - sledgehammers, claw
hammers, stone hammers, and so forth. It transpires that almost always we
use these hammers for cracking hazelnuts. Whilst it is physically possible
to crack a nut with almost any hammer, in fact a small to medium hammer is
by far the most convenient. Armed with this knowledge, it might be a good
idea to get rid of all the hammers and just buy small to medium hammers.If you want to index an array in C, your integer type needs to be big enough
to address the whole of memory. So 32 bits on a 4GB machine, 64 bits on a
larger machine. You'll notice a slight snag. You can't address an array of
2^63 chars with 64 bits. In practise this is unlikely matter, though that
consideration dominated the definition of size_t.Hence the campaign for 64 bits calls for int to be a 64 bit type, on 64 bit
architectures. The other types will essentially melt away because the need
for them will be so specialised that programmers will just forget they
exist. However, at least at present, there is no proposal to actually change
the C standard, merely to maintain the convention that int is the "natural
integer type" for the machine.
 
O

Old Wolf

You don't. You're already assuming that index will fit in an int, so you can
just change the declaration of index to match.

You didn't say this explicitly, so OP may have missed
it: both versions cause implementation-defined behaviour
if the value is outside of the range of an int.

I would at least make 'index' an unsigned type, since
it is meaningless for the index of the string to be
negative anyway; and then you don't get any UB either.
 
B

Ben Bacarisse

Malcolm McLean said:
You can count the number of array access in a Java program by looking
at the bytecode. It's a bit more difficult with a C compiler. However
the two languages are very similar - yes occasionally you might want
to use a pointer rather than an index variable to walk a C array for
some reason, and there will be a difference between embedded apps and
business logic. A difference can be statistically highly significant
without being either large or important. For instance slightly more
boys than girls are born, but for most practical purposes we can
regard the ratio as 50%.

I don't see any supporting evidence or argument that the differences
can safely be ignored. I also think you know what I meant by C
programs being statistically different to Java ones. You chose to
correct my language rather than argue the point. Are you confident
that evidence drawn from Java programs applies to C programs? Franky,
if you say "yes" I'll drop it because I certainly don't have any
evident to the contrary.

Hence the campaign for 64 bits calls for int to be a 64 bit type, on
64 bit architectures.

The analogy did not help me. I am still stuck on this: "what is the
problem that you are campaigning to solve?". I thought you wanted all
integer types to be 64 bits, but it seems that all you want is for
undecorated "int" to 64 bits on 64 bit processors. What problem does
that solve?
 
F

Flash Gordon

Malcolm McLean wrote, On 15/07/07 23:28:
You can count the number of array access in a Java program by looking at
the bytecode. It's a bit more difficult with a C compiler. However the
two languages are very similar

Yes, I can see how a language is objects (in the OO sense) is similar to
one without. Also how similar the usage is, with all those device
drivers and operating systems written in Java. Oh, wait a moment, those
things are not true, so perhaps they are significantly different after all.
> - yes occasionally you might want to use
a pointer rather than an index variable to walk a C array for some
reason,

Lots of reasons, and it is quite common in C programming.
> and there will be a difference between embedded apps and
business logic.

And all the other types of SW that people have pointed out to you.
> A difference can be statistically highly significant
without being either large or important. For instance slightly more boys
than girls are born, but for most practical purposes we can regard the
ratio as 50%.

Or a statistical difference can be extremely important. You have yet to
demonstrate that the statistical differences are not important, so the
above does nothing to forward your argument.
Viritually all counting is done with integers,

Probably true.
> and most integers are
used to count things.

Still unproven and people other than me have said it does not agree with
their experience. Don't state things as facts when you cannot prove them
and they disagree with other peoples experience.
> Both are true, but one doesn't imply the other.

No, one is probably true and the other you keep claiming and claiming to
have proved even though others have shown why you did not succeed in
proving it and have provided evidence against it.
Both are important in deciding what representation of integers we need,
but the second more important.

Ah, so something unproven is more important than the inefficiencies that
switching to a 64 bit int would lead to. If you don't believe there are
inefficiencies look at the documentation on why the POSIX people decided
what they did.
Let's say we've got hammers in various calibres - sledgehammers, claw
hammers, stone hammers, and so forth. It transpires that almost always
we use these hammers for cracking hazelnuts. Whilst it is physically
possible to crack a nut with almost any hammer, in fact a small to
medium hammer is by far the most convenient. Armed with this knowledge,
it might be a good idea to get rid of all the hammers and just buy small
to medium hammers.

No, a sensible person decides to use the correct tool for the job and
keep the other tools around for when they need them. Otherwise breaking
up concrete and hammering in fence posts becomes a lot harder.
If you want to index an array in C, your integer type needs to be big
enough to address the whole of memory. So 32 bits on a 4GB machine, 64
bits on a larger machine. You'll notice a slight snag. You can't address
an array of 2^63 chars with 64 bits. In practise this is unlikely
matter, though that consideration dominated the definition of size_t.

Where does it say this in the rational of the C standard? Or where you
involved in designing some of the major C compilers? If not, where is
your evidence for this being the reason? Anyway, you have just found the
type large enough for your purpose so use it and stop complaining.
Hence the campaign for 64 bits calls for int to be a 64 bit type, on 64
bit architectures.

A campaign of one and unlikely to do more than mildly irritate a few people.
> The other types will essentially melt away because
the need for them will be so specialised that programmers will just
forget they exist.

Strange that they have not started melting away yet then, when some
people here have been using 64 bit processors for decades.
> However, at least at present, there is no proposal to
actually change the C standard, merely to maintain the convention that
int is the "natural integer type" for the machine.

On common desktop systems 32 bit ints are *also* natural because they
have registers and operations for them and they can often be handled faster.

Ben, I would not worry too much about it, I just find it amusing that
Malcolm's experience and a study designed to prove something completely
unrelated which does not prove his point for many reasons count as
evidence, but decades of experience of others in real world programming
in a variety of fields does not count as evidence.

Anyone could make the opposite claim to Malcolm and it would be just as
justified if they just referenced this and the previous thread.
 
M

Malcolm McLean

Ben Bacarisse said:
The analogy did not help me. I am still stuck on this: "what is the
problem that you are campaigning to solve?". I thought you wanted all
integer types to be 64 bits, but it seems that all you want is for
undecorated "int" to 64 bits on 64 bit processors. What problem does
that solve?
It means that we almost never need anything other than an undecorated int.
In the nineteenth century Sir Joseph Whitworth stadnardised screw threads.
No longer did you need the matching nut for a bolt - all bolts and nuts
would match. Superficially that might seem a small change. In fact it was
one of the seminal events of the Industrial Revolution.
Standardising on 64 bit integers will have a similar effect on the
productivity of C programmers.

However we've got to move by stages. Just to ban "short" and "long" would
break too much code. So first they become rare, then deprecated, then they
map to 64 bits unless you compile with a separate flag. Finally they are
removed from the language, and it becomes simpler rather than more
complicated.
 
F

Flash Gordon

Malcolm McLean wrote, On 16/07/07 20:14:
It means that we almost never need anything other than an undecorated int.
In the nineteenth century Sir Joseph Whitworth stadnardised screw
threads.

He did not successfully standardise nuts and bolts though, only certain
aspects of them.
> No longer did you need the matching nut for a bolt - all bolts
and nuts would match.

So why does my M5 nut not fit my M6 bolt?
> Superficially that might seem a small change. In
fact it was one of the seminal events of the Industrial Revolution.
Standardising on 64 bit integers will have a similar effect on the
productivity of C programmers.

The int type not being 64 bits does not slow me down. Not having smaller
integer types than 64 bits, on the other hand, would significantly slow
down one of the pieces of SW I work on, SW which is already slow enough
on some tasks that people wait noticeable amounts of time. The reason it
is slow is because it is IO (specifically disk) bound, and that is why
going to a larger integer type would slow it down.
However we've got to move by stages. Just to ban "short" and "long"
would break too much code. So first they become rare, then deprecated,
then they map to 64 bits unless you compile with a separate flag.
Finally they are removed from the language, and it becomes simpler
rather than more complicated.

Alternatively you can change to a language that already meets your
requirements since there are plenty to choose from.

Oh, and if you ban having larger integer types than 64 bit (which you
seem to be proposing) you will also prevent people taking easy advantage
of newer HW that supports 128 bit or 256 bit operations, something that
would be very useful in some fields, such as cryptography, and most
people who use computers these days use some cryptographic SW whether
they know it or not.
 
M

Malcolm McLean

Flash Gordon said:
So why does my M5 nut not fit my M6 bolt?


The int type not being 64 bits does not slow me down.
That was probably what the craftsmen said. The nut comes screwed onto the
bolt anyway. Sometimes you need a thick thread for a heavy duty one,
othertimes a shallow thread is cheaper. Thnen there's always one situation
where we need M5 bolts instead of M6.
In fact standardisation realsied something. But it won't necessarily ossify
the language for all time. Having pared down C to three data types; reals,
characters and integers, it might make sense to build it up again, maybe by
adding complexes or symbols like PI, e and surds. If there are about twenty
integer types that becomes much more difficult. In practise what will happen
will be that the language will become unwieldy and be abandoned.
Oh, and if you ban having larger integer types than 64 bit (which you seem
to be proposing) you will also prevent people taking easy advantage of
newer HW that supports 128 bit or 256 bit operations, something that would
be very useful in some fields, such as cryptography, and most people who
use computers these days use some cryptographic SW whether they know it or
not.
You probably don't want to write your bignum arithmetical ops in C anyway.
If you are relying on particular machine instructions being avialable to get
needed performance, assembly is still the way to go. We are talking about
literally half a dozen functions, of which only div mod is likely to be
non-trivial.
 
B

Ben Bacarisse

Malcolm McLean said:
It means that we almost never need anything other than an undecorated int.
In the nineteenth century Sir Joseph Whitworth

Oh dear. I hate reasoning by analogy. It has never done anything for
me. I always wonder what parts are important and what parts are not.
stadnardised screw
threads. No longer did you need the matching nut for a bolt - all
bolts and nuts would match. Superficially that might seem a small
change. In fact it was one of the seminal events of the Industrial
Revolution.

And there is why I can't get one with it. Whitworth standardised the
thread profile but not the size. Different BSW sizes do *not* match.
Even C does better than that -- the different sizes can be assign and
converted, often without any problems at all.

Whitworth sizing was very important, but why was it extended with
extra profiles? Because the BSW profile does not work equally well
for all materials in all situations.[1]

So from this analogy I conclude that C's plethora of integer sizes is
already better than the Whitworth ideal (there are fewer of them and
the sizes are all compatible for assignment and comparison, whereas
different BSW thread do not "match" in any useful way at all[2]) and
that we should expect to introduce more, even less compatible, integer
types in future C to handle new, as yet unforeseen situations.

I don't for a moment believe what either of us thinks we can draw from
this analogy to an engineering standard -- so don't take this as a
counter argument. It is, like yours, a non-argument.[3]
Standardising on 64 bit integers will have a similar effect on the
productivity of C programmers.

I don't see the problem holding back the C programmers, and I can only
image that you have no evidence that it is really there or you would
be pointing to it rather than drawing analogies.

[1] The wide pitch makes BSW threads more prone to vibration. A
narrow pitch was required to cope well with modern mechanical
situations (in particular the "new" automotive industry).

[2] The value of the standardisation was that it no longer mattered
who you bought a 1/4" nut from. It did not mean that the different
nut sizes matched.

[3] Analogies can, sometimes, clarify an confused situation, but they
rarely help to persuade.
 
M

Malcolm McLean

Ben Bacarisse said:
Oh dear. I hate reasoning by analogy. It has never done anything for
me. I always wonder what parts are important and what parts are not.

And there is why I can't get one with it. Whitworth standardised the
thread profile but not the size. Different BSW sizes do *not* match.
Even C does better than that -- the different sizes can be assign and
converted, often without any problems at all.
Engineering is psychological as well as physical. When you are dealing with
human social beahviour, you don't get exactly the same situation twice.
For instance the Dutch tulip mania and the dot com bubble had some
similarities, but one wasn't just an exact replay of the other. Tulip mania
was confined to Holland whilst dot coms were international, for instance.
However the general rule that stocks can be wildly overbid still holds.

Similarly C standards and bolt standards have their differences. However it
obvious that C has too many integer types; short, int, long, long long, in
signed and unsigned, size_t and ptrdiff_t. That's ten standards for
representing an integer. We also know that standardisation tends to work.

You can't just argue by analogy, of course, but don't ignore the lessons of
history.
 
K

Keith Thompson

Malcolm McLean said:
Similarly C standards and bolt standards have their
differences. However it obvious that C has too many integer types;
short, int, long, long long, in signed and unsigned, size_t and
ptrdiff_t.
[...]

It's obvious only to you.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

No members online now.

Forum statistics

Threads
473,744
Messages
2,569,484
Members
44,903
Latest member
orderPeak8CBDGummies

Latest Threads

Top