Malcolm McLean wrote, On 14/07/07 19:29:
As Flash says, we've been through all this before. I even pulled up some
Java stats that showed pretty clearly that there were about as many
indexed array accesses as integer operations in a sample of Java
programs. I couldn't find a similar study for C, though I didn't try too
hard, but there is no reason to suppose that C programs are radically
different from Java ones.
As was pointed out it explicitly excluded major application domains and
had other major flaws in trying to prove your point, so it failed to
prove it. For example there was nothing in the study indicating that the
extremely small sample was representative.
Computers spend most of their cycles moving data from one place to
another,
In your opinion. The only code I've come across where that was even
close to true were dedicated communications cards.
> and most integers are used to count things in the computer's
memory.
In your opinion. It is untrue for all of the software I've worked on in
over 20 years.
The one piece of SW I've come across where I think your claim might be
true, the original designer admitted when I confronted him that with
hindsight it was the wrong design, and that my suggestion which would
have avoided at least three quarters of the moving would have been
vastly more efficient. However, even then when you add in the other
processors in the system more processing power (probably by an order of
magnitude or more, since I'm talking about the other dozen or more
processors) was spent processing data then moving or counting it.
> That's not every cycle, of course, nor is every single integer a
count of something - cryptography is an obvious exception,
As is a lot of image processing (done as integer arithmetic for speed),
a lot of financial processing (done as integer arithmetic on pennies
because you are not aloud to have rounding errors), as is almost all the
software in almost every avionics system I've worked on (imaging
systems, radar systems, built in test software etc).
> as are
intermediate results in frequency transforms, or pixel colours. The last
leads us to another issue, in a typical image function
void setpixel(long *img, int width, int height, int x, int y, long val)
{
assert(x >= 0 && x < width);
assert*y >= 0 && y < height);
img[y*width + x] = height;
}
How many integers do we have? You could say width * height, plus a few,
or you could say six, of which two are pixel values and four
intermediates in array calculations. I'm counting it as six, which isn't
the only answer justifiable, but makes sense in the context of how best
to define the types in a high-level language.
The integer processing I've done when doing image processing had more
like several dozen integer operations on pixel values, and only a very
few on position. Almost all of the algorithms I've worked on or come
across are designed so that the address calculations are simple
increments, because there is too much other integer work to do to waist
cycles on address arithmetic.
One made up example is proof of nothing. Even one real example proves
nothing about what the major use for integers is.
If you want to say it is anything other than personal opinion then find
some evidence, otherwise stop claiming your personal opinion is reality.
I.e. either find a study designed to prove your point or get your
university to fund one and do it properly. I'm sure the stats department
can tell you how to design a proper study, including telling you that
you need to find a large enough *representative* sample.
I'm not the only one to have expressed a dissenting opinion, an I don't
think I've seen anyone agree with you. Ever considered that if no one
supports you then the experience of most people here disagrees with you
and that therefore it might be you that is wrong?