Randy said:
It's the model that pretty much all ISV's use. Â <insert list of 20,000
canned apps here>
That doesn't mean that it has any justification whatsoever, specially if we
are dealing with applications that need to run as efficiently as possible.
It's not barred from it, technically. Â It's cost prohibitive to build
and test a lot of versions of the same program just to micro-optimize
for a bunch of slightly different processors, cache sizes, etc.
I have to call bullshit on that one. What is being discussed here is not
micro-optimisation but simply building and distributing binaries that are
specific for a certain platform instead of shipping a single binary that is
expected to run on multiple platforms. Nowadays we have compilers, even
F/LOSS compilers, that, to produce optimised binaries for a certain
platform, the only thing that need to do is set a couple of flags.
For example, in order for GCC to produce optimised code for a
386/pentium/athlon/prescot/opteron/abbacus, you only need to specify
the -mtune=cpu-type flag. GCC offers a countless set of optimisation
options but, according to the present state of proprietary, closed source
software distribution, that change alone would be a huge step forward.
Moreover, please do have in mind that there are quite a few free software
projects, even whole operating system distributions, that distribute
multiple versions of binaries optimised to specific target platforms? The
people behind those projects distribute those binaries on public
repositories open to all and we don't see them bankrupting. How exactly are
the people behind all those free software distributions able to produce
optimised binaries and offer them to everyone interested without charging a
single cent for their work while, according to your statement, the cost for
proprietary software distributors to do an irrelevant fraction of that same
work would be "prohibitive", specially when they charge quite a bit over 50
euros for each download/DVD?
Rui Maciel