Compiler optimizations

R

Rui Maciel

Randy said:
I think you missed what I was saying there.  You can optimize with
something like -march for a specific hardware type, but when you move
that binary to other machines it isn't a promise of anything, it may
not even run properly.

Where exactly is there a need to run a single, unique binary across a whole
range of different machines? Why should anyone want that?

Assuming that open source software isn't even being considered here, I don't
see how, in this day and age, a distributor of closed-source, proprietary
software is barred from shipping multiple binaries in the install media or
even making it available over the internet. I mean, in this day and age,
software is being distributed through DVD packs and even through the net.
Where exactly is it impossible to pack a hand full of specialised binaries
in the install media and then pick the more appropriate binary for the
current system?


Rui Maciel
 
R

Rui Maciel

jacob said:
Shipping the optimizer with your application?

There was a time where operating systems also came with a compiler. Heck, it
is still happening in this very day. Nowadays there are quite a lot of
operating systems installing compilers in their default configuration. Take
all those linux distributions, for example. Thanks to that, we see
companies like nvidia relying on the system's compiler to install their
software.


Rui Maciel
 
R

Randy Howard

Where exactly is there a need to run a single, unique binary across a whole
range of different machines? Why should anyone want that?

It's the model that pretty much all ISV's use. <insert list of 20,000
canned apps here>
Assuming that open source software isn't even being considered here, I don't
see how, in this day and age, a distributor of closed-source, proprietary
software is barred from shipping multiple binaries in the install media or
even making it available over the internet.

It's not barred from it, technically. It's cost prohibitive to build
and test a lot of versions of the same program just to micro-optimize
for a bunch of slightly different processors, cache sizes, etc.
 
K

Kelsey Bjarnason

If not the compiler/optimizer, who else?

Suppose I write code optimized for, oh, pentiums. There are something
like 197 different flavours of pentiums, with different cache sizes,
pipleine depths, who knows what sort of differences.

How does the optimizer know which of those you're going to run the code
on? Perhaps you'll distribute the app and it will have to run on all of
them. Tomorrow's "mini pentium" for embedded systems might have the full
instruction set, but almost no cache - shall the optimizer predict the
future to tell you might, someday, want to run the binary on this chip?

All it can do is generate the code, as efficiently as possible... and let
_you_, the developer, see whether the code generated really was as
efficient as it should have been, then either accept it, or re-compile
with different optimization settings.
 
G

Gordon Burditt

I think you missed what I was saying there.  You can optimize with
Where exactly is there a need to run a single, unique binary across a whole
range of different machines? Why should anyone want that?

It simplifies virus-writing a lot.
 
R

Rui Maciel

Randy said:
It's the model that pretty much all ISV's use.  <insert list of 20,000
canned apps here>

That doesn't mean that it has any justification whatsoever, specially if we
are dealing with applications that need to run as efficiently as possible.

It's not barred from it, technically.  It's cost prohibitive to build
and test a lot of versions of the same program just to micro-optimize
for a bunch of slightly different processors, cache sizes, etc.

I have to call bullshit on that one. What is being discussed here is not
micro-optimisation but simply building and distributing binaries that are
specific for a certain platform instead of shipping a single binary that is
expected to run on multiple platforms. Nowadays we have compilers, even
F/LOSS compilers, that, to produce optimised binaries for a certain
platform, the only thing that need to do is set a couple of flags.

For example, in order for GCC to produce optimised code for a
386/pentium/athlon/prescot/opteron/abbacus, you only need to specify
the -mtune=cpu-type flag. GCC offers a countless set of optimisation
options but, according to the present state of proprietary, closed source
software distribution, that change alone would be a huge step forward.

Moreover, please do have in mind that there are quite a few free software
projects, even whole operating system distributions, that distribute
multiple versions of binaries optimised to specific target platforms? The
people behind those projects distribute those binaries on public
repositories open to all and we don't see them bankrupting. How exactly are
the people behind all those free software distributions able to produce
optimised binaries and offer them to everyone interested without charging a
single cent for their work while, according to your statement, the cost for
proprietary software distributors to do an irrelevant fraction of that same
work would be "prohibitive", specially when they charge quite a bit over 50
euros for each download/DVD?


Rui Maciel
 
R

Rui Maciel

Randy said:
It's the model that pretty much all ISV's use.  <insert list of 20,000
canned apps here>

That doesn't mean that it has any justification whatsoever, specially if we
are dealing with applications that need to run as efficiently as possible.

It's not barred from it, technically.  It's cost prohibitive to build
and test a lot of versions of the same program just to micro-optimize
for a bunch of slightly different processors, cache sizes, etc.

I have to call bullshit on that one. What is being discussed here is not
micro-optimisation but simply building and distributing binaries that are
specific for a certain platform instead of shipping a single binary that is
expected to run on multiple platforms. Nowadays we have compilers, even
F/LOSS compilers, that, to produce optimised binaries for a certain
platform, the only thing that need to do is set a couple of flags.

For example, in order for GCC to produce optimised code for a
386/pentium/athlon/prescot/opteron/abbacus, you only need to specify
the -mtune=cpu-type flag. GCC offers a countless set of optimisation
options but, according to the present state of proprietary, closed source
software distribution, that change alone would be a huge step forward.

Moreover, please do have in mind that there are quite a few free software
projects, even whole operating system distributions, that distribute
multiple versions of binaries optimised to specific target platforms? The
people behind those projects distribute those binaries on public
repositories open to all and we don't see them bankrupting. How exactly are
the people behind all those free software distributions able to produce
optimised binaries and offer them to everyone interested without charging a
single cent for their work while, according to your statement, the cost for
proprietary software distributors to do an irrelevant fraction of that same
work would be "prohibitive", specially when they charge quite a bit over 50
euros for each download/DVD?


Rui Maciel
 
K

Keith Thompson

Rui Maciel said:
Moreover, please do have in mind that there are quite a few free software
projects, even whole operating system distributions, that distribute
multiple versions of binaries optimised to specific target platforms? The
people behind those projects distribute those binaries on public
repositories open to all and we don't see them bankrupting. How exactly are
the people behind all those free software distributions able to produce
optimised binaries and offer them to everyone interested without charging a
single cent for their work while, according to your statement, the cost for
proprietary software distributors to do an irrelevant fraction of that same
work would be "prohibitive", specially when they charge quite a bit over 50
euros for each download/DVD?

For a free software project, the cost of customized builds for
multiple systems is (a) building and testing the software for each
variant target and (b) providing multiple versions on the download
site. A user who downloaded the wrong variant can just go back and
download the right one.

To do the same for a commercial product, you'd have added production
costs (packaging, keeping track of different versions on DVDs, etc.)
-- and a customer who finds he bought the wrong variant is likely to
have more trouble straightening it out; in the worst case, he might
have to pay for the product again.

I'm sure there are ways to work around these issues. I'm also sure
there are better places for this discussion.
 
G

Gordon Burditt

That doesn't mean that it has any justification whatsoever, specially if we
are dealing with applications that need to run as efficiently as possible.


I have to call bullshit on that one. What is being discussed here is not
micro-optimisation but simply building and distributing binaries that are
specific for a certain platform instead of shipping a single binary that is
expected to run on multiple platforms. Nowadays we have compilers, even
F/LOSS compilers, that, to produce optimised binaries for a certain
platform, the only thing that need to do is set a couple of flags.

How much do the tech support calls from people who don't know what
to download cost? If you're distributing an OS, this might not be
much of a problem, but if you're distributing an application, good
luck. Some people don't know how to tell if their desktop is a PC
or a Mac, much less whether it's got an Intel vs. AMD processor,
or whether it's dual-core. And a lot of people know they are
running Windows, but don't recognize such terms as "XP", "Vista",
"Windows 98", etc.
 
R

Rui Maciel

Gordon said:
How much do the tech support calls from people who don't know what
to download cost?  If you're distributing an OS, this might not be
much of a problem, but if you're distributing an application, good
luck.  Some people don't know how to tell if their desktop is a PC
or a Mac, much less whether it's got an Intel vs. AMD processor,
or whether it's dual-core.  And a lot of people know they are
running Windows, but don't recognize such terms as "XP", "Vista",
"Windows 98", etc.

The tech support reference is a bit absurd. We see free software supporting
a vast set of architectures but they don't suffer from any support issue.
Moreover, there is even absolutely no need for the user to know those
details, as even microsoft's windows line of operating systems benefits
from automatic package deployment. It's just a matter of checking the
relevant system properties and deploy the right binary. So what stops
anyone from shipping specialized binaries?


Rui Maciel
 
R

Randy Howard

The tech support reference is a bit absurd. We see free software supporting
a vast set of architectures but they don't suffer from any support issue.

Right, because for the bulk of the open source community, no formal
support is available at all. It's a non-issue. The rest of the world
does not behave identically though.
Moreover, there is even absolutely no need for the user to know those
details, as even microsoft's windows line of operating systems benefits
from automatic package deployment.

When it works. It still doesn't make it free to develop 10 packages
instead of 1, test them, package them, then test the deployment via
package managers, in 10X the number of cases.
It's just a matter of checking the
relevant system properties and deploy the right binary.

It's not that simple.
So what stops anyone from shipping specialized binaries?

Reality.
 
S

Stephen Sprunk

Rui Maciel said:
The tech support reference is a bit absurd. We see free software
supporting
a vast set of architectures but they don't suffer from any support issue.
Moreover, there is even absolutely no need for the user to know those
details, as even microsoft's windows line of operating systems benefits
from automatic package deployment. It's just a matter of checking the
relevant system properties and deploy the right binary. So what stops
anyone from shipping specialized binaries?

Support costs. Even if you can automatically install the correct binary on
each machine, you still have to test all of the individual binaries in QA,
and you have to troubleshoot the correct ones when customers find problems.
This is why many companies are still shipping binaries that are compiled for
original Pentiums with low optimization settings and debug symbols -- it's
easier to support, and it works, if not optimally, on every system their
customers own.

The FOSS crowd has it a bit easier since they don't have to support
anything, so each user can compile the software however they want. Notice
that those for-profit companies that do support FOSS often require customers
run the binaries _they_ compiled.

S
 
J

jacob navia

Rui said:
The tech support reference is a bit absurd. We see free software supporting
a vast set of architectures but they don't suffer from any support issue.
Moreover, there is even absolutely no need for the user to know those
details, as even microsoft's windows line of operating systems benefits
from automatic package deployment. It's just a matter of checking the
relevant system properties and deploy the right binary. So what stops
anyone from shipping specialized binaries?

The show stopper is the fact that you would have to TEST each
of the binaries before you ship it.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

No members online now.

Forum statistics

Threads
473,774
Messages
2,569,599
Members
45,175
Latest member
Vinay Kumar_ Nevatia
Top