In said:
As I understand it there is a good amount of link compatibility
among C compilers. For example, I can compile main.c with GCC
and func.c with Sun One and link the objects using either linker
(GNU or Sun).
It's more than that: there's a good amount of link compatibility among
*all* compilers used on any given modern platform.
What I'm curious about is why this compatibility exists in the
absence of a standard C ABI?
Each platform defines its own ABI, in a language independent way.
It also defines a common format for the object files. This is all that's
needed to have modules compiled with different compilers (not necessarily
for the same language) linked together in a working executable.
What encourages C compiler vendors to agree on implementation
issues such as alignment, packing, etc., such that their object
files are compatible?
The ABI specification of the target platform. gcc behaves differently
on different platforms, in order to comply with the local ABI.
I've heard it said that compiler vendors who don't want to re-
implement the whole standard C library have to use the platform
ABI to conform to the platform standard library, but I was always
under the impression that most compiler vendors provided their
own standard library.
There is no general rule. gcc always uses the local implementation of
the standard library, many other compilers come with their own libraries
but also use part of the local libraries: it is a royal pain in the ass
to implement the interface with the OS, so you simply use the low level
stuff already existing as part of the local implementation of the standard
libraries.
I'd appreciate could share their insights as to why this
apparently high degree of object compatibility exists among C
compilers.
Imagine that you were a compiler writer. Would you design your
compiler in a manner requiring you to reimplement a lot of OS dependent
stuff from scratch?
On bare bones platforms, like, e.g. CP/M-80, that didn't provide any
common software infrastructure (except for a few OS primitives), it was
not uncommon for language implementors to define *everything* from
scratch, except the format of the executable files, of course. And by
everything I mean exactly that: the size and representation of the data
types, the calling conventions, the object file format, the linker, the
libraries. At best, you could combine modules produced by different
compilers of the same vendor. On platforms even more primitive than
CP/M-80 (many home computers of the early eighties) you did not even
have object files: you had to recompile all your sources in a single
executable file every time you made a change somewhere; the linker was
built into the compiler or the assembler and the run time support module
bundled with the generated code contained the whole language library.
Also, are there any well defined areas where C compilers are
likely to be incompatible?
Not if they are targeting the same platform. They all would try to
implement the standard C features in the same way and, to a certain
extent, to be compatible even at the level of language extensions.
There is one exception, though: some platforms having moved from one
bitness to another, e.g. from 32-bit to 64-bit. It is not uncommon to
have 32 and 64-bit compilers coexisting on such platforms, if they
continue to support applications built for the older format. Or even
have the same compiler work in both modes. In such cases, the linkers
are usually checking that all modules being linked together have been
compiled in the same mode and link them against the right set of
libraries. These are tricky environments: you may use, without realising,
a 32-bit shell combined with native 64-bit commands and not understand
why some things work differently in different contexts: e.g. your 64-bit
application can easily generate files above 4 GB, but, if you redirect
its standard output to a file, that file is limited to 4 GB.
Dan