namespaces and main()

P

poiuz24

You still seem to misunderstand what a using declaration does - it
*doesn't* modify the linkage of the original function in any way. e.g.

namespace foo
{
void f();
}

using foo::f;
//linker still expects foo::f, it's just that code below this point
//looks up f to be foo::f.

you're right: this was probably my core misunderstanding. please
let me rephrase to check if i got it now:

"using foo::f;" in above introduces a synonym ::f -> foo::f ..
but this is only for the compiler in the translation unit
it is working on. this synonym information is no longer present
in the object the compiler leaves for the linker. IOW: the symbol
table will NOT look like this:
....
00000018 T _foo_f
00000018 T _f
....

but instead will plainly look so:
....
00000018 T _foo_f
....

from my point of view (hope i got it this time), this is tribute
to the ideosyncracies of traditional C++ build environments, in
particular the lack of adequate type information left by the
compiler for the linker. even above (non-existing) symbol table
loses information. think of this (also non-existing) symbol table:
....
00000018 T _foo_f
_foo_f SYNONYM _f
....

this one would preserve which symbol is "real" and which is just
a synonym.

thx for pressing me so hard to get the point;)
 
P

poiuz24

another background was this: i do lot of templated stuff which
You're not writing code twice, just declarations. The separation is

with writing code i meant writing C++ surface syntax, may it be
declarations or definitions. writing declarations twice all over
the place "just" because the build environment is lacking _is_ a pain
in the ass at least for me ..

so you say stop whining and write a new C++ build environment? ugh;)
quite important in very large projects, since .h files should rarely
be modified.

in an ideal world (from my POV), the compiler would generate
an object file for a translation unit which contains enough
information for the linker to check if any exported type information
changed. a recompiled library A would mean a new modification date
for the object file of library A. a linker would then see the new
mod date, but would have to look into the file to check if only
"internal information" changed or if exported type information
changed. this last step "checking if exported type info has changed"
is classically done via header files. but thats a hack around the
compiler/linker system unable to figure out stuff themselfes which
they clearly could (in theory).
problem

Not true if any of the following is true:
a) They are defined inside a class body.
i meant "namespace { .." scoped rather than namespace in the
sense namespace = {"namespace { ..", "class {..", ..}
b) They are declared inline.
c) They are in an anonymous namespace.
d) They are declared static.
right, ok.
Actually it's quite a big problem for the linker to sort out - some

again lack of adequate information passed from the compiler to
the linker. a reoccurring theme it seems.
linkers can't do it (including one I have used on QNX - .exes ended up
containing multiple definitions of template functions, causing
horrendous bloat). Some compilers have complicated prelinkers that get

i'm in the comfortable position to have no requirement to
support such broken/incapable build envs
around this problem, by assigning each template instantiation to a
particular translation unit, but this requires multiple compilation
passes to find out all the required instantiations. This is what
Comeau C++ does by default.

it becomes clearer and clearer to me that the real legacy are the
object file formats used to communicate between compiler and linker
If you have multiple definitions of a function, either the
compiler/linker has to compile them all, assume they are the same and
discard all but one, or it has to assign the definition to a single
translation unit and not compile it in any of the others. You can
imagine the number of passes that would require!

One way around the bloat issue is just to have a single compilation
unit for your whole program! This might suit you.

puh, yes that might be a practical workaround for the broken
object file format
const vars have static linkage, so you haven't got multiple
definitions, rather each translation unit gets a separate copy. For
static functions this will obviously lead to major code bloat.

pain everywhere.
Well, I suggested some ways above, but none of them are recommended
since they all lead to bloated executables - the linker knows to merge
different instantations of template functions, but not normal ones.

probably i go the "one system in one translation unit" road until
it doesn't scale anymore.
It's not just compilation speed. It's the fact that any change at all
means you have to recompile absolutely everything.

so what? if that is fast enough (e.g. <2min), i don't care about
"wasted" CPU cycles computing stuff over and over again, if that helps.
Yes, I don't strongly disagree with that, but the biggest problem is
the dependency bottleneck. You will have major problems with circular

which dependency bottleneck? from my POV, there a logical dependencies
(class A needs class B). thats my design work. all other dependencies
i see as artificial and imposed by a specific build env.
dependencies if you're not careful.


Actually, they do. Look up posts by James Kanze on the subject, for
one.

It's best not to fight the language - in Java do what it wants, in C++
do what it wants.

i always try to keep an open mind and look out for nice stuff
available to other languages. e.g. javadoc triggered doxygen which
is arguable a good thing. attributes in C# i find useful too. also,
i guess i'm not fighting C++ but it's arcane build env. i hate it
as much as i love C++.
 
K

Karl Heinz Buchegger

poiuz24 said:
you're right: this was probably my core misunderstanding. please
let me rephrase to check if i got it now:

"using foo::f;" in above introduces a synonym ::f -> foo::f ..
but this is only for the compiler in the translation unit
it is working on. this synonym information is no longer present
in the object the compiler leaves for the linker. IOW: the symbol
table will NOT look like this:
...
00000018 T _foo_f
00000018 T _f
...

but instead will plainly look so:
...
00000018 T _foo_f
...

Yep.

from my point of view (hope i got it this time), this is tribute
to the ideosyncracies of traditional C++ build environments, in
particular the lack of adequate type information left by the
compiler for the linker.

Well. Yes. In a sense you are right. But even then: mangled names
contain all the information the linker needs to implement
the C++ requirements.
even above (non-existing) symbol table
loses information. think of this (also non-existing) symbol table:
...
00000018 T _foo_f
_foo_f SYNONYM _f
...

this one would preserve which symbol is "real" and which is just
a synonym.

That would be useless. It would completely destroy the use of namespaces.

You need to look at it the other way round.
When the compiler process a trabnslation unit, it has a global
namespace. There all global names are collected. When you declare
a namespace on your own, then the compiler creates a second table
for this namespace where all the names in that namespace are collected.
By writing a 'using namespace' you just instruct the compiler to copy
all the names from the specified namespace into the global one. The
result of this is: When the compiler searches a name in the global namespace
it therefore will also find the names in the specified namespace.

PS: if the compiler really copies all the names or just sets a marker that
another namespace is included is not really important. There would only
be a difference if you first 'use' the namespace and then later add names
to it.
 
K

Karl Heinz Buchegger

poiuz24 said:
with writing code i meant writing C++ surface syntax, may it be
declarations or definitions. writing declarations twice all over
the place "just" because the build environment is lacking _is_ a pain
in the ass at least for me ..

so you say stop whining and write a new C++ build environment? ugh;)


in an ideal world (from my POV), the compiler would generate
an object file for a translation unit which contains enough
information for the linker to check if any exported type information
changed. a recompiled library A would mean a new modification date
for the object file of library A. a linker would then see the new
mod date, but would have to look into the file to check if only
"internal information" changed or if exported type information
changed. this last step "checking if exported type info has changed"
is classically done via header files. but thats a hack around the
compiler/linker system unable to figure out stuff themselfes which
they clearly could (in theory).

There seems to be a misconception about what a compiler does and
what a linker does. You completely ignore that the main use of a header
file is for the *compiler* (not the linker) to check proper use
of argument passing in function calls.
 
T

tom_usenet

you're right: this was probably my core misunderstanding. please
let me rephrase to check if i got it now:

"using foo::f;" in above introduces a synonym ::f -> foo::f ..
but this is only for the compiler in the translation unit
it is working on. this synonym information is no longer present
in the object the compiler leaves for the linker. IOW: the symbol
table will NOT look like this:
...
00000018 T _foo_f
00000018 T _f
...

but instead will plainly look so:
...
00000018 T _foo_f
...
Right.


from my point of view (hope i got it this time), this is tribute
to the ideosyncracies of traditional C++ build environments, in
particular the lack of adequate type information left by the
compiler for the linker.

Well, I think it was done by design - using declarations are only
meant to affect the translation unit in which they are used, I suppose
you could say that names introduced by using declarations have
internal linkage, but it's easier just to think of it as declaring a
local alias, just as a typedef does.

even above (non-existing) symbol table
loses information. think of this (also non-existing) symbol table:
...
00000018 T _foo_f
_foo_f SYNONYM _f
...

this one would preserve which symbol is "real" and which is just
a synonym.

Still, I don't see the benefit that having synonyms in the the symbol
table would give, except that you might get lots of violations of the
one definition rule.

But you've basically got it - using declarations just make ordinary
name lookup behave within that translation unit as though the original
name were declared in the using declaration.

Tom
 
T

tom_usenet

in an ideal world (from my POV), the compiler would generate
an object file for a translation unit which contains enough
information for the linker to check if any exported type information
changed. a recompiled library A would mean a new modification date
for the object file of library A. a linker would then see the new
mod date, but would have to look into the file to check if only
"internal information" changed or if exported type information
changed. this last step "checking if exported type info has changed"
is classically done via header files. but thats a hack around the
compiler/linker system unable to figure out stuff themselfes which
they clearly could (in theory).

Yes, this would be a good solution - each .cpp file has an
accompanying symol table, and other files can "import" that symbol
table. Basically every extern definition (but not declaration) in the
file is added to the symbol table.

It would be quite a simple thing to work out what changes require
compilation of dependent files - basically any change that actually
modifies the symbol table.

I for one would love something along the lines of:
#import "otherclass.cpp"
and
#import <vector>
etc.
I'm sure that something like this could be made to work, it's just a
matter of someone writing it and proving it.

Writing a good IDE would be much easier too, since it could use those
symbol table files to produce browse information, and refactorings,
etc.
again lack of adequate information passed from the compiler to
the linker. a reoccurring theme it seems.

The problem is that many platforms only have a linker that was
designed for use with C and other similar languages (e.g. FORTRAN).
i'm in the comfortable position to have no requirement to
support such broken/incapable build envs

The build environment was actually fine - GCC + Dinkumware's C and C++
libraries. By now it may be about the most conforming combination that
ships by default with an OS (due to Dinkumware's libs being 100%
conforming).

But the problem was the linker, or perhaps the object file format.
it becomes clearer and clearer to me that the real legacy are the
object file formats used to communicate between compiler and linker

Indeed. C++ was specifically designed so that it could be linked using
a traditional linker. Templates slightly damaged that, and the best
solution would be to bite the bullet and either go down the pre-linker
route or use more intelligent linkers.
so what? if that is fast enough (e.g. <2min), i don't care about
"wasted" CPU cycles computing stuff over and over again, if that helps.

That's quite painful during a debugging cycle.
which dependency bottleneck? from my POV, there a logical dependencies
(class A needs class B). thats my design work. all other dependencies
i see as artificial and imposed by a specific build env.

No, I'm talking about the practical problem of programming using just
1 translation unit. You'll have to very precisely order everything
(e.g. definition of class A, then definition of class B, then A::foo,
etc.) and this will be a pain to maintain.
i always try to keep an open mind and look out for nice stuff
available to other languages. e.g. javadoc triggered doxygen which
is arguable a good thing. attributes in C# i find useful too. also,
i guess i'm not fighting C++ but it's arcane build env. i hate it
as much as i love C++.

Sadly C++ was designed with multiple preprocessed translation units
very firmly in mind. Changing it to use something more like "modules"
is not going to be easy, but I think it would be worth the effort.

Tom
 
P

poiuz24

Karl Heinz Buchegger said:
There seems to be a misconception about what a compiler does and
what a linker does. You completely ignore that the main use of a header

i was talking about a hypothetical, though from my POV desirable,
compiler/linker.
file is for the *compiler* (not the linker) to check proper use
of argument passing in function calls.

ok, my description was incomplete. if a compiler would produce object
files containing sufficient information for all exported types, that
information would be used by the compiler _and_ linker.

the compiler would read type information directly from object
files instead of headers. e.g. compiling mylib.cpp would produce
mylib.o and that would contain the signature of an exported
function myfun(). then, when compiling myapp.cpp which uses myfun(),
the compiler would read the signature of myfun() directly from
mylib.o. in the end, when linking myapp.o and mylib.o, the linker
would read necessary information also from myapp.o and mylib.o.
instead of '#include "mylib.h"' in myapp.cpp, one would need to
say e.g. 'import "mylib"', instructing the compiler to read type
information from mylib.o.

in a sense, it's like having preparsed/precompiled headers embedded
into the object files.
 
P

poiuz24

I for one would love something along the lines of:
#import "otherclass.cpp"
and
#import <vector>
etc.
I'm sure that something like this could be made to work, it's just a
matter of someone writing it and proving it.

yes, i'd love that too .. actually, i thought along very
similar lines: import "otherclass"

that is, not the preprocessor, but the compiler would handle
'import' and no specification of file extension, since object
file extensions may be platform specific (on a *nix system, above
would make the compiler read type information for OtherClass
from "otherclass.o")
Indeed. C++ was specifically designed so that it could be linked using
a traditional linker. Templates slightly damaged that, and the best
solution would be to bite the bullet and either go down the pre-linker
route or use more intelligent linkers.

aeh, what is the "pre-linker route"? didn't heard of that one yet ..
Sadly C++ was designed with multiple preprocessed translation units
very firmly in mind. Changing it to use something more like "modules"
is not going to be easy, but I think it would be worth the effort.

a module system that integrates with the build system .. absolutely!

i've spent some time trying to figure out what would be needed.
i came to the conclusion that one would need to bring the
object file format much closer to the compiler middle in a computer
science sense, that is closer to the Abstract Sytntax Tree.

a new object file format containing full static type information for
exported types would be needed. what does "full static type inf." mean?

e.g. for exported non-inline, non-template, non-member functions:

- the fully namespace qualified function name
- the function signature

so far so good. this could be done pretty ad-hoc. but what is
"full static type inf." for an exported template function e.g.

template<typename T> T fun (const T& x);

i would argue that full static type information in this case
included the function body. if so, that would require a new
object file format to be powerful enough to express function
bodies. that could be done at the Abstract Syntax Tree (AST)
level. in other words, a sufficiently powerful object file
format would be similar to a persistent file format for
C++ ASTs. when compiling a translation unit, the compiler would
produce an object file that contains native code + those parts
of the AST for the current translation unit, that describe exported
types. compiling another translation unit importing a module
would make the compiler not start with the empty AST for the
translation unit, but with an AST prefilled from the AST parts
embedded in and read from all imported modules.

in a sense, this is perhaps roughly similar to preparsed
headers but without the headers

i'm sure this is no new idea but must have been proposed
already. what's the catch? would such a module system "deliver"?
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

No members online now.

Forum statistics

Threads
473,769
Messages
2,569,580
Members
45,054
Latest member
TrimKetoBoost

Latest Threads

Top