how to decrease link time

B

baibaichen

i know that pImpl idiom can decrease compile time, but Is there any
good practice/idiom to decrease link time?

any reference or any idea is appreciated

thanks
 
R

red floyd

baibaichen said:
i know that pImpl idiom can decrease compile time, but Is there any
good practice/idiom to decrease link time?

any reference or any idea is appreciated

Link time is so minimal compared to compile time and totally negligible
compared to run time.

Design instead for maintainability. pImpl provides support for this, by
divorcing implementation and interface. The compile time benefits (if
any) are just icing on the cake.
 
S

Shezan Baig

baibaichen said:
i know that pImpl idiom can decrease compile time, but Is there any
good practice/idiom to decrease link time?

any reference or any idea is appreciated



The best advice I can give is do not create cyclic dependencies in your
modules.

Hope this helps,
-shez-
 
D

Dave Townsend

baibaichen said:
i know that pImpl idiom can decrease compile time, but Is there any
good practice/idiom to decrease link time?

any reference or any idea is appreciated

thanks

This is a bit platform dependent. But you could use dll/shared libraries to
a greater extent
so linking of the "executable" can be minimized to just the externals
provided by the libraries.
I've not noticed a problem with links on Windows or Red Hat Linux, maybe if
you use HP or
IBM you might have some long link jobs.

Perhaps you could use static functions where appropriate, but I suspect this
like trying to lose
weight by cutting your fingernails.

dave
 
M

Mike Wahler

red floyd said:
Link time is so minimal compared to compile time and totally negligible
compared to run time.

<OT>

I've found that this depends upon the project. I've
worked on projects which used several large third-party
libraries. Link time was considerably longer than
compile time. This ratio was made even larger because
I often would suppy the compiler with several source files
at a time, which meant the compiler was only invoked once.

As always, YMMV.

</OT>

-Mike
 
R

Rolf Magnus

red said:
Link time is so minimal compared to compile time and totally negligible
compared to run time.

Design instead for maintainability. pImpl provides support for this, by
divorcing implementation and interface. The compile time benefits (if
any) are just icing on the cake.

I have a different view on that. I hate writing code for an hour and then
searching for the erros for several hours because I did so many changes at
once. So I usually rather change my code in many small steps.
Each time, I have to compile the program to try whether it's ok. If compile
time is long, it slows me down considerably.
 
D

Dietmar Kuehl

baibaichen said:
any reference or any idea is appreciated

In general, reducing the dependencies between modules improves both
compile and link time, sometimes removing the need to compile and/or
link certain parts of the application altogether (the latter is the
case when using shared libraries/DLLs where only this objects needs
to be linked). The pimpl idiom is just one approach to reduce
dependencies, there are several others. John Lakos' "Large Scale C++"
(Addison-Wesley) presents several decoupling techniques.
 
J

JustBoo

I have a different view on that. I hate writing code for an hour and then
searching for the erros for several hours because I did so many changes at
once. So I usually rather change my code in many small steps.
Each time, I have to compile the program to try whether it's ok. If compile
time is long, it slows me down considerably.

This follows the Refactoring Methodologies outlined in Martin Fowlers
book. Hey, they have a webpage:

http://www.refactoring.com/

Good Stuff.
 
T

Thomas Maier-Komor

baibaichen said:
i know that pImpl idiom can decrease compile time, but Is there any
good practice/idiom to decrease link time?

any reference or any idea is appreciated

thanks

if you are using the GNU toolchain, you might take an alternative
compiler/linker combination into consideration. The GNU linker is known
to be slooooooow...

Tom
 
R

red floyd

Rolf said:
I have a different view on that. I hate writing code for an hour and then
searching for the erros for several hours because I did so many changes at
once. So I usually rather change my code in many small steps.
Each time, I have to compile the program to try whether it's ok. If compile
time is long, it slows me down considerably.

And two years down the line, when you have to spend three days searching
for the bug your clever link optimization caused, or you're dealing with
a portability issue (you need to port to a different OS) caused by your
clever link optimization, you'll have wasted much more time.

Not to mention that to fix said bug, you'll probably have to refactor
etc... Design for mainainability.
 
B

baibaichen

imagine that I am sitting at my desk fixing bugs all day long. When I
make a change and build, how many files are going to compile? 3? 10?
100? My guess is that over a whole day, each build will compile maybe
five files on average. Only rarely will I make a change to some header
file that causes more than 100 files to be recompiled. How long does it
take five files to compile? On my laptop, which is over a year old,
five files will compile in less than 30 seconds, and often in less than
10 seconds. Now consider link times: on my laptop, it takes about 12
minutes for DVDit to link, even if only one source file was changed.

Even if I were compiling 50 files, that's still less than five minutes,
and often less than two minutes. I wonder how much you could actually
reduce that by using relative paths.
 
A

Alf P. Steinbach

* baibaichen:
i know that pImpl idiom can decrease compile time, but Is there any
good practice/idiom to decrease link time?

Yes, the same as in general programming: divide and conquer.

If you're linking some hundreds of object files you have packaging problem:
repackage in smaller libraries (this may require some redesign; packaging
can affect design and vice versa). Use the appropriate options and/or tools
to optimize your libraries for fast access. E.g., under *nix, run 'ranlib'.

If you're linking some hundreds of libraries you also have a packaging
problem: repackage in smaller executables and dynamic libraries (this may
require some redesign; packaging can affect design and vice versa).

Of course this has only tangentially to do with current C++ as a language.

But, to the degree usage of the language is affected and hindered by tool
usage I think at least this kind of high level advice is on-topic. However,
be aware that concrete tool usage issues are rarely or not ever on-topic in
this group. This group is concerned with the C++ _language_.
 
?

=?iso-8859-1?q?Stephan_Br=F6nnimann?=

Are you using libraries (shared?) or do you just link all object files?
Just a wild guess: I suspect the system starts trashing when you link.
Monitor the disk I/O, memory and swap space usage.

regards, Stephan
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

No members online now.

Forum statistics

Threads
473,776
Messages
2,569,603
Members
45,188
Latest member
Crypto TaxSoftware

Latest Threads

Top