One Big (std::) Header File?

S

Steven T. Hatton

I recently came across the suggestion that it might be beneficial to create
a header file containing all the Standard Headers, and include that in all
the places where declarations from the Standard Library are used - as
opposed to including the specific header containing the declaration. It
was argued that this may improve compilation speeds. Opinions?
--
"If our hypothesis is about anything and not about some one or more
particular things, then our deductions constitute mathematics. Thus
mathematics may be defined as the subject in which we never know what we
are talking about, nor whether what we are saying is true." - Bertrand
Russell
 
C

Catalin Pitis

Steven T. Hatton said:
I recently came across the suggestion that it might be beneficial to create
a header file containing all the Standard Headers, and include that in all
the places where declarations from the Standard Library are used - as
opposed to including the specific header containing the declaration. It
was argued that this may improve compilation speeds. Opinions?
--
"If our hypothesis is about anything and not about some one or more
particular things, then our deductions constitute mathematics. Thus
mathematics may be defined as the subject in which we never know what we
are talking about, nor whether what we are saying is true." - Bertrand
Russell

I don't think so. All the headers will be compiled for each cpp file that
includes that "super" header file. And not all the code inside is useful.
Include only the necessary headers.

Catalin
 
S

Sharad Kala

Catalin Pitis said:
I don't think so. All the headers will be compiled for each cpp file that
includes that "super" header file. And not all the code inside is useful.
Include only the necessary headers.

And what happens in the case of precompiled headers ?

Sharad
 
S

Steven T. Hatton

Catalin said:
I don't think so. All the headers will be compiled for each cpp file that
includes that "super" header file. And not all the code inside is useful.
Include only the necessary headers.

Catalin

That is the conventional wisdom. The counter argument has to do with the
use of precompiled headers. I don't understand all the idiosyncracies of
that technology, but the way it was explained, the compiler that supports
pre-compiled headers can skip everything up to the first line that it
hasn't seen before (or something like that). So the usefulness of the
technique will be implementation dependent. I don't know how widespread
support for pre-compiled headers is. GCC just introduced it, and I am not
able to use the latest version for anything that needs to play nicely with
the KDE I'm running.

I do know some things such as the KDE can currently take the better part of
24 hours to build on a decent system. OSG can be a slow build as well.
Anything to improve the build times will be welcomed by me.
--
"If our hypothesis is about anything and not about some one or more
particular things, then our deductions constitute mathematics. Thus
mathematics may be defined as the subject in which we never know what we
are talking about, nor whether what we are saying is true." - Bertrand
Russell
 
C

Catalin Pitis

Steven T. Hatton said:
That is the conventional wisdom. The counter argument has to do with the
use of precompiled headers.

Precompiled headers still apply to only the necessary headers, so there
should be no difference between including one big "root" header file and
including only the necessary header files. However, since the headers
contain templates, I'm not sure how this influences the compiling speed.

Catalin
 
S

Steven T. Hatton

Catalin said:
Precompiled headers still apply to only the necessary headers, so there
should be no difference between including one big "root" header file and
including only the necessary header files.

That is another alternative that was suggested. If I understand what the
authors were saying, the idea is to put all the (standard) headers you use
in the whole project and put them in a shared header file.
However, since the headers
contain templates, I'm not sure how this influences the compiling speed.

Hence my comment about not fully understanding the notion of pre-compiled
headers.
--
"If our hypothesis is about anything and not about some one or more
particular things, then our deductions constitute mathematics. Thus
mathematics may be defined as the subject in which we never know what we
are talking about, nor whether what we are saying is true." - Bertrand
Russell
 
C

Chris Theis

Steven T. Hatton said:
That is the conventional wisdom. The counter argument has to do with the
use of precompiled headers. I don't understand all the idiosyncracies of
that technology, but the way it was explained, the compiler that supports
pre-compiled headers can skip everything up to the first line that it
hasn't seen before (or something like that). So the usefulness of the
technique will be implementation dependent. I don't know how widespread
support for pre-compiled headers is. GCC just introduced it, and I am not
able to use the latest version for anything that needs to play nicely with
the KDE I'm running.

I do know some things such as the KDE can currently take the better part of
24 hours to build on a decent system. OSG can be a slow build as well.
Anything to improve the build times will be welcomed by me.

In general you should only include the headers that are really required,
although with precompiled headers the story is a little different as you
say. If your build process is still too slow, have you thought about a
parallel build?

Cheers
Chris
 
S

Steven T. Hatton

Chris said:
In general you should only include the headers that are really required,
although with precompiled headers the story is a little different as you
say.

I'm inclined to take the approach of creating a project_std.hh that contains
the standard headers used by the entire project. I don't like the idea all
that much. I very much believe in isolating components as much as
possible. OTOH, if compile times are going to be sensitive to the order in
which headers are included, trying to manage that on a file-by-file basis
throughout a project is probably a bad idea.
If your build process is still too slow, have you thought about a
parallel build?

That depends what you mean. I kick of multiple processes when I build, but
I haven't attempted sharing the load between systems. I don't know how
much getting my hyperthreading to work might impact the build times. As I
said in another post, I'm not able to really experiment with the recently
introduced pre-compiled header support in GCC, so I can't comment on
whether it would make a significant difference. I do believe the KDE folks
may have done some things to speed up the build process. But I haven't
been building the entire cvs image lately, so that's not an immediate
issue.

--
"If our hypothesis is about anything and not about some one or more
particular things, then our deductions constitute mathematics. Thus
mathematics may be defined as the subject in which we never know what we
are talking about, nor whether what we are saying is true." - Bertrand
Russell
 
J

JKop

Steven T. Hatton posted:
I recently came across the suggestion that it might be beneficial to
create a header file containing all the Standard Headers, and include
that in all the places where declarations from the Standard Library are
used - as opposed to including the specific header containing the
declaration. It was argued that this may improve compilation speeds.
Opinions?

With precompiled headers, yes.

Sounds like a good idea for when you're actually writing and testing the
program, but once I'd have finished it, or if I was distributing the code,
I'd go back and just select the pertinent headers.


-JKop
 
C

Chris Theis

Steven T. Hatton said:
I'm inclined to take the approach of creating a project_std.hh that contains
the standard headers used by the entire project. I don't like the idea all
that much. I very much believe in isolating components as much as
possible. OTOH, if compile times are going to be sensitive to the order in
which headers are included, trying to manage that on a file-by-file basis
throughout a project is probably a bad idea.


That depends what you mean. I kick of multiple processes when I build, but
I haven't attempted sharing the load between systems. I don't know how
much getting my hyperthreading to work might impact the build times. As I
said in another post, I'm not able to really experiment with the recently
introduced pre-compiled header support in GCC, so I can't comment on
whether it would make a significant difference. I do believe the KDE folks
may have done some things to speed up the build process. But I haven't
been building the entire cvs image lately, so that's not an immediate
issue.

What I meant by building in parallel is load-sharing. Unfortunately I only
know tools under windows that will do this for you in a comfortable way and
I have no experience doing this under linux. However, a quick google showed
that this seems to be a common approach at the companies providing linux
distributions. IMHO this is the only way to speed up your compilation
process if you cannot apply precompiled headers. In case you find another
solution (or a tool to perform comfortable parallel builds under linux) I'd
be happy if you let me know.

Cheers
Chris
 
M

Mike Wahler

Steven T. Hatton said:
I recently came across the suggestion that it might be beneficial to create
a header file containing all the Standard Headers, and include that in all
the places where declarations from the Standard Library are used - as
opposed to including the specific header containing the declaration. It
was argued that this may improve compilation speeds. Opinions?

I think that's an attempt to avoid (some) thinking while writing
a program. I don't like it at all.

-Mike
 
S

Steven T. Hatton

Mike said:
I think that's an attempt to avoid (some) thinking while writing
a program. I don't like it at all.

-Mike
I won't say they sold me on the idea yet, but I won't accuse these guys of
trying to duck too much thinking. See §6.5:

http://www.josuttis.com/tmplbook/

I was rather surprised when I read the discussion. When I reread it just
now, I realized the recommendation seems stronger than I had first taken it
to be.
--
"If our hypothesis is about anything and not about some one or more
particular things, then our deductions constitute mathematics. Thus
mathematics may be defined as the subject in which we never know what we
are talking about, nor whether what we are saying is true." - Bertrand
Russell
 
S

Steven T. Hatton

Chris Theis wrote:

What I meant by building in parallel is load-sharing. Unfortunately I only
know tools under windows that will do this for you in a comfortable way
and I have no experience doing this under linux. However, a quick google
showed that this seems to be a common approach at the companies providing
linux distributions. IMHO this is the only way to speed up your
compilation process if you cannot apply precompiled headers. In case you
find another solution (or a tool to perform comfortable parallel builds
under linux) I'd be happy if you let me know.

Cheers
Chris

As is typical of me, I had the software installed and ready to go, I just
never took the time to read the documentation. I've also been reluctant to
mess with trying to get all my versions synchronized between systems. For
various reasons I tend to want different configurations on different boxes.

This is from the people who implemented SMBFS for Linux:

http://distcc.samba.org/
1. For each machine, download distcc, unpack, and do
./configure && make && sudo make install
2. On each of the servers, run distccd --daemon, with --allow options to
restrict access.
3. Put the names of the servers in your environment:
export DISTCC_HOSTS='localhost red green blue'
4. Build!
cd ~/work/linux-2.4.19; make -j8 CC=distcc

The install and config was actually _easier_ than what's described above.

I can't comment on the ROI yet. I can say my old klunker hasn't worked so
hard in years. I'm not really sure why it is spending so much time hitting
the harddrive, but I have an ancient 300 meg Western Digital I use for
extra swap space, and it is going crazy. It may turn out that it's so
slow, it actually slows down other build processes if it blocks. I have
another system I'm upgrading so I can tie it in as well. Also not a top of
the line.

I've been using this for a while:
http://ccache.samba.org/

It doesn't work magic, but it does seem to help.

--
"If our hypothesis is about anything and not about some one or more
particular things, then our deductions constitute mathematics. Thus
mathematics may be defined as the subject in which we never know what we
are talking about, nor whether what we are saying is true." - Bertrand
Russell
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

No members online now.

Forum statistics

Threads
473,770
Messages
2,569,584
Members
45,075
Latest member
MakersCBDBloodSupport

Latest Threads

Top