Alternatives to make

I

Ian Collins

James said:
Implementing a simplified make is a good exercise for someone
just learning programming---it's one of the simplest practical
examples I know of something requiring dynamic allocation and
pointers, along with some parsing (to read the dependency file).

Implementing a full featured make, along the lines of GNU make,
is a different question, and if I needed something GNU make
didn't provide (distributed builds, for example), I'd look for
something already implemented before I'd attack it myself.

Sun's dmake for example.
 
F

Fred Zwarts

James Kanze said:
That's the granularity problem. The granularity of make is a
file; a target can only depend on a file, not something smaller,
and any change in that file will provoke the rebuilding of the
target.

That is not completely true. There is a somewhat smaller granularity.
A target can depend on a object in an archive. E.g.:

Example.exe: Library.a(Object.o)

(But this may be different for other versions of make.)

I have a script to parse the linker map to create these dependencies.
I hope that someone will point to a linker flag that does it for me.
 
J

joshuamaurice

It takes a little more than one line.  In fact, it takes some
collaboration from the compiler, and some shell behind it.  But
it does work, and it works correctly (unlike most other tools
I've seen), because it depends on the compiler to find the
dependencies, rather than trying to reimplement the compiler's
logic internally.

Yes, it actually takes ~4 lines of makefile, not just 1. I
exaggerated, but not by much. And yes, it does require using compiler
support to do it in so little lines. (This is using gcc 3.4.3.)

Here's a snippet from such a make system I'm been playing around with
on the side as a demonstration to my company that GNU Make is superior
to Maven. I'm sure this is not the best, just a demonstration that I
whipped up mostly myself. (I would first attempt to cache calls to
shell and use Make built-ins for stuff like realpath, except those
appear to be broken my version of GNU Make.)

(Yes I pass -fno-strict-aliasing to gcc. I'm aware that code should be
written to not require that. Some of my company's code does.)

(Yes, the makefile will be munged from line wrapping that the
submission process does automatically. Hopefully it'll still be
readable.)



#Input is:
# CPP_FILENAMES a list of file names (without directory)
to compile into object files
# INCLUDE_PATHS a list of include paths
# CPP_DIR folder containing *.cpp files
# OBJ_DIR folder where to put the .o files and
other temp files like .d files
# SHARED_LIBRARY_BUILD if "yes", then will build with -fPIC or
similar option to allow
# use of .o files in shared libraries.
# RELEASE_BUILD if "yes", then will build with release
options, otherwise no


ifneq ($(filter %.cpp, $(CPP_FILEPATHS)),)
$(error $(filter %.cpp, $(CPP_FILEPATHS)) do not have .cpp as their
extension)
endif


..PHONY : all
all :


UNAME := $(shell uname)


build_scripts_dir := $(dir $(word $(words $(MAKEFILE_LIST)), $
(MAKEFILE_LIST)))


#create the directories if they don't exist
FORCE := $(foreach x,$(INCLUDE_PATHS),$(shell $(build_scripts_dir)/
mkdir_p.sh $(x)))
FORCE := $(shell $(build_scripts_dir)/mkdir_p.sh $(CPP_DIR))
FORCE := $(shell $(build_scripts_dir)/mkdir_p.sh $(OBJ_DIR))


# basically realpath, which doesn't work for some reason
INCLUDE_PATHS := $(foreach x,$(INCLUDE_PATHS),$(shell cd $(dir $(x))
&& pwd)/$(notdir $(x)))
CPP_DIR := $(shell cd $(dir $(CPP_DIR)) && pwd)/$(notdir $(CPP_DIR))
OBJ_DIR := $(shell cd $(dir $(OBJ_DIR)) && pwd)/$(notdir $(OBJ_DIR))


# Get absolute path names to use instead of file names
OBJ_FILEPATHS := $(patsubst %.cpp,$(OBJ_DIR)/%.o,$(CPP_FILENAMES))
DEP_FILEPATHS := $(patsubst %.cpp,$(OBJ_DIR)/%.d,$(CPP_FILENAMES))


####
# compile rules
ifeq ($(UNAME),Linux)
# The compile-rule generates makefile fragements for each .cpp file.
# These makefile fragments add a dependency of every included header
# on the object file. Include these dependencies here.
# (If the file is not yet generated, for example a clean build, the
# "-" in "-include" means ignore it if the file doesn't exist.)
ifeq ($(DEP_FILEPATHS),)
$(error Must compile at least one cpp file)
endif
-include $(DEP_FILEPATHS)

# Target specific variable definitions, to not conflict if this
makefile is included again
$(OBJ_FILEPATHS) : OBJ_DIR := $(OBJ_DIR)

# Target specific variable definitions, to not conflict if this
makefile is included again
$(OBJ_FILEPATHS) : INCLUDE_OPTIONS := $(addprefix -I,$
(INCLUDE_PATHS))

# Target specific variable definitions, to not conflict if this
makefile is included again
ifeq ($(RELEASE_BUILD),yes)
$(OBJ_FILEPATHS) : OTHER_COMPILE_OPTIONS := -O3 -finline-
limit=10000 -Wuninitialized
else
$(OBJ_FILEPATHS) : OTHER_COMPILE_OPTIONS := -g
endif
ifeq ($(SHARED_LIBRARY_BUILD),yes)
$(OBJ_FILEPATHS) : OTHER_COMPILE_OPTIONS += -fPIC
endif


$(OBJ_FILEPATHS) : $(OBJ_DIR)/%.o : $(CPP_DIR)/%.cpp
rm -f $@ $(OBJ_DIR)/$*.d
g++ -c \
-std=c++98 -pedantic-errors -Wall -Wabi -Wtrigraphs -Wpointer-
arith -Wdisabled-optimization \
$(OTHER_COMPILE_OPTIONS) \
-fno-strict-aliasing \
-MP -MMD -MF $(OBJ_DIR)/$*.d \
-o $@ \
$< \
$(INCLUDE_OPTIONS)
@echo
else
$(error Platform $(UNAME) not suppoted in makefile.)
endif


####
# all rule
..PHONY : all
all : $(OBJ_FILEPATHS)


####
# clean rules
FAKE_CLEAN_TARGETS := $(addsuffix .clean,$(OBJ_FILEPATHS) $
(DEP_FILEPATHS))
..PHONY : clean
clean : $(FAKE_CLEAN_TARGETS)
..PHONY : $(FAKE_CLEAN_TARGETS)
$(FAKE_CLEAN_TARGETS) : %.clean : ; rm -f $*
 
J

joshuamaurice

Even GNU make only works at the file level.  In practice,
however, not all modifications of a header require the
recompilation of all sources which include it.  (And I don't see
how .PRECIOUS is relevant to this.)

Suppose you have an archive, and a build step is to add something to
this archive, not to rebuild the file. .PRECIOUS from my understanding
means that if make is killed, it will not delete the file as it
normally would. Thus GNU Make has some minimal support for dependency
scope beyond that of files and parts of file. Yes I admit this is
quite minimal support, and I guess my earlier post I was being anal. I
apologize.
 
J

joshuamaurice

I agree with this. The problem with using a separate script/program is to
get all predefined macros and include directories correct. Some compilers
also use headers that cannot be found as normal files, to complicate things.

Two problems. First is that it requires duplicating logic of the
compiler, including stuff like implicit defines, and this is tedious,
error prone, and platform specific. Not much else to do besides suck
it up and do it if the compiler won't already do it for you. (Luckily
most do, though some sed and grep hackery may be required to get it
into a usable form.) Then there's the question of compiler magic for
compilers which have standard library includes as not actual files. In
which case, the programmer probably isn't modifying those files, so
they can be safely ignored in the make build system, safely not made a
prerequisite of anything as it won't be changed.
 
C

coal

On Apr 21, 10:18 pm, (e-mail address removed) wrote:
    [...]
I want to be able to return 1 from main() to indicate success
and 0 to indicate failure.

The C and C++ standards say that 0 must indicate success.  It's
not a question of make, or anything else for that matter---it's
what the standard requires.

I don't believe in blindly following the standard.
With this particular aspect I'll probably use ifdefs
so as to make it easy for those who use make.

Besides make being inflexible in this regard, it's
also another large C program. I think further
development of make in C is a mistake. As far as
long term strategies, I suggest people begin to
reduce their dependence on C-based make programs.
Make normally adopts the conventions of the platform it is on.
Under Unix, 0 is success, and anything else is failure, and
there's no way you can change this, other than rewriting every
utility program on the system, all of the shells, the compilers,
etc., etc.


I don't have to change those. I just need a build
tool that's more flexible in this regard. Between
this matter and make being C-based I'm convinced
it's worth investigating alternatives.


Brian Wood
Ebenezer Enterprises
www.webEbenezer.net
 
J

joshuamaurice

On Apr 21, 10:18 pm, (e-mail address removed) wrote:
    [...]
I want to be able to return 1 from main() to indicate success
and 0 to indicate failure.
The C and C++ standards say that 0 must indicate success.  It's
not a question of make, or anything else for that matter---it's
what the standard requires.

I don't believe in blindly following the standard.
With this particular aspect I'll probably use ifdefs
so as to make it easy for those who use make.

Besides make being inflexible in this regard, it's
also another large C program.  I think further
development of make in C is a mistake.  As far as
long term strategies, I suggest people begin to
reduce their dependence on C-based make programs.
Make normally adopts the conventions of the platform it is on.
Under Unix, 0 is success, and anything else is failure, and
there's no way you can change this, other than rewriting every
utility program on the system, all of the shells, the compilers,
etc., etc.

I don't have to change those.  I just need a build
tool that's more flexible in this regard.  

Now, you can do one of two things.
1- You can accept the universal convention that a process error return
of 0 means success and nonzero is an error code, or
2- you can cause yourself great pain.

Why do you need this flexibility?

You do realize that if you have a silly build tool which breaks this
convention, and you need to invert the return code to appease make,
then it's as easy as the following with the bourne shell:
if silly_nonconforming_command; then false; else true; fi
Equally obvious solutions exist for all shells.
Between
this matter and make being C-based I'm convinced
it's worth investigating alternatives.

You say "C-based" as though it's a bad thing. You obviously have a
strong bias against C, and thus presumably you have a strong bias
against C++, so why post in a C++ forum?

Moreover, what does "C-based" mean? Sure GNU Make is probably (??)
written in C. Does that make it C-based? (That it's written in C
should be irrelevant as long as it works, and it does. It doesn't
force you to write C code just because it was written in C.)

Or maybe that it was initially tailored to work with C compilation.
Does that make it C-based? (Again irrelevant. Dose it solve your
problem satisfactorily? And are there better alternatives, preferably
better portable alternatives?)

Specifically, what is "C-based", and what makes that bad?
 
C

coal

On Apr 21, 10:18 pm, (e-mail address removed) wrote:
    [...]
I want to be able to return 1 from main() to indicate success
and 0 to indicate failure.
The C and C++ standards say that 0 must indicate success.  It's
not a question of make, or anything else for that matter---it's
what the standard requires.
I don't believe in blindly following the standard.
With this particular aspect I'll probably use ifdefs
so as to make it easy for those who use make.
Besides make being inflexible in this regard, it's
also another large C program.  I think further
development of make in C is a mistake.  As far as
long term strategies, I suggest people begin to
reduce their dependence on C-based make programs.
I don't have to change those.  I just need a build
tool that's more flexible in this regard.  

Now, you can do one of two things.
1- You can accept the universal convention that a process error return
of 0 means success and nonzero is an error code, or
2- you can cause yourself great pain.

Why do you need this flexibility?

To write the software as best I'm able. I'm not going
to let a tool dictate that up is down and down and is
up.
You do realize that if you have a silly build tool which breaks this
convention, and you need to invert the return code to appease make,
then it's as easy as the following with the bourne shell:
    if silly_nonconforming_command; then false; else true; fi
Equally obvious solutions exist for all shells.


You say "C-based" as though it's a bad thing. You obviously have a
strong bias against C, and thus presumably you have a strong bias
against C++, so why post in a C++ forum?

I do have a bias against C in some cases. On the other
hand I think C++ is pretty good. Using C in the 80s or
90s made more sense than it does today.


Brian Wood
Ebenezer Enterprises
www.webEbenezer.net
 
J

joshuamaurice

On Apr 22, 1:40 pm, (e-mail address removed) wrote:
On Apr 21, 10:18 pm, (e-mail address removed) wrote:
    [...]
I want to be able to return 1 from main() to indicate success
and 0 to indicate failure.
The C and C++ standards say that 0 must indicate success.  It's
not a question of make, or anything else for that matter---it's
what the standard requires.
I don't believe in blindly following the standard.
With this particular aspect I'll probably use ifdefs
so as to make it easy for those who use make.
Besides make being inflexible in this regard, it's
also another large C program.  I think further
development of make in C is a mistake.  As far as
long term strategies, I suggest people begin to
reduce their dependence on C-based make programs.
Make incorrectly, in my opinion, interprets 1 to be a failure
and zero to be success.
Make normally adopts the conventions of the platform it is on.
Under Unix, 0 is success, and anything else is failure, and
there's no way you can change this, other than rewriting every
utility program on the system, all of the shells, the compilers,
etc., etc.
I don't have to change those.  I just need a build
tool that's more flexible in this regard.  
Now, you can do one of two things.
1- You can accept the universal convention that a process error return
of 0 means success and nonzero is an error code, or
2- you can cause yourself great pain.
Why do you need this flexibility?

To write the software as best I'm able.  I'm not going
to let a tool dictate that up is down and down and is
up.

You still claim that 0 is an unnatural return code for success. It's
not just make. It's everything. In linux there's two utilities
executables (or shell built-ins) called "true" and "false". "true"
returns code 0, and "false" returns code nonzero. The shell operators
&& and || on all shells I know, including windows shells, operate with
these definitions of true and false. true && echo 1 will print 1, and
false && echo 1 will not. Almost all real executables are written with
this convention. It's also the sensible thing to do. If you made 0
failure and nonzero success, then you are limited to 1 error return
code. Generally there are many ways of failure but a single way of
success. That's (probably ??) why 0 was picked for success and the
rest of the return codes were reserved for failure.

Yes this is in direct contradiction with the C rule that nonzero is
convertable to true and zero is convertible to false. I suggest you
accept this lack of consistency between automatic conversion rules of
C and process exit code convention.

Also, I'm still waiting for your explanation what "C-based" is, and
why this warrants not using GNU Make.
 
R

red floyd

You still claim that 0 is an unnatural return code for success. It's
not just make. It's everything. In linux there's two utilities
executables (or shell built-ins) called "true" and "false". "true"
returns code 0, and "false" returns code nonzero. The shell operators
&& and || on all shells I know, including windows shells, operate with
these definitions of true and false. true && echo 1 will print 1, and
false && echo 1 will not.

Old style C shell had it ass-backward, I wonder if OP is using a
derivative thereof?
 
J

James Kanze

That is not completely true. There is a somewhat smaller
granularity. A target can depend on a object in an archive.
E.g.:
Example.exe: Library.a(Object.o)

Yes, but the objects in an archive are files.

Most makes also have some sort of means for getting files from
the source code control system, as well.
(But this may be different for other versions of make.)

IIRC, it was present in the first version of make I saw (Unix
version 7).
I have a script to parse the linker map to create these
dependencies. I hope that someone will point to a linker flag
that does it for me.

I'm not aware of one. But you're right, it would allow a lot
less rebuilding, if you have access to the information.
 
J

joshuamaurice

The paper is a good example of someone writing about something
he doesn't know or understand.  But that's not the point.  The
smallest granularity make can deal with is the file---a file has
been modified, or it hasn't.  It can't determine whether the
modification will require recompilation or not, because it
doesn't know anything about the files contents.  So if you
modify an inline function in a header, all sources which include
that header (directly or indirectly) will be recompiled, and not
just the sources which use the inline function which was
modified.

I agree that it may be possible to construct a better build system
that knows the C++ language to give better incremental builds than GNU
Make. However, what's wrong with that paper?
http://aegis.sourceforge.net/auug97.pdf
(Well, besides the "consider harmful" in the title.) Perhaps my
understanding is lacking as well, and I would like to hear differing
opinions.
 
J

James Kanze

Can you elaborate on that ? That looks interesting.

It would take a bit more than would be acceptable here
(especially as it's off topic), but see section 8.8 in the GNU make
manual (http://www.gnu.org/software/make/manual/make.html). For
a bit of an example (but with a lot cut) from my makefiles:

define makeLocalBinary
$(call derivedNameList,$(6),component,binary,$(1)) : $(call
derivedNameList,$(6),component,object,$(2)) $(3)
mkdir -p $$(dir $$@)
$$(call buildExec,$$@,$(call derivedNameList,
$(6),component,object,$(2)) $(3),$$(call cppFlags,$(6)) $
(cppLinkOptions),$(4),$(5),$(call derivedNameList,$(6),component,map,
$(1)))
endef

# ...

makeBinaryFromSources
= \
$(if $(filter export,
$(3)), \
$(eval $(call makeExportedBinary,$(1),
$(2)))) \
$(eval $(call
makeLocalBinary, \

$(1), \
$(call objectNames,$($(strip $(1))
Sources)), \
$(call
librariesForComponentBinary, \
$(1),$(2),
$(3)), \
,
\
,
\

$(2))) \
$(foreach
s, \
$($(strip $(1))
Sources), \
$(eval $(call makeLocalObject,$(2),$(s),Off)))

# ...

$(foreach
b, \
$
(binaries), \
$(call makeBinaryFromSources,$(b),$(targetVariant),export
useLibraries))

It's still pretty experimental; there's a total of about 2300
lines of makefile. The above is in makefiles included from
component.mk---the user writes something like:

binaries = bina binb
binaSources = a1.cc a2.cc a3.cc
binbSources = b1.cc b2.cc b3.cc

#include $(buildRoot)/Makefiles/component.mk

component.mk provides the standard targets install, run-test,
doc, and the usual variants of clean, and generates all of the
dependencies necessary to build bina and binb for the given
architecture, system and compiler.

For me, it's been an interesting excursion into functional
programming. But as you can see, make doesn't result in the
most readable code.
[snip]
The problem, probably, is that most linkers don't support
this. But I see you've found your work-around, the same way
I did for headers and VC++. (Now, of course, someone will
point out to me a simple flag I missed, which will cause
VC++ to output the dependencies like I want:). After all,
if the IDE can do it, the compiler must support it somehow.)
Something like /showincludes greping on "Note: including
file:" on error standard output :)

Something that will generate at least a list of the included
files without outputting the complete preprocessor expansion or
compiling the code.

There are two philosophies here: always update the dependencies,
each time you compile, or have a special target for them. I
choose the second, because with g++, you can't generate the
dependencies in the same run as you compile: -M (or -MM) implies
-E. My dependencies don't change that often, and invoking the
compiler twice for each compilation. At least, that used to be
the case, or that's the way I remember it being. I see that
there is a -MD option now which doesn't behave like this. And
IIRC, there's also a way to get Sun CC to generate its
dependencies on the fly, so I'll have to look into this
possibility; it's obviously far safer to regenerate the
dependencies each time you recompile. (Separating the output of
VC++ might be a bit complicated. Obviously, I want the usual
messages to appear in the usual place, and only redirect the
ones used for dependencies. I know how to do this with AWK
under Unix, but rather obviously, Unix based tricks aren't much
help with VC++.)
 
J

James Kanze

On Apr 22, 3:00 am, James Kanze <[email protected]> wrote:

[...]
Yes, it actually takes ~4 lines of makefile, not just 1. I
exaggerated, but not by much. And yes, it does require using
compiler support to do it in so little lines. (This is using
gcc 3.4.3.)

If you're only targetting one compiler, and that compiler
supports the collaboration, it can be very simple; in the case
of Sun CC and Sun's make (back when I was using it), a single
line of about 10 characters sufficed. If you're targetting
several compilers, some of which support it, and others not,
and those that support it do so in subtly different ways, you
need a bit more---both in the makefile, and in terms of scripts
which the makefile invokes. Especially in terms of scripts
which the makefile invokes. My target "depends" ends up
invoking the function compileToDependency. Which is just
compileToDependency = $(compile) -MM $(2) $(1)
for g++, but even for VC++:
compileToDependency = $(SHELL) $(makefilesDir)/$(configPath)/cl-
M.sh '$(2)' $(1)
Of course, cl-M.sh is about 30 lines of Bourne shell (which
supposes that $(SHELL) is set to something which corresponds to
a shell capable of handling Bourne shell---bash or ksh work
fine---and that this shell has access to some of the usual Unix
tools: sed, egrep and sort, at least).
Here's a snippet from such a make system I'm been playing
around with on the side as a demonstration to my company that
GNU Make is superior to Maven.

Not knowing Maven, I can't say, but GNU make is very, very
powerful. Like C++, however, it suffers from its
source---neither C nor the original make are very good bases for
evolution.
 
J

James Kanze

On Apr 21, 10:18 pm, (e-mail address removed) wrote:
[...]
I want to be able to return 1 from main() to indicate
success and 0 to indicate failure.
The C and C++ standards say that 0 must indicate success.
It's not a question of make, or anything else for that
matter---it's what the standard requires.
I don't believe in blindly following the standard.

The problem here is that you don't really have a choice, unless
you reimplement all of the tools in the OS (shell, make, etc.).
They all use this convention.
 
F

Fred Zwarts

James Kanze said:
There are two philosophies here: always update the dependencies,
each time you compile, or have a special target for them. I
choose the second, because with g++, you can't generate the
dependencies in the same run as you compile: -M (or -MM) implies
-E.

I use -MD, which does not imply -E. In this way objects and dependency
files are created in one run of the compiler.
 
J

James Kanze

On Apr 22, 3:39 pm, (e-mail address removed) wrote:

[...]
You still claim that 0 is an unnatural return code for success.

Success and failure aren't numeric values, so in a certain
sense, no numeric value is natural for them. Given that we're
using a numeric value, however (and don't have a choice, since
main returns an int), any choice is purely arbitrary---there's
no "natural" involved.
It's not just make. It's everything. In linux there's two
utilities executables (or shell built-ins) called "true" and
"false". "true" returns code 0, and "false" returns code
nonzero. The shell operators && and || on all shells I know,
including windows shells, operate with these definitions of
true and false. true && echo 1 will print 1, and false && echo
1 will not. Almost all real executables are written with this
convention. It's also the sensible thing to do. If you made 0
failure and nonzero success, then you are limited to 1 error
return code.

And what if you make odd success and even failure (as did VMS, I
think). (In this case, of course, if you return 0 from main,
the runtime library must map it to an odd value before returning
it to the system.)
Generally there are many ways of failure but a single way of
success. That's (probably ??) why 0 was picked for success and
the rest of the return codes were reserved for failure.

There's often more than one way of success, as well.
Yes this is in direct contradiction with the C rule that
nonzero is convertable to true and zero is convertible to
false.

Interestingly enough, I've also used languages where odd was
true, and even false. (In other words, only the least
significant bit was used for boolean operations.) There's
something satisfying in the idea a random value has an equal
chance of being true or false.
I suggest you accept this lack of consistency between
automatic conversion rules of C and process exit code
convention.

I fail to see the lack of consistency. In one case, we're
mapping ("converting") an integral value to an enum { success,
failure }. In the other we're converting it to a boolean {
false, true }. In both cases, rather than truncating, or
treating the value modulo n, we just convert any out of range
value to the last element. It seems very, very consistent to
me. What isn't consistent is the shell's mapping of success to
true, rather than false. But that's really only inconsistent
*if* you consider the underlying representation, which is really
irrelevant---the mapping is arbitrary, since you are mapping
between two unrelated concepts.

Of course, if you're writing portable production code, you don't
return 0 and 1 from main, you return EXIT_SUCCESS and
EXIT_FAILURE. (I actually use Gabi::success, Gabi::warning,
Gabi::error and Gabi::fatal. Which are mapped in a system
dependent way, but by default, if the system is unknown, the
first maps to EXIT_SUCCESS, and the others to EXIT_FAILURE.)
 
J

James Kanze

I agree that it may be possible to construct a better build
system that knows the C++ language to give better incremental
builds than GNU Make. However, what's wrong with that
paper?http://aegis.sourceforge.net/auug97.pdf (Well, besides
the "consider harmful" in the title.) Perhaps my understanding
is lacking as well, and I would like to hear differing
opinions.

Just that the author doesn't know anything about how to use
make. Or software engineering, for that matter:

-- It is very hard to get the order of the recursion into
the sub-directories correct. This order is very unstable
and frequently needs to be manually "tweaked."
Increasing the number of directories, or increasing the
depth in the directory tree, cause this order to be
increasingly unstable.

This is a basic software engineering issue. If you don't manage
your dependencies at this level, you're screwed. Recursive make
or not. But I've never seen an organisation so badly run that
this is a problem (and I've seen some pretty bad organisations
in my time).

-- It is often necessary to do more than one pass over the
sub-directories to build the whole system. This,
naturally, leads to extended build times.

This is related to the above. Again, I've never seen any
software so badly organized where this was the case. (Take a
look at any of the open source projects. They all use recursive
make, and none of them have this problem.)

-- Because the builds take so long, some dependency
information is omitted, otherwise development builds
take unreasonable lengths of time, and the developers
are unproductive. This usually leads to things not
being updated when they need to be, requiring frequent
"clean" builds from scratch, to ensure everything has
actually been built.

If you're using make correctly, dependencies are generated
automatically (see else thread). So nothing will ever be
missing. And the only times I've seen the need for clean builds
due to mishandling of dependencies was due to template
instantiations in the repositories not getting rebuild when
needed---a problem with the compiler/linker, not with the make
and its dependencies.

Basically, the paper argues that if you screw up your project
management beyond belief, recursive make doesn't work. Which is
true, but then, nor does anything else.
 
J

James Kanze

Sun's dmake for example.

For example:). For that matter, it shouldn't be to difficult
to add to GNU make. (The real problem isn't so much adding the
logic to distribute the make---fundamentally, once the
asynchronism is present, even if only to execute commands in
different threads, the necessary program logic is present. The
problem is to define and implement the servers for the other
processors, and the protocol to converse with them.)
 
J

joshuamaurice

Just that the author doesn't know anything about how to use
make.  Or software engineering, for that matter:

 -- It is very hard to get the order of the recursion into
    the sub-directories correct. This order is very unstable
    and frequently needs to be manually "tweaked."
    Increasing the number of directories, or increasing the
    depth in the directory tree, cause this order to be
    increasingly unstable.

This is a basic software engineering issue.  If you don't manage
your dependencies at this level, you're screwed.  Recursive make
or not.  But I've never seen an organisation so badly run that
this is a problem (and I've seen some pretty bad organisations
in my time).

It's a problem at a company I've worked for. A large source tree of
~21,000 source files, and lots of levels of recursive make. They
didn't do any header file dependencies. Change only a header file and
hit make and nothing would happen.

Let's consider the root makefile which includes or invokes make on 50
sub-directories, each making a dll, something I saw at that company.
Now, suppose I modify some source of one of those dlls to use
something from one of the utility dlls. I've introduced a dependency
between the two. However, they were being built in the wrong order.
What I did a build, it worked, but when someone else did a build from
clean on this, the build failed because the order of building the dlls
was wrong.

I suppose you might argue that I should have "managed my dependencies
better". I argue that I shouldn't have to manage dependencies at such
a low level and so manually and without any sort of diagnostic I'm
doing it wrong besides a build from clean.
 -- It is often necessary to do more than one pass over the
    sub-directories to build the whole system.  This,
    naturally, leads to extended build times.

This is related to the above.  Again, I've never seen any
software so badly organized where this was the case.  (Take a
look at any of the open source projects.  They all use recursive
make, and none of them have this problem.)

I have. See above.
 -- Because the builds take so long, some dependency
    information is omitted, otherwise development builds
    take unreasonable lengths of time, and the developers
    are unproductive.  This usually leads to things not
    being updated when they need to be, requiring frequent
    "clean" builds from scratch, to ensure everything has
    actually been built.

If you're using make correctly, dependencies are generated
automatically (see else thread).  So nothing will ever be
missing.  And the only times I've seen the need for clean builds
due to mishandling of dependencies was due to template
instantiations in the repositories not getting rebuild when
needed---a problem with the compiler/linker, not with the make
and its dependencies.

Basically, the paper argues that if you screw up your project
management beyond belief, recursive make doesn't work.  Which is
true, but then, nor does anything else.

You exaggerate the competency of companies and their knowledge of
make.

At that company, a very successful large company might I add, I
regularly saw build breakages because someone did a build not from
root attempting to bypass the slow build, and missing a vital piece
which depended upon their change. It was checked into source control,
and broke the streaming build.

Any sort of header file change resulted in the developer guessing
which source folders would be affected, going there, cleaning and then
building. Hopefully they didn't miss any, otherwise you just compiled
an ill-formed program which violates the ODR, and playing with that in
the debugger is most fun.

It's quite liberating to be able to just hit "make" at the root level
of a source directory structure with ~20,000 source files and have
make do it all with only ~15 seconds of overhead.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

Forum statistics

Threads
473,776
Messages
2,569,603
Members
45,192
Latest member
KalaReid2

Latest Threads

Top