No unanswered question

  • Thread starter Alf P. Steinbach /Usenet
  • Start date
I

Ian Collins

Yes, I guess not /everyone/ in the project has to know /all/ about the
build system. But I also think it's dangerous to delegate build and
SCM to someone, especially someone who's not dedicated 100% to the
project, and who doesn't know the code itself.

I guess you've never worked on a multi-site project using Clear Case!
You get problems like people not arranging their code to support
incremental builds. For example, if all object files have a dependency
on version.h or a config.h which is always touched, incremental builds
don't exist.

A good slap round the head normally solves that problem.
 
J

Jorgen Grahn

Second, make's model is fundamentally fubar. You cannot have a fully
correct incremental build written in idiomatic make ala Recursive Make
Considered Harmful. See else-thread, or the end of this post for a
synopsis. Make was good back in the day when a single project did fit
into a single directory and a single developer knew all of the code,
but when a developer does not know all of the code, make's model no
longer works.

Simply put, this is my use case which make will not handle. I'm
working in a company on a project with over 25,000 source files in a
single build. The compile / link portion takes over an hour on a
developer machine, assuming no random errors, which there frequently
are on an incremental build.

There's something suspect in that sentence, since incremental builds
vary in the time they take, from a few seconds (to check file
timestamps) and up.
I work on one of the root components, a
reusable module which is used by several services (also part of the
same build). It is my explicit responsibility to do a decent effort at
not breaking the build from any checkin. As the closest thing my
company has to a build expert, I know that the build is not
incrementally correct. I hacked a large portion of it together. I can
do an incremental build most of the time, and just cross my fingers
and hope that it's correct, but I have no way of knowing it.

So why not fix the Makefile? You cannot expect Make to do something
sensible when given incorrect instructions. Unless you refer to the
loopholes you list below (see there).
On the bright side, I manage to not break it most of the time.
However, with a large number of developers working on it, the last
build on the automated build machine is almost always broken. On an
almost weekly basis checkin freezes are enacted in an attempt to
"stabilize" the build. The problem is that most other developers are
not as thorough in their own testing as I, and the automated build
machine takes hours to do the full clean recompile. The time from a
checkin to a confirmed bug is quite large, and as the number of build
breakages goes up, so does this turnaround time as compile failures
hide compile failures.

Yes, I know the standard solution is to break up the project into
smaller projects. I would like that too. However, I'm not in a
position of power to make that happen, and no one else seems
interested in changing the status quo there.

Been there, done that. I think it would be harmful. If you have
informal sub-projects which break each other today, actually splitting
them would just force you to do handle dependencies manually. "Let's
see, if I do this change in Foo, Bar and Baz need to be updated. So
Bar 1.5 now needs Foo 1.4; I must remember to tell everyone ..."

You can e.g. look at what the Debian Linux project does: they spend
much of their time managing such dependencies, and it doesn't look
like a lot of fun.

If the root components were stable (their interface rarely changed) it
would work, but then it would work in your current setup too.

....
Pretty cool system. I would still argue no, that there is a difference
between what I want and what your system handles. As I mentioned else-
thread, no build system is perfect. The line must be drawn somewhere.
At the very least, the correctness of an incremental build system is
conditional on the correctness of the code of the build system itself.
Moreover, if the developer installs the wrong version of the build
system, then he's also fubar.

Now you're talking sense again!

....
The discussion at hand is make is broken. Not being
able to hand multiple output from a single step is annoying to handle,
but it's relatively minor. Its major problems are:
1- It's core philosophy of a file dependency graph with cascading
rebuilds without termination, combined with its idiomatic usage, is
inherently limited and broken without further enhancement.
- a- It will not catch new nodes hiding other nodes in search path
results (such as an include path).

Solution: don't feed multiple -I (include path root) statements to the
compiler. #include <my_components/foo/bar.h> is better than #include
<bar.h>. Or you can do -Imy_components and #include <foo/bar.h> if you
want to -- just try to make the #include statements unambiguous.

Fixing that in an existing code base can be very time consuming.
I've done it once in a 1000+ file code base, and it took a few days.
Hard to explain to people, too.

Or, issue "make clean" whenever a new include file shows up. Either
you add the file yourself, or you see it show up when you update from
revision control. (If things can change without you knowing it, you
have even bigger problems and need to adjust how you work with
revision control too).
- b- It will not catch removed nodes nor edges which should trigger
rebuilds (such as removing a cpp file will not relink the library).

Solution: issue "make clean" when files are removed; detection as above.
- c- It will not catch when the rule to build a node has changed which
should trigger a rebuild (such as adding a command line processor
define).

Solution: issue "make clean" when the Makefile is changed and you
haven't reviewed the change and seen that it's obviously harmless.
- d- It will not get anywhere close to a good incremental build for
other compilation models, specifically Java. A good incremental Java
build cannot be based on a file dependency graph. [...]

Well, I don't do Java ;-) Everything I use fits the make model. And
that's not just C and C++ compilers; I have Make control dozens of
different tools.

Like you said above, "the line has to be drawn somewhere". Or, as the
saying goes: "all software sucks". I claim that while make has
loopholes, (a) you can easily see when an incremental build may be
incorrect and fix it with a 'make clean'; (b) this doesn't happen very
often, and the vast majority of builds can be done incrementally.

Since I don't see any better alternatives (and you seem to think the
existing "make replacements" share its flaws) I am relatively happy.
2- It exposes a full turing complete language to common developer
edits, and these common developer edits wind up in source control.

Just like C++. I commented on that that above.

But I think you exaggerate the changes "common developer" make.
Those are almost always of the form of adding or removing lines like

libfoo.a: fred.o

and if someone doesn't get that, maybe he shouldn't edit the C++ code
either.

I am assuming here that someone designed the Makefile correctly
originally; set up automatic dependency generation and so on. Lots of
projects fail to do that -- but lots of projects fail to use C++
properly too.
The
immediate conclusion is that a build system based on idiomatic make
can never be incrementally correct over all possible changes in source
control. That is, a developer will inevitably make a change which
breaks the idiomatic usage (what little there is) and will result in
incremental incorrectness.

So review the changes and reprimand him. Same as with the C++ code,
but *a lot* easier.
False negatives are quite annoying, but
perhaps somewhat acceptable. False positives are the bane of a build
systems existence, but they are possible when the build system itself
is being constantly modified \without any tests whatsoever\ as is
common with idiomatic make.

That *is* scary, but I see no way around it.

/Jorgen
 
K

Keith H Duggar

I must say this is somewhat surprising to hear in a C++ forum. I
expected most people to still be under the incremental spell.
Incremental really is not that hard to achieve. It's just that no one
has really tried as far as I can tell. (Specifically under my
definition of "incremental correctness" which by definition includes
the associated build scripts as source, though I think even ignoring
build script changes, all common, purported incremental, build systems
are not incrementally correct under idiomatic usage.)

Yes people have tried and have succeeded and it was not hard.
A few simple scripts and conventions to augment make suffice.
I've already told you this (and given you one example that you
could/should have expanded on).

Your problem is not one of intellect rather it is an attitude
problem and a problem of self-delusion. You are operating on
several false assumptions:

1) That you are a master of build /systems/ (make is a tool
not a system by the way).

2) That make is a fundamentally fubar, flawed, horrific,
broken, etc /tool/ that cannot serve as a core component
of an incrementally correct build /system/.

3) That the problems make does have are insurmountable
without major overhaul.

4) That the incompetence and social problems of your
workplace are relevant to the correctness of make.

All of the above are false but you labor with them as truths.
They are holding you back! If you would stop ranting and whining
and start thinking and scripting your problems would start to
evaporate.

Even the fact that I'm telling you it is /possible/ to solve ALL
the problems you have outlined and indeed can be done with simple
scripts + make (at least with C++ projects, I can't comment on
Java), that alone should be a boon to you if you could only get
past those flawed preconceptions.

KHD
 
M

Maxim Yegorushkin

On Jul 20, 7:39 am, Jorgen Grahn<[email protected]> wrote:
[]
In short, make is /one/ tool, a dependency analysis tool,
that is /part/ of a build system (called Unix). Learn to use
the full suite of tools instead of searching for a "One True
Uber Tool" monolith. Remember the Unix paradigm of "many small
tools working together to solve big problems".

Of course, there are some actual problems with make. A few
have be mentioned in other posts. Another is proper handling
of a single command that outputs multiple targets which is,
well let's say annoying ;-), with make.

Interesting. I'm sure there's some logical fallacy in here somewhere,
but I don't know the name(s). In short, you assert that the Unix way
works, and is better than other ways, especially better than "One True
Uber Tool monolith". I'd rather not get into this discussion, as it's
mostly tangential. The discussion at hand is make is broken. Not being
able to hand multiple output from a single step is annoying to handle,
but it's relatively minor.

I find it little known, but make does handle the case of multiple output
files when using pattern rules. See example 3 on
http://www.gnu.org/software/make/manual/make.html.gz#Pattern-Examples
> Its major problems are:
1- It's core philosophy of a file dependency graph with cascading
rebuilds without termination, combined with its idiomatic usage, is
inherently limited and broken without further enhancement.

I've read the thread but could not find anything that proves the above
statement.
- a- It will not catch new nodes hiding other nodes in search path
results (such as an include path).

True. Not sure if it is a good practise though.
- b- It will not catch removed nodes nor edges which should trigger
rebuilds (such as removing a cpp file will not relink the library).

When you remove a source file you end up updating a makefile. In a
robust system changes to makefiles trigger a rebuild. In an in-house
system I built different aspects of building (compiling, linking) are
put in different makefiles, so that changes to one makefile only trigger
a relink, to others - a recompilation.
- c- It will not catch when the rule to build a node has changed which
should trigger a rebuild (such as adding a command line processor
define).

It will if your .o files depend on the makefiles, which they should.
- d- It will not get anywhere close to a good incremental build for
other compilation models, specifically Java. A good incremental Java
build cannot be based on a file dependency graph. In my solution, file
timestamps are involved yes, but the core is not a file dependency
graph with cascading rebuilds without termination conditions.

Could you elaborate please?
2- It exposes a full turing complete language to common developer
edits, and these common developer edits wind up in source control. The
immediate conclusion is that a build system based on idiomatic make
can never be incrementally correct over all possible changes in source
control. That is, a developer will inevitably make a change which
breaks the idiomatic usage (what little there is) and will result in
incremental incorrectness. False negatives are quite annoying, but
perhaps somewhat acceptable. False positives are the bane of a build
systems existence, but they are possible when the build system itself
is being constantly modified \without any tests whatsoever\ as is
common with idiomatic make.

One often used tool in GNU make is macro calls $(eval $(call ...)),
using which you hide all the complexity from the average developers.
 
K

Keith H Duggar

I find it little known, but make does handle the case of multiple output
files when using pattern rules. See example 3 onhttp://www.gnu.org/software/make/manual/make.html.gz#Pattern-Examples

Yeah, the only problem is that the targets must match a pattern.
Often they don't. There are other "solutions" too but none all of
them have annoying limitations. This is one thing that is tough
(in the general) case to get around in make. (Or at least I have
not figured out a nice way). These other "problems" the op brings
up are easier to eliminate in a nice and useful way.
 > Its major problems are:


I've read the thread but could not find anything that proves the above
statement.

Because it is false.
True. Not sure if it is a good practise though.

Handled with a very simple script.
When you remove a source file you end up updating a makefile. In a

Not necessarily. My make files do not change when a file is
removed (unless those files are involved in special "override"
functionality we have). Furthermore, a simple script (closely
related to the script that solves -a-) triggers the necessary
updates.
robust system changes to makefiles trigger a rebuild. In an in-house
system I built different aspects of building (compiling, linking) are
put in different makefiles, so that changes to one makefile only trigger
a relink, to others - a recompilation.

A more sophisticated system (still make based!) can be smarter.
In addition, as mentioned, the makefiles of a sophisticated make
system might change very rarely (if ever) anyhow. Simple actions
like adding/removing source files will not change the make files.
It will if your .o files depend on the makefiles, which they should.

And factoring make files and/or using environment variables
together with a smart script which examines the environment
helps to minimize impact.
Could you elaborate please?


One often used tool in GNU make is macro calls $(eval $(call ...)),
using which you hide all the complexity from the average developers.

There are many additional ways to simplify makefiles that are
(if ever) touched by "common" users. Besides, such changes must
be reviewed by a senior anyhow (unless your shop sucks or is in
too much of hurry to practice software /engineering/).

KHD
 
Ö

Öö Tiib

First, let me note that the post to which you are replying only argued
that he basically did not test the incremental build system. I was

Sure. Sorry. He seemingly did use clean builds as well. With clean
build there may be also issues but these are less common and false
positives are rare.
quite explicit in this. However, else-thread, I made arguments in
favor of incremental. Let me add a couple new replies here.
Some companies don't have those build farms. For starters, they're
expensive. I also disagree that build farms are the "simpler"
solution. Maintaining a build farm is a huge PITA. Just maintaining
automated build machines for the dozen or so platforms my company
supports for emergency bug fixes, hot fixes, and daily ML builds takes
the full time employment of 3+ people.

Experiences differ, I have not observed such PITA. Software for
distributed building is perhaps different, maintainers are different,
sizes are different and so the budgets different. Bigger problems need
bigger baseball bat to deal with them. Usually solving bigger problems
also raises money for to buy bigger sticks. Finesse of clever details
is wrong way to go there. The bigger it is the more robust and more
oriented to usage of power (not dexterity) it should be.
The simpler solution is to have
the developer build everything himself. A lot less moving parts. A lot
less different versions running around. A lot less active Cruise
Control instances. Is building it all in one giant build more
sensible? Perhaps not; it depends on the situation, but a build farm
is definitely not simpler than building everything yourself in a giant
build.

Individual developers (or sub-teams) work on a single module. He/they
can build it separately using a diagnose/build/test/deploy systems
dedicated for the module. Only before he/they move work results into
some main repository branch it needs to be integrated with efforts of
other developers/teams. Other teams can write protective tests against
alien modules to give fast feedback when they are unable to integrate
with things thrown at them.
Now, correct incremental. It's relatively clear to me that this is
more complex than a full clean build every time, aka not simpler. Is
it simpler or more complex than a build farm? I don't know. I would
think that a correct incremental build system is actually simpler than
a build farm. A build farm really is complex and takes effort to
maintain, but a correct incremental build system would only have to be
written once,  or at least each different kind of task would only have
to be written once, amortized over all companies, whereas a build farm
would probably have a lot more per-company setup cost.

I am even not sure why you say it? Building modules is only tiny bit
of computer time what full building takes. If incrementally versus
cleanly means 5 times quicker, then just that tiny thing gets tinier.
The argument made was that he did not test the incremental build
system to any significant degree. He argued that he did test the
incremental build system with "continuous clean builds". He is in
error, and you are arguing a non sequitur.

Ok. He did test clean build system, so he did not test incremental
build system.
Also, no one uses incremental builds? Why do we even use Make anymore
then? I suppose that Make has some nifty parallelization features, so
I guess it has some use if we ignore incremental.

Yes. makefile is script in flexible programming language. It is sort
of tradition i suppose to use that language for scripts that analyze
dependencies and build software.
I must say this is somewhat surprising to hear in a C++ forum. I
expected most people to still be under the incremental spell.
Incremental really is not that hard to achieve. It's just that no one
has really tried as far as I can tell. (Specifically under my
definition of "incremental correctness" which by definition includes
the associated build scripts as source, though I think even ignoring
build script changes, all common, purported incremental, build systems
are not incrementally correct under idiomatic usage.)

Sure, i think i explained. Incremental makes sense when won time that
you get is something that matters. Then it is good. Sure. That is
usually so when debugging product of small project. Small may be
project for building a tool for bigger project, but into bigger
project it can be then integrated as external dependency or satellite
co-product.

However now consider such a build: Lets say target of (certain from
several) build system is Win32. Production binaries are compiled with
intel's compiler but build system also compiles the same code with g++
and msvc as well just to collect diagnostics produced by these
compilers. Incremental build will then damage the purpose of compiling
with the other two compilers whatsoever, because you do not get full
set of diagnostics. Also you do not get full set of diagnostics from
Intel (so part of reason of building with it is damaged as well), but
from there you at least get modules.
Again, arguing a non sequitur. I would love to have a discussion of if
we should have clean builds, but you reply as though I was making the
argument in that quote that we should have incremental build systems.
I was not. I was very clear and explicit in that post that I was
arguing solely that he did not test the incremental build system for
correctness in any significant way.

Sorry there, then.
Again, to change topics to your new point, why should we have correct
incremental builds? Because it's faster, and componentizing components
might not make sense, and it might be more costly to the company than
a correct incremental build system, especially when the cost of the
incremental build system can be amortized over all companies.

Componentizing makes always so great sense for me from so lot of
different angles that it is somewhat holy thing for me. I believe into
it. Testability, reusability, maintanability etc. If you use
components statically or dynamically in your end product is entirely
different issue.
Think about it like this. It's all incremental. Splitting it up into
components is one kind of incremental. It's incremental at the
component level. However, the benefit of this can only go so far.
Eventually there would be too many different components, and we're
right in the situation described in Recursive Make Considered Harmful,
the situation without automated dependency analysis. Yes, we do need
to break it down at the component level at some point. It's not
practical to rebuild all of the linux kernel whenever I compile a
Hello World! app, but nor is it practical to say componentization
solves all problems perfectly without need of other solutions like a
parallel build, a distributed build, build farms, faster compilers,
pImpl, and/or incremental builds.

I am not saying components solve everything ... just that these help
greatly and simplify life from so numerous angles. As for Make ... it
is itself fully automated dependency analysis?
Yes, a build system does everything "automatically" if you do a full
clean build every time, then it is handled automatically. Well, except
it's slow.

It is slow because it runs all static tools on code, runs class level
unit tests, (X) builds modules, runs module level unit tests and then
builds product (and possibly its installers), deploys to places and
runs product level automatic tests there. Also it produces reports
about everything. Why to say that it is slow because not using
incremental build in spot (X)? Does incremental build speed it up so
lot that it matters?
And if there's a lot of dependency components which are
frequently changing, and you have to manually get these extra-project
dependencies, then we're in Recursive Make Considered Harmful. If
instead you use some automated tool like Maven to download
dependencies, and you do a full clean build, aka redownload, of those
every time, then it's really really slow. (I'm in the situation now at
work where we use Maven to handle downloading a bazillion different
kinds of dependencies. As Maven has this nasty habit of automatically
downloading newer "versions" of the same snapshot version, it's quite
easy to get inconsistent versions of other in-house components. It's
quite inconvenient and annoying. I've managed to deal with it, and
work around several bugs, to avoid this unfortunate default. Did I
mention I hate Maven as a build system?)

Isn't Maven for java? Sorry, i have no experience, our java teams have
their own processes, tools and procedures.
Also, an automated build machine polling source control for checkins
can only tell you which checkin broke the automated build (and tests)
if your full clean build runs faster than the average checkin
interval. At my company, the build of the ~25,000 source file project
can take 2-3 hours on some supported systems, and that's without any
tests. The basic test suite add another 5-6 hours. As a rough guess, I
would imagine we have 100s of checkins a day.

~25,000 *files* and splitting into components (and even fully
autonomously useful subsystems) does not make sense? We are from
different universes. Apparently. There should be distributed version
control system, several repositories and so on i feel. Nothing to talk
of modules and components. Also it feels you should have budget to
increase such a build farm (if it is building it 2 hours) until it
takes 10 minutes. One thing i am sure, problems of incremental build
system are smallest cause of issues there.

Probably i am not qualified enough to discuss such an ill situation
and good ways out of it. Continuously integrating millions of SLOC to
keep it as one big blob? Rewrite it, split it up into components and
drop integration cycle of components to 2 weeks as minimum?
 
J

Joshua Maurice

I guess you've never worked on a multi-site project using Clear Case!


A good slap round the head normally solves that problem.

Can you come slap mine please? We have two parallel build machines,
one full clean, one this hacked version of incremental I set up. Every
such incremental build changes the version.h (and a couple other
version files like version.java), updating the build number, which has
the result that my incremental build tends to rebuild like 40% of all
of the code on every streaming incremental build because these version
files were changed. This was noted to management, but no time was
allocated to fix this.
 
J

Joshua Maurice

There's something suspect in that sentence, since incremental builds
vary in the time they take, from a few seconds (to check file
timestamps) and up.

Yes, looking over that, it is quite nonsensical. Sorry. I think I was
trying to say that the full clean build takes forever, and I
frequently hit errors during it because ours is especially a POS.
So why not fix the Makefile?  You cannot expect Make to do something
sensible when given incorrect instructions.  Unless you refer to the
loopholes you list below (see there).



Been there, done that.  I think it would be harmful. If you have
informal sub-projects which break each other today, actually splitting
them would just force you to do handle dependencies manually. "Let's
see, if I do this change in Foo, Bar and Baz need to be updated. So
Bar 1.5 now needs Foo 1.4; I must remember to tell everyone ..."

  You can e.g. look at what the Debian Linux project does: they spend
  much of their time managing such dependencies, and it doesn't look
  like a lot of fun.

If the root components were stable (their interface rarely changed) it
would work, but then it would work in your current setup too.

Yes. I agree. Simply breaking up the components by fiat I think would
be harmful to the process. Instead, clear defined interfaces need to
be put in place first, preferably with some decent acceptance tests at
every component level. Some people in my company just want to
componentize by fiat as though this will fix anything. Unfortunately,
such well defined interface would be a paradigm shift. People would
actually have to plan ahead, get requirements earlier on, write
generic reusable interfaces instead of the feature driven approach we
currently have, etc.
Solution: don't feed multiple -I (include path root) statements to the
compiler. #include <my_components/foo/bar.h> is better than #include
<bar.h>. Or you can do -Imy_components and #include <foo/bar.h> if you
want to -- just try to make the #include statements unambiguous.

  Fixing that in an existing code base can be very time consuming.
  I've done it once in a 1000+ file code base, and it took a few days.
  Hard to explain to people, too.

Or, issue "make clean" whenever a new include file shows up. Either
you add the file yourself, or you see it show up when you update from
revision control. (If things can change without you knowing it, you
have even bigger problems and need to adjust how you work with
revision control too).


Solution: issue "make clean" when files are removed; detection as above.


Solution: issue "make clean" when the Makefile is changed and you
haven't reviewed the change and seen that it's obviously harmless.

I might be amenable to this in practice. As mentioned else-thread,
"good" GNU Make usage has a lot of $(eval $(value ...)) stuff in it
(or $(eval $(call ...)) where the call directly contains a $
(value ...)). This was my first prototype at attempting incremental
correctness, but man it was slow. The file IO for all of the build
state files, the cat and echo calls, and all of the interpreted string
manipulation, was killing it, especially on windows. It was spending
over half of its time on an up to date build just reading and parsing
the saved state files. (The .d files, one might say.)
- d- It will not get anywhere close to a good incremental build for
other compilation models, specifically Java. A good incremental Java
build cannot be based on a file dependency graph. [...]

Well, I don't do Java ;-) Everything I use fits the make model. And
that's not just C and C++ compilers; I have Make control dozens of
different tools.

Unfortunately, my shop is moving towards Java, but we have a lot of
legacy C and C++ code which provides the engine for the Java tools, so
any usable incremental solution must handle Java. I think I hacked
together a solution which did at one point, but again the file IO and
interpreted string manipulation really killed the performance. I've
achieved much better performance just writing all of the logic in
actual compiled code instead of makefile.
Like you said above, "the line has to be drawn somewhere". Or, as the
saying goes: "all software sucks". I claim that while make has
loopholes, (a) you can easily see when an incremental build may be
incorrect and fix it with a 'make clean'; (b) this doesn't happen very
often, and the vast majority of builds can be done incrementally.

Since I don't see any better alternatives (and you seem to think the
existing "make replacements" share its flaws) I am relatively happy.

I'm writing my own. See my first post in this thread, or one of the
earlier ones.
Just like C++. I commented on that that above.

But I think you exaggerate the changes "common developer" make.
Those are almost always of the form of adding or removing lines like

  libfoo.a: fred.o

and if someone doesn't get that, maybe he shouldn't edit the C++ code
either.

I am assuming here that someone designed the Makefile correctly
originally; set up automatic dependency generation and so on. Lots of
projects fail to do that -- but lots of projects fail to use C++
properly too.

Well, I'm one of those projects which failed to do that. It also uses
recursive make.

However, it's not just a simple C++ project. We have Java, code
generation from a simple Java like language to Java and C++ to support
serialization between the two, C++, other code generation for message
files, JNI, some Eclipse plugin build thingy, ?AWT?, and at least a
half a dozen other kinds of build steps in the build. It also somehow
manages to throw Maven in. It really is a mess.

The unfortunate problem of this thread is everyone has their own
preferred solution. Some suggest $(eval (value ...)) to deal with my
corner cases. Some suggest doing a make clean on every such corner
case. I think there was a serious suggestion to just do full clean
builds everytime as well, and people are getting confused to my
replies, where I reply to idea A, but they argue I'm wrong when
applied to idea B. Thus I'm trying to be overly pedantic in this
thread.

So, if edits are always like
libfoo.a : fred.o
then we have my corner cases which must be handled by email or some
other word of mouth or manual step, and this is a relatively horrible
state of affairs.

If the makefiles use file system wildcards, then we're looking a
little better. However, another corner case of mine sneaks in, which
again can only be handled by email, word of mouth, or manually
checking for it.

You could go the distance and abandon idiomatic make usage entirely
and use $(eval $(value ...)) to guarantee incremental correctness. All
developer makefiles would consist only of $(eval $(value ...)) and
they would never specify a make rule. This was my initial route. It
could work, but man it was slow with GNU Make 3.81 on windows. It's
also somewhat ugly, if I may say so, and I fear there would be a great
desire for a developer to just write his own one-off rule if the
situation arised, and anyone with sufficient power over him wouldn't
care. At least, that's how my company operates.

So, I took the approach where I wrote my own replacement system which
is kind of like a bastard mix of Maven, Ant, with implementation
details of Make, to put all of the build logic in a single place, and
developers can only instantiate build macros which have otherwise been
thoroughly reviewed and tested before being deployed to developers.
So review the changes and reprimand him. Same as with the C++ code,
but *a lot* easier.

They'll argue that 95% incremental correctness is acceptable, just as
someone else has else-thread. If you allow a build system where the
developer can incorrectly specify a build script, but it works most of
the time, management will not see a need to spend developer time
fixing it. That's why I want it near impossible for a developer to be
able to break incremental correctness short of maliciousness.
That *is* scary, but I see no way around it.

I'm working on it. See else-thread for a description of my build
system.
 
J

Joshua Maurice

Yes people have tried and have succeeded and it was not hard.
A few simple scripts and conventions to augment make suffice.
I've already told you this (and given you one example that you
could/should have expanded on).

As I just mentioned in the previous post, conventions are insufficient
IMHO. If it allows for someone to add a new rule to make, such as for
a one-off build step, it's very hard to convince managers that it's
worth developer time to do it the right way if it'll work 99% of the
time. However, those 1% "add up", and the end result is my current
build system which is horribly broken for incremental, and I suspect
this the same for most other build system of equivalent size.

Also, could you point me to publicly available implementations?
Your problem is not one of intellect rather it is an attitude
problem and a problem of self-delusion. You are operating on
several false assumptions:

   1) That you are a master of build /systems/ (make is a tool
      not a system by the way).

I'm not operating on that assumption. If I was, I wouldn't be posting
here asking for feedback and advice. I merely said I was above the
pack, which is quite evident from my company, but less-so here. And
yes, make is a build system framework. You can implement many
different kinds of build systems from Make. I was trying to be clear
when I emphasized "idiomatic GNU Make usage ala Recursive Make
Considered Harmful", though I may have let that slip several times to
simply "make".
   2) That make is a fundamentally fubar, flawed, horrific,
      broken, etc /tool/ that cannot serve as a core component
      of an incrementally correct build /system/.

3) That the problems make does have are insurmountable
without major overhaul.

Can you elaborate further? It's the existence of these scripts and
conventions which don't make it broken? I think that's still under
dispute. Moreover, this isn't an assumption of mine. I have spent a
great deal of text-space here in this thread clarifying the problems
and arguing that they are real problems.
4) That the incompetence and social problems of your
workplace are relevant to the correctness of make.

They're not? I live in a world where practicality matters. I live in a
world where politics and social pressures matter. I live in a world
where we like type safety and const correctness because developers are
not perfect, myself included. I live in a world where we use C++
instead of assembly.

I could not write an incrementally correct build on my first try. I
would need several iterations, a large test suite, etc. (which I'm in
the process of doing). If you have an academic solution, but it
doesn't work in practice, then it simply does not work.
All of the above are false but you labor with them as truths.
They are holding you back! If you would stop ranting and whining
and start thinking and scripting your problems would start to
evaporate.

Even the fact that I'm telling you it is /possible/ to solve ALL
the problems you have outlined and indeed can be done with simple
scripts + make (at least with C++ projects, I can't comment on
Java), that alone should be a boon to you if you could only get
past those flawed preconceptions.

So yes, you're advocating the $(eval $(value ...)) like approach where
all developers work in terms of the predefined, vetted GNU Make
macros. As I mentioned, this was my initial idea, my prototype, but I
threw it out for the reasons already mentioned.

Also, unfortunately my solution requires Java support.
 
J

Joshua Maurice

There are many additional ways to simplify makefiles that are
(if ever) touched by "common" users. Besides, such changes must
be reviewed by a senior anyhow (unless your shop sucks or is in
too much of hurry to practice software /engineering/).

As already mentioned, mine is. Mine is in the business of making
money, not perfecting a GNU Make incremental build system. If I go to
a manager and say "Here's this problem. In 1% of incremental builds,
the result will be incorrect, and we'll lose X developer time over
it." Manager will say "Ok, how long will it take to fix? I can't just
send you on this academic exercise. If we do that, we'll lose
developer time overall, and we'll let feature Y miss the release." And
the manager would be right. The problem is that all of these little
problems in the build system pile up, each one not worth fixing, but
in the end the incremental build system becomes quite broken.

That's why I argue it's important to make it right by default, and
make it exceptionally hard to break the incremental build system.
 
Ö

Öö Tiib

I don't agree with that, 99% of my builds are incremental, often just
the one file I'm editing.

Yes, i did mean for producing full fifty-modules and thousand-code-
files product after pulling from repository with changes made by who
knows whom and in where. As for one file or even module and its unit-
tests and tools (with what i usually work) it usually takes minute or
so for my PC to compile and run them all so i barely can run to take
coffee during such build ... no difference if it is incremental or
not. Incrementally building and linking full product whole day just to
see if tiny change you made in one file did the trick in context of
full product feels a bit like voodoo-programming and hacking and may
throw all the productivity out of window i believe.
 
K

Keith H Duggar

As already mentioned, mine is. Mine is in the business of making
money, not perfecting a GNU Make incremental build system. If I go to
a manager and say "Here's this problem. In 1% of incremental builds,
the result will be incorrect, and we'll lose X developer time over
it." Manager will say "Ok, how long will it take to fix? I can't just
send you on this academic exercise. If we do that, we'll lose
developer time overall, and we'll let feature Y miss the release." And
the manager would be right. The problem is that all of these little
problems in the build system pile up, each one not worth fixing, but
in the end the incremental build system becomes quite broken.

That's why I argue it's important to make it right by default, and
make it exceptionally hard to break the incremental build system.

I'm not sure I understand what you meant by "mine is". Did you
mean "my (Joshua's) shop sucks"? Or "my (Joshua's) shop is in too
much of a hurry to practice software engineering"?

Anyhow, our goal is making money as well. Luckily we know (to
some extent at least) that haste can make waste and that cutting
corners eventually bites your ass. We learned the hard way.

Also, haven't you heard of working on the weekends and at night?
What does your employer think of all the time you are spending
whinging here in the newsgroup about make? Would they allow you
to spend such whining time on improving the build system instead?

On more than one occasion I've had a conversion like this with
my director:

KHD : By the way, I went ahead and implemented a solution this
last two weekends to problem XYZ that I've been complaining
about for the last month.
DIR : Hehe. Why did you do that?
KHD : I just couldn't stand screwing around with the hacks
any longer. They were breaking and wasting my time.
DIR : Ok. Cool. When will it be online?
KHD : Well, I have to wait until ABC has time to review it.
It's simple so it will only take him an hour or two.
DIR : Ok. Well tell him he can review it after he finishes
project PQR unless something else comes up.
KHD : Thanks. In the meantime I'll just keep hacking shit.

Step up to the plate, man. Be a leader and problem solver. Work
overtime to implement your ideas. Maybe this will help you out:


KHD
 
I

Ian Collins

So yes, you're advocating the $(eval $(value ...)) like approach where
all developers work in terms of the predefined, vetted GNU Make
macros. As I mentioned, this was my initial idea, my prototype, but I
threw it out for the reasons already mentioned.

There's more to make than GNU make. In some ways the "extensions" in
GNU make simply give the unwary more rope to hang them selves.
 
I

Ian Collins

Ok. He did test clean build system, so he did not test incremental
build system.

I thought I had explained clearly that incremental development tests
incremental builds!

You add a test or some code to pass a test, build, test. If you don't
get the expected result, something is broken. That something can either
be the new code, or the build (or on a really bad day, the compiler!).
 
J

Joshua Maurice

I thought I had explained clearly that incremental development tests
incremental builds!

You add a test or some code to pass a test, build, test.  If you don't
get the expected result, something is broken.  That something can either
be the new code, or the build (or on a really bad day, the compiler!).

If you restrict developers to only $(eval $(value ...)) of predefined,
vetted macros, then at least you might be able to thoroughly test each
macro for correctness. I presumed you were not using such a scheme. I
apologize if that assumption was incorrect. If my assumption was
correct, and there are a lot of explicit rules commonly modified by
developers, then my claims hold.

To reiterate: My first claim was that you did not thoroughly test the
incremental build system at all before deploying it to developers. The
fact remains that the build system is under constant modification
without even a basic sanity test before being deployed to developers.

Moreover, I have a new claim as well, that even the everyday use by
developers will not test all possible source code deltas, that is
developers using it over the next day will not constitute a
comprehensive test either. In addition to being forced to be the
guinea pig testers for the new incremental build system, they still
are not thoroughly testing it. Their goal isn't to test it and its
corner cases. They're just trying to use it to get another job done. A
single missing dependency could remain uncaught for months of usage,
or never caught.
 
J

Joshua Maurice

I'm not sure I understand what you meant by "mine is". Did you
mean "my (Joshua's) shop sucks"? Or "my (Joshua's) shop is in too
much of a hurry to practice software engineering"?

Anyhow, our goal is making money as well. Luckily we know (to
some extent at least) that haste can make waste and that cutting
corners eventually bites your ass. We learned the hard way.

Possibly both? If you define "software engineering" appropriately,
then we do not do it. We decide that we cannot spend developer time on
an investment which would cost more than it would return before the
next release. At least, that's how most of the decisions go. If you
want to define a short time horizon as not software engineering, then
yes.
Also, haven't you heard of working on the weekends and at night?
What does your employer think of all the time you are spending
whinging here in the newsgroup about make? Would they allow you
to spend such whining time on improving the build system instead?

This is my own time. I have rather flexible hours, but I put in more
than my expected time most days. I don't appreciate the insinuations
and personal attacks either.

And no, they're not terribly interested in fixing the build, either
componentizing, or any of my other suggestions involving incremental
correctness. I can't really blame them either for the aforementioned
reasons, such as it might not be a wise investment for the next
release.
On more than one occasion I've had a conversion like this with
my director:

   KHD : By the way, I went ahead and implemented a solution this
      last two weekends to problem XYZ that I've been complaining
      about for the last month.
   DIR : Hehe. Why did you do that?
   KHD : I just couldn't stand screwing around with the hacks
      any longer. They were breaking and wasting my time.
   DIR : Ok. Cool. When will it be online?
   KHD : Well, I have to wait until ABC has time to review it.
      It's simple so it will only take him an hour or two.
   DIR : Ok. Well tell him he can review it after he finishes
      project PQR unless something else comes up.
   KHD : Thanks. In the meantime I'll just keep hacking shit.

Step up to the plate, man. Be a leader and problem solver. Work
overtime to implement your ideas.

What do you think this is? Whining? I specifically asked where I
should post up the code in order to get reviews and such, and possibly
wide spread public adoption. Since then, I have merely participated in
discussions on my claims, and I have defended my claims where I think
I am right based on evidence and argument. However, this has been very
beneficial to me, as I now know where to further explore my ideas.
This thread has led to several novel claims and ideas for reinforcing
my beliefs and presented several new against, and for that I thank
you.

My build system, on which I do work weekends and nights off-clock,
it's not complete enough for use in my company. I have many macros to
implement before it's in a usable state, and even then there's the
cost to move over from the recursive make system + Maven nonsense to
my system. As my system is not publicly adopted, it's an in-house
system, they're hesitant. They also don't believe that the costs of
the current system are unreasonable - I just wished they would
actually do some coding sometime to see how bad it actually is.

Then we also have developers in high level meetings take the weasel
way out, much like this thread, so that's not helping. I was just
recently involved in a team to help "fix" the build. As far as I can
tell, this is basically how it went:
- Developers: Yo managers! The build is like, way slow, and very
fragile.
- Managers: Ok. So, we agree. In a meeting which Joshua was not
involved in, nor anyone else from the "build specialist" team, we
pulled numbers out of a hat of for acceptable build times. None of
these targets include anything about its fragileness. How bad is our
current build system? What can we do to meet these targets?
- Developer Teams Representatives (of which I was one): Well, if a
developer chooses to not do a full rebuild but only a rebuild of my
own unofficial subcomponent, and I skipped running 80% of the tests,
then we actually already meet these goals without any changes. (Insert
other weaseling which shows we already meet build time targets.)
- Managers: Excellent! So, issue closed?

I've had discussions similar to yours with my manager, other managers,
and higher level managers, but they don't end the same way. In the
end, they don't understand the technical details, they don't
understand the lost developer time, and they defer to the other senior
developers and/or developer+managers who take the weasel way out
because they have looming deadlines, and to some extent because they
also don't fully understand the technical details. . When I'm in such
meetings, I have to explain to the higher level technical managers,
such as the manager of the manager of the build specialist team, what
a pom.xml is ("Oh, like a makefile"), and we've had Maven as our core
build tool for the last 2 years.

At least, this is my impression. I believe a correct incremental build
system is a doable proposition, and I believe that switching to it
would be a very worthwhile investment if such a system already
existed. However, if the system had to be written from scratch, like
what I'm doing, then it may not be worth it if our time horizon is
only the next release. However, as it will never fit in a release, and
the investment return spawns over several releases, I suspect it'll
never get done in my company. Well, at least not for many more
releases from now.

PS: I feel as though I'm being given the runaround by some people in
here. I have made very clear points, and some people continue to
misinterpret or misrepresent my arguments, and others bring up
tangentially related arguments as though it's a rebuttal to my mostly
unrelated points. Now I'm replying to mostly personal attacks and
other non sequiturs, like "Stop complaining and fix it", despite my
sincere belief that this thread is exactly that, an important and
crucial step towards fixing it (initially I asked where I could post
the source code if I could get it open-sourced) , and I have explained
else-thread that I have been working (in my own time) on such a fix.
 
I

Ian Collins

If you restrict developers to only $(eval $(value ...)) of predefined,
vetted macros, then at least you might be able to thoroughly test each
macro for correctness.

If I knew what $(eval $(value ...)) did, I'd be able to answer. We
didn't use GNU make.
I presumed you were not using such a scheme. I
apologize if that assumption was incorrect. If my assumption was
correct, and there are a lot of explicit rules commonly modified by
developers, then my claims hold.

The *only* changes made to makefiles by developers where the addition
and removal of targets (source files).
To reiterate: My first claim was that you did not thoroughly test the
incremental build system at all before deploying it to developers. The
fact remains that the build system is under constant modification
without even a basic sanity test before being deployed to developers.

I'm sorry, but that's bollocks.
Moreover, I have a new claim as well, that even the everyday use by
developers will not test all possible source code deltas, that is
developers using it over the next day will not constitute a
comprehensive test either. In addition to being forced to be the
guinea pig testers for the new incremental build system, they still
are not thoroughly testing it. Their goal isn't to test it and its
corner cases. They're just trying to use it to get another job done. A
single missing dependency could remain uncaught for months of usage,
or never caught.

The world isn't perfect, get over it.
 
J

Joshua Maurice

The *only* changes made to makefiles by developers where the addition
and removal of targets (source files).

So, developers never specified new rules? What if the developer added
new source code which was to be in a new library? I'm confused.
Presumably the developers had to define new make rules. I can only
assume that's what you meant by the addition of new targets. In which
case, do you even track header dependencies? If not, your system is
laughably not incremental. If you do track header file dependencies,
and the developer has to add the rules to track header file
dependencies every time he adds a new library, then there's plenty of
room for error. (In addition to all of my other points.) Also typos.
I'm sorry, but that's bollocks.

Can you explain? It sounds to me still that the makefile is a Turing
complete programming language, and this file is being modified and
deployed without testing all aspects of it. I think your argument is
"It's so simple they can't break it." Fine, I guess, for a
sufficiently small makefile and project working entirely on C++ code,
which never adds nor removes source files, which was written the right
way to start with, and which is maintained by knowledgeable people of
make and your conventions. That doesn't sound reasonable. Also typos.
The world isn't perfect, get over it.

That's not a rebuttal. I'm sorry I am pedantic, it's just the way I
am. I originally claimed that incremental build systems are basically
deployed entirely untested, including yours. You contested this claim,
and based on the available evidence and argument, you are wrong, and I
will continue arguing this until presented with some new evidence or
argument. You're welcome to stop contesting it.
 
I

Ian Collins

So, developers never specified new rules? What if the developer added
new source code which was to be in a new library?

Adding a new library would have been a team decision, it didn't happen
very often.
I'm confused.
Presumably the developers had to define new make rules. I can only
assume that's what you meant by the addition of new targets. In which
case, do you even track header dependencies? If not, your system is
laughably not incremental. If you do track header file dependencies,
and the developer has to add the rules to track header file
dependencies every time he adds a new library, then there's plenty of
room for error. (In addition to all of my other points.) Also typos.

*make* tracks header dependencies, that's why people use it! Google
"make .KEEP_STATE". As I said, there's more to make than GNU make. We
only listed source files, not headers. The source file dependencies are
in the files generated buy make. So I/we only write the higher level
dependencies:

Executable
|
Libraries and Object files
|
Source files

My current personal project space makefile is 1055 lines long, with 180
targets, in 11 libraries, the generated dependency file is almost 6000
lines.
Can you explain? It sounds to me still that the makefile is a Turing
complete programming language, and this file is being modified and
deployed without testing all aspects of it. I think your argument is
"It's so simple they can't break it." Fine, I guess, for a
sufficiently small makefile and project working entirely on C++ code,
which never adds nor removes source files, which was written the right
way to start with, and which is maintained by knowledgeable people of
make and your conventions. That doesn't sound reasonable. Also typos.

I think you are assuming too much of how we used our makefile. All we
had in makefiles were lists of tools, options and targets. No fancy
stuff, no variable evaluations, just lists. There was very little to
break, if a source was missing, the application wouldn't link.

I guess with hindsight, we were lucky to start out with makefiles
generated by an IDE. IDEs being simple beasts generate simple makefiles!
 
J

Joshua Maurice

Adding a new library would have been a team decision, it didn't happen
very often.


*make* tracks header dependencies, that's why people use it!  Google
"make .KEEP_STATE".  As I said, there's more to make than GNU make.  We
only listed source files, not headers.  The source file dependencies are
in the files generated buy make.  So I/we only write the higher level
dependencies:

         Executable
             |
Libraries and Object files
             |
        Source files

My current personal project space makefile is 1055 lines long, with 180
targets, in 11 libraries, the generated dependency file is almost 6000
lines.



I think you are assuming too much of how we used our makefile.  All we
had in makefiles were lists of tools, options and targets.  No fancy
stuff, no variable evaluations, just lists.  There was very little to
break, if a source was missing, the application wouldn't link.

I guess with hindsight, we were lucky to start out with makefiles
generated by an IDE.  IDEs being simple beasts generate simple makefiles!

Ok, I give. I misunderstood, or you were unclear, either way my fault,
and I persisted in this without further clarifying exactly what you
have done.

In effect, it sounds like in your system that there are some rules for
building certain kinds of code, like C++, which have been prevetted
and rarely change, very much like the $(eval $(value ...)) approach.
As I said earlier, I'm much more partial to this approach. In fact, my
initial prototype was very much something like this. However, at least
my implementation on GNU Make was quite slow. Some profiling showed
that it spent a large portion of its time in string manipulation,
process creation for the cat and echo processes, and io, on windows.
It was a magnitude or two slower than the solution written in compiled
code I have now. My compiled code solution determines ~4,000 java
files of an unofficial component of my company's project is up to date
in about 2 seconds. The GNU Make based approach with prevetted complex
rules ala $(eval $(value ...)) took much longer to determine the
~4,000 java files were up to date.

I think my complaints about such a scheme are simply that it's slower
than a compiled code solution, and it will not work for Java, both of
which are sticking points for my company's very large project.
However, it sounds like it actually works for you. I guess I'll have
to retract my points as made in ignorance based on assumption. My
bad.

Still though, I'm curious how exactly how yours is implemented. Is it
portable across operating systems, like HPUX, z linux, win itanium,
and more? Does it support cross compilation which is required for msvc
on win itanium? What exact make does it use? What other tools does it
use? Anything fancy, or just cat, echo, gcc (or whatever c++
compiler), etc.? You said it was created by an IDE initially, which?
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

Forum statistics

Threads
473,777
Messages
2,569,604
Members
45,229
Latest member
GloryAngul

Latest Threads

Top