[PVFS-developers] Recursive make considered harmful revisited
Thu, 10 Jan 2002 13:53:59 +0200
I got a request to better explain the single Makefile approach and recursive
make drawbacks. I will give a short summary for those of you who do not
wish to read the entire paper linked at the end.
A hard core recap of what make is:
Make is a program which receives a DAG (Directed Acyclic Graph) where
the nodes of the graph are files and the edges are dependency relations.
Make also receives a single node that needs to be built.
Make, in a recursive manner (recursive on the graph, not the directories),
tried to find out whether there are ingredients to the node that need to be
built that need building on their own. This way the node gets built.
It doesnt really matter that the node is .o .c .so .a .tar.gz or whatever.
What is wrong with recursive make ?
The most basic fact is that when make is applied recursivly it DOES NOT SEE
THE WHOLE GRAPH. This means that if I have two subdirs : one and two
and files in one include files in two, then make, when visiting one in a
directory recursive configuration, does not know about two! it may be that
things in two need to be built! it may be that there are ingredients in two
that are newer than in one and force things in one to be built! make will
never know about those. This way make will make mistakes...:)
Why do developers use recursive make ?
Because each subdir developer can hack their own makefile.
Each developer can also hack their own utility library too but most projects
don't have each directory with it's own utility library.
What actually happens is that each developer describes the graph for his part
of the project in his makefile but the edges that connect nodes from different
graphs ae entirely missing.
What happens ?
A LOT OF BAD THINGS.
1. file that need to get built don't.
This creates very strange bugs especially in langs like C and C++ which
are very close to the machine.
2. As a result of 1 people mistrust their build system. This creates the
popular rule of thumb: "clean rebuild on segmentation fault". This is
bad. If you don't trust your build system then what good is it ? Why not
use a bash script to rebuild from scratch every time ? Oh you do trust
it most of the time ? But why don't you trust it when it counts ? (i.e.
when you receive SIGSEGV...:)
3. As a result of 2 people write scirpts which "over make". Meaning rebuild
things just to make sure. This causes long compilation times
which are unneccessary and makes people mistrust their build
system even more.
If you want a more precise description of this problem here is the link to
Peter Millers excellent paper on the subject:
This is indeed a gem.
Is there a solution ?
YES YES YES YES YES
Just write a single makefile which works from the top direcotry.
make is so powerful these days that if you use it right you can combine
automatic dependency handling, compilation and automatic source location with
correct dependency handling in a 100 line makefile even for a moderate size
projects. All the solutions are in the paper.
If you are more bold then you can use a tool other than make (there are many
out there) and non of them promotes the recursive build methodology but rather
the full project methodology.
Hope this helps.