The GNU Configure and Build System

The GNU Configure and Build System consists of autoconf, automake, and libtool. I wrote an essay about them a long time ago. Slightly more recently I was a co-author of a book about them.

David MacKenzie started writing autoconf way back in 1991. I was an early beta-tester and contributed some early features. autoconf generates a configure script written in portable Bourne shell script language. That makes sense. autoconf at the time was essentially an M4 script. That made sense at the time, but no longer does. Today autoconf is essentially a perl script that invokes M4. That doesn’t make sense today either.

automake is a completely separate program. It essentially provides a simplified Makefile language. David MacKenzie started writing it in 1994 as a portable shell script plus Makefile fragments. In 1995 Tom Tromey rewrote it in perl. I first used it in 1997, and added support for conditionals. automake works with autoconf. automake translates a file written in the automake language into a file written in the make language. The configure script generated by autoconf then applies substitutions to generate a Makefile from

libtool is essentially a complicated portable shell script which creates shared libraries. It was written by Gordon Matzigkeit starting in 1996. libtool provides a standard set of commands for compiling and linking to create shared libraries, and for installing them. It does this in a rather baroque manner in which what appears to be library is actually a shell script pointing to the real library, which libtools reads when doing a link to decide just what to do. automake has support for using libtool.

This build system is used by many different tools, including pretty much all GNU tools. However, it has a major problem: it is much too complicated. It is written in portable shell, perl, and m4. It does not work in a intuitive or transparent manner. autoconf input files mix shell and m4 code. automake input files mix the automake and make languages. libtool hides its files in a hidden directory named .lib. The system does work fairly well. But it is hard to understand and hard to debug. And configure scripts generated by autoconf are slow. In a distcc environment, it is not uncommon for it to take longer to run the configure script than it takes to build the program.

We have to get rid of these tools. There have been many alternative build systems. Not of them have caught on, at least not in the free software world, because they have to be installed before they can be used. The great advantage of the GNU system is that it requires nothing more than portable shell and portable make.

However, we can do better today. The GNU make program is very powerful, much more powerful than the original make program. GNU make is installed on every popular development platform. Let’s take advantage of it by assuming that it is there.

The goal would be to have people write a single Makefile. instead of and or (We would probably keep a configure script for compatibility, and to record options like –target). The Makefile would start with a standard include directive which provided all the default commands. The rest would just be a standard GNU Makefile with some restrictions, like no %.o: %.c command, and with some standard variables to describe compilation options and the like.

The standard include would ensure that every compilation depended on config.h. config.h could be built by grepping the source code for HAVE_xxx strings and the like. Each string would map to a test which would have an appropriate rule. These rules could be run in parallel to generate tiny header files with the appropriate #define. config.h would include all the tiny header files.

This would provide an interface similra to autoconf/automake, but simpler to use and faster to run. Of course there would be many details to work out. But I think an approach along these lines has real promise to get us out of the current quagmire.


  1. fche said,

    November 16, 2007 @ 7:47 pm

    The use of gnu make in the linux kbuild system is particularly impressive.

  2. fdeweerdt said,

    November 17, 2007 @ 5:20 am

    CMake ( does a fine job actually, especially regarding:
    – Integration with different dev setups (generates Visual Studio as well as GNU Make files)
    – Packaging (NSIS .exe setup, .deb, .rpm)
    Your post doesn’t address these issues though, don’t you think they’re in the scope of a build system?

  3. tromey said,

    November 17, 2007 @ 8:06 am

    The big problem is just that anybody with energy and motivation to work on this invariably decides to write their own tool instead. But… build it and they will come 🙂

    Also, one complaint I hear a lot is that make’s syntax sucks. It seems to me that this is fixable by giving GNU make a new parser that understands a better syntax.

    FWIW make is showing signs of life. There was an interesting SoC project to add content-based dependencies.

    One last note… there are alternatives in use. KDE switched to cmake. Gnome is debating what else to use. But like next-gen version control systems it looks like there is no clear winner.

  4. Ian Lance Taylor said,

    November 19, 2007 @ 12:46 pm

    I didn’t know about cmake. It looks interesting. Packaging is definitely part of a build system. Integration with development setups is not interesting to me personally, but it could certainly be useful.

    As far as I can tell, cmake needs to be run by every user who builds the package. I’m saying that based on the notes about what the generated Makefiles look like. If that is indeed the case, then it doesn’t have a key feature of autoconf/automake/libtool: the user who builds the package does not need to have anything special installed.

  5. The Cliffs of Inanity » Blog Archive » Replacing Automake said,

    November 23, 2007 @ 3:56 pm

    […] Inspired by Ian, this past weekend I took a look at replacing the auto tools with GNU make code. […]

  6. RussNelson said,

    November 26, 2007 @ 11:42 am

    May I suggest instead of using a single config.h which determines the configuration, that you use a config.d holding all the configuration files?

    First, because you see this design pattern all over Unix, where single config files are replaced by multiple files in a foo.d directory. E.g. init.d, and crontab.d. Second, because then, a file need only include the config.d/whatever.h file it depends upon. Then, since every whatever.h has its own timestamp, when you change whatever.h, only those files which depend upon it need to be rebuilt.

    Also, what about having a single directory which holds system-wide information? Run the tests, do the discernment once, and then it becomes available to all autotools. Dare I suggest /var/autotools/?

  7. Ian Lance Taylor said,

    November 26, 2007 @ 7:20 pm

    Hi Russ. A config.d makes sense, though we would probably also want the config.h for easier use and compatibility.

    The problem that autoconf has had with system-wide information in the past is that some tests are compiler dependent or even compiler-option dependent. Also of course when you update the OS the test results may change. These are not insoluble problems, but they are the reasons that autoconf generally does not use a system-wide cache file.

  8. jmmv said,

    November 27, 2007 @ 3:14 am

    Speaking of system-wide configuration, check “autoswc”, a little tool/package I wrote for the NetBSD packaging system:

    With some manual changes it could be made to work outside of pkgsrc.

    It does exactly that: generate a system-wide configuration cache for autoconf scripts so that you can add to it some tests, run them once and later reuse their results automatically. Speeds things up quite a bit in old machines.

  9. renoX said,

    December 2, 2007 @ 9:01 am

    Cmake has been used as a replacement for auto tools for the KDE projects.
    Given the size of KDE, for me this is a good indication that Cmake could replace auto tools for C/C++ projects.

  10. Ian Lance Taylor said,

    December 3, 2007 @ 3:33 pm

    I don’t doubt that cmake could work. There are actually a number of tools that can work.

    However, I think the only tool which can succeed is one which does not impose any requirements on the system on which the program is built. It’s OK to impose requirements on developers, but it’s not OK to impose them on people who build the source code. That is why the autotools have been successful: they do not impose any requirements.

    As far as I can tell, cmake does impose a requirement: the system on which the program is built must have cmake installed.

  11. Logiciel Libre » Blog Archive » What’s wrong with the GNU autotools? said,

    January 26, 2008 @ 1:39 pm

    […] Ian Lance Taylor captures what’s wrong with autotools quite nicely. Ian says Cmake isn’t a suitable replacement, but perhaps it could evolve into one […]

  12. Ian Lance Taylor said,

    March 22, 2008 @ 11:34 am

    Tom Tromey has started working on some of the ideas in the post. See .

  13. cruz44 said,

    April 4, 2008 @ 12:26 pm

    As somebody that’s use their own various build system for many years (because quite honestly, autotools suck… they’re complex, weird and just don’t work under Windows), I’ve also wanted to write my own “final solution” for this too. The problem is, it requires a lot of testing, user cases and platforms to play.

    I’ve tried a bunch of new build systems like Waf and Scons too. Finally though, I’ve settled on cmake. Quite simply, it’s really nice. It works on all the major platforms, and it’s easy to use (has optional graphical configuration screens too, and works great under Windows).

    I don’t think having to install cmake is some kind of onerous requirement on someone who wants to compile software on their system. I’d rather type “apt-get install cmake” than deal with some kind assembled from matchsticks system like autotools is, or this new gmake based one may be.

    Besides, gnu make isn’t installed on that many platforms out of the box. In particular a Windows box/Visual Studio developer almost definately doesn’t have it installed. Installing cmake on windows is a 2 click install there, too. Although I’m primarily a Linux developer, using the same build system (one that works with Visual Studio directly) is a god send.

    Lets not reinvent the wheel, lets just all use cmake and be happy, like those KDE guys 🙂

  14. Ian Lance Taylor said,

    April 4, 2008 @ 5:33 pm

    Clearly one can reasonably make different choices about which dependencies are acceptable and which are not when it comes to building software. My feeling right now is that GNU make is acceptable. The kinds of programs I write are never going to build with Visual Studio, which means that a Windows user has to install cygwin before they can build them–and that means that they have GNU make. However, I don’t have cmake installed on any system, so that seems like a less acceptable dependency.

RSS feed for comments on this post · TrackBack URI

You must be logged in to post a comment.