This Time Self-Hosted
dark mode light mode Search

Discovering the environment versus knowledge repository

For what concerns users, the main problem with build systems based on autotools is the long delay caused by executing ./configure scripts. It is, indeed, a bit of a problem from time to time and I already expressed my resentment regarding superfluous checks. One of the other proposed solution to mitigate the problem in Gentoo is running the script through a faster, more basic shell rather than the heavy and slow bash, but this has a few other side issues that I’m not going to discuss today.

As a “solution” (but to me, a workaround) to this problem, a lot of build systems prefer using a “knowledge repository” to know how to deal with various compiler and linker flags, and various operating systems. While this certainly has better result for smaller, standards-based software (as I write in my other post, one quite easy way out from a long series of tests is just to require C99), it does not work quite that well for larger software, and tends to create fairly problematic situations.

One example of such a buildsystem is qmake as used by Trolltech, sorry Qt Software. I had to fight with it quite a bit in the past when I was working on Gentoo/FreeBSD since the spec files used under FreeBSD assumed that the whole environment was what ports provided, and of course Gentoo/FreeBSD had a different environment. But without going full-blown toward build systems entirely based on knowledge, there can easily be similar problems with autotools-based software, as well as cmake-based software. Sometimes it’s just a matter of not knowing well enough what the future will look like, sometimes these repositories are simply broken. Sometimes, the code is simply wrong.

Let’s take for instance the problem I had to fix today (in a hurry) on PulseAudio: a patch to make PulseAudio work under Solaris went to look for the build-time linker (ld) and if it was the GNU version it used the -version-script option that it provides. If you look at it on paper, it’s correct, but it didn’t work and messed up the test4 release a lot. In this case the cause of the problem is that the macro used has been obsoleted and thus it never found the link as being GNU, but this was, nonetheless, a bad way to deal with the problem.

Instead of knowing that the GNU ld supported that option, and just that, the solution I implemented (which works) is to check if the linker accepts the flag we needed, and if it does it provides a variable that can be used to deal with it. This is actually quite useful since as soon as I make Yamato take a break from the tinderbox I can get the thing to work with the Sun linker too. But it’s not just that: nobody tells me whether in the future a new linker will support the same options as the GNU ld (who knows, maybe gold).

A similar issue applies to Intel’s ICC compiler, that goes to the point as passing itself as GCC (defining the same internal preprocessor macros), for the software to use the GCC extension that ICC implements. If everybody used discovery instead of knowledge repository, this would have not been needed (and you wouldn’t have to workaround when ICC does not provide the same features as GCC — I had to do that for FFmpeg some time ago).

Sure, knowledge repository is faster, but is it good just the same? I don’t think so.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.