Follow @Openwall on Twitter for new release announcements and other news
[<prev] [next>] [day] [month] [year] [list]
Date: Wed, 27 Jan 2016 16:04:57 -0500
From: Rich Felker <>
Subject: Bits deduplication: details/build system mechanisms?

I think it looks okay to have a simple full-file overlay for the first
phase of the bits deduplication. If we need more fine-grained dedup we
can look at more advanced approaches later. So the big question left
is how to actually do this in the build system.

What we'll have in the source tree is something like:


in that fallback order. I've intentionally left the second item vague
because there's some question of how we implement it. The possible
list of generic-* dirs we could have is:


The idea of generic-mips is that, if we add mips64 and possibly n32,
there are a lot of wacky mips-specific constant values that are shared
by them all.

I don't think it's worth actually having generic-le and -be because
the logic to search these dirs would be larger than the one line in

It probably also doesn't make sense to have generic dirs just for long
double format, especially since archs that vary in float.h might also
have to define different FLT_EVAL_METHOD values. Then ld64 can just be
in generic and the few archs with ld80 or ieee quad can provide their
own (possibly duplicate, but tiny) versions.

One option to start would be not to have any generic-* dirs, just a
single generic. This would be easy to implement, but I really want to
be able to represent some of the genericness of things like the 64-bit
bits/socket.h hacks. Maybe we can do this in two phases, though, since
there are only a couple 64-bit archs so far and dedup'ing them isn't
the first priority.

Now, how to make the build system handle this all?

The first big question is whether to use multiple -I's at build time
and multiple implicit rules to control which bits files get installed
at install time, or to stage bits headers in obj/include/bits first.

The benefits of staging are that we avoid adding a bunch of new -I's
to every invocation of the compiler, and have more flexibility in how
we write the rules to pick the files. If obj/include/bits/*.h just
depend on all possible files they could be generated from, the actual
rule can contain the logic to select the first matching version.
There's also the flexibility to add more complex generation in the

The disadvantage of staging is that it becomes hard to do any tests
that rely on bits headers from configure, since they haven't been
generated yet. Right now the only test that actually matters is the
check for incorrect long double on powerpc, and we could just remove
it because there's a second check at the source level during build,
but it's kind of nice to have the early check too.

If we don't stage but instead use -I's, or even if we do stage but let
the make control the choice of version copied via overlaid implicit
rules, then make somehow needs to be told about which generic dirs to
use. (Of course this issue does not apply if/as-long-as we only have
one generic dir.) I don't see any really clean way to do this aside
from something like

-include $(srcdir)/arch/$(ARCH)/

and I'm still not sure exactly what the contents of such a file should
look like.

Any good ideas on implementing this? If we don't reach any good
conclusion right away I'll probably start with just one generic dir
and no staging, which is the simplest and just requires one new -I.
Then we can extend later as needed.


Powered by blists - more mailing lists

Confused about mailing lists and their use? Read about mailing lists on Wikipedia and check out these guidelines on proper formatting of your messages.