Date: Wed, 13 May 2020 15:33:54 -0400 From: Rich Felker <dalias@...ifal.cx> To: Anders Magnusson <ragge@...d.ltu.se> Cc: John Arnold <iohannes.eduardus.arnold@...il.com>, musl@...ts.openwall.com, pcc@...ts.ludd.ltu.se Subject: Re: Re: [Pcc] PCC unable to build musl 1.2.0 (and likely earlier) On Wed, May 13, 2020 at 09:09:13PM +0200, Anders Magnusson wrote: > Den 2020-05-13 kl. 16:30, skrev Rich Felker: > >On Wed, May 13, 2020 at 09:10:40AM +0200, Anders Magnusson wrote: > >>Den 2020-05-12 kl. 23:21, skrev Rich Felker: > >>>Thanks. Adding pcc list to cc. > >>> > >>>On Tue, May 12, 2020 at 03:59:36PM -0500, John Arnold wrote: > >>>>With an i386 PCC 1.2.0.DEVEL built from source from > >>>>http://pcc.ludd.ltu.se/ftp/pub/pcc/pcc-20200510.tgz, I was unable to > >>>>build an i386 musl 1.2.0. The compiler first hits this error: > >>>> > >>>>../include/limits.h:10: error: bad charcon > >>>> > >>>>This line was the only change made in commit cdbbcfb8f5d, but it has a > >>>>lengthy commit message about the proper way of determining CHAR_MIN > >>>>and CHAR_MAX. > >>>I think this is clearly a PCC bug, one they can hopefully fix. The > >>>commit message cites the example from 184.108.40.206: > >>Can you please sen med the offending line? > >#if '\xff' > 0 > > > Thanks, fixed now, it was a missing pushback of ' that was the problem. > > Note that this check cannot be used to see whether a target uses > signed or unsigned char. > In pcc the above is always true, no matter what char is. See C11 > clause 6.10.1 clause 4. See the commit message for: https://git.musl-libc.org/cgit/musl/commit/include/limits.h?id=cdbbcfb8f5d748f17694a5cc404af4b9381ff95f There is good reason we changed this. I believe you're referring to the text: "This includes interpreting character constants, which may involve converting escape sequences into execution character set members. Whether the numeric value for these character constants matches the value obtained when an identical character constant occurs in an expression (other than within a #if or #elif directive) is implementation-defined.168) Also, whether a single-character character constant may have a negative value is implementation-defined." and the footnote is: "168) Thus, the constant expression in the following #if directive and if statement is not guaranteed to evaluate to the same value in these two contexts. #if 'z' - 'a' == 25 if ('z' - 'a' == 25)" The point here seems to be allowing compilers where the preprocessor character set does not match, an awful possibility the standard didn't want to rule out (mixed ASCII/EBCDIC environments, etc.). A compile using the allowance given in the example in the footnote would definitely not be one I'd want to try to support. While this isn't really a _binary_ issue, I'd consider it non-conforming to the platform ABI, which should uniquely determine how character constants evaluate. The signedness issue seems to be technically a different allowance, and one that's not inherently awful like the EBCDIC one above. I'm somewhat amenable to finding an alternate solution here that's compatible with PCC, but I think it would be preferable just for PCC to make the definition of the value consistent here just like GCC and clang and other compilers do.
Powered by blists - more mailing lists
Confused about mailing lists and their use? Read about mailing lists on Wikipedia and check out these guidelines on proper formatting of your messages.