sign of char literals in #if directive

O

Ole Nielsby

How should a C++ preprocessor interpret char literals in #if directives?

From the mingw linits.h (the comment is in the original):

===code snippet begin===

#define SCHAR_MIN (-128)
#define SCHAR_MAX 127

#define UCHAR_MAX 255

/* TODO: Is this safe? I think it might just be testing the preprocessor,
* not the compiler itself... */
#if ('\x80' < 0)
#define CHAR_MIN SCHAR_MIN
#define CHAR_MAX SCHAR_MAX
#else
#define CHAR_MIN 0
#define CHAR_MAX UCHAR_MAX
#endif

===code snippet end===

From the Final Draft of the standard, I get that the signedness of char
is implementation defined and the char-signedness of the compiler
isn't requied do match that of the preprocessor.

===Final draft 16.1 -4-===
Whether the numeric value for these character literals matches the value
obtained when an identical characterliteral occurs in an expression (other
than within a #if or #elif directive) is implementation-defined.*
===draft snippet end===

So the standard (in the draft version, I assume the final is the same)
doesn't even give a hint: should the preprocessor treat '\x80' as -128
or 128?

I ask because I'm working on a C++ parser to be used for code
analysis and transformation tools and it needs to have its own
preprocssor.

Should I settle on signed? unsigned? Is there a de-facto-standard?
Or should I let the preprocessor decide from a pragma - such as:

#pragma kvernbitr preprocessor_char_signedness signed
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

No members online now.

Forum statistics

Threads
473,769
Messages
2,569,578
Members
45,052
Latest member
LucyCarper

Latest Threads

Top