N
Noob
Hello,
I've come across a library which defines TRUE and FALSE thus.
/* BOOL type constant values */
#ifndef TRUE
#define TRUE (1 == 1)
#endif
#ifndef FALSE
#define FALSE (!TRUE)
#endif
Do you know why one would do that, instead of the more
straight-forward (IMO)
#define FALSE 0
#define TRUE 1
or, perhaps,
#define FALSE 0
#define TRUE (!FALSE)
(Given that !0 evaluates to 1.)
??
Regards.
I've come across a library which defines TRUE and FALSE thus.
/* BOOL type constant values */
#ifndef TRUE
#define TRUE (1 == 1)
#endif
#ifndef FALSE
#define FALSE (!TRUE)
#endif
Do you know why one would do that, instead of the more
straight-forward (IMO)
#define FALSE 0
#define TRUE 1
or, perhaps,
#define FALSE 0
#define TRUE (!FALSE)
(Given that !0 evaluates to 1.)
??
Regards.