How does #define operate ?

T

Timothy Madden

Hello

If I say

#define MASK 0x00F0
#define BIT_SET MASK
#define MASK 0x000F

than what value will BIT_SET macro expand to ?
I mean the preprocessor does lasy evaluation or immediate evaluation ?
I tryed to figure it out from MSDN once but I couldn't

Thank you
Timothy Madden
Romania
 
H

Howard

Timothy Madden said:
Hello

If I say

#define MASK 0x00F0
#define BIT_SET MASK
#define MASK 0x000F

than what value will BIT_SET macro expand to ?
I mean the preprocessor does lasy evaluation or immediate evaluation ?
I tryed to figure it out from MSDN once but I couldn't

I don't think the standard deals with preprocessor stuff, but on my compiler
it gives me a redefinition error. If I want to re-#define something, I have
to #undef it first. Which would imply (assuming you threw in a #undef MASK
before the second #define) that BIT_SET would still be 0x00F0, because
otherwise what would the point be of requiring #undef? I see several places
where something is defined just before including a file that needs it, then
undefined so as not to interfere with other, later, includes.

-Howard
 
X

Xenos

Howard said:
I don't think the standard deals with preprocessor stuff, but on my compiler
it gives me a redefinition error. If I want to re-#define something, I have
to #undef it first. Which would imply (assuming you threw in a #undef MASK
before the second #define) that BIT_SET would still be 0x00F0, because
otherwise what would the point be of requiring #undef? I see several places
where something is defined just before including a file that needs it, then
undefined so as not to interfere with other, later, includes.

-Howard

Not true, BIT_SET expands to MASK, not 0x00F0. If you redefine MASK, your
redefine BIT_SET.

DrX
 
H

Howard

Xenos said:
Not true, BIT_SET expands to MASK, not 0x00F0. If you redefine MASK, your
redefine BIT_SET.

DrX

You're correct (at least using CodeWarrior on the Mac).

And it's obvious why. My thinking was backwards. At the time/place of
expansion, it evaluates the symbol, and by that time the symbol has changed.

(But I definitely do need to #undef it before I can #define it again.)

-Howard
 
T

Timothy Madden

Howard said:
You're correct (at least using CodeWarrior on the Mac).

And it's obvious why. My thinking was backwards. At the time/place of
expansion, it evaluates the symbol, and by that time the symbol has changed.

(But I definitely do need to #undef it before I can #define it again.)

Actualy I really meant:

#define MASK 0x00F0
#define BIT_SET MASK
#undef MASK
#define MASK 0x000F

Should the standard not say something about this ?
Could MASK in turn be defined as some other defined symbol ? How many levels
can this go ? Why does any book not say anything about it ?
Anyone can tell me how other compilers behave ?

Thank you
Timothy Madden
Romania
 
X

Xenos

Timothy Madden said:
Actualy I really meant:

#define MASK 0x00F0
#define BIT_SET MASK
#undef MASK
#define MASK 0x000F

Should the standard not say something about this ?
Could MASK in turn be defined as some other defined symbol ? How many levels
can this go ? Why does any book not say anything about it ?
Anyone can tell me how other compilers behave ?
It's not the compiler, but the preprocessor. They behave by following the
standard. You are not setting one variable equal to another. You are
telling the preprocessor that the macro BIT_SET is equal to the the text
"MASK" (without the quotes). After the text replacement, the code will be
rescanned looking for more macros. If MASK is defined when this is done,
then MASK will be replaced. If not, it will be passed as is to the
compiler.

DrX
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

Forum statistics

Threads
473,744
Messages
2,569,482
Members
44,901
Latest member
Noble71S45

Latest Threads

Top