ulrich said:
well, if you have
const int A = 10;
it will occupy sizeof(int) bytes of memory as long as it lives.
It might, or it might not. Depends on the compiler.
on the other hand, if you have
#define A 10
the _preprocessor_ will literally replace every "A" in your source code by
"10", so this #defined a uses no memory.
Might also be just the other way round.
however, those variables holding the value A use memory, off course.
Well, it depends. Some CPUs will insert the value directly into the code
that loads that value into the register. If it has to be loaded from
memory, the memory address needs to be provided in the code instead.
OTOH, some CPUs cannot directly write the value 10 into a register. They
need to read it from some memory location, in which case the #define might
result in an extra memory use for every occurance of A, while the const
would need only one.
Of course that all depends on the capabilities of the compiler and on the
platform. But even if you have the case of the #define needing no memory
and the const needing it, then sizeof(int) isn't a big deal unless you're
on a very limited embedded platform. E.g. on a 32bit platform, sizeof(int)
is usually 4. With 512MB of memory, 4 bytes are about 7e-7 percent of the
total memory.
remark:
to really avaoid hardcoding of constants (my point of view is that
"hardcoded" is anything the change of which requires a re-compilation),
Though I agree here, it seems the OP rather meant avoiding magic numbers
within the code and replacing them with a constant or a #define to
centralize it and to document its meaning through giving it a proper name.
your program should use an initialisation file. then, the name of this
file is the only thing which needs to be hardcoded.
That depends on the purpose of the value. For example, there is no need to
have the following replaced by a configuration file entry:
const int hours_per_day = 24;
because the number of hours per day is not expected to change, and it would
clutter the config file with useless options.