checking bits

B

Bo Persson

Luke Meyers said:
Portable, sure. Type-safe, no. Maintainable, no. Recommendable,
absolutely not.

This is what const is for.

const unsigned char BIT3 = ...


So you're starting counting at 0? I love zero-indexing as much as
the
next fella, but that's not a reasonable convention when speaking in
terms of ordinal numbers. The rightmost bit is the first bit, not
the
zeroth bit.

Or it might be the last bit, on some hardware. So much for portability
of bits!

The bit pattern you've produced is 00001000. Surely we
can all agree that's the *fourth* bit?

Unless it's the 60th or 61st bit, counting from the left. :)


Bo Persson
 
M

Marcus Kwok

M

Marcus Kwok

Shark said:
And he was bashed in comp.lang.c for pretending that the Fake January
interview was somehow related to "The Real IEEE Interview"
http://groups.google.com/group/comp.lang.c/browse_thread/thread/7de485cf8c7795/c8c4dc4844f688b3

The only "bashing" I saw was for posting a C++ article in a C newsgroup.
doesn't "legend" in "C++_legend.html" tell you something?

Of course, which is why I felt the need to clarify; others may not have
picked up on the subtlety. When something is called a "<insert
someone's name>-ism", it implies that it is attributable to that person.
 
L

Luke Meyers

Shark said:
Yes I agree with you. I am against the use of c++ preprocessor to
define globals (partly because I read Scott Meyers). But the snippet I
was demonstrating was correct in itself (due to closure, i think.
Question satisfied by answer, and answer is to the point). Meep!

Don't know when to quit, do you? (Nor do I, it seems...)

The snippet was correct and solved the immediate problem. However, it
was a lousy solution to the problem because it was excessively
specific, and provided no benefit in exchange for its rigidity. It's
just bad.
If I were to write something like this for my own use, I'd write
exactly what you suggested. But if I am quoting from another
source....and if they have a good reason for writing a macro instead of
function, then ranting is pointless.

It defined a macro, then used the macro in a function. I think you're
getting confused, because there are two issues:

* In C++, it's dumb to use #define for constants, because const is
better.
* In both C and C++, it's dumb to write a function like
"set3rdBit(unsigned char)" rather than "setNthBit(int, unsigned char)."
That has nothing to do with C vs. C++, embedded vs. not, etc. It's
just an idiotic programming practice.
Well, C != C++ obviously, but C is a subset of C++.

No, it's not. C++ has a C-like subset. It's different.
Just because you
are using a C++ compiler shouldn't mean you must use all the WMDs that
C++ provides.

No, but good C code can be bad C++ code. Same code, different context,
yes it really does make a difference.
Lets use some polymorphism to relate C and C++

Please let's not.
Does that mean anywhere you can use C you can also use C++? Sure you
can!

In most cases, a C++ compiler will compile valid C code. But the
matter at hand is not syntactic correctness, but good programming
practice. C++ allows much more effective practices than the
best-of-breed C practices. The bar is higher, so anything which only
comes as high as the bar for C is substantially sub-par in C++.
Does that mean C++ is C? Open question.

Not really. No more so that C++ is arithmetic. Sure, it incorporates
arithmetic. But if all you do is add, subtract, multiply, and divide,
you're kind of missing the point.
Because years of
conditioning by books, professors, and crappy recruiters looking for
"C++ Professionals" has skewed the answer in favor of "no".

Or the fact that, objectively, C++ isn't C.
translates into english as "He that fights and runs away, lives to
fight another day."

Still haven't seen the back of you. ;)

Luke
 
S

Shark

Luke said:
Please let's not.

Haha, it seems you are evading this issue instead of tackling it. You
don't like using polymorphism here, do you? Because it seems counter
intuitive.

If you can use polymorphism to say:

class mammals : public animals {
};

then why can't we say:

class C++ : public C {
//overload structs unions, crap
// add stroustrup conspiracy
};
Not really. No more so that C++ is arithmetic. Sure, it incorporates
arithmetic. But if all you do is add, subtract, multiply, and divide,
you're kind of missing the point.

What is the point exactly? That C++ is not C and vice versa?

Although comparing C++ and Arithmetic is an extreme example it can
still be treated with multiple inheritance

public C : public Arithmetic, public Blah1, public Blah2...... {
//define atithmetic symbols
//override some blah1 stuff
//override some blah2 stuff
...
...
};

and

public C++ : public C {
//override some C stuff
//add b.s. crap
};

So C++ indirectly inherits from Arithmetic. What is the problem? Why
shouldn't we use polymorphism to relate C++ and C?
Still haven't seen the back of you. ;)

Its called guerilla warfare :)
 
L

Luke Meyers

Shark said:
Haha, it seems you are evading this issue instead of tackling it. You
don't like using polymorphism here, do you? Because it seems counter
intuitive.

No, your point is plenty obvious, it just doesn't mean what you want it
to mean. Polymorphism in this case (as in general) just means that C++
is (more or less) USABLE-AS C. In the same sense that a laptop
computer is usable-as a paperweight. You can use it that way, but
you're kind of missing the point if you do, and would be out of line to
recommend it to someone asking how best to use their laptop computer.

How many more cute analogies do you need? There is no reason to use
#define for symbolic constants in C++, because const is equal or better
in every way. Advocating anything else indicates that you are
ill-informed, irresponsible, or simply stubborn.
What is the point exactly? That C++ is not C and vice versa?

YES. C is certainly not C++. C++ has a C-like subset, but it is a
language unto itself, and as with any language there are certain idioms
which represent best practice developed through years of experience,
and others which are useless, bad, or dangerous. Certain "good C"
idioms are bad in C++, for a variety of reasons.
So C++ indirectly inherits from Arithmetic. What is the problem? Why
shouldn't we use polymorphism to relate C++ and C?

Because it's irrelevant to do so? Even if we accept it as true, it
doesn't support your point in any way. And, incidentally,
"inheritance" and "polymorphism" are not synonyms.

Luke
 
J

JustBoo

How many more cute analogies do you need? There is no reason to use
#define for symbolic constants in C++, because const is equal or better
in every way. Advocating anything else indicates that you are
ill-informed, irresponsible, or simply stubborn.

Just wondering. Does using const vars take storage? IOW, use memory
during runtime? Do #defines take storage? I believe #defines do not
take storage during runtime. Well, to pre-split a hair, other than in
the "code segment." Note the quotes. :)

Thanks.

"There is no such thing as a good tax." - Winston Churchill
 
L

Luke Meyers

JustBoo said:
Just wondering. Does using const vars take storage? IOW, use memory
during runtime? Do #defines take storage? I believe #defines do not
take storage during runtime. Well, to pre-split a hair, other than in
the "code segment." Note the quotes. :)

Precisely what happens depends on the compiler. Any reasonable
compiler would produce no increased overhead for using a const versus
using a literal (which is what a #define constant amounts to by the
time the compiler sees it). Const data lives in a separate section in
memory from other storage types.

Remember, even a numeric literal has to be stored somewhere.

Luke
 
J

JustBoo

Precisely what happens depends on the compiler. Any reasonable
compiler would produce no increased overhead for using a const versus
using a literal (which is what a #define constant amounts to by the
time the compiler sees it). Const data lives in a separate section in
memory from other storage types.

Remember, even a numeric literal has to be stored somewhere.
Luke

Interesting. So in a conformant compiler the same amount of storage
(or approximate storage) would be taken by either a #define or a const
var?
 
B

Ben Pope

JustBoo said:
Interesting. So in a conformant compiler the same amount of storage
(or approximate storage) would be taken by either a #define or a const
var?

Well... you have to stick the number somewhere. You could point to it,
but that's probably at least as big as the number itself.

Ben Pope
 
J

Jerry Coffin

JustBoo wrote:

[ ... ]
Interesting. So in a conformant compiler the same amount of storage
(or approximate storage) would be taken by either a #define or a const
var?

As long as you only use the const variable in ways that literals could
be used, most compilers won't allocate any memory for it. If you do
something that requires it to have an address (e.g. take its address or
pass it by reference) then the compiler won't normally have any choice
but to allocate some memory for it.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

No members online now.

Forum statistics

Threads
473,744
Messages
2,569,484
Members
44,903
Latest member
orderPeak8CBDGummies

Latest Threads

Top