const integers

S

Serve Laurijssen

Theres a certain style of coding that uses const as much as possible.
Like

const int foo(const int a, const int b)
{
const int retval = pow(a, b);
return retval;
}

one argument to use code like this is that compilers can theoretically
optimize better.
Is that true? I thought that was only with pointers?
 
L

lovecreatesbeauty

Serve said:
const int foo(const int a, const int b)
{
const int retval = pow(a, b);
return retval;
}

I read an article and it prefers the following style.

const int foo(int a, int b){
cosnt aa = a;
cosnt bb = b;
int retval;

/*...*/

retval = pow(aa, bb);
return retval;
}
 
G

Giorgio Silvestri

lovecreatesbeauty said:
sorry, the code will be better with the changes,

cosnt int ca = a;
cosnt int cb = b;

better:
const int ca = a;
const int cb = b;

:)

Giorgio Silvestri
 
F

Frederick Gotham

Serve Laurijssen posted:
Theres a certain style of coding that uses const as much as possible.


I myself have that style -- but with one exception: Return types.

Anything returned from a function is going to be an R-value in anyway, so
there's no need to add redundant words.

Here would be a sample function of mine:

int Func(int const a, int const b)
{
return a + b;
}
 
J

Jack Klein

Theres a certain style of coding that uses const as much as possible.
Like

const int foo(const int a, const int b)
{
const int retval = pow(a, b);
return retval;
}

one argument to use code like this is that compilers can theoretically
optimize better.
Is that true? I thought that was only with pointers?

A function argument, or a local automatic variable, cannot be changed
without the compiler's knowledge unless its address is passed to a
function. So even rudimentary data flow analysis would permit any
optimizations on such a variable with or without the const qualifier.

Some programmers like to use const on function arguments like this as
a reminder that they don't intend to change them. Perhaps down near
the end of the function, they expect them to have the original passed
value. Example:

int foo(const int a, const int b)
{
/* do stuff using current value of a and b */
/* finally... */
ret = pow(a, b);
return ret;
}

Now if they accidentally modify a or b before they get to the final
expression that uses them, the compiler will issue a diagnostic for a
constraint violation.

I rarely try to argue with someone who adds minimal extra typing to
catch errors in their code, but I don't care much for this idea
myself. If the function is large enough that you could modify a or b
and not be seen on a single display screen in the editor at the point
where the value is used later, the function is too large and effort
would better be spent splitting it up.

As for putting a const qualifier on a returned object, that's pretty
useless. You can return a non-const object from a function that
returns a const object, and you can assign a const object returned
from a function to a non-const object in the caller. I don't see how
it achieves anything at all.
 
K

Keith Thompson

Serve Laurijssen said:
Theres a certain style of coding that uses const as much as possible.
Like

const int foo(const int a, const int b)
{
const int retval = pow(a, b);
return retval;
}

one argument to use code like this is that compilers can theoretically
optimize better.
Is that true? I thought that was only with pointers?

<OT>

One of my radical language design ideas is to make all declared
objects const by default. If you want to be able to modify an
object's value, you'd need to explicitly qualify it with, say, "var".

Thus:

int x = 10;
var int y = 20;
x ++; /* error */
y ++; /* ok */

I suspect that most declared objects actually don't change in value
once they're initialized. If my suspicion is correct, the result
would be safer code.

Of course, such a radical change would be inappropriate for C, and I'd
strongly oppose it if there were any chance of it being taken
seriously. But designers of new languages might want to consider it.

</OT>
 
S

spibou

Keith said:
<OT>

One of my radical language design ideas is to make all declared
objects const by default. If you want to be able to modify an
object's value, you'd need to explicitly qualify it with, say, "var".

Thus:

int x = 10;
var int y = 20;
x ++; /* error */
y ++; /* ok */

I suspect that most declared objects actually don't change in value
once they're initialized. If my suspicion is correct, the result
would be safer code.

Of course, such a radical change would be inappropriate for C, and I'd
strongly oppose it if there were any chance of it being taken
seriously. But designers of new languages might want to consider it.

</OT>

Isn't the same result achieved by declaring x as a macro ?
For example
#define x 10
in C. Most languages will have a similar construct and if the
programmer intends for the value to remain constant he/she
will pick that construct instead of one which allows for the value
to be changed. So how is your proposal different ?

Spiros Bousbouras
 
K

Keith Thompson

Isn't the same result achieved by declaring x as a macro ?
For example
#define x 10
in C. Most languages will have a similar construct and if the
programmer intends for the value to remain constant he/she
will pick that construct instead of one which allows for the value
to be changed. So how is your proposal different ?

In my (not very serious) proposal, objects would be read-only by
default, not necessarily constant in the sense of being computable
during compilation.

For example:

printf("Continue? ");
fflush(stdout);
int c = getchar();
if (c == 'y') {
...
}

In my hypothetical language, any attempt to modify c after
initializing it would be illegal, unless it's qualified with "var".

The problem this tries to address is that many objects are declared as
variables (without "const") even though they're never actually
modified.

(Again, I absolutely am not advocating making this change to C.)
 
M

Mark McIntyre

<OT>

One of my radical language design ideas is to make all declared
objects const by default. If you want to be able to modify an
object's value, you'd need to explicitly qualify it with, say, "var".

Interesting, though fairly annoying. I find that the vast bulk of
variables I declare are.
I suspect that most declared objects actually don't change in value
once they're initialized.

I'd be pretty surprised. Certainly its not my experience.
If my suspicion is correct, the result
would be safer code.

I'm fairly unconvinced that lack of constness makes code 'unsafe'.
...designers of new languages might want to consider it.

I have to say, I can't see why one would want to have variables which
were by default not variable. Why not use constants for that? :)
--
Mark McIntyre

"Debugging is twice as hard as writing the code in the first place.
Therefore, if you write the code as cleverly as possible, you are,
by definition, not smart enough to debug it."
--Brian Kernighan
 
K

Keith Thompson

Mark McIntyre said:
Interesting, though fairly annoying. I find that the vast bulk of
variables I declare are.


I'd be pretty surprised. Certainly its not my experience.

You may well be right; I haven't done any kind of in-depth study.
I'm fairly unconvinced that lack of constness makes code 'unsafe'.

Consider something like this:

char *s = "hello";
some_func(s);

If some_func() modified the string pointed to by s, the code is
unsafe. If s were a pointer to const char, this would be caught at
the point of the call.
I have to say, I can't see why one would want to have variables which
were by default not variable. Why not use constants for that? :)

I've added an "OT:" tag to the subject header. If this discussion
continues, we might want to take it to comp.lang.misc.

It's an idea I had some years ago in the context of a different
language, one that's more flexible than C in how objects can be
initialized (Ada).

In Ada, there are two kinds of objects: variables and constants.
Objects are the same as in C. Constants are objects declared with the
"constant" keyword, much like "const" in C. Variables are
non-constant objects.

My experience in Ada was that objects very commonly were initialized
to some desired value, and that value was never changed over the
lifetime of the object. I tried to declare objects with "constant"
whenever possible, giving a hint both to the compiler and to the
reader that they weren't going to change after the initial
declaration. My vague impression was that this was the case more
often than not, and that making objects constant by default might be
an improvement.

This may be less true for C than for Ada, for various reasons.

If I see a reference to an object, I can look at its declaration to
get more information about it. If it's declared const, I know that
the value at the point of reference is the same as the value in the
initializer. If it isn't, I have to understand the execution of the
program to figure out whether it might have been modified (and so does
the compiler, but that's a secondary issue).

Here's a trivial example: a swap routine:

void swap_int(int *x, int *y)
{
int tmp = *x;
*x = *y;
*y = tmp;
}

tmp is never modified after its initialization, but how many C
programmers would bother to declare it const?

In general, I tend to think that making the default more restrictive,
while allowing it to be relaxed explicitly, tends to make for cleaner
and safer code. YMMV.

To return to something vaguely approaching topicality, I suppose I'm
advocating more use of "const" for objects that don't change after
they're initialized.
 
G

Gordon Burditt

if (c == 'y')
I'd get rid of the "==" for equality, and "=" for assignment.

What would you suggest as a replacement?

'y' =: c;
for assignment, and
if (c :=: 'y') { ...
for equality?

Gordon L. Burditt
 
K

Keith Thompson

What would you suggest as a replacement?

'y' =: c;
for assignment, and
if (c :=: 'y') { ...
for equality?

above. The "if (c == 'y')" is quoted from something I wrote.

Gordon, please let us know when you've gotten more complaints for
snipping attributions than you claim to have gotten in the past for
including them. Or just stop being offensively rude and acknowledge
the people you're quoting. Would it help if I threatened a lawsuit?

If I were designing a new language from scratch, I'd probably use ":="
for assignment and "=" for equality. There are other possibilities,
of course, but the one thing I'd avoid is using a symbol for
assignment that other languages (and ordinary mathematical notation)
use for equality, at least in a language where both can be used in the
same context.

I might also consider ":=" for assignment and "==" for comparison,
leaving the bare "=" unused.
 
L

lovecreatesbeauty

Keith said:
One of my radical language design ideas is to make all declared
objects const by default. If you want to be able to modify an
object's value, you'd need to explicitly qualify it with, say, "var".

Thus:

int x = 10;
var int y = 20;
x ++; /* error */
y ++; /* ok */

This does not sound like a good idea.
I suspect that most declared objects actually don't change in value
once they're initialized. If my suspicion is correct, the result
would be safer code.

Does not any object have an initial value (a determinate value or a
random one) at the point of time of their definition?
 
K

Keith Thompson

lovecreatesbeauty said:
This does not sound like a good idea.

You could be right.
Does not any object have an initial value (a determinate value or a
random one) at the point of time of their definition?

An automatic (local) object with no explicit initializer has an
indeterminate value. A hypothetical language that incorporated my
suggestion would probably have to require any non-var object to have
an explicit initializer.

(I had assumed that C required an initializer for a const object, but
apparently it doesn't. For example, this:
const int x;
appears to be legal.)
 
G

Gordon Burditt

if (c == 'y')
above. The "if (c == 'y')" is quoted from something I wrote.

Gordon, please let us know when you've gotten more complaints for
snipping attributions than you claim to have gotten in the past for
including them. Or just stop being offensively rude and acknowledge
the people you're quoting. Would it help if I threatened a lawsuit?

Misattribution is far worse than being rude. (So are libel, slander,
genocide, and spamming.) Even the possibility of misattribution
is far worse than being rude, and people in this newsgroup regularly
get it wrong and draw complaints about it. Speaking of rude, so
is posting an article that consists entirely of discussions of
attributions, top-posting, Google, trolls, and topicality.

The attribution cabal has raised the bar on correct attibutions
high enough that I have no chance of getting it right consistently:
just not deleting the attributions and letting my newsreader do its
thing with the attributions is NOT sufficient, contrary to prior
claims of some of them, and several complaints of misattribution
not involving any post of mine in the last couple of months have
been due to this. And getting attributions right requires that the
attributions in the article I'm replying to are right, something
which doesn't always hold true.

This is USENET. There is no identity here, so the whole idea that
attributions (or even From: lines) are worth reading seems pretty
silly.

for assignment and "=" for equality. There are other possibilities,
of course, but the one thing I'd avoid is using a symbol for
assignment that other languages (and ordinary mathematical notation)
use for equality, at least in a language where both can be used in the
same context.
leaving the bare "=" unused.

This I consider a better choice, since a bare = now has two common
meanings, equality in mathematics and assignment in quite a few
programming languages including C.

Gordon L. Burditt
 
K

Keith Thompson

Misattribution is far worse than being rude.

Not really.
(So are libel, slander,
genocide, and spamming.) Even the possibility of misattribution
is far worse than being rude, and people in this newsgroup regularly
get it wrong and draw complaints about it.

I rarely see such a problem here, and any complaints about it have
been mild and responded to quickly. Serious complaints would occur
only if a misattribution were deliberate, but I don't recall ever
seeing such a thing here. There have been some cases of deliberately
altered quotations; the offenders where thoroughly flamed.

There have been some cases, and some complaints, about people who have
deleted deeply quoted text without deleting the corresponding
attribution line. This is a minor mistake, and avoiding it just
requires a little attention. Most of us get it right 99% of the time,
and getting it wrong isn't that much of a problem.
Speaking of rude, so
is posting an article that consists entirely of discussions of
attributions, top-posting, Google, trolls, and topicality.

Which I have not done here -- but in any case, meta-discussions about
topicality are, by convention, considered to be topical.
The attribution cabal has raised the bar on correct attibutions
high enough that I have no chance of getting it right consistently:
just not deleting the attributions and letting my newsreader do its
thing with the attributions is NOT sufficient, contrary to prior
claims of some of them, and several complaints of misattribution
not involving any post of mine in the last couple of months have
been due to this. And getting attributions right requires that the
attributions in the article I'm replying to are right, something
which doesn't always hold true.

You exaggerate.

If you quote another article with incorrect attributions, nobody is
going to blame you for the previous poster's error.
This is USENET. There is no identity here, so the whole idea that
attributions (or even From: lines) are worth reading seems pretty
silly.

Nonsense. We know each other by our names (real or not) and by our
posting histories. There are some people here I take more seriously
than others.

If you're going to base your decision on the number of complaints,
note that I'm not the only one who finds your posting style offensive
-- and hardly anyone has seriously defended you.
 
R

Richard Heathfield

Gordon Burditt quoted Keith Thompson as saying:

but failed to attribute it.

Gordon Burditt replied:
Misattribution is far worse than being rude.

Lack of attribution /is/ misattribution, so please stop doing it.

Even the possibility of misattribution is far worse than being rude,

Failing to attribute constitutes incorrect attribution.
and people in this newsgroup regularly
get it wrong and draw complaints about it.

Consider this a complaint at your failure to get attributions right.
Speaking of rude, so is posting an article that consists entirely
of discussions of attributions, top-posting, Google, trolls, and
topicality.

If you attributed correctly, we wouldn't be discussing it, would we?
The attribution cabal

There is no cabal. This is widely known.
has raised the bar on correct attibutions
high enough that I have no chance of getting it right consistently:

You are getting it /wrong/ consistently.

<snip>
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

No members online now.

Forum statistics

Threads
473,780
Messages
2,569,611
Members
45,277
Latest member
VytoKetoReview

Latest Threads

Top