pointer to const int

T

Tagore

Please consider following program:
#include<stdio.h>
int main()
{
const int x=5;
int *ptrx;
ptrx=&x;
*ptrx=10;
printf("%d",x);
return 0;
}

Output of this program is 5 which shows that value at of x is not
changing by taking a pointer to it...
but I am unable to understand how ptrx knows that address it points is
const...
 
J

jacob navia

Tagore said:
Please consider following program:
#include<stdio.h>
int main()
{
const int x=5;
int *ptrx;
ptrx=&x;
*ptrx=10;
printf("%d",x);
return 0;
}

Output of this program is 5

No, in my system is 10... using lcc-win:
lc -A tconst.c
Warning tconst1.c: 6 Different const qualifiers
tconst
10

Using gcc I get
tconst.c:6: warning: assignment discards qualifiers from pointer target type
../a.out
10

Using MSVC I get
tconst1.c(6) : warning C4090: '=' : different 'const' qualifiers

and I get also a result of 10. Which compiler are you using?

which shows that value at of x is not
changing by taking a pointer to it...
but I am unable to understand how ptrx knows that address it points is
const...

???????
 
K

Keith Thompson

Tagore said:
Please consider following program:
#include<stdio.h>
int main()
{
const int x=5;
int *ptrx;
ptrx=&x;
*ptrx=10;
printf("%d",x);
return 0;
}

Output of this program is 5 which shows that value at of x is not
changing by taking a pointer to it...
but I am unable to understand how ptrx knows that address it points is
const...

By attempting to modify an object defined with the "const" qualifier,
your program invokes undefined behavior. In fact the assignment

ptrx=&x;

violates a constraint, and should have produced a diagnostic message;
I get:

c.c:6: warning: assignment discards qualifiers from pointer target type

ptrx doesn't know, and doesn't need to know, that x is const (in fact
a pointer object doesn't "know" anything), but the compiler is
entitled to assume that, since you defined x as const, its value will
never change. In the printf call, the compiler undoubtedly generated
code to load a literal value 5 rather than loading the current value
of x.

The lesson: Don't ignore compiler warnings. (And your compiler didn't
give you a warning, use whatever options are necessary to make it do
so.)
 
T

Tagore

No, in my system is 10... using lcc-win:
lc -A tconst.c
Warning tconst1.c: 6  Different const qualifiers
tconst
10

Using gcc I get
tconst.c:6: warning: assignment discards qualifiers from pointer target type
./a.out
10

Using MSVC I get
tconst1.c(6) : warning C4090: '=' : different 'const' qualifiers

and I get also a result of 10. Which compiler are you using?


???????
I got that it is actually undefined behavior.
I used online compiler codepad and see this
http://codepad.org/CHFJoE41
 
K

Keith Thompson

jacob navia said:
Tagore said:
Please consider following program:
#include<stdio.h>
int main()
{
const int x=5;
int *ptrx;
ptrx=&x;
*ptrx=10;
printf("%d",x);
return 0;
}

Output of this program is 5
[...]
Using gcc I get
tconst.c:6: warning: assignment discards qualifiers from pointer target type
./a.out
10

Using MSVC I get
tconst1.c(6) : warning C4090: '=' : different 'const' qualifiers

and I get also a result of 10. Which compiler are you using?

Try "gcc -O1", or -O2, or -O3. it causes the compiler to generate
code to use a literal value 5 rather than loading the value of x from
memory.

In any case, since the code invokes undefined behavior (assuming the
compiler doesn't reject it outright because of the constraint
violation), any result is valid -- though 5 and 10 are the most likely
results.

(I'll refrain from arguing that the code also exhibits undefined
behavior because of the missing "void" in the declaration of main and
the missing new-line at the end of the output.)
 
A

Andrey Vul

(I'll refrain from arguing that the code also exhibits undefined
behavior because of the missing "void" in the declaration of main and
the missing new-line at the end of the output.)
Why does C89 have a problem with missing newline at end of input?
 
L

luserXtrog

Why does C89 have a problem with missing newline at end of input?

You have to output a newline or call fflush to flush the output
buffer. Otherwise, you may never see the output at all.
 
O

Old Wolf

(I'll refrain from arguing that the code also exhibits undefined
behavior because of the missing "void" in the declaration of main and
the missing new-line at the end of the output.)

What's your basis for the former claim?
 
K

Keith Thompson

That was meant to be (mostly) facetious; I suppose I should have
refrained harder. :cool:}
You have been harping about your opinion that the C standard does not
require:

int main()

...to be equivalent to:

int main(void)

..and in fact the former causes undefined behavior.

I have yet to notice any significant number agreeing with you that C90
did, indeed, make every conforming pre-C90 program without command
line arguments undefined, despite its mandate to accept formerly
working code.

Accepting working pre-C90 code was a goal of the C90 standard, but
it's not an explicit requirement stated normatively in the standard
itself. (It would have been difficult to do so, since there was no
standard definition of pre-C90 code.) Sorry to harp about it again,
but since I've been asked:

The standard (C99 5.1.2.2.1; C90 is similar) requires main to be
defined as:
int main(void) { /* ... */ }
or as:
int main(int argc, char *argv[]) { /* ... */ }
or equivalent, or in some other implementation-defined manner. The
"or equivalent" allows for things like different names for argc and
argv, using typedefs, and using char **argv rather than char *argv[].

I argue that this:
int main() { /* ... */ }
is *not* equivalent to this:
int main(void) { /* ... */ }

Why? Because this program:
int main(void) { return 0; }
int foo(void) { return main(42); }
violates a constraint, but this program:
int main() { return 0; }
int foo(void) { return main(42); }
does not. (The latter would invoke undefined behavior of foo() were
called, but it isn't; I believe the second program is strictly
conforming.) Here the only difference between a program that violates
a constraint and one that does not is the "void" keyword in the
definition of main.

I suspect that the authors of the C90 and C99 standards did not intend
this, but it's the only conclusion I can reach by reading the actual
normative wording.

In practice, this isn't a problem. I don't know of any compiler that
doesn't treat "int main(void)", in the absence of anything that calls
main or otherwise takes its address, as equivalent to "int main()" --
which, assuming "int main()" invokes undefined behavior, is perfectly
acceptable behavior.

It may be that no significant number of people have agreed with me on
this point, but I have yet to see a convincing counterargument.
As for missing a new line on the last line of input causing undefined
behavior, chapter and verse, please, or correct your statement.

It *may* cause undefined behavior, depending on a particular
implementation-defined choice. (I didn't take the time to word my
claim more carefully; it was meant to be a throwaway line.)

C99 7.19.2p2:

A text stream is an ordered sequence of characters composed into
_lines_, each line consisting of zero or more characters plus a
terminating new-line character Whether the last line requires a
terminating new-line character is implementation-defined.

If an implementation requires the terminating new-line character, what
happens if a program doesn't provide it? The standard doesn't say, so
the behavior is undefined by omission.

For example, in an implementation where text files are structured
entities (as opposed to the simple byte sequences used on Unix and
Windows), closing a text stream without providing a terminating
new-line might leave the corresponding file in an inconsistent state,
making it impossible to open. Or the last partial line of output
might be dropped, or some arbitrary amount of trailing text might be
dropped. An implementation *could* detect this and fix it up by
adding a new-line in fclose() if the last character written was
something other than a new-line, but the standard doesn't require it.
 
K

Kenny McCormack

....
The standard (C99 5.1.2.2.1; C90 is similar) requires main to be
defined as:
int main(void) { /* ... */ }
or as:
int main(int argc, char *argv[]) { /* ... */ }
or equivalent, or in some other implementation-defined manner. The
"or equivalent" allows for things like different names for argc and
argv, using typedefs, and using char **argv rather than char *argv[].

I argue that this:
int main() { /* ... */ }
is *not* equivalent to this:
int main(void) { /* ... */ }
etc, etc.

Nice to see CLC getting back onto its home territory.

Next up: casting the return value of malloc()...
 
J

James Kuyper

Andrey said:
Why does C89 have a problem with missing newline at end of input?

Because some real implementations of C at the time C was standardized
had problems with it. Therefore, to accommodate those implementations,
C89 made it undefined behavior. To a large extent, the first C standard
standardized the consensus of existing practice. When such a consensus
was lacking, it often deliberately leaves the behavior sufficiently
unspecified to accommodate the range of existing practice.

Which just pushes the question one step farther back: why did those
implementations have a problem with it. I don't know; the Rationale does
not mention the issue. Sorry.
 
B

Ben Bacarisse

Jack Klein said:
You have been harping about your opinion that the C standard does not
require:

int main()

...to be equivalent to:

int main(void)

..and in fact the former causes undefined behavior.

I have yet to notice any significant number agreeing with you that C90
did, indeed, make every conforming pre-C90 program without command
line arguments undefined, despite its mandate to accept formerly
working code.

I think that is the wrong way round. The C89 standard made lots of
existing programs undefined on purpose -- that is how it met the
mandate to accept formally working code. Had the standard defined,
for example, the meaning of signed shifts, it would have invalidated a
whole bunch of programs and implementations.

It is only in the post-standard glow of portability that undefined
behaviour is seen as such a bad thing. C programmers of the early 90s
were quite happy with it because it permitted their compilers to
continue to compile their code as before. Those that needed
portability could then re-write any undefined or implementation
defined code.
As for missing a new line on the last line of input causing undefined
behavior, chapter and verse, please, or correct your statement.

Andrey Vul clearly made a typo, but produced a correct statement (or
at least a defensible one) as a result. C89 "has a problem" with
non-empty source files that don't terminate with a new-line (it is UB)
and it similarly has a less dramatic problem with input streams that
don't end with a new-line (an implementation may require it). Of
course this latter one is odd. For maximum portability you have to
append a new-line to output (because an implementation may require it)
but on input you have to be prepared for there to be none (because an
implementation may not require it).
 
J

Jun Woong

Keith Thompson said:
That was meant to be (mostly) facetious; I suppose I should have
refrained harder. :cool:}
You have been harping about your opinion that the C standard does not
require:
int main()
...to be equivalent to:
int main(void)
..and in fact the former causes undefined behavior.
I have yet to notice any significant number agreeing with you that C90
did, indeed, make every conforming pre-C90 program without command
line arguments undefined, despite its mandate to accept formerly
working code.

Accepting working pre-C90 code was a goal of the C90 standard, but
it's not an explicit requirement stated normatively in the standard
itself. (It would have been difficult to do so, since there was no
standard definition of pre-C90 code.) Sorry to harp about it again,
but since I've been asked:

The standard (C99 5.1.2.2.1; C90 is similar) requires main to be
defined as:
int main(void) { /* ... */ }
or as:
int main(int argc, char *argv[]) { /* ... */ }
or equivalent, or in some other implementation-defined manner. The
"or equivalent" allows for things like different names for argc and
argv, using typedefs, and using char **argv rather than char *argv[].

I argue that this:
int main() { /* ... */ }
is *not* equivalent to this:
int main(void) { /* ... */ }

Why? Because this program:
int main(void) { return 0; }
int foo(void) { return main(42); }
violates a constraint, but this program:
int main() { return 0; }
int foo(void) { return main(42); }
does not. (The latter would invoke undefined behavior of foo() were
called, but it isn't; I believe the second program is strictly
conforming.)
Agreed.

Here the only difference between a program that violates
a constraint and one that does not is the "void" keyword in the
definition of main.

I suspect that the authors of the C90 and C99 standards did not intend
this, but it's the only conclusion I can reach by reading the actual
normative wording.

Probably adding the wording for "int main()" to the normative text or
to the footnote 9 (I'm using N1256) seems to be necessary to clarify
the intent.

[...]
It *may* cause undefined behavior, depending on a particular
implementation-defined choice. (I didn't take the time to word my
claim more carefully; it was meant to be a throwaway line.)

C99 7.19.2p2:

A text stream is an ordered sequence of characters composed into
_lines_, each line consisting of zero or more characters plus a
terminating new-line character Whether the last line requires a
terminating new-line character is implementation-defined.

If an implementation requires the terminating new-line character, what
happens if a program doesn't provide it? The standard doesn't say, so
the behavior is undefined by omission.

I think the wording you cited should read in connection with:

7.19.2p2:
Data read in from a text stream will necessarily compare equal to
the data that were earlier written out to that stream only if:
[...] and the last character is a new-line character.

which means that, if the last line of the output does not end with a
newline character when an implementation requires it, it is not
guaranteed that data read back from the stream compares equal to what
was written to the stream.

I think the purpose of the wording cited above is not to add an UB
case, but to define "line" and to specify an exception to that
definition by the last sentence.

Omitting a newline character when write something to a text stream
does not cause undefined behavior, even if code containing such a
construction is not strictly conforming.
 
E

Eric Sosman

James said:
Because some real implementations of C at the time C was standardized
had problems with it. Therefore, to accommodate those implementations,
C89 made it undefined behavior.

Not undefined, implementation-defined (7.19.2p2).
To a large extent, the first C standard
standardized the consensus of existing practice. When such a consensus
was lacking, it often deliberately leaves the behavior sufficiently
unspecified to accommodate the range of existing practice.

Which just pushes the question one step farther back: why did those
implementations have a problem with it. I don't know; the Rationale does
not mention the issue. Sorry.

To me, the most plausible explanation arises from I/O
systems that deal in "blocks" or "records" rather than in
streams of characters. Printers that write an entire line
in a single operation cannot write a few characters at the
start of the line, wait a while, and then append a few more
characters (not efficiently, anyhow). Similarly for card
readers and punches that handle a card in one pass, processing
all eighty columns in parallel. Tape drives, disk drives --
there are still lots of block-oriented devices around. Even
some interactive terminals communicate in blocks rather than
in streams, doing some local line-editing and form-filling and
then dumping the data back to the host in one gulp rather than
in many little dribbles.

So, how do you mediate between C's stream-oriented I/O
model and a block-oriented output device? The obvious way
is to collect C's output characters in a buffer until you've
got an entire block's worth, and then to transmit the block.
For a text output stream, the "Move 'em on, head 'em up"
action gets initiated when C writes a '\n' -- and if C writes
forty-two characters and fails to write a '\n', the output
line might never be generated.

Even if the output line *is* generated by fflush() or
the equivalent in fclose(), reading the data back again may
well retrieve a terminal '\n' that the original C did not
write. Again, imagine a block-oriented device where "end of
line" is not a character code but simply a block boundary.
Writing a '\n' may cause the block to be transmitted, with
no actual '\n' character written to the output. On input,
the library will read a block and synthesize a '\n' as an
in-stream marker that separates the data of one block from
the data of the next, even if the '\n' "wasn't there" in the
output stream of the program that wrote the file.

Similar considerations probably underlie the "trailing
blanks may disappear" rule.
 
J

Jun Woong

I wrote:
[...]
I think the wording you cited should read in connection with:

7.19.2p2:
Data read in from a text stream will necessarily compare equal to
the data that were earlier written out to that stream only if:
[...] and the last character is a new-line character.

which means that, if the last line of the output does not end with a
newline character when an implementation requires it, it is not
guaranteed that data read back from the stream compares equal to what
was written to the stream.

I think the purpose of the wording cited above is not to add an UB
case, but to define "line" and to specify an exception to that
definition by the last sentence.

Omitting a newline character when write something to a text stream
does not cause undefined behavior, even if code containing such a
construction is not strictly conforming.

Sorry, I take it back.

Getting back to home, I found on my copy of C90 notes saying that
not providing a newline character at the end of a text file itself
results in undefined behavior as writing more than 254 characters
for a line does.

It's been too long ago that I studied that part of the standard.
 
H

Harald van Dijk

Keith Thompson said:
[...]
I argue that this:
int main() { /* ... */ }
is *not* equivalent to this:
int main(void) { /* ... */ }
[...]
Probably adding the wording for "int main()" to the normative text or to
the footnote 9 (I'm using N1256) seems to be necessary to clarify the
intent.

This would allow

int main(argc, argv, envp)
int argc;
char **argv;
char **envp;
{}

because the defined type is int main().

What if "It shall be defined with [...]" were changed to "It shall be
(defined as) compatible with [...]"? This, I believe, allows old-style
definitions that actually define main with 0 or 2 parameters of the
appropriate type, without allowing other old-style definitions.
 
B

Beej Jorgensen

James Kuyper said:
Which just pushes the question one step farther back: why did those
implementations have a problem with it.

The one time I saw something like that was an 1995-era MS compiler in
which the preprocessor would ignore directives on the last line of the
header file. Those were profane days... but I've not put anything on
the last line of a file ever since.

I just figured it was a lame bug at the time, but maybe it was more of a
historic holdover.

-Beej
 
K

Keith Thompson

Harald van Dijk said:
Keith Thompson said:
[...]
I argue that this:
int main() { /* ... */ }
is *not* equivalent to this:
int main(void) { /* ... */ }
[...]
Probably adding the wording for "int main()" to the normative text or to
the footnote 9 (I'm using N1256) seems to be necessary to clarify the
intent.

This would allow

int main(argc, argv, envp)
int argc;
char **argv;
char **envp;
{}

because the defined type is int main().

What if "It shall be defined with [...]" were changed to "It shall be
(defined as) compatible with [...]"? This, I believe, allows old-style
definitions that actually define main with 0 or 2 parameters of the
appropriate type, without allowing other old-style definitions.

Personally, I'd prefer to leave it as it is. Old-style function
declarations and definitions are obsolescent anyway (C90 6.9.4, 6.9.5;
C99 6.11.6, 6.11.7). Adding new text in C201X to accomodate a feature
that's been obsolescent since C90 seems unwise.

De facto backward compatibility has been addressed by the fact that
implementations do accept "int main()", even though they're not
required to. And since old-style declarations have been officially
obsolescent for the past 20 years, perhaps it's finally time to drop
them altogether. "int main()" presumably would then be a syntax
error, though surely most C201X compilers would continue to provide a
non-conforming mode in which it's accepted, or would accept it with a
warning even in conforming mode.

From a programmer's point of view, adding a void keyword to the
definition of main isn't an excessive burden. (Note: add it *between
the parentheses*, not in front of "main"!)

I'm assuming here that the change would be made by making "int main()"
(or "double foo()") a syntax error, and requiring "int main(void)" or
"double foo(void)" for a function with no parameters. An alternative
would be to change the meaning of "int main()" so it declares a
function with no parameters, not with an unspecified number of
parameters. The former would be incompatible with pre-ANSI C and with
C++; the latter would be more consistent, but would quietly change the
meaning of existing valid C90 and C99 code.
 
C

CBFalconer

Eric said:
.... snip ...

So, how do you mediate between C's stream-oriented I/O
model and a block-oriented output device? The obvious way
is to collect C's output characters in a buffer until you've
got an entire block's worth, and then to transmit the block.
For a text output stream, the "Move 'em on, head 'em up"
action gets initiated when C writes a '\n' -- and if C writes
forty-two characters and fails to write a '\n', the output
line might never be generated.

Even if the output line *is* generated by fflush() or
the equivalent in fclose(), reading the data back again may
well retrieve a terminal '\n' that the original C did not
write. Again, imagine a block-oriented device where "end of
line" is not a character code but simply a block boundary.
Writing a '\n' may cause the block to be transmitted, with
no actual '\n' character written to the output. On input,
the library will read a block and synthesize a '\n' as an
in-stream marker that separates the data of one block from
the data of the next, even if the '\n' "wasn't there" in the
output stream of the program that wrote the file.

Similar considerations probably underlie the "trailing
blanks may disappear" rule.

If you look at the source code for ggets you will see that a final
line without a '\n' termination is automatically treated the same
as a line with a '\n' termination. This is tied into the
absorption of the '\n' terminating all lines.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

Forum statistics

Threads
473,772
Messages
2,569,593
Members
45,110
Latest member
OdetteGabb
Top