Does -O3 enable more warnings than just -Wall -Wextra with gcc?

L

luser- -droog

I thought I remembered reading in this group that enabling maximum
optimizations with -O3 enabled extra code-path analysis which could
turn up warnings that otherwise would not be noticed. Not that it
"enables" warnings per se, but allows more warnings to be issued,
since more were detected.

Searching the group yielded no results. And nothing under
http://gcc.gnu.org/onlinedocs/gcc/Warning-Options.html#Warning-Options
or
http://gcc.gnu.org/onlinedocs/gcc/Optimize-Options.html#Optimize-Options

Am I full of crap?
 
G

glen herrmannsfeldt

luser- -droog said:
I thought I remembered reading in this group that enabling maximum
optimizations with -O3 enabled extra code-path analysis which could
turn up warnings that otherwise would not be noticed. Not that it
"enables" warnings per se, but allows more warnings to be issued,
since more were detected.

Could be, or it might find fewer. At higher optimization levels the
compiler knows the flow structure better, and might generate fewer false
positives.

-- glen
 
S

Stephen Sprunk

I thought I remembered reading in this group that enabling maximum
optimizations with -O3 enabled extra code-path analysis which could
turn up warnings that otherwise would not be noticed. Not that it
"enables" warnings per se, but allows more warnings to be issued,
since more were detected.

Right. In a sense, many warnings are a side effect; without the
relevant optimization being enabled, GCC isn't doing the analysis
necessary to generate the warning.

Another way of looking at it is that, by default, GCC generates a near
literal translation of your code; code that invokes undefined behavior
will do what most programmers would expect, so there is no need to warn
them. When you enable optimizations, though, GCC will do clever things
that are legal (in most cases) but likely to cause unexpected results in
the case of undefined behavior, hence the warnings.

A diagnostic is required by the Standard in certain cases (constraint
violations?), but those tend to be fatal errors rather than warnings.

S
 
B

Ben Bacarisse

luser- -droog said:
I thought I remembered reading in this group that enabling maximum
optimizations with -O3 enabled extra code-path analysis which could
turn up warnings that otherwise would not be noticed. Not that it
"enables" warnings per se, but allows more warnings to be issued,
since more were detected.

Yes. A case in point (though not requiring -O3) is the recent post
about longjmp clobbering locals. The warning disappears when
optimisation is turned off. In fact the warning changes when using -Os
rather than, say, -O1. When optimising for space, the warning is more
accurate in that it names the actual object whose value might become
indeterminate.

<snip>
 
K

Keith Thompson

Stephen Sprunk said:
Right. In a sense, many warnings are a side effect; without the
relevant optimization being enabled, GCC isn't doing the analysis
necessary to generate the warning.

I believe that's correct.
Another way of looking at it is that, by default, GCC generates a near
literal translation of your code; code that invokes undefined behavior
will do what most programmers would expect, so there is no need to warn
them. When you enable optimizations, though, GCC will do clever things
that are legal (in most cases) but likely to cause unexpected results in
the case of undefined behavior, hence the warnings.

I don't think gcc invokes that kind of reasoning when deciding
whether to issue a warning (and IMHO it shouldn't). If a given
construct has undefined behavior, I'd expect gcc or any reasonable
compiler to warn about it if it's able to do so, even if the behavior
is well defined *by the compiler* at the current optimization
level. If it fails to warn about something at -O0 (little or
no optimization), I expect it's because it doesn't have enough
information, not because it chooses not to warn about it. And the
definition of "what most programmers would expect" is slippery.
A diagnostic is required by the Standard in certain cases (constraint
violations?), but those tend to be fatal errors rather than warnings.

gcc issues non-fatal warnings by default for a lot of constraint
violations. (IMHO this is unfortunate.) You can override this with
"-pedantic-errors".
 
S

Stephen Sprunk

I don't think gcc invokes that kind of reasoning when deciding
whether to issue a warning (and IMHO it shouldn't).

I didn't mean to imply that GCC has such reasoning encoded in it, but
rather that appears to be the view that GCC's _programmers_ take.
If a given construct has undefined behavior, I'd expect gcc or any
reasonable compiler to warn about it if it's able to do so, even if
the behavior is well defined *by the compiler* at the current
optimization level.

That is clearly not the case with any compiler I've ever used, as much
as I might wish it were so. Undefined behavior, aside from constraint
violations, rarely seems to generate a warning. It's also a slippery
slope; should a compiler be required to warn about unspecified or
implementation defined behavior, too?

Also, one of C's strengths is the ability to write non-portable code in
the same language as portable code; some desirable things simply can't
be done without invoking undefined (according to the Standard) behavior,
so one could make a decent argument that shouldn't require warnings when
done deliberately.
the definition of "what most programmers would expect" is slippery.

It may also be circular, since many programmers seem to learn what to
expect by experimenting with a particular compiler rather than by
reading the Standard (or even that compiler's documentation). It might
be better to say that GCC generates a warnings when optimization may
deliver _different_ results, without reference to expectations.
gcc issues non-fatal warnings by default for a lot of constraint
violations. (IMHO this is unfortunate.) You can override this with
"-pedantic-errors".

Or "-pedantic" combined with "-Werror". I nearly always use the latter,
since all of the environments I work in require that code compile with
no errors at "-W -Wall", and I use the former unless dealing with code
that requires certain GCC-specific extensions, e.g. the Linux kernel.

S
 
K

Keith Thompson

Stephen Sprunk said:
I didn't mean to imply that GCC has such reasoning encoded in it, but
rather that appears to be the view that GCC's _programmers_ take.

I didn't phrase that particularly well. I was referring to the
reasoning of the gcc developers; I don't claim that gcc itself
"reasons".

Do you have any examples of this? What I think we're talking about is
cases where a given construct has undefined behavior, and gcc warns
about it only at a higher optimization level *specifically because the
behavior without optimization is considered acceptable*. I'm not aware
of any such cases. There are cases (I'm fairly sure) where gcc warns
about something only at higher optimization levels because it otherwise
doesn't have the information.

An example:

#include <stdio.h>
#include <limits.h>
int main(void) {
const int n = INT_MAX;
printf("%d\n", n + 1);
}

gcc warns about this with "-O1 -std=c99 -Wall", but not with
"-O0 -std=c99 -Wall".
That is clearly not the case with any compiler I've ever used, as much
as I might wish it were so. Undefined behavior, aside from constraint
violations, rarely seems to generate a warning. It's also a slippery
slope; should a compiler be required to warn about unspecified or
implementation defined behavior, too?

Compilers aren't *required* to warn about anything. Decent ones warn
about undefined behavior when they can, but it's impossible to do so in
all cases.
Also, one of C's strengths is the ability to write non-portable code in
the same language as portable code; some desirable things simply can't
be done without invoking undefined (according to the Standard) behavior,
so one could make a decent argument that shouldn't require warnings when
done deliberately.

Which is why casts typically inhibit warnings. But it's impossible in
general to determine whether a given violation is deliberate.
It may also be circular, since many programmers seem to learn what to
expect by experimenting with a particular compiler rather than by
reading the Standard (or even that compiler's documentation). It might
be better to say that GCC generates a warnings when optimization may
deliver _different_ results, without reference to expectations.


Or "-pedantic" combined with "-Werror". I nearly always use the latter,
since all of the environments I work in require that code compile with
no errors at "-W -Wall", and I use the former unless dealing with code
that requires certain GCC-specific extensions, e.g. the Linux kernel.

"-Werror" is often a good idea; on the other hand, it makes gcc
non-conforming, since it causes it to reject some conforming code.
 
J

James Kuyper

On 12/02/2013 03:02 PM, David Brown wrote:
....
What desirable things can only be achieved by invoking undefined
behaviour? There is a lot that requires implementation-defined
behaviour, and there are times when you need something that has no
spec'ed behaviour at all (using compiler extensions, inline assembly,
etc.), but I don't see off-hand when you would /need/ undefined behaviour.

Talking about these issue can get confusing unless you realize that the
relevant terms are specialized jargon with meanings defined by the C
standard, rather than having the meaning that would normally apply under
the rules of ordinary English. For instance, in the C standard,
"undefined behavior" doesn't mean "behavior that it not defined". It
means "behavior that is not defined by the C standard". Similarly, in
the C standard, "implementation-defined behavior" does not mean
"behavior that is defined by the implementation"; it means "behavior
that the implementation is required by the C standard to document".

Thus, it is not possible to achieve anything by writing code that has
behavior that is not defined by anything; but it is possible to achieve
something useful by writing code that has "undefined behavior" - because
something other than the C standard defines the behavior, such as the
compiler's documentation, or a platform-specific API, or the OS. Such
behavior is still "undefined", as far as the C standard is concerned.

Similarly, it is not necessary that the behavior be "implementation
defined", it's sufficient that it be defined by the implementation, even
if the standard imposes no requirement that the implementation document
the definition that it provides. Such behavior is not
"implementation-defined", as far as the C standard is concerned.
 
S

Stephen Sprunk

It is good to warn about code that might not perform as the
programmer expects - whether the behaviour is well-defined in the
specs, implementation defined, or undefined. (It's also good that
such warnings can be controlled by flags to the users' likings.)

For example, gcc will warn about "if (a = 3) ..." with the right
flags. The behaviour is clearly defined in the specs - there is no
ambiguity or undefined behaviour - but it is probably a typo by the
programmer.

Keith's expectation above is that the compiler will warn about any
undefined behavior (that the compiler can detect) in the code presented,
apparently without regard to whether it was deliberate.
So there is no slippery slop that I can see here.

The main motivation I see for warning about undefined behavior is that
the programmer be made aware that the code may not do what he expects it
to do (whatever that may be). That logic holds for
implementation-defined and unspecified behavior as well; the results are
constrained, but it's still possible that the code may not do what he
expects it to do, either on the current implementation or when moved to
another.
What desirable things can only be achieved by invoking undefined
behaviour? There is a lot that requires implementation-defined
behaviour, and there are times when you need something that has no
spec'ed behaviour at all (using compiler extensions, inline
assembly, etc.), but I don't see off-hand when you would /need/
undefined behaviour.

The most obvious case would be an OS kernel, in particular device
drivers. You must do things that the C Standard leaves undefined, e.g.
writing to particular memory locations that are memory-mapped to device
registers or altering page tables, but it works because the OS (or
hardware, or ABI, or whatever) _does_ define the behavior in those
cases, even though the C Standard does not require it to do so or even
acknowledge such cases.

Most "interesting" OS APIs probably fall into the same category, and
that's fine because nobody _expects_ such code to be portable, but it is
of great value that one can do such things in C--and mix it with code
that _is_ expected to be portable, written in the same language.

S
 
S

Stephen Sprunk

Do you have any examples of this? What I think we're talking about
is cases where a given construct has undefined behavior, and gcc
warns about it only at a higher optimization level *specifically
because the behavior without optimization is considered acceptable*.
I'm not aware of any such cases. There are cases (I'm fairly sure)
where gcc warns about something only at higher optimization levels
because it otherwise doesn't have the information.

No specific examples come to mind, but generally speaking, GCC is
obviously capable of generating such warnings since it does so when
optimization is enabled, yet at some point someone chose not to do the
same analysis when optimization is disabled even though the warnings
could obviously be given without affecting code generation.

When I was first learning C, I had no trouble making my programs compile
quietly and work as expected with -O0. However, with -O3, I would get
dozens of new warnings--and my code no longer worked as expected. So,
correctly or not, I learned that GCC only warns me when it thinks it's
doing something that I don't expect.
Which is why casts typically inhibit warnings.

For the specific case where the problem is a disallowed implicit
conversion, sure, but useful undefined behavior is larger than that.
But it's impossible in general to determine whether a given
violation is deliberate.

Hence the problem with a proposal to _require_ warnings.

IIRC, MSVC has #pragmas to disable individual warnings, but in my
experience that is used more often to protect bad code from discovery
than to suppress incorrect/spurious warnings about valid code. I
couldn't in good conscience suggest that GCC add that misfeature.

S
 
J

James Kuyper

On 02-Dec-13 13:55, Keith Thompson wrote: ....

No specific examples come to mind, but generally speaking, GCC is
obviously capable of generating such warnings since it does so when
optimization is enabled, yet at some point someone chose not to do the
same analysis when optimization is disabled even though the warnings
could obviously be given without affecting code generation.

The analysis affects code generation by delaying completion by the
amount of time that is required to perform the analysis. gcc (rightly or
wrongly) chooses not to spend that time unless explicitly asked to
perform the optimizations enabled by that analysis. That doesn't
necessarily mean:
 
K

Keith Thompson

Stephen Sprunk said:
No specific examples come to mind, but generally speaking, GCC is
obviously capable of generating such warnings since it does so when
optimization is enabled, yet at some point someone chose not to do the
same analysis when optimization is disabled even though the warnings
could obviously be given without affecting code generation.

I think your understanding is reversed (or, conceivably, mine is).

When you requesti optimization (via "-O3" or whatever),
gcc performs additional analysis if your code so that it can
determine what optimizations can be performed without breaking the
required behavior. A side effect of that analysis is that it can
detect things that might cause undefined behavior. For example
tracking the value that an object holds during execution both (a)
enables optimization (such as replacing a reference to the object
with a constant if the compiler can prove that it must hold some
value in particular), and (b) enables some warnings (such as an
overflow because it's been able to figure out that the value you're
incrementing at run time happens to be INT_MAX).

It refrains from performing that analysis at -O0 simply because
it takes additional time and memory, and "gcc -O0" means roughly
"Compiler: Please generate correct code quickly".

It might have made some sense not to tie these two things together, so
that the compiler could perform the analysis needed to diagnose (some
instances of) undefined behavior without generating optimized code, but
there probably isn't enough demand for that.
When I was first learning C, I had no trouble making my programs compile
quietly and work as expected with -O0. However, with -O3, I would get
dozens of new warnings--and my code no longer worked as expected. So,
correctly or not, I learned that GCC only warns me when it thinks it's
doing something that I don't expect.

It's likely that your code has undefined behavior regardless of the
optimization level (in fact, UB is always independent of the
optimization level), and it just happened to "work" at "-O0" and not at
"-O3". Which is why compiling with "-O3", even if you don't intend to
run the optimized code, can be a good way to flush out bugs.
For the specific case where the problem is a disallowed implicit
conversion, sure, but useful undefined behavior is larger than that.


Hence the problem with a proposal to _require_ warnings.

I don't recall any such proposal.
 
P

Philip Lantz

Stephen said:
IIRC, MSVC has #pragmas to disable individual warnings, but in my
experience that is used more often to protect bad code from discovery
than to suppress incorrect/spurious warnings about valid code.

My experience is the opposite: I have frequently needed to use the MSVC
pragma to disable bogus warnings about perfectly good code. (One
example: it generated a warning about the declaration of a flexible
array member, which was defined and used according to the standard, and
correctly implemented by the compiler; I couldn't figure out why they
felt a need to generate a warning.)
I couldn't in good conscience suggest that GCC add that misfeature.

Gcc has essentially the same feature, except that it puts the control on
the command line, rather than in the source file. Gcc has somewhat less
fine-grained control than MSVC. It also seems to have less problem with
bogus warnings, in my experience.
 
A

ais523

Philip said:
Stephen said:
IIRC, MSVC has #pragmas to disable individual warnings, but in my
experience that is used more often to protect bad code from discovery
than to suppress incorrect/spurious warnings about valid code. [snip]
I couldn't in good conscience suggest that GCC add that misfeature.

Gcc has essentially the same feature, except that it puts the control on
the command line, rather than in the source file. Gcc has somewhat less
fine-grained control than MSVC. It also seems to have less problem with
bogus warnings, in my experience.

Just thought that you should know that modern gcc does allow warning
control from the source file.

I mostly use this in case of false positives (or occasionally to
increase the warning level for a section of code to include a warning
that I want to make use of but which is disabled by default even with
-Wextra due to too many false positives).

Here's an example of the syntax:

== cut here ==
int main(void)
{
int x = 1/0;
#pragma GCC diagnostic push
#pragma GCC diagnostic ignored "-Wdiv-by-zero"
int y = 1/0;
#pragma GCC diagnostic pop
int z = 1/0;
return x+y+z;
}
== cut here ==

I get warnings for the assignments to x and z, but not for the
assignment to y, because the pragma specifically requests no warning.
 
K

Keith Thompson

David Brown said:
(I know that the terms "undefined behaviour" and "implementation defined
behaviour" have special meanings in C, and I believe I have a fair
understanding of them - partly due to the good folks in c.l.c - but I am
always happy to have my understandings refined, improved and corrected.)

Suppose the compiler encounters code like this:

extern pid_t waitpid(pid_t pid, int *status, int options);
pid_t x = waitpid(y, &status, 0);


What you are saying here is that the call to "waitpid" is undefined
behaviour, because the C standards don't say anything about this
function (it being an OS kernel call). To my understanding, it /is/
defined behaviour - the compiler will generate code to put the
parameters "y", "&status" and "0" onto the stack (or whatever is
required by the calling conventions), call the externally linked
function, and return the value to "x". The standards don't define the
behaviour of the function itself (unlike for standard functions such as
strcpy), but they define how the function is to be called.

The same applies to other things you mentioned, such as writing to
memory-mapped registers in device drivers. The action of these
registers is not defined, but the addresses and values written /is/
defined behaviour.


The common image of "undefined behaviour" is that the compiler can
generate code to make daemons fly out your nose when you write something
like "*(int*)0 = 0;" - it certainly cannot invoke nasal daemons when you
call "waitpid" !


I realise (as James pointed out) that "undefined behaviour" means
"behaviour undefined by the C standard", rather than generally unknown
behaviour (the behaviour of "waitpid" is hopefully well defined in the
kernel's documentation). But I feel that the act of invoking such
functions /is/ well defined in the C standards.



Presumably, given your and James' posts, something is wrong with my
reasoning above. I just don't see exactly where.

waitpid(y, &status, 0) is a function call, and the standard discusses
how function calls work. It doesn't define the behavior of waitpid()
(POSIX does), so the behavior of that particular function is
undefined *by the C standard*. And if waitpid() happens to execute
`*(int*)0 = 0;`, then the call *could* in principle result in
nasal demons.

To determine whether the behavior of the call is defined by the C
standard, you'd have to look at the code that implements waitpid().
 
J

James Kuyper

On 12/03/2013 03:05 PM, Keith Thompson wrote:
....
waitpid(y, &status, 0) is a function call, and the standard discusses
how function calls work. It doesn't define the behavior of waitpid()
(POSIX does), so the behavior of that particular function is
undefined *by the C standard*. And if waitpid() happens to execute
`*(int*)0 = 0;`, then the call *could* in principle result in
nasal demons.

To determine whether the behavior of the call is defined by the C
standard, you'd have to look at the code that implements waitpid().

I'm sure you're aware of the issue I'm about to raise, but for the sake
of other readers I want to point out that such functions are often
written in some other language (such as assembler). Even if they are
written in C, they are often written in code that is not strictly
conforming C. In some cases the use of another language or of
not-strictly conforming C will be indirect, through a subroutine call.
Regardless of how such code is reached, the behavior of that code will
not be defined by C.
 
J

Jorgen Grahn

I thought I remembered reading in this group that enabling maximum
optimizations with -O3 enabled extra code-path analysis which could
turn up warnings that otherwise would not be noticed. Not that it
"enables" warnings per se, but allows more warnings to be issued,
since more were detected.

Searching the group yielded no results. And nothing under

From said:
#include<stdio.h>
#include<math.h>
#define M_PI 3.14159

int main()
{
double theta, phi, sinth;
double count;
double incr;
double s;

s = ((double) 180)/M_PI; /* converting to radiens */
incr = 0.5;
theta = (double) 0;

for(theta = incr; theta < (double) 180; theta += incr)
sinth = sin(s *theta);
for(phi = 0; phi < (double) 360 ; phi += incr/ sinth)
count ++;
printf("%f", count);
return 0;
}

% gcc -std=c99 -Wall -Wextra -pedantic -c foo.c
(no warnings)
% gcc -std=c99 -Wall -Wextra -pedantic -O1 -c foo.c
foo.c: In function 'main':
foo.c:19:28: warning: 'count' may be used uninitialized in this
function [-Wmaybe-uninitialized]

In this case enabling optimization /at all/ made a difference, but -O3
is what I used in that posting.

Perhaps there is more information (or desinformation) around the
postings

<[email protected]>
<[email protected]>
<[email protected]>

/Jorgen
 
S

Stephen Sprunk

On 12/03/2013 03:05 PM, Keith Thompson wrote: ....

I'm sure you're aware of the issue I'm about to raise, but for the
sake of other readers I want to point out that such functions are
often written in some other language (such as assembler). Even if
they are written in C, they are often written in code that is not
strictly conforming C. In some cases the use of another language or
of not-strictly conforming C will be indirect, through a subroutine
call. Regardless of how such code is reached, the behavior of that
code will not be defined by C.

The internals of nearly every system call will invoke undefined
behavior, e.g. inline assembly for the syscall interface, but I was
thinking of cases where the desired effect either creates undefined
behavior in the caller's environment or essentially requires the caller
to invoke such itself to be useful, e.g. fork(), mmap(), aio_read().

An OS kernel will have lots of UB itself in areas that interact directly
with the hardware, e.g. device drivers and memory managers, though other
areas can be (but still often aren't) written portably. Being able to
do both in the same language is one of the key strengths of C.

S
 
M

Malcolm McLean

I'm sure you're aware of the issue I'm about to raise, but for the sake
of other readers I want to point out that such functions are often
written in some other language (such as assembler). Even if they are
written in C, they are often written in code that is not strictly
conforming C. In some cases the use of another language or of
not-strictly conforming C will be indirect, through a subroutine call.
Regardless of how such code is reached, the behavior of that code will
not be defined by C.
There are two common situations. One is that you need to read from or write to
a memory-mapped address. So typically that's written in C. The result of
*123 = 42 is probably not just to set memory location 123 to 42, there's
usually a side effect, like turning on an LED. Otherwise you'd just use
regular memory on the heap or the stack.
The other common case is that you need to issues some sort of special command
to the processor, like generate or return from an interrupt, or execute
a parallel multiply instruction, or start a background block transfer,
something that doesn't fit in the standard C read / write from memory,
execute arithmetical operations / branch on condition / make subroutine calls
programming model. Nowadays you normally see just one or two assembler
embedded in what's essentially still a C subroutine, because there will be
conditional jump / read/write / arithmetical logic around the special
operation, and it's easier to keep this in C.

C can't define what the behaviour of the subroutine will be, because you've
gone beyond the scope of the language. However the use of C is predicated
on the assumption that most of the C constructs will have an effect which
can be predicted by someone familiar with C. Otherwise you wouldn't try to
use C at all.
 
J

James Kuyper

On 12/04/2013 02:44 AM, David Brown wrote:
....
But this means that there are two different types of "undefined
behaviour" in C - behaviour that the compiler /knows/ is bad, and
behaviour that the compiler does not know to be good. For example, the
compiler can use the undefined nature of signed overflow to simplify "(x
+ 1) > x" to "1" (for signed x). But it can't use the undefined nature
of "waitpid" to do something unexpected.

That's not quite the right way to think about it. In general, the reason
why the committee chose to leave the behavior of a C construct undefined
was that doing so gives the implementation the freedom needed to do
whatever it considers is most appropriate. In the particular, if the
behavior of a call to waitpid() were well-defined, that would force the
implementor to make sure that the call had precisely the behavior
defined for it by the C standard (whatever that might be), whether or
not the OS provided a function of the same name, and it would have to do
so even if the behavior specified by the C standard were different from
the behavior provided by the OS function of the same name.

Because the behavior is undefined, an implementation of C is free to
implement the call as simply conforming to the calling conventions of
the corresponding platform, and leaving the details of what happens when
it is called up to the library containing the corresponding function.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

No members online now.

Forum statistics

Threads
473,754
Messages
2,569,526
Members
44,997
Latest member
mileyka

Latest Threads

Top