why these codes will cause Segmentation fault in linux?

J

jacob navia

Le 01/10/10 21:23, Keith Thompson a écrit :
I agree that "-Wall" is mis-named. It's probably for historical
reasons. In any case, the documentation explains what it does and gives
a good overview of which warnings it enables and why.

As for needing an optimization flag, I suppose it does seem
counterintuitive if you haven't thought about what the compiler
does internally.

[snip explanation]
Bottom line: Enabling optimization typically gives you better
warnings. It's not immediately obvious, but it's something you
should only have to learn once.

Sure. As you say yourself, this is counter-intuitive and only histerical
reasons are proposed.
 
A

August Karlstrom

I suppose the alternative would be for the compiler to *always*
perform the analysis, but not perform the optimizations unless
asked to do so, but that would lose the ability to get much faster
compilation speed by turning off optimization.

I think it would be reasonable to always perform the analysis if
warnings are enabled (at the cost of the compilation speed). From a user
perspective compiler optimizations and warning messages are two
completely different concepts.


/August
 
K

Kenny McCormack

The fact that `-Wall' doesn't enable all warnings and that an
optimization flag(!) is needed to get warnings about uninitialized
variables is nothing but laughable.

All true, but you won't get any sympathy here.

Or, IOW, if you want "D", you know where to find it.
 
K

Keith Thompson

jacob navia said:
Le 01/10/10 21:23, Keith Thompson a écrit :
I agree that "-Wall" is mis-named. It's probably for historical
reasons. In any case, the documentation explains what it does and gives
a good overview of which warnings it enables and why.

As for needing an optimization flag, I suppose it does seem
counterintuitive if you haven't thought about what the compiler
does internally.

[snip explanation]
Bottom line: Enabling optimization typically gives you better
warnings. It's not immediately obvious, but it's something you
should only have to learn once.

Sure. As you say yourself, this is counter-intuitive and only
histerical reasons are proposed.

I did not say that.

I said that the fact that "-Wall" does not enable all warnings is
probably due to historical reasons.

I said that the fact that uninitialized variables are not detected
unless optimization is enabled is due to technical reasons.
 
K

Keith Thompson

jacob navia said:
Le 01/10/10 18:14, Seebs a écrit :

Try harder.

The reason optimization is an option is that it imposes

Yes.

Some people don't want to pay those costs,

Yes

and spending a lot of time

Yes, but when you want ALL warnings, doing usage analysis is quite
useful. If you want fast compilations then do not use -Wall.

This is just common sense.


Is it "unreasonable" to expect that using a variable without
initialization generates a warning?

OK. "Reasonable" is everything gcc does because it is GNU.

Sorry, I did not get that...

Perhaps you do not get it because *nobody said it*.

I've described what gcc does, and explained some of the reasons
behind it. I did not say or imply that what gcc does is reasonable
"because it is GNU", and I don't believe anyone else did either.

And Seebs, in case you didn't notice, was (mildly) criticizing gcc
for not including some "reasonable" warnings under "-Wall".

Please stop inventing things like this.

Some warnings require very little extra work. For example, a warning
about a conversion that might overflow would be issued only when the
compiler is already generating code to perform the conversion; most of
the added cost would be just printing the message. Other warnings, as
we've discussed, can require additional analysis that can significantly
slow down compilation.

I agree that it's counterintuitive, at least at first glance, that
uninitialized variables are not detected unless you enable optimization.
But the alternative would be that some specific warning options, but not
others, would slow down compilation. Furthermore, if "-Wall" enabled
extra analysis, then either it would also enable optimization (which
would be *really* counterintuitive), or the analysis and subsequent
optimization would have to be decoupled. In the latter case, you'd want
to make sure that the exact same code is generated regardless of what
warnings are enabled. I don't know the internals of gcc, but I can
imagine that might be very difficult.

So, I have some questions for you:

1. What exactly do you think gcc should do that's different from
what it does now. (I suggest being careful about any changes that
would break existing usage.)

2. Have you suggested any such changes to the gcc maintainers?

3. How does lcc-win handle this kind of thing? Does it warn about
uninitialized variables? Does it *always* perform the analysis
needed to detect them? Does enabling optimization enable more
warnings?
 
S

Seebs

And Seebs, in case you didn't notice, was (mildly) criticizing gcc
for not including some "reasonable" warnings under "-Wall".

Exactly. I am not a big fan of gcc's default behaviors and warning
behaviors. It's nice that you can get it to give a great number of
warnings, but I would be much happier if it were easier to get it
to give "all warnings for things that are genuinely likely to be issues,
but not merely stylistic warnings".

.... Although much though I used to hate the "extra parentheses around
assignment used as truth value" warning, I'm now willing to admit that
it's probably better on than off.

-s
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

Forum statistics

Threads
473,774
Messages
2,569,596
Members
45,139
Latest member
JamaalCald
Top