Books for advanced C++ debugging

J

Joshua Maurice

Joshua Maurice said:
C++ is not Java.  C++'s stated primary design goals
include
- runtime performance comparable with assembly

For most programs, we don't care about the speed.
- don't pay for what you don't use

I wish you'd paid for the uncaught bugs left in executables that
affect the users.
[...]
- easy to write code / programmer productivity (with less relative
  emphasis on this one IMHO)

Programmers would be more productive if the implementations helped to
catch bugs at run-time.

I full heartily agree that current C++ compilers make me sad. They
make me sad for lack of standard compliance (some recent versions of
MSVC don't support covariant return types with multiple inheritance,
all compilers have bugs:
http://www.cs.utah.edu/~regehr/papers/emsoft08-preprint.pdf
etc.) and lack of developer-focused tools. I would very much want
every compiler out there to use "fat" pointers and other techniques to
catch all undefined behavior, either at compile time or runtime. I
also very much want this to be entirely optional, and for it to be
expressly stated that no "good" C++ program should depend upon such
checks; they should exist only as "terminate the process" asserts
only.
For most programs, we don't care about the speed.

Agreed, and with the current state of the C++ industry, C++ is not the
best language for every situation. Perhaps Java is be more useful for
most programs.

I very much want C++ to remain focused on runtime performance.
However, \at least\ for developing purposes, I would also very much
like \optional\ Java-like runtime checks to catch all undefined
behavior. Unfortunately, it's impractical because it would break all
platform ABIs, and it requires compiler writers to write such things
which apparently isn't going to happen anytime soon.
 
B

Brian Wood

Joshua Maurice said:
On Jul 10, 3:05 am, (e-mail address removed) (Pascal J. Bourguignon)
wrote:
[snip]
The problem is that C compiler writers don't bother
writting the run-time checks that would detect these bugs,
much less doing the type inference that would be needed to
detecht a small number of them at compilation-time.
You seem to be taking the opinion that compilers should
catch all undefined behavior.
Not necessarily ALL the implementations (compilers or
interpreters), but there should be such implementations, and
those should be the implementation used most of the time,
because most of the time, C++ programs are mere application
programs that would benefit much more from  run-time checking
than from fast instructions (the more so on modern processors,
where it's pointless to go fast in the processor, since you
always are waiting on the RAM).

I think that there are some implementations.  At least in the
past, CenterLine caught most cases of undefined behavior.  I
don't know what its current status is, but it is still being
sold.  (http://www.ics.com/products/centerline/objectcenter/,
for more information.)

I agree with you that such a compiler should be the default and
usually used compiler.  I have the impression, however, that we
are in a very small minority---at any rate, I don't have the
impression that CenterLine is a market leader.  (ICS, which owns
it, seems to push its GUI expertise and products considerably
more.)

    [...]
Implementations of other programming languages are able to do
so, why not implementations of C++?  It's perfectly reasonable
to expect it, and as a user of C++, I'd rather use such an
implementation for 100% of my C++ development, and 99% of my
C++ program deployment.

Implementations of C++ are capable of doing a lot more than they
do.  Apparently, the market doesn't want it.  

My take is that most C++ compiler vendors were/are attempting
to ride 20th century business models. They have not adapted
to an on line model and what worked well for decades is now
tanking. I've said it before, but I think there are only
two C++ compilers with minor on line support. Comeau hasn't
made much progress in it's on line support in years. They work
on adding new functionality to their existing products, but
not, from what I can tell, in reworking their products to
beef up their on line support. There are some dinosaurs out
there company-wise that are being punished now for not
understanding the times ten years ago, let alone today.
Your remark about reading what the market wants is related to
on line products as well. In the past with very limited
feedback, vendors have to try to figure out what they should
do next. With an on line approach there's much more concrete
information on which to base product development decisions.

(Should we
conclude that C++ programmers don't care about quality, or
programmer productivity?)

Some programmers care primarily about money and only about
quality because it may affect how much money they make.
I'm thinking of that Russian guy who may have stolen a
bunch of software from Goldman Sachs.


Brian Wood
Ebenezer Enterprises
www.webEbenezer.net
 
J

James Kanze

[snipped discussion about run-time error when OP accessed
uninitialized memory. OP complained that C++ compiler (gcc)
could not detect this even though its warning level was set to
highest]
You seem to be taking the opinion that compilers should
catch all undefined behavior. C++ is not Java. C++'s stated
primary design goals include
- runtime performance comparable with assembly
- don't pay for what you don't use
- portable
- easy to write code / programmer productivity (with less relative
emphasis on this one IMHO)
With these design goals in mind, it is not reasonable to
expect a compiler to catch all possible undefined behavior
or errors. To do that would necessarily restrict the
language so that it's less comparable to assembly in speed
and/or you start paying for things you don't use.
That's not strictly true. Both the C and the C++ standards
were designed so that all undefined behavior can be caught.
really? Where does it say that? Do you mean at compile time or
at run-time?

At run-time, at the latest. (I think that there is some which
can't be detected at compile time. But probably a lot less than
one might think---compilers have gotten quite good at tracing
intermodular code flow.) And it's scattered throughout the
standard. Mostly in the form of "undefined behavior"---the
behavior is undefined precisely so that a checking
implementation can trap it.
I'd always thought about half of UB was in the spec precisely
because it was too hard to detect.

Too hard, no. Too expensive, perhaps: to catch all pointer
violations, you need "fat" pointers---each pointer contains a
current address, plus the limits, each modification of the
pointer value verifies that the current address stays in the
limits, and each access through the pointer verifies that it
isn't using the end pointer (and that the pointer isn't null,
but most hardware traps this already today).

Of course, a good compiler could eliminate a certain number of
these checks, or at least hoist them outside of a loop. But I
don't think it could easily avoid the fact that the size of a
pointer is multiplied by three, which makes things like copying
significantly more expensive, and can have very negative effects
on locality.
The other half was hardware stuff things like what the modulo
operator does with negative numbers

That's unspecified, not undefined behavior.

Takes a pointer. If the pointer contains the bounds, then it
can easily check.
ITYM detecting the access of uninitialized memory through
aliasing at compile time is equivalent to the Halting Problem.
I can't quite work out how to break a fat-pointer
implementation but can't you do some very nasty things with
printf("%p") and scanf ("%p")?

Given that the standard makes these implementation defined, I
don't think so. It might make detecting undefined behavior
expensive, however. About the only way I think that an
implementation could determine that the value read by
scanf("%p") is "a value converted eariler during the same
program execution" is by saving all of the pointers output by
printf("%p") somewhere. (Inputting any other value is undefined
behavior.)
 
J

James Kanze

[ ... ]
[...]
Also, various versions MSVC do have optional runtime bounds
checking and other runtime checking.
Good!
But only in the standard library, I think.
Not so -- recent versions have flags to tell it to include
runtime checks in your code. A short description is available
at:

Well, it's a start, although it's still very limited.
[ ... ]
Arrays in C are very poorly designed, and C++ has inherited
this. In order to do full run-time checking, you need fat
pointers. Which not only slows the code down considerably,
but also breaks the ABI. If you rigorously avoid C style
arrays, and only use std::vector, g++ does run-time check.
(But as soon as you do something like &v, all bets are
off with regards to the resulting pointer.)

Interestingly, the run-time checks provided by MS VC++ have
almost exactly the same limitation in one respect -- they can
track (to a degree) whether you use uninitialized variables,
but taking the address is treated as equivalent to
initialization.

Which wasn't really what I was talking about. My point was that
having done &v, you have a raw pointer, on which you can do
pointer arithmetic, and access out of bounds without checking,
even if the implementation checks in vector<>::eek:perator[].

But the documentation on how VC++ detects uninitialized
variables did seem wierd to me. For a runtime check, I'd have
just associated an additional flag somewhere, setting it when
the variable was set, and checking it otherwise. Perhaps the
problem is that if initialization occurs through a pointer, the
compiler can't generate the code to update the flag, so if the
address is taken (which would allow such initialization), it
just gives up.

Come to think of it, that's very likely the reason. The obvious
way of being able to find the flag through the pointer would
change the size or the range of the data type. And the other
ways I can think of are fairly complex, and probably rather
expensive in run-time. And you're right that that's sort of the
same problem as with &v---the use of a pointer causes the
code to "loose" any associated data.
 
J

James Kanze

On Jul 15, 1:39 am, (e-mail address removed) (Pascal J. Bourguignon)
wrote:

[...]
I would very much want
every compiler out there to use "fat" pointers and other techniques to
catch all undefined behavior, either at compile time or runtime. I
also very much want this to be entirely optional, and for it to be
expressly stated that no "good" C++ program should depend upon such
checks; they should exist only as "terminate the process" asserts
only.

The "option" is tricker than you seem to realize. Anything
which changes the size of an object (e.g. fat pointers vs.
normal pointers) breaks the ABI. You can't link object files
compiled with different options. (Or maybe you can link, but
the resulting program will just crash.)

Note that this is already the case with compilers which provide
"debugging" versions of std::vector and others. The debugging
changes the size and the behavior of std::vector, and mixing
code with and without debugging causes core dumps.
Agreed, and with the current state of the C++ industry, C++ is
not the best language for every situation. Perhaps Java is be
more useful for most programs.

Only if you can accept a fairly low level of robustness.
 
B

Bo Persson

Michael said:
Well, I am no expert on Ada, but I had a look on Ada 2005 when
searching for other languages to learn and wrote only some simple
programs. I finally changed to Haskell and Ocaml just to learn some
new
principles of programming.

Anyway the Ada people claim, that a lot of these checks can be
optimised out by the compiler and the remaining ones are rather
inexpensive.

This was designed into the language from the beginning, so Ada arrays
know their size so you can iterate over a'range. No need to range
check.

for i in a'range loop
a(i) :=1; -- always in range
end loop;


Also, the index type can be a subtype restricted to the allowed range
of the array type. As the index then just cannot be out of range,
there is no need to check a(i).


In C++ we have it differently.


Bo Persson
 
P

Pascal J. Bourguignon

Bo Persson said:
This was designed into the language from the beginning, so Ada arrays
know their size so you can iterate over a'range. No need to range
check.

for i in a'range loop
a(i) :=1; -- always in range
end loop;


Also, the index type can be a subtype restricted to the allowed range
of the array type. As the index then just cannot be out of range,
there is no need to check a(i).


In C++ we have it differently.

Notice that with intensive use of classes, we could archive similar results:

template <int MIN,int MAX>class Integer{
int value;
public:
Integer(int aValue){rangeCheck(MIN,MAX,aValue);value=aValue}
// operators...
};

std::vector<X> v;
for(Integer<0,v.size()-1> i=0;i<v.size()-1;i++){
v; // no check needed.
}


Ok, perhaps a more intelligent compiler and some work on the syntax is
needed, but you get the idea.


Now, perhaps it might be slightly easier to write: for i in a'range...
 
I

Ian Collins

Pascal said:
Bo Persson said:
This was designed into the language from the beginning, so Ada arrays
know their size so you can iterate over a'range. No need to range
check.

for i in a'range loop
a(i) :=1; -- always in range
end loop;


Also, the index type can be a subtype restricted to the allowed range
of the array type. As the index then just cannot be out of range,
there is no need to check a(i).


In C++ we have it differently.

Notice that with intensive use of classes, we could archive similar results:

template <int MIN,int MAX>class Integer{
int value;
public:
Integer(int aValue){rangeCheck(MIN,MAX,aValue);value=aValue}
// operators...
};

std::vector<X> v;
for(Integer<0,v.size()-1> i=0;i<v.size()-1;i++){
v; // no check needed.
}


Or use tr1::array and iterators. This is closer to the Ada concept of
the array knowing its size.
 
J

Joshua Maurice

On Jul 15, 1:39 am, (e-mail address removed) (Pascal J. Bourguignon)
wrote:

    [...]
I would very much want
every compiler out there to use "fat" pointers and other techniques to
catch all undefined behavior, either at compile time or runtime. I
also very much want this to be entirely optional, and for it to be
expressly stated that no "good" C++ program should depend upon such
checks; they should exist only as "terminate the process" asserts
only.

The "option" is tricker than you seem to realize.  Anything
which changes the size of an object (e.g. fat pointers vs.
normal pointers) breaks the ABI.  You can't link object files
compiled with different options.  (Or maybe you can link, but
the resulting program will just crash.)

Indeed. I mentioned exactly this in the exact same post you quote.
Note that this is already the case with compilers which provide
"debugging" versions of std::vector and others.  The debugging
changes the size and the behavior of std::vector, and mixing
code with and without debugging causes core dumps.

I just wish they actually caught all errors / undefined behavior in a
systematic fashion instead of in the current half-hazard way.
Only if you can accept a fairly low level of robustness.

This intrigues me. If you elaborate or point me to articles, I'd love
to read up on this. IMHO, I could probably write an application faster
in C++ and have it be "more correct" (aka less testing / bug-fixing
time), but the same probably isn't true of the average developer. I'm
just curious how you're defining "robustness". Are we talking real-
time? Or correct in the face of errors? Stuff like how it's easier to
leak file handles, other non-memory resources? Or how it's exceedingly
annoying and difficult to write correct code in the face of
"dispose" / "close" / "release" calls which can throw exceptions? Or
are we talking about how it's impossible to write correct code in the
face of asynchronous exceptions?
 
J

Jerry Coffin

[ ... ]
Well, it's a start, although it's still very limited.

Beyond a doubt -- I certainly didn't intend to imply that it was any
sort of panacea. At the same time, I suspect for some types of code
it's _really_ helpful.

[ ... ]
Which wasn't really what I was talking about. My point was that
having done &v, you have a raw pointer, on which you can do
pointer arithmetic, and access out of bounds without checking,
even if the implementation checks in vector<>::eek:perator[].


Right -- my point wasn't that the checks were the same, or anything
like that, just that (interestingly enough) taking an address happens
to break both.
But the documentation on how VC++ detects uninitialized
variables did seem wierd to me. For a runtime check, I'd have
just associated an additional flag somewhere, setting it when
the variable was set, and checking it otherwise. Perhaps the
problem is that if initialization occurs through a pointer, the
compiler can't generate the code to update the flag, so if the
address is taken (which would allow such initialization), it
just gives up.

That's my guess. For code that knows to do so, updating the flag is
easy -- but if you pass the address to the OS (for example) to read
from a file into a buffer, it's going to take a (substantial) update
to the ABI for the OS to find and update the flag appropriately.
Basically, you wouldn't be able to pass raw addresses to the OS
anymore -- you'd have to use some sort of fat pointer. I'm not sure
that's an entirely bad idea either, but I'm afraid in the open market
such an OS would tend to disappear without a trace. Too much emphasis
on still placed on raw speed for such things to survive.

OTOH, I'd almost bet that the little bit MS has done basically came
from the OS side of the house -- specifically, I'd almost bet that
some bright boy (and I do NOT mean that pejoratively at all) thought
about the number of times they've run into problems from simple
buffer overruns and such, and thought that since the programmers
weren't preventing or catching such errors dependably, it would be a
good idea to see how much they could do in the compiler instead.
 
N

Nick Keighley

this was originally on comp.lang.c++
but discussions about Undefined Behaviour seem on-topic to comp.lang.c
as well


[snipped discussion about run-time error when OP accessed
uninitialized memory. OP complained that C++ compiler (gcc)
could not detect this even though its warning level was set to
highest]
You seem to be taking the opinion that compilers should
catch all undefined behavior. C++ is not Java. C++'s stated
primary design goals include
- runtime performance comparable with assembly
- don't pay for what you don't use
- portable
- easy to write code / programmer productivity (with less relative
emphasis on this one IMHO)
With these design goals in mind, it is not reasonable to
expect a compiler to catch all possible undefined behavior
or errors. To do that would necessarily restrict the
language so that it's less comparable to assembly in speed
and/or you start paying for things you don't use.
That's not strictly true.  Both the C and the C++ standards
were designed so that all undefined behavior can be caught.

this surprised me

At run-time, at the latest.  (I think that there is some which
can't be detected at compile time.  But probably a lot less than
one might think---compilers have gotten quite good at tracing
intermodular code flow.)  And it's scattered throughout the
standard.  Mostly in the form of "undefined behavior"---the
behavior is undefined precisely so that a checking
implementation can trap it.

I thought a fair amount of Undefined Behaviour was implicit.
Thta is no behaviour was defined therefore the behaviour was
undefined; rather there being an explicit statement that
"this behaviour is not defined". I'm pretty sure this is true
of C if not C++.

Too hard, no.  Too expensive, perhaps: to catch all pointer
violations, you need "fat" pointers---each pointer contains a
current address, plus the limits, each modification of the
pointer value verifies that the current address stays in the
limits, and each access through the pointer verifies that it
isn't using the end pointer (and that the pointer isn't null,
but most hardware traps this already today).

Of course, a good compiler could eliminate a certain number of
these checks, or at least hoist them outside of a loop.  But I
don't think it could easily avoid the fact that the size of a
pointer is multiplied by three, which makes things like copying
significantly more expensive, and can have very negative effects
on locality.


That's unspecified, not undefined behavior.


Takes a pointer.  If the pointer contains the bounds, then it
can easily check.

<snip>
 
J

James Kanze

this was originally on comp.lang.c++ but discussions about
Undefined Behaviour seem on-topic to comp.lang.c as well

Given that C++ just takes over the C definition here.
this surprised me

On thinking about it, I probably overstated it. The context of
the discussion was things like array bounds and pointer errors,
and that's really what I had in mind. Although I think things
like i = ++i are catchable, I don't think that the intent of
making it undefined was to allow it to be caught at runtime.

There are still large categories of behavior which is undefined
expressedly to allow an implementation to catch it; arithmetic
overflow and array bounds and pointer errors are in this
category.
I thought a fair amount of Undefined Behaviour was implicit.
Thta is no behaviour was defined therefore the behaviour was
undefined; rather there being an explicit statement that "this
behaviour is not defined". I'm pretty sure this is true of C
if not C++.

I don't think so. I think that almost all of the cases of
undefined behavior are explicitly stated as such. I think that
the rule of undefined behavior when the standard doesn't say
anything is mainly there to catch oversights. What "undefined
behaviors" did you have in mind?

(The typical examples of undefined behavior are all explicitely
stated as undefined: pointer and array bounds errors in the
specifications of the various operators on pointers, things like
i=++i in the header text for the Expressions section, illegal
operands to functions in the introductory text of the Library
section, and violations of what C++ calls the one definition
rule in section 3.2 in C++, and in section 6.2.7 in C.)
 
J

James Kanze

[ ... ]
Which wasn't really what I was talking about. My point was
that having done &v, you have a raw pointer, on which you
can do pointer arithmetic, and access out of bounds without
checking, even if the implementation checks in
vector<>::eek:perator[].

Right -- my point wasn't that the checks were the same, or
anything like that, just that (interestingly enough) taking an
address happens to break both.

Yes. The problem is that raw pointers are, well, very raw. And
that in C and C++, you have to use them in contexts where you
really shouldn't, since arrays convert to a pointer at the drop
of a hat, and array indexing is defined in terms of pointer
arithmetic. (Basically, pointer arithmetic should be reserved
for very low level code, like that inside malloc, and not be
used elsewhere. In C or C++, however, you often don't have the
choice.)
 
J

James Kanze

On Jul 16, 2:09 am, James Kanze <[email protected]> wrote:

[...]
This intrigues me. If you elaborate or point me to articles,
I'd love to read up on this. IMHO, I could probably write an
application faster in C++ and have it be "more correct" (aka
less testing / bug-fixing time), but the same probably isn't
true of the average developer.

I don't think it's possible in Java to reach the level of
robustness I generally require, regardless of the developer.
There are too many things that simply aren't possible, like
programming by contract (which means executable code in the
"interface", which Java doesn't allow), or RAII. Or simply
being able to abort on an assertion failure.

I think that there are pre-processors which resolve some of
Java's problems in this respect---I've heard of one for
programming by contract, for example, and I'd like to see
something like ESC/Java for C++ (and a couple of quick checks on
the web suggest that Java is evolving to address these
problems).
I'm just curious how you're defining "robustness".

Vaguely:). Basically, just that the code is known to be
correct, to a certain point, and that any errors will be
promptly detected and can easily be fixed.
Are we talking real- time?
No.

Or correct in the face of errors?

Possibly. If you accept that the correct behavior in the case
of a programming error is to abort (which is usually a
requirement in my work), then it's impossible to write code with
this behavior in Java.
Stuff like how it's easier to leak file handles, other
non-memory resources? Or how it's exceedingly annoying and
difficult to write correct code in the face of "dispose" /
"close" / "release" calls which can throw exceptions?

Partially. The lack of RAII does make certain types of errors
more likely, or harder to prevent (and missing finally blocks
are a common error in Java).
Or are we talking about how it's impossible to write correct
code in the face of asynchronous exceptions?

I'm not too sure what you mean here. As far as I know, neither
Java nor C++ support what I would call an asynchronous
exception. On the other hand, the fact that you can't guarantee
a function to never raise an exception in Java does mean that
you can't write really exception safe code.
 
E

Eric Sosman

James said:
[...]
I thought a fair amount of Undefined Behaviour was implicit.
Thta is no behaviour was defined therefore the behaviour was
undefined; rather there being an explicit statement that "this
behaviour is not defined". I'm pretty sure this is true of C
if not C++.

I don't think so. I think that almost all of the cases of
undefined behavior are explicitly stated as such. I think that
the rule of undefined behavior when the standard doesn't say
anything is mainly there to catch oversights. What "undefined
behaviors" did you have in mind?
[...]

The Committee's reasons for using three means to declare
behavior "undefined" are unknown to me, but there is no
difference in effect or in quality between "undefined due
to violation," "explicitly undefined" and "undefined by
omission." ISO/IEC 9899:1999, section 4 paragraph 2:

If a ‘‘shall’’ or ‘‘shall not’’ requirement that
appears outside of a constraint is violated, the
behavior is undefined. Undefined behavior is otherwise
indicated in this International Standard by the words
‘‘undefined behavior’’ or by the omission of any
explicit definition of behavior. There is no difference
in emphasis among these three; they all describe
‘‘behavior that is undefined’’.

The final sentence says it all (and says it normatively!):
All three means of un-definition are equivalent. (In C, at
any rate: I don't know That Other Language.)
 
J

James Kuyper

Eric said:
James said:
[...]
I thought a fair amount of Undefined Behaviour was implicit.
Thta is no behaviour was defined therefore the behaviour was
undefined; rather there being an explicit statement that "this
behaviour is not defined". I'm pretty sure this is true of C
if not C++.

I don't think so. I think that almost all of the cases of
undefined behavior are explicitly stated as such. I think that
the rule of undefined behavior when the standard doesn't say
anything is mainly there to catch oversights. What "undefined
behaviors" did you have in mind?
[...]

The Committee's reasons for using three means to declare
behavior "undefined" are unknown to me, but there is no
difference in effect or in quality between "undefined due
to violation," "explicitly undefined" and "undefined by
omission." ISO/IEC 9899:1999, section 4 paragraph 2:

If a ‘‘shall’’ or ‘‘shall not’’ requirement that
appears outside of a constraint is violated, the
behavior is undefined. Undefined behavior is otherwise
indicated in this International Standard by the words
‘‘undefined behavior’’ or by the omission of any
explicit definition of behavior. There is no difference
in emphasis among these three; they all describe
‘‘behavior that is undefined’’.

The final sentence says it all (and says it normatively!):
All three means of un-definition are equivalent. (In C, at
any rate: I don't know That Other Language.)

The C++ standard does not mention "shall" as a method of indicating
undefined behavior. Section 1.3.13 says "Undefined behavior may also be
expected when this International Standard omits the description of any
explicit definition of behavior.", which strikes me as bad wording - the
phrase "may ... be expected" reflects and reinforces the misconception
that "undefined behavior" refers to a specific type of undesireable
behavior.

I think that the C++ wording provides more support for James Kanze's
opinion that the C wording does.
 
N

Nick Keighley

Given that C++ just takes over the C definition here.

which is why I thought it was a legitimate x-post


On thinking about it, I probably overstated it.  The context of
the discussion was things like array bounds and pointer errors,
and that's really what I had in mind.  Although I think things
like i = ++i are catchable, I don't think that the intent of
making it undefined was to allow it to be caught at runtime.

ah, that what I was disputing. Or rather that it was took me by
surprise I'd always kind of assumed they bunged in UB just to make
the implementor's job easier.
 
J

Joshua Maurice

I'm not too sure what you mean here.  As far as I know, neither
Java nor C++ support what I would call an asynchronous
exception.  On the other hand, the fact that you can't guarantee
a function to never raise an exception in Java does mean that
you can't write really exception safe code.

http://java.sun.com/docs/books/jls/first_edition/html/11.doc.html

details asynchronous exceptions. I haven't thought it through
thoroughly enough, but at the very least it would be in practice
impossible to write correct code in the face of asynchronous
exceptions, and may be in-fact impossible to write it. Depends on
exactly what they call "a transfer of control" and "statements" in
regards to when an asynchronous exception can be raised. However, this
is getting a little off topic, so I guess I'll leave it at that.
 
J

Jerry Coffin

"Nick Keighley" <[email protected]> ha scritto nel messaggio
this was originally on comp.lang.c++
but discussions about Undefined Behaviour seem on-topic to comp.lang.c
as well

in a cpu can not be undefinited behaviour because
if the cpu is in the state X and it read the instruction "a"
the result will be always the state X'

the same for the couple cpu-os
of the cpu-os is in the state XX and it read the instruction "a"
the result will be always the state XX'

UB exist only in the standards

Not really.

First of all, some CPUs have instructions that cause undefined
results -- and while on a _specific_ CPU, the result of execution may
be predictable, different versions of the CPU, down to and including
different steppings, may give different behavior for that
instruction.

In other cases, the behavior even on a single CPU could be
unpredictable -- just for example, Intel has included a thermal diode
in some of their CPUs that's intended as high quality (albeit slow)
source of truly random numbers. While there are certainly defined
ways to access that diode, it's entirely possible that executing some
undefined instruction could do so as well -- and at least part of the
result state after doing so could be entirely unpredictable.
 
J

Jerry Coffin

[ ... ]
i not speak about standards, i speak about a real cpu
if one 386 cpu of state X(eax=1, ebx=19, ecx=20 ...)
read the binary of "add eax, ebx"
the result will be always the state of cpu
X'(eax=20, ebx=19, ecx=20 ...)

Yes, but what if what's executed is 'add eax, [ebx]' instead? If ebx
happens to point to uninitialized memory, you don't know what you'll
get in eax, and it will probably vary from one invocation of the
program to the next. If ebx starts out set to zero (or another small
number, typically anything less than 4 million or so) quite a few
OSes will detect that you're accessing an illegal address, and halt
the program with some sort of error message about it doing something
illegal (of course, the exact message varies between OSes).
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

No members online now.

Forum statistics

Threads
473,769
Messages
2,569,582
Members
45,062
Latest member
OrderKetozenseACV

Latest Threads

Top