Benefit of not defining the order of execution

U

user923005

[snip]



I guess that I really cannot see what the fuss is about because I do
not seem to grasp the actual issue at hand.

See the reply by Kaz.  Your examples have a point - they deal
with the reordering of the course of computations of well defined
expressions.  The goal typically is to achieve greater accuracy
or more efficient code with the same end result.  While this is
important and of interest, it's not the subject of discussion -
at least what I understand it to be.

What we are talking about is things like what happens between
sequence points.  Tim's original example is the line

    a = f(g(x),h(y));

This code could have different results depending on whether the
compiled code evaluates g(x) first or h(y) first.  This could
happen in a lot of ways; I'm sure you could think of half a dozen
without much effort.

If the expression/statement is evaluated left to right this cloud
of possible results collapses to a single result.  Is this
collapse a good thing; I argue that it is, Tim that it is not.

In C there are issues having to do with assignment being an
operator and with the comma operator.  The pre and post increment
operators add a source of confusion.  For example, consider

    x[++i] = i;

One possible evaluation rule is to rule the right hand side first
and the left hand side second; another is to evaluate left to
right.  The two rules give different results.  Which is better?
The C standard (IIANM) just says "undefined".

Does this clarify things?

I guess that one purpose is to take examples of currently undefined
behavior and force a definition.
For example:
x[++i] = i;
would currently result in undefined behavior. I personally think that
creating a definition for an operation of this nature is not
necessarily a good thing.
x[i++] += i++ * ++i / (x[i++] + x[++i]); /* Oh joy, I can't help
but look forward to debugging things like this. */

This one is not as clear cut to me:
a = f(g(x),h(y));
I can see demanding that the commas operate both as function call
separators and as sequence points so that this would be equivalent:
q = g(x);
r = h(x);
a = f(q,r);

On the other hand, what if the the function call paramters are
expressions? In that case, I am not sure what sort of action makes
sense.
Consider:
a = f(++i, i++, ++i, i++); // now what?

I think that if we want to add a sequence to define a more exact
meaning it has to make sense generically.
 
G

Guest

There is a more general software engineering issue here.  In
practice there are three main routes to creating robust software
- coding practice, testing, and use of trusted components.

and formal verification?
 
G

Guest

<snip>

Ah, now we are getting somewhere.  If the compiler running on a 68K chip
optimizes and does the math with the 68K chip but I'm asking to generate code
for a i386 chip and expect the math to be done with a i386 chip

I don't follow this. Why is the compiler doing arithmatic?
In order to reorder the expression? If its generating code for the
i386 wouldn't you expect it to have a model of the i386, regardless
of what processor the compiler was running on?

I may well be
disappointed.  That is why programmers need to take a class in numeric
analysis

they certainly need one of those if they're are doing floating point
calculations.

"I found a bug in the compiler 10.0/3*3 doesn't equal 10.0!"

scarily he was doing billing software...
 
G

Guest

<snip>

[Kaz is arguing against C's undefined order of evaluation]

I'm not quite convinced, but I'm getting there

Apparently several people are of the opinion that having language
semantics be more deterministic is better than being not as
deterministic, because...  well, just because.

I think Kaz's argument was a little more robust than that
 To that I say, just 'tisn't so.

Even if a clever optimizing compiler could recover (relative to C as
it is now) all the possible parallelism of a C-like language with a
defined left-to-right order of evaluation (and it can't, but never
mind that now), it still isn't automatically better to impose a
left-to-right evaluation order, or any fixed evaluation order.  In
fact, doing that to C expressions would make C a worse language,
not a better one.  Here's why.

If I see a piece of C code like, for example,

   a = f( g(x), h(y) );

then I don't have to look at the definitions for g() and h() to know
they don't interact (by which I mean, interact in any nontrivial
way).  The reason?  If they did interact, the code would have been
written differently, to force a particular order of evaluation.

rubbish. As others have pointed out you assume error-free programmers

Of course, technically I don't know that g() and h() don't interact;
quite

I know only that the person who wrote the code didn't think it was
important to force a particular order of evaluation.

or just didn't think. Most programmers assume left to right
evaluation and would be quite surprised if they were told they
were wrong.

 But knowing
the intent (or in this case, the lack of intent) of the code's
author is just as valuable here, or perhaps more valuable.

I don't see that
 I can
always go look at the definitions for g() and h() to see if they
interact, but I can't go back and look at what the code's author
was thinking.

Now consider the case where the language specifies a left-to-right
evaluation order.  Let's look again at the example line.  Now I have
to wonder if g() and h() interact;

this is crazy! You claim that making the code more predictable
makes the code harder to analyse!

 to find out I have to go read
their definitions.  If they don't interact, I can breathe a big sigh
of relief and go back to reading the function where they were
called.  But suppose they do interact;  in that case I have to
wonder if the interaction was known and deliberate, or whether it
might have been unintentional.  Short of going back and asking the
original author, there really isn't any way of knowing.  Discovering
what the program does and what the program was intended to do has
become a lot more work.

I cannot agree

<snip>
 
I

Ike Naar

rubbish. As others have pointed out you assume error-free programmers


or just didn't think. Most programmers assume left to right
evaluation and would be quite surprised if they were told they
were wrong.

Your programmer might also think that in

u() - v() * w()

v() and w() would be evaluated before u(), because "multiplication has
higher precedence than subtraction".
Now, imposing left-to-right evaluation order would be at least as bad
as leaving the order unspecified.
 
K

Kaz Kylheku

Your programmer might also think that in

u() - v() * w()

v() and w() would be evaluated before u(), because "multiplication has
higher precedence than subtraction".
Now, imposing left-to-right evaluation order would be at least as bad
as leaving the order unspecified.

``Specified in a way that surprises someone'' is unequivocally better
than ``unspecified''.

That programmer would still find that although his expectation was incorrect,
the result that is obtained nevertheless does not vary when the compiler
configuration is changed or when the code is ported to a different compiler.
This is valuable the programmer, so that he's willing to fix his
expectations.

And note that the surprised programmer still had expectations of /an/ order,
just not the defined one.

There is no way to specify evaluation order such that it will conform to
absolutely everyone's expectations, since those expectations are conflicting.
That doesn't constitute rationale for leaving it unspecified.
 
J

jameskuyper

this is crazy! You claim that making the code more predictable
makes the code harder to analyse!

It sounds paradoxical, but it's true. If the order of evaluation is
indeterminate, than discovery of any feature that makes the order of
evaluation matter terminates analysis immediately, with the conclusion
that the code was defective. Imposing a specific order means that
analysis must continue in that case, to determine whether or not the
guaranteed order of evaluation actually produces the desired result.
 
K

Keith Thompson

or just didn't think. Most programmers assume left to right
evaluation and would be quite surprised if they were told they
were wrong.

"Most programmers"? If that statement is based on actual data, that's
interesting (and disappointing); if not, it would be good to see some
actual data.

[...]
this is crazy! You claim that making the code more predictable
makes the code harder to analyse!

(The context was a = f( g(x), h(y) );.)

If you assume a sufficiently competent programmer, you can assume
that g(x) and h(y) don't interact, because if they did then the
programmer wouldn't have written that statement. On the other hand,
"sufficiently competent" may approach "never makes mistakes".
On the other other hand, g(x) and h(y) *shouldn't* interact; if they
do, nailing down the order of evaluation makes the code well-defined,
but still poor style.

[...]
 
K

Kaz Kylheku

(The context was a = f( g(x), h(y) );.)

If you assume a sufficiently competent programmer, you can assume
that g(x) and h(y) don't interact, because if they did then the
programmer wouldn't have written that statement. On the other hand,
"sufficiently competent" may approach "never makes mistakes".
On the other other hand, g(x) and h(y) *shouldn't* interact; if they
do, nailing down the order of evaluation makes the code well-defined,
but still poor style.

I don't agree. It depends on what kinds of abstractions f, g and h
represent. Maybe it's not arithmetic.

node = make_unary_node(pop(parser_stack), pop(parser_stack));

There is no question that pop has a side effect, and that the order
of the parameters to the constructor is critically important.
 
N

Nate Eldredge

I doubt that there is any such data. It sounds plausible,
though, because people read ordinary text left to right.
Presumably they also read code the same way, i.e., by default
they interpret it left to right. This could create the
unconscious assumption that the evaluation is left to right, even
if they know better.

But it's only plausible.

I may be a bad example, but I don't think I would assume this. In
programming, there are constructs that are naturally read right-to-left
as well as left-to-right (assignment, for instance). Intuitively, I
don't feel that evaluation of function arguments is naturally one way or
the other. If I had a situation where the order mattered, I wouldn't
have a clear feeling for which way the language was probably going to do
it, and would have to look it up. (In the case of C, I would find out
that there was no standard, and I'd have to force the ordering with
temporary variables.)
 
R

Richard

I doubt that there is any such data. It sounds plausible,
though, because people read ordinary text left to right.

Yup. And of course that it is left to right in logical conditions in
most languages. This is another example of pedantry and desire to stir
up the waters obliterates common sense.
 
K

Kenny McCormack

I doubt that there is any such data. It sounds plausible,
though, because people read ordinary text left to right.

Yup. And of course that it is left to right in logical conditions in
most languages. This is another example of pedantry and desire to stir
up the waters obliterates common sense.[/QUOTE]

And the tragedy, of course, is that most real world C implementations
evaluate, e.g., function parameters, right-to-left. I'm aware that we
weren't directly talking about function parameters, but it is still
relevant in a general way to the issue.

(As I'm sure most people in this NG know)
 
L

Lew Pitcher

(e-mail address removed) wrote: [snip]
"I found a bug in the compiler 10.0/3*3 doesn't equal 10.0!"

scarily he was doing billing software...

I should hope 10.0/3*3 wouldn't equal 10.0.

Funny, but I would hope that 10.0/3*3 /would/ equal (be within FLT_EPSILON
of) 10.0 (within the bounds of floatingpoint precision). It has to do with
the operator preceence and associativity of mathematical operators.

In this case
10.0 / 3 * 3
consists of operators from the "Multiplicative operators" group, so they all
have equal precedence. As they associate left-to-right, the expression
should be evaluated as if it were written
( 10.0 / 3 ) * 3
and the language even allows for this to be treated "atomically",
and "evaluated as though it were an atomic operation, thereby omitting
rounding errors implied by the source code and the expression evaluation
method" [9989-1999 Chapter 6.5, point 8]
Now (10.0/3)*3 being close to 10.0 is another matter.

Or phrased another way...
Now 10.0/3*3 being close to 10.0 is another matter.

HTH
--
Lew Pitcher

Master Codewright & JOAT-in-training | Registered Linux User #112576
http://pitcher.digitalfreehold.ca/ | GPG public key available by request
---------- Slackware - Because I know what I'm doing. ------
 
J

jameskuyper

Lew said:
(e-mail address removed) wrote: [snip]
"I found a bug in the compiler 10.0/3*3 doesn't equal 10.0!"

scarily he was doing billing software...

I should hope 10.0/3*3 wouldn't equal 10.0.

Funny, but I would hope that 10.0/3*3 /would/ equal (be within FLT_EPSILON

You'll get a better error estimate by using 10.0*FLT_EPSILON. Since
there's two separate floating point computations specified by that
expression, each of which could make it's own contribution to the
total error, you'll need to increase that estimate accordingly.
of) 10.0 (within the bounds of floatingpoint precision). It has to do with
the operator preceence and associativity of mathematical operators.

I understood "equal" to imply the use of the == operator, which does
not perform fuzzy floating point comparisons; therefore, your
"within ..." clause, an expression that does not use as are
irrelevant. C doesn't provide a fuzzy-comparison operator, one has to
be cobbled together using fabs() or equivalent.
In this case
10.0 / 3 * 3
consists of operators from the "Multiplicative operators" group, so they all
have equal precedence. As they associate left-to-right, the expression
should be evaluated as if it were written
( 10.0 / 3 ) * 3
and the language even allows for this to be treated "atomically",
and "evaluated as though it were an atomic operation, thereby omitting
rounding errors implied by the source code and the expression evaluation
method" [9989-1999 Chapter 6.5, point 8]

While it would be convenient if those rounding errors were omitted,
it's certainly nothing you should be expecting, unless the
implementation's documentation gives you reason to do so. Even if
your code contains the line

#pragma STDC FP_CONTRACT ON

contraction is still only permitted, not required.
 
C

CBFalconer

<snip enormous quote>

please trim your posts!

If you had read the last of the 'enormous quote' you would have
found the following:

the reading of which should have removed any urge to make the
top-posted reply.
 
D

dj3vande

If I had a situation where the order mattered, I wouldn't
have a clear feeling for which way the language was probably going to do
it, and would have to look it up. (In the case of C, I would find out
that there was no standard, and I'd have to force the ordering with
temporary variables.)

If I had a situation where the order mattered, I would re-write it so
that it was obvious what order things happened in without having to
look things up.

If the order matters, and it's not obvious when you read the code what
order things happen in, that's not a problem with the language; it's a
problem with the code.


dave
(If you want Java, you know where to find it.)
 
F

Flash Gordon

Richard said:
I doubt that there is any such data. It sounds plausible,
though, because people read ordinary text left to right.
Presumably they also read code the same way, i.e., by default
they interpret it left to right. This could create the
unconscious assumption that the evaluation is left to right, even
if they know better.

But it's only plausible.

There are people who strongly believe that the arguments are evaluated
right to left and pushed on to the stack as this is done. Thus when the
function is called the left most argument is a fixed offset from the
stack pointer making life simple for the compiler.

Sine different people assume different orders (and some know the order
is unspecified as far as the language is concerned) forcing a specific
order on compilers will certainly still leave people getting it wrong
*and* will make some implementations less efficient. Seems like a
loose-loose option to me.
All of this depends upon the kind of analysis. If you are trying
to understand the operation of a piece of code, the default
assumption is that the author got it right. If you are trying to
debug it, the default assumption is that the author got it wrong.

When I'm analysing code for any reason my default assumption is that
there will be a bug somewhere, after all no one is perfect!
 
G

Guest

[I heard someone say...]
Funny, but I would hope that 10.0/3*3 /would/ equal (be within FLT_EPSILON
of) 10.0 (within the bounds of floatingpoint precision). It has to do with
the operator preceence and associativity of mathematical operators.

the terms "within FLT_EPSILON of" and "within the bounds of floating
point
precision" did not appear in my statement for areason.

In this case
  10.0 / 3 * 3
consists of operators from the "Multiplicative operators" group, so they all
have equal precedence. As they associate left-to-right, the expression
should be evaluated as if it were written
  ( 10.0 / 3 ) * 3

do not educate your mother's mother in the extraction the embroyonic
juices
of the bird


<snip>
 
G

Guest

"Most programmers"?  If that statement is based on actual data, that's
interesting (and disappointing); if not, it would be good to see some
actual data.

no data I'm afraid, but many programmers are remarkably poorly
informed about the tools they use.

<snip>
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

No members online now.

Forum statistics

Threads
473,755
Messages
2,569,536
Members
45,007
Latest member
obedient dusk

Latest Threads

Top