T
tconkling
I have an if statement that looks like this:
if(foo(&x) && x > y)
...
where the value of x is modified by foo, and the comparison between x
and y only makes sense after x has been modified by foo (and, of
course, if foo returns true). Am I guaranteed (assuming my compiler
generates correct code) that x > y is evaluated after foo(&x) returns?
Assuming things work the way I think they do, is it considered bad form
to write code like this? It saves me from doing something like the
following, which I think is ugly-looking:
if(foo(&x))
{
if(x > y)
{
...
}
}
Thanks,
Tim
if(foo(&x) && x > y)
...
where the value of x is modified by foo, and the comparison between x
and y only makes sense after x has been modified by foo (and, of
course, if foo returns true). Am I guaranteed (assuming my compiler
generates correct code) that x > y is evaluated after foo(&x) returns?
Assuming things work the way I think they do, is it considered bad form
to write code like this? It saves me from doing something like the
following, which I think is ugly-looking:
if(foo(&x))
{
if(x > y)
{
...
}
}
Thanks,
Tim