R
Rouben Rostamian
Consider the following illustrative program:
#include <stdio.h>
double f(double x)
{
return x*x;
}
double g(double x)
{
puts("hello world");
return x*x;
}
int main(void)
{
double s = 0.0;
int i;
for (i=0; i<10000; i++)
s += f(2.0); /* f(2.0) is always 4.0 */
for (i=0; i<10000; i++)
s += g(2.0); /* g(2.0) is always 4.0 */
printf("s = %g\n", s);
return 0;
}
Does standard C provide a mechanism whereby one can reassure the
compiler that the function f() produces no side-effects while the
function g() has side-effects?
If such a mechanism existed, then an optimizing compiler might replace
f(2.0) by 4.0 to avoid 10000 function calls, but it wouldn't replace
g(2.0) by 4.0.
Something in the back of my mind tells me that such a mechanism exists
but I may be (a) just imagining it; (b) thinking of a non-standard
compiler; or (c) a language other than C.
I would appreciate it if someone would set me straight.
#include <stdio.h>
double f(double x)
{
return x*x;
}
double g(double x)
{
puts("hello world");
return x*x;
}
int main(void)
{
double s = 0.0;
int i;
for (i=0; i<10000; i++)
s += f(2.0); /* f(2.0) is always 4.0 */
for (i=0; i<10000; i++)
s += g(2.0); /* g(2.0) is always 4.0 */
printf("s = %g\n", s);
return 0;
}
Does standard C provide a mechanism whereby one can reassure the
compiler that the function f() produces no side-effects while the
function g() has side-effects?
If such a mechanism existed, then an optimizing compiler might replace
f(2.0) by 4.0 to avoid 10000 function calls, but it wouldn't replace
g(2.0) by 4.0.
Something in the back of my mind tells me that such a mechanism exists
but I may be (a) just imagining it; (b) thinking of a non-standard
compiler; or (c) a language other than C.
I would appreciate it if someone would set me straight.