C
Charles M. Reinke
I'm using the function clock() to measure the run time of a program so that
I can compare among several different algorithms. My code looks like:
#include <stdio.h>
#include <stdlib.h>
#include <assert.h>
#include <time.h>
int main() {
clock_t start, stop;
double t = 0.0;
/* Start timer */
assert((start = clock())!=-1);
/* Do lotsa fancy calculations */
/* Stop timer */
stop = clock();
t = (double) (stop-start)/CLOCKS_PER_SEC;
printf("Run time: %f\n", t);
return(0);
} /* main */
The question is, does this give me the "real life" time that passes while
the process is excuting, or just the processor time actully used by this
process. Put another way, if I run the exact same code when the machine is
"idle" and again when the processor is being shared by a bunch of other
processes, will the above give me *roughly* the same results or a
significantly longer time for the latter case?
Thanx all!
--
Charles M. Reinke
Georgia Institute of Technology
School of Electrical and Computer Engineering
(404) 385-2579
I can compare among several different algorithms. My code looks like:
#include <stdio.h>
#include <stdlib.h>
#include <assert.h>
#include <time.h>
int main() {
clock_t start, stop;
double t = 0.0;
/* Start timer */
assert((start = clock())!=-1);
/* Do lotsa fancy calculations */
/* Stop timer */
stop = clock();
t = (double) (stop-start)/CLOCKS_PER_SEC;
printf("Run time: %f\n", t);
return(0);
} /* main */
The question is, does this give me the "real life" time that passes while
the process is excuting, or just the processor time actully used by this
process. Put another way, if I run the exact same code when the machine is
"idle" and again when the processor is being shared by a bunch of other
processes, will the above give me *roughly* the same results or a
significantly longer time for the latter case?
Thanx all!
--
Charles M. Reinke
Georgia Institute of Technology
School of Electrical and Computer Engineering
(404) 385-2579