how do I know how long my function call took?

K

Kevin Schultz

Hello all. I am interfacing my computer to the outside world for an
experiment and we would like to know how much time the calling function
takes. The specs say this should happen on the order of a microsecond,
which is faster than we need. We would like to call 30 or so of these
functions in 0.5 milliseconds and before committing to purchasing this
interface system, we would like to make sure it fits our needs. However,
we are having difficulties timing how long our functions take. We are
running Microsoft Visual Developer (using C) v 6.0 on a computer running
Win98. We tried something like the following

#include <time.h>
clock_t start, end;
double duration;
start=clock();
my_function_to_test();
stop=clock();
duration=(stop-start)/CLOCKS_PER_SEC;

It has become apparent that the resolution of clock() is only on the
order of a second. I have tried looking at the C FAQ and Google to find
something that would give me better resolution, but have failed. We are
thinking of looping our test function thousands or millions of times and
dividing the total duration by the total number of loops, but I would
like to get a better idea how long each function will take.

Thanks in advance

Kevin
 
G

Gordon Burditt

Hello all. I am interfacing my computer to the outside world for an
experiment and we would like to know how much time the calling function
takes.

A function call always takes at least two moments (a term *NOT*
defined by the ANSI C standard). That way, another program (if
there is one) has a chance to do something else in between them,
usually with the effect of messing things up.
#include <time.h>
clock_t start, end;
double duration;
start=clock();
my_function_to_test();
stop=clock();
duration=(stop-start)/CLOCKS_PER_SEC;

clock_t is probably an integer type.
The above division is INTEGER DIVISION!

duration=(stop-start)*1.0/CLOCKS_PER_SEC;
It has become apparent that the resolution of clock() is only on the
order of a second.

Maybe, maybe not, but your division lost the fractional part, if
there ever was one.

Gordon L. Burditt
 
K

Kevin Schultz

Gordon said:
A function call always takes at least two moments (a term *NOT*
defined by the ANSI C standard). That way, another program (if
there is one) has a chance to do something else in between them,
usually with the effect of messing things up.

Thanks for the quick response, but what do you mean as a moment (aside
from each person's subjective notion of time).
 
S

SM Ryan

# It has become apparent that the resolution of clock() is only on the
# order of a second. I have tried looking at the C FAQ and Google to find

Given some of the hardware out there, be happy it's that small.

# something that would give me better resolution, but have failed. We are
# thinking of looping our test function thousands or millions of times and
# dividing the total duration by the total number of loops, but I would
# like to get a better idea how long each function will take.

You probably have system specific clock functions with millisecond,
microsecond, or nanosecond resolutions. Not ANSI C, but probably
available anyway.

Note that with virtual memory and multiprocessing, high precision clocks
can wobble quite a bit even with the same code and data.
 
P

pete

Kevin said:
Hello all. I am interfacing my computer to the outside world for an
experiment and we would like to know how much time
the calling function
takes. The specs say this should happen on the order of a microsecond,
which is faster than we need. We would like to call 30 or so of these
functions in 0.5 milliseconds and before committing to purchasing this
interface system, we would like to make sure it fits our needs.
However,
we are having difficulties timing how long our functions take. We are
running Microsoft Visual Developer (using C) v 6.0 on a computer running
Win98. We tried something like the following

#include <time.h>
clock_t start, end;
double duration;
start=clock();
my_function_to_test();
stop=clock();
duration=(stop-start)/CLOCKS_PER_SEC;

It has become apparent that the resolution of clock() is only on the
order of a second. I have tried looking at the C FAQ and Google to
find something that would give me better resolution, but have failed.
We are
thinking of looping our test function thousands or millions of times
and
dividing the total duration by the total number of loops, but I would
like to get a better idea how long each function will take.

Time your thousand million loops with the function call,
and then time a thousand million empty loops and subtract
the empty loop time from call loop time.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

No members online now.

Forum statistics

Threads
473,769
Messages
2,569,580
Members
45,055
Latest member
SlimSparkKetoACVReview

Latest Threads

Top