Hi there.
If I make a function in c (I acually use gnu right now), is there any
way to find out how many clocksycluses that function takes?
("Clocksycluses?" After some puzzlement the light dawned:
I *think* what you mean is the two-word phrase "clock cycles."
However, the word "clocksycluses" has a certain fascination,
and people may take it up and start using it. You may have
gained immortality by enriching the lexicon!)
C provides the clock() function, which returns the amount
of CPU time consumed by your program since some fixed arbitrary
moment. You use it like this:
#include <stdio.h>
#include <time.h>
...
clock_t t0, t1;
t0 = clock();
do_something();
t1 = clock();
printf ("Used %g CPU seconds\n",
(t1 - t0) / (double)CLOCKS_PER_SEC);
If I divide some numbers etc Var1 = Var2/Var3, is it a fix amount of
clocksycluses that is been used for that division, or does it varies?
There are at least two problems here. First, the Standard
says nothing about how precise the clock() measurement is, how
rapidly the clock "ticks." On typical systems, the "tick rate"
is somewhere between 18Hz and 1000Hz; 100Hz is a fairly common
value. What this means is that clock() is probably too coarse-
grained to measure the execution time of a few instructions or
even a few tens of instructions; the measured time for something
as short as one division will probably be zero.
The second problem is that the C language says nothing about
how much time various operations take. On actual machines, the
time taken for your division will probably be affected by many
influences, such as
- Operand type: floating-point divisions and integer
divisions might run at different speeds
- Operand values: dividing by a denormal might take more
or less time than dividing by a normalized value
- Operand location: there's probably a cascade of different
places the operands might reside (CPU, various caches,
main memory, swap device), all with different speeds
- Interference: the division might compete with other
operations for scarce resources like pipelines, floating-
point units, internal CPU latches, and whatnot
.... and, of course, many more. Modern computers are complicated
systems, and it is all but meaningless to speak of "the" amount
of time a single operation takes.