marcus said:
how come this snippet has got an output like:
12297359.000000
12297359.000000
12297375.000000
12297375.000000
12297390.000000
12297390.000000
12297406.000000
.
.
while (1)
printf("%f\n", timeGetTime());
why does it sometimes claim that no time has passed and sometimes
claim that about 16 milliseconds has past?
Because the time couting chip in your computer has
a resolution of 16 milliseconds only?
It should be millisecond
resolution on it
It has milliseconds resolution.
All the time values are in the unit 'milliseconds'.
But nowhere the documentation says that the value will
increase with 1 millisecond
(ok I know windows is not a realtime operating
system,
'realtime operating system' has nothing to do with it.
but anyway I think it should do better than this I was
planning to do some profiling on my code)
You can do it.
Execute the code in question 16 times and divide the resulting
time by 16 and you get an accuracy of 1 millisecond. (Well
sort of, if the process didn't get swapped or interrupted
and nothing else is happening on your machine besides running
your program. You get the idea)