Double Clock Experiment

B

Bill Hobba

pete said:
Keith said:
Bill Hobba said:
news:[email protected]... [...]
In any case, clock() measures CPU time,
not real time (to make this at
least *vaguely* topical in at least one of these newsgroups).

'Real time'????? In physics,
especially relativistic physics, time is what
a clock reads.
Clock accuracy is a statistical thing based on comparisons
with other clocks, astronomical data etc.
At present atomic clocks are the
most accurate.

Ok, the clock() function is intended to measure the CPU time consumed
by a program, rather than (any approximation of) the time that might
be measured by a clock that measures so-called "real time" (such as an
atomic clock, sun dial, or whatever).

And this is why cross-posts between comp.lang.c and
sci.physics.relativity are a bad idea. (I have no idea why
comp.lang.c has been getting cross-posts from alt.magick lately.)

Yes. "Real time" actually means something else in programming.

Ahhhhh. Got it.

Thanks
Bill
 
C

CoreyWhite

A 24 bit counter running at 1kHz will overflow after about 4 1/2 hours.
In my case the maximum error you can get is +/- 1 tick. I guess it ws
my luck that both runs produced the same result. The +/- 1 tick error
is due to the possibility of sampling the counter at in between
transition. Say for example you sample the counter just when it is
incrementing from1000 to 1001. In which case you have a 50% chance of
getting the either 1000 or 1001. But just like your Unix experiment
this says nothing about time travelling but more about sampling theory.

To me what that says is that the moment we sample is a probability and
there is a 50% chance of it being either 1000 or 1001. I think we
would find this probability exists even in atomic clocks. What I meant
by time travel wasn't that the computer was literally time traveling,
but that time was behaving as if it was uncertain about what time it
was.
I can actually construct a set-up that can guarantee the same result
for every run simply by using the same clock source to drive both the
counter and the CPU. In which case the CPU is running in-sync with the
clock regardless of the acuracy of the clock source. Such a setup even
works if you keep varying the clock frequency because the CPU executes
instructions synchronously with the clock.


Think of it this way. If the CPU needs to execute exactly 100
instructions for each round of loop and each instruction executes in
exactly 2 clock cycles then each round of the loop will execute in
exactly 200 clock cycles. Now, when talking about 'clock' here we are
talking about the square wave used to drive the CPU. If we now use this
same square wave as the basis for the CPU to measure time then of
course the CPU will never disagree with its time measurement assuming
nothing else introduces delays or jitter to our instruction stream such
as interrupts.

I believe you that it would always be consistent, but I don't think it
would really measure time. Lets say that it was taking longer to
execute some of the instructions but they still took only one clock
cycle. We would just be counting clock cycles then and not time.
If, like my experiment above, we use two different clock source: one to
drive the CPU and another to drive the counter then what you are
measuring is not "time travel" but simply the relative accuracy between
the square waveforms which can indeed be seen visually if the two
square waves are fed into an oscilloscope. In this case an error can
occur if you happen to sample the counter at a harmonic interval
between the two square waves:

This actually sounds like the same idea that I had when writing the
program for my expirement. I like it :).
I guess it is a matter of philosophy to ask if the accuracy of the
clocks says anything about the nature of time. Even the best atomic
clock NIST has is off by .00000000002 seconds per second. Although,
I'm sure even the acuracy of NISTs clock varies considerably. If we
ran this experiment with one of their clocks, I believe we could get
considerable interest into the experiment because it would show much
more obvious variations in the atomic clock, and would suggest time was
behaving unusually.
 
S

slebetman

To me what that says is that the moment we sample is a probability and
there is a 50% chance of it being either 1000 or 1001. I think we
would find this probability exists even in atomic clocks. What I meant
by time travel wasn't that the computer was literally time traveling,
but that time was behaving as if it was uncertain about what time it
was.

If you are trying to explore the concept of time itself then be aware
that:

1. Currently use the periodic vibrations of excited crystals to measure
time.
2. We measure the period/frequency of the vibrations of excited
crystals in units of time.

Hence what we mean by time is defined recursively. Physics per se
currently does not have any concept of time which relates to the "time"
that we experience. The closest thing is the concept of entropy which
is often used to indicate an "arrow of time". Time is just a concept we
use to measure events. What we are really measuring is the events
themselves: the hand of the clock moving, crystals vibrating, radio
waves propagating. Any real attempt to measure time end up measuring
events defined in relation to time itself - again we end up with a
cyclic/recursive definition.

To be pedantic, in these experiments we are not really measuring
against time. Instead we are measuring against the number of times a
cyrstal oscillates under excitation. So the count value of 1000 does
not really represent "time" but rather that the crystal have oscillated
1000 times since the beginning of the experiment. We only assume it
measures "time" since the oscillations are specified in terms of time:
1kHz = 1000 times each second. So what is a "second": the time it takes
the crystal to oscillate 1000 times -- again we end up with a cyclical
definition.
I believe you that it would always be consistent, but I don't think it
would really measure time. Lets say that it was taking longer to
execute some of the instructions but they still took only one clock
cycle. We would just be counting clock cycles then and not time.

But on a simple CPU like the PIC, indeed on most simple RISC CPUs most
instructions take exactly the same amount of clock cycles to execute.
Hence counting instruction cycles IS a measure of time in terms of
instruction cycles. Besides, you misunderstand me, I am not measuring
against INSTRUCTION CYCLES, I am measuring against CLOCK CYCLES. Indeed
some instruction cycles takes multiple clock cycles to complete but it
doesn't matter. My clock is fed directly to hardware counters from the
clock source, not form the instruction clock. So my setup is a perfect
measure of time against the oscillations of a quartz crystal.

In short, counting clock cycles is how we measure time, even on atomic
clocks we use the vibrations of the atom as the source of the clock
cycle. It's just in my case I'm using the vibrations of a quartz
crystal. If I have an atomic clock to be the clock source of my CPU
then my CPU will never disagree with the atomic clock because the CPU
executes in lock-step with the vibrations of the atomic clock and the
result will always be consistent.
 
R

Richard Bos

Hexenmeister said:
| There was a recent article in either Scientific American or
| American Scientist (I forget which), which indicated that clocks
| are now approaching sufficient precision that it would be impossible
| to synchronize any two of the ultra-precision clocks. Apparently
| on those timescales, any movement of the clocks has noticable relatively
| effects.
There was a recent article in either the New York Times, the Chicago
Tribune,
the London Times or the National Enquirer ( I forget which ) which indicated
that the Pope was an ardent relativist who believed prayers could reach the
throne of God ( i9.0 light years away) no faster than the speed of light.

Recent, as in, last Saturday's issue, I presume?

YMHBTIRL.

Richard
 
A

Archangel

Corey,
Your post is kinda fun and interesting, irrespective of its accuracy.
If you get ahold of an english version of "On the Electrodynamics of
Moving Bodies" (1905AD) Albert Einstein, and peruse it for its
equaltions and explanations, you
will be able to generalize your idea into more interesting vistas.
Even if you had ideal clocks running under ideal conditions there still
exists a finite amount of time for "signal-sync" or communication.
Also, since there (scientifically) exists NO simultaneous NOW and the
infinitely small is as vast as the infinitely large, then the idea of
time travel becomes a bit illusory. This is because there could be NO
ABSOLUTE moment for multiple observers (or yourself at multiples
different "times") and "revisiting" the so-called past would really be
a new time for the observer.

Interesting, but I wish you hadn't mentioned Einstein. Now that idiot
Hexenmeister is likely to put in another ridiculous appearance.

A
 
E

edprochak

Archangel said:
what is Godwin's Law?

A

greg might add more detail but it basically says as the argument in a
thread progresses, once name calling begins, the one that invokes NAZI
references loses. and the argument is over.
 
A

Archangel

Bill Hobba said:
'Real time'????? In physics, especially relativistic physics, time is
what a clock reads. Clock accuracy is a statistical thing based on
comparisons with other clocks, astronomical data etc. At present atomic
clocks are the most accurate.

Thanks
Bill


Indeed so, one of the major implications of Relativity is that there is no
such thing as real (or absolute) time. There is only relative time.

A
 
A

Archangel

greg might add more detail but it basically says as the argument in a
thread progresses, once name calling begins, the one that invokes NAZI
references loses. and the argument is over.



Lol, that rings a bell, I suspect I have heard of it before. Either that or
I have just had an amazing case of deja-vu. This is alt.magick after all...

Thanks.

A
 
R

Richard Bos

greg might add more detail but it basically says as the argument in a
thread progresses, once name calling begins, the one that invokes NAZI
references loses. and the argument is over.

No, it's not. RTFJF.

Richard
 
M

Martin Swain

Bill said:
'Real time'????? In physics, especially relativistic physics, time is what
a clock reads. Clock accuracy is a statistical thing based on comparisons
with other clocks, astronomical data etc. At present atomic clocks are the
most accurate.

Thanks
Bill

Even atomic clocks are measuring an arbitrary standard. In order
to calculate 'real time', whatever that might mean, one would have
to compute the astronomical postition of the Earth, since the
standards we use for time are derived from that.
 
M

Martin Swain

To me what that says is that the moment we sample is a probability and
there is a 50% chance of it being either 1000 or 1001. I think we
would find this probability exists even in atomic clocks. What I meant
by time travel wasn't that the computer was literally time traveling,
but that time was behaving as if it was uncertain about what time it
was.

That is because you are using a digital system to measure an analog
value, which will always involve sampling, and therefore some measure
of uncertainty.

If you want to eliminate that uncertainty, eliminate the digital system.
Learn to calculate the Earth's position in space, and you will always
know what time it is by the position of the sun in the sky, you will
always know what the date is by the length of the day and so on.

Honestly I don't know why you are so preoccupied with computer math.
It really is a clunky aproximation of the real thing. If you learn
a little calculus you will see that what is simple with a pencil and
paper is often difficult to a Sisyphean degree on a digital system.
 
S

Sn!pe

Richard Bos said:
No, it's not. RTFJF.

Richard

Stone me, here we have the exception to prove the rule. I propose
Sn!pe's Corollary: Once Godwin's Law is invoked the thread is dead,
except in the Bit-Bucket.
 
M

Martin Swain

Walter said:
There was a recent article in either Scientific American or
American Scientist (I forget which), which indicated that clocks
are now approaching sufficient precision that it would be impossible
to synchronize any two of the ultra-precision clocks. Apparently
on those timescales, any movement of the clocks has noticable relatively
effects.

According to relativity, simultaneity is a relative to the observer.

However, any two clocks could be synchronised by finding the
plane of simultaneity. Subsequently moving either clock would require
that they be re-synchronised, since time and space are pretty much
the same thing. I suppose there could be other techical difficulties
due to the extremely small timescales involved, but a difficulty is
not the same as an impossiblity.

http://physics.syr.edu/courses/modules/LIGHTCONE/minkowski.html
 
P

Paul B. Andersen

Keith said:
In any case, clock() measures CPU time, not real time (to make this at
least *vaguely* topical in at least one of these newsgroups).

A keeper! :)

Paul
 
R

Richard G. Riley

pete said:
Keith said:
Bill Hobba said:
news:[email protected]... [...]
In any case, clock() measures CPU time,
not real time (to make this at
least *vaguely* topical in at least one of these newsgroups).

'Real time'????? In physics,
especially relativistic physics, time is what
a clock reads.
Clock accuracy is a statistical thing based on comparisons
with other clocks, astronomical data etc.
At present atomic clocks are the
most accurate.

Ok, the clock() function is intended to measure the CPU time consumed
by a program, rather than (any approximation of) the time that might
be measured by a clock that measures so-called "real time" (such as an
atomic clock, sun dial, or whatever).

And this is why cross-posts between comp.lang.c and
sci.physics.relativity are a bad idea. (I have no idea why
comp.lang.c has been getting cross-posts from alt.magick lately.)

Yes. "Real time" actually means something else in programming.

http://www.cs.york.ac.uk/rts/RTSBookThirdEdition.html

"Real Time" generally means just that - "sun dial"
time. The ability of a program to generate a response in a fixed "REAL
TIME" to meet real world timing requirements: such as sampling a
modulated waveform at least twice its fastest frequency component for
proper a-d conversion. Unless you were referring to the difference being
from the CPU usage statement-in which case I agree fully.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

Forum statistics

Threads
473,777
Messages
2,569,604
Members
45,234
Latest member
SkyeWeems

Latest Threads

Top