java Date to c# ticks

Discussion in 'Java' started by Peter K, Mar 2, 2010.

  1. Peter K

    Peter K Guest

    Hi

    I have a c# application which reads data from a database. Some of the data
    is a "time" which is actually a c# "ticks" value (written to the db by
    another c# application).

    Now I am writing a java application, which collects data and writes it to
    the database for the c# application to read. So how do I convert a java
    "Date" to a value which the c# application can interpret as "ticks"?


    Thanks,
    Peter
     
    Peter K, Mar 2, 2010
    #1
    1. Advertising

  2. On 02.03.10 08:09, Peter K wrote:
    > Hi
    >
    > I have a c# application which reads data from a database. Some of the
    > data is a "time" which is actually a c# "ticks" value (written to the db
    > by another c# application).
    >
    > Now I am writing a java application, which collects data and writes it
    > to the database for the c# application to read. So how do I convert a
    > java "Date" to a value which the c# application can interpret as "ticks"?
    >


    First you should use a more common DB schema instead of this proprietary
    one.

    Second you should have provided the definition of ticks:
    From MSDN:
    A single tick represents one hundred nanoseconds or one ten-millionth of
    a second. There are 10,000 ticks in a millisecond.
    The value of this property represents the number of 100-nanosecond
    intervals that have elapsed since 12:00:00 midnight, January 1, 0001.


    java.util.Date has a constructor public Date(long date) with date as the
    number of milliseconds since January 1, 1970, 00:00:00 GMT.
    So this should be no problem for a programmer to solve this now.
     
    Frank Langelage, Mar 2, 2010
    #2
    1. Advertising

  3. Peter K

    Peter K Guest

    "Frank Langelage" <> wrote in message
    news:...
    > On 02.03.10 08:09, Peter K wrote:
    >> Hi
    >>
    >> I have a c# application which reads data from a database. Some of the
    >> data is a "time" which is actually a c# "ticks" value (written to the db
    >> by another c# application).
    >>
    >> Now I am writing a java application, which collects data and writes it
    >> to the database for the c# application to read. So how do I convert a
    >> java "Date" to a value which the c# application can interpret as "ticks"?
    >>

    >
    > First you should use a more common DB schema instead of this proprietary
    > one.


    What would be a better format for the dates?

    We have found that using a real date field in sql server 2005 gives odd
    rounding issues - eg dates which are very close to midnight (eg 1 ms before
    midnight) are rounded up to the next day. We don't want this behaviour, so
    the c# application was programmed to use the ticks value (where no rounding
    up to the next day occurs),

    Using a formatted string for the date makes it harder to select a range of
    dates (this is very easy with a numeric value).

    What other/better possibilities are there?



    Thanks,
    Peter
     
    Peter K, Mar 2, 2010
    #3
  4. Peter K wrote:

    > Now I am writing a java application, which collects data and writes it to
    > the database for the c# application to read. So how do I convert a java
    > "Date" to a value which the c# application can interpret as "ticks"?


    Assuming that "ticks" are using NT Time, GIYF giving you e.g.
    http://support.citrix.com/article/CTX109645

    [...]
    How to convert Windows NT Time to UNIX Time:
    Divide by 10,000,000 and subtract 11,644,473,600.
    How to convert UNIX Time to Windows NT Time:
    Add 11,644,473,600 and multiply by 10,000,000.
    [...]


    Regards, Lothar
    --
    Lothar Kimmeringer E-Mail:
    PGP-encrypted mails preferred (Key-ID: 0x8BC3CD81)

    Always remember: The answer is forty-two, there can only be wrong
    questions!
     
    Lothar Kimmeringer, Mar 2, 2010
    #4
  5. Peter K

    Lew Guest

    Frank Langelage wrote:
    > java.util.Date has a constructor public Date(long date) with date as the
    > number of milliseconds since January 1, 1970, 00:00:00 GMT.
    > So this should be no problem for a programmer to solve this now.


    Given that the Java type to match databases is java.sql.Timestamp, and that
    has a resolution of one nanosecond, it's potentially a better choice.

    --
    Lew
     
    Lew, Mar 2, 2010
    #5
  6. Peter K

    Peter K Guest

    "Lothar Kimmeringer" <> wrote in message
    news:...
    > Peter K wrote:
    >
    >> Now I am writing a java application, which collects data and writes it to
    >> the database for the c# application to read. So how do I convert a java
    >> "Date" to a value which the c# application can interpret as "ticks"?

    >
    > Assuming that "ticks" are using NT Time, GIYF giving you e.g.
    > http://support.citrix.com/article/CTX109645
    >
    > [...]
    > How to convert Windows NT Time to UNIX Time:
    > Divide by 10,000,000 and subtract 11,644,473,600.
    > How to convert UNIX Time to Windows NT Time:
    > Add 11,644,473,600 and multiply by 10,000,000.
    > [...]


    Thanks to all for your input.

    C# (.net) ticks are based on nanoseconds since 1/1/0001.
    Actually I had managed to write a satisfactory conversion routine - but
    during the testing I kept getting wrong answers, which turned out to be
    because I overlooked that Java months are 0-based while .net months are
    1-based. (And timezones also confused the picture - most of the data is
    generated in a different timezone from where I am).

    At the moment I will keep the .net ticks as the value in the database. I
    understand it is not a generally accepted date/time representation, but the
    original software used this representation, and it seems the easiest to
    keep. Data can be input into the database from several sources (applications
    written in both c# and now in java). Data is read by a .net (c#)
    application.


    /Peter
     
    Peter K, Mar 2, 2010
    #6
  7. Peter K

    Wojtek Guest

    Peter K wrote :
    > And timezones also confused the picture - most of the data is generated in a
    > different timezone from where I am


    And another good reason to always store dates in GMT

    --
    Wojtek :)
     
    Wojtek, Mar 3, 2010
    #7
  8. Peter K

    Roedy Green Guest

    On Tue, 2 Mar 2010 20:09:39 +1300, "Peter K" <>
    wrote, quoted or indirectly quoted someone who said :

    >
    >Now I am writing a java application, which collects data and writes it to
    >the database for the c# application to read. So how do I convert a java
    >"Date" to a value which the c# application can interpret as "ticks"?


    There are so many definitions of "ticks".

    AT ticks were in the neighbourhood of 20 ms.

    Dates are just a wrapper around a long ms since 1970-01-01

    You might have a look at the code in FileTimes
    http://mindprod.com/products.html#FILETIMES
    which interconverts between Java-ticks and MS file timestamp ticks.

    Java timestamps use 64-bit milliseconds since 1970 GMT. Windows
    timestamps use 64-bit value representing the
    number of 100-nanosecond intervals since January 1, 1601, with ten
    thousand times as much precision.
    DIFF_IN_MILLIS is the difference between January 1 1601 and January 1
    1970 in milliseconds. This magic number came from
    com.mindprod.common11.TestDate. Done according to Gregorian Calendar,
    no correction for 1752-09-02 Wednesday was followed immediately by
    1752-09-14 Thursday dropping 12 days. Also according to
    http://gcc.gnu.org/ml/java-patches/2003-q1/msg00565.html

    private static final long DIFF_IN_MILLIS = 11644473600000L;

    long javaTime = ( msTime / 10000 ) - DIFF_IN_MILLIS;


    See http://mindprod.com/jgloss/time.html
    --
    Roedy Green Canadian Mind Products
    http://mindprod.com

    The major difference between a thing that might go wrong and a thing that cannot possibly go wrong is that when a thing that cannot possibly go wrong goes wrong it usually turns out to be impossible to get at or repair.
    ~ Douglas Adams (born: 1952-03-11 died: 2001-05-11 at age: 49)
     
    Roedy Green, Mar 3, 2010
    #8
  9. Peter K

    Arne Vajhøj Guest

    On 02-03-2010 20:56, Roedy Green wrote:
    > On Tue, 2 Mar 2010 20:09:39 +1300, "Peter K"<>
    > wrote, quoted or indirectly quoted someone who said :
    >> Now I am writing a java application, which collects data and writes it to
    >> the database for the c# application to read. So how do I convert a java
    >> "Date" to a value which the c# application can interpret as "ticks"?

    >
    > There are so many definitions of "ticks".
    >
    > AT ticks were in the neighbourhood of 20 ms.
    >
    > Dates are just a wrapper around a long ms since 1970-01-01
    >
    > You might have a look at the code in FileTimes
    > http://mindprod.com/products.html#FILETIMES
    > which interconverts between Java-ticks and MS file timestamp ticks.
    >
    > Java timestamps use 64-bit milliseconds since 1970 GMT. Windows
    > timestamps use 64-bit value representing the
    > number of 100-nanosecond intervals since January 1, 1601, with ten
    > thousand times as much precision.


    There are many definitions of ticks, but the original poster
    did say C# and in that case he must mean System.DateTime.Ticks
    and that is year 0001 based not 1601 based.

    Arne
     
    Arne Vajhøj, Mar 3, 2010
    #9
  10. In comp.lang.java.programmer message <qrfro5lqhcao7vsii7gmbbb6s8mrsjhogh
    @4ax.com>, Tue, 2 Mar 2010 17:56:20, Roedy Green <
    om.invalid> posted:
    >On Tue, 2 Mar 2010 20:09:39 +1300, "Peter K" <>
    >wrote, quoted or indirectly quoted someone who said :
    >
    >>
    >>Now I am writing a java application, which collects data and writes it to
    >>the database for the c# application to read. So how do I convert a java
    >>"Date" to a value which the c# application can interpret as "ticks"?

    >
    > There are so many definitions of "ticks".
    >
    >AT ticks were in the neighbourhood of 20 ms.


    Exactly 0x1800B0 per 24-hour day; just over 0x10000 per hour; about 54.9
    ms.

    > This magic number came from
    >com.mindprod.common11.TestDate. Done according to Gregorian Calendar,
    >no correction for 1752-09-02 Wednesday was followed immediately by
    >1752-09-14 Thursday dropping 12 days.


    Ten dates (no days) dropped. Later, parts of Canada dropped 11 dates.

    --
    (c) John Stockton, nr London, UK. ?@merlyn.demon.co.uk Turnpike v6.05.
    Web <URL:http://www.merlyn.demon.co.uk/> - w. FAQish topics, links, acronyms
    PAS EXE etc : <URL:http://www.merlyn.demon.co.uk/programs/> - see 00index.htm
    Dates - miscdate.htm estrdate.htm js-dates.htm pas-time.htm critdate.htm etc.
     
    Dr J R Stockton, Mar 3, 2010
    #10
  11. Peter K

    Eric Sosman Guest

    On 3/2/2010 4:28 PM, Peter K wrote:
    > [...]
    > C# (.net) ticks are based on nanoseconds since 1/1/0001.


    Assertion: The low-order thirty-two bits of such a value
    at any given moment (NOW!) are unknown -- and unknowable.

    Y'know those "Star Trek" moments where Scotty looks at the
    enormous alien space ship and says "It's huge! It must be half
    a mile across!" and Spock says "Zero point five eight three two
    two miles, to be precise?" Spock's folly of over-precision (who
    measures the alien space ship to plus-or-minus six inches?) is as
    nothing compared to that of a time standard that pretends to
    measure two millennia's worth of itsy-bitsy wobbles in Earth's
    rotation. My claim that thirty-two bits are unknowable says
    nothing more than "We don't know the history of Earth's rotation
    to plus-or-minus four seconds over the last two thousand years,"
    and I'll stand by the claim.

    Put it this way: Can you think of ANY physical quantity that
    has been measured to (let's see: 1E9 nanoseconds in a second,
    86,400 seconds in a day ignoring leap seconds, 365.25 days in a
    year ignoring adjustments, 2010 years, 63,430,776,000,000,000,000
    nanoseconds in all) TWENTY decimal places?

    Add to this the fact that light travels only ~1 foot per
    nanosecond. Every mile between you and the time standard amounts
    to five *micro*seconds' worth of slop ...

    --
    Eric Sosman
    lid
     
    Eric Sosman, Mar 4, 2010
    #11
  12. Peter K

    Arne Vajhøj Guest

    On 03-03-2010 20:45, Eric Sosman wrote:
    > On 3/2/2010 4:28 PM, Peter K wrote:
    >> [...]
    >> C# (.net) ticks are based on nanoseconds since 1/1/0001.

    >
    > Assertion: The low-order thirty-two bits of such a value
    > at any given moment (NOW!) are unknown -- and unknowable.


    It is not 1 ns unit but 100 ns units. And the low 32 bits
    is around 430 seconds.

    We do probably not have any measurements at 430 seconds accuracy
    for year 1. But do have it today. And it would be rather inconvenient
    to use different units for different periods.

    Arne
     
    Arne Vajhøj, Mar 4, 2010
    #12
  13. Peter K

    Eric Sosman Guest

    On 3/3/2010 8:57 PM, Arne Vajhøj wrote:
    > On 03-03-2010 20:45, Eric Sosman wrote:
    >> On 3/2/2010 4:28 PM, Peter K wrote:
    >>> [...]
    >>> C# (.net) ticks are based on nanoseconds since 1/1/0001.

    >>
    >> Assertion: The low-order thirty-two bits of such a value
    >> at any given moment (NOW!) are unknown -- and unknowable.

    >
    > It is not 1 ns unit but 100 ns units. And the low 32 bits
    > is around 430 seconds.


    Thanks for the information. I'll revise my claim: "The
    low-order twenty-five bits are unknown and unknowable."

    > We do probably not have any measurements at 430 seconds accuracy
    > for year 1. But do have it today. And it would be rather inconvenient
    > to use different units for different periods.


    Intervals between contemporary events can (sometimes) be
    measured to nanosecond precision. In the laboratory, femtosecond
    precision may be attainable. But extending the scale to longer
    periods is pure fiction! Claim: You cannot measure the time
    between an event at lunchtime yesterday and one at lunchtime today
    with nanosecond precision. You probably can't measure it with
    millisecond precision, and even one-second precision would require
    a good deal of care.

    Even in one single lunch hour, you cannot measure the time
    between the swallow and the belch with nanosecond precision.

    --
    Eric Sosman
    lid
     
    Eric Sosman, Mar 4, 2010
    #13
  14. Peter K

    Roedy Green Guest

    On Wed, 3 Mar 2010 22:56:52 +0000, Dr J R Stockton
    <> wrote, quoted or indirectly quoted
    someone who said :

    >
    >Ten dates (no days) dropped. Later, parts of Canada dropped 11 dates.


    The full story is quite complex. Different parts of world accepted
    the Gregorian calendar at different times. There are parts of the
    world today still on the Julian calendar.

    BigDate works off two different definitions, the papal and the British
    adoption.
    --
    Roedy Green Canadian Mind Products
    http://mindprod.com

    The major difference between a thing that might go wrong and a thing that cannot possibly go wrong is that when a thing that cannot possibly go wrong goes wrong it usually turns out to be impossible to get at or repair.
    ~ Douglas Adams (born: 1952-03-11 died: 2001-05-11 at age: 49)
     
    Roedy Green, Mar 4, 2010
    #14
  15. Eric Sosman wrote:

    > Intervals between contemporary events can (sometimes) be
    > measured to nanosecond precision. In the laboratory, femtosecond
    > precision may be attainable. But extending the scale to longer
    > periods is pure fiction! Claim: You cannot measure the time
    > between an event at lunchtime yesterday and one at lunchtime today
    > with nanosecond precision.


    With intervals of that size, nobody will anyway. Point is that
    you don't want to change data-structures in dependence of the
    size of the interval. As well, you want to keep some kind of
    reserve for the future to avoid the problem the runtime libraries
    of Borland TurboPascal had where a cycle-counter-value became
    larger than the maximum value that could be represented by a Word.

    > You probably can't measure it with
    > millisecond precision, and even one-second precision would require
    > a good deal of care.


    Like with all physical measures you have an error. Assuming it
    to be constant (e.g. 0.01%) an interval of 10 µs can be expected
    to be in the range of 9999 ns and 1001 ns where in terms of
    a day, the error alone is plus or minus 9 seconds.

    > Even in one single lunch hour, you cannot measure the time
    > between the swallow and the belch with nanosecond precision.


    Most measurements in IT I'm aware of are about the time of a
    method-call, the execution time of an SQL-query, the round-
    trip-time of a network request, etc. Hopefully most of them
    are in the range of micro- or milliseconds, so having a data-
    structure with some kind of "reserve" for the future isn't the
    badest thing to have.


    Regards, Lothar
    --
    Lothar Kimmeringer E-Mail:
    PGP-encrypted mails preferred (Key-ID: 0x8BC3CD81)

    Always remember: The answer is forty-two, there can only be wrong
    questions!
     
    Lothar Kimmeringer, Mar 4, 2010
    #15
  16. Peter K

    Arne Vajhøj Guest

    On 03-03-2010 21:21, Eric Sosman wrote:
    > On 3/3/2010 8:57 PM, Arne Vajhøj wrote:
    >> On 03-03-2010 20:45, Eric Sosman wrote:
    >>> On 3/2/2010 4:28 PM, Peter K wrote:
    >>>> [...]
    >>>> C# (.net) ticks are based on nanoseconds since 1/1/0001.
    >>>
    >>> Assertion: The low-order thirty-two bits of such a value
    >>> at any given moment (NOW!) are unknown -- and unknowable.

    >>
    >> It is not 1 ns unit but 100 ns units. And the low 32 bits
    >> is around 430 seconds.

    >
    > Thanks for the information. I'll revise my claim: "The
    > low-order twenty-five bits are unknown and unknowable."
    >
    >> We do probably not have any measurements at 430 seconds accuracy
    >> for year 1. But do have it today. And it would be rather inconvenient
    >> to use different units for different periods.

    >
    > Intervals between contemporary events can (sometimes) be
    > measured to nanosecond precision. In the laboratory, femtosecond
    > precision may be attainable. But extending the scale to longer
    > periods is pure fiction! Claim: You cannot measure the time
    > between an event at lunchtime yesterday and one at lunchtime today
    > with nanosecond precision. You probably can't measure it with
    > millisecond precision, and even one-second precision would require
    > a good deal of care.
    >
    > Even in one single lunch hour, you cannot measure the time
    > between the swallow and the belch with nanosecond precision.


    All true.

    But still it is a lot easier to use the same unit for
    both long and short intervals.

    Arne
     
    Arne Vajhøj, Mar 5, 2010
    #16
  17. Peter K

    Eric Sosman Guest

    On 3/4/2010 10:56 PM, Arne Vajhøj wrote:
    > On 03-03-2010 21:21, Eric Sosman wrote:
    >> On 3/3/2010 8:57 PM, Arne Vajhøj wrote:
    >>> On 03-03-2010 20:45, Eric Sosman wrote:
    >>>> On 3/2/2010 4:28 PM, Peter K wrote:
    >>>>> [...]
    >>>>> C# (.net) ticks are based on nanoseconds since 1/1/0001.
    >>>>
    >>>> Assertion: The low-order thirty-two bits of such a value
    >>>> at any given moment (NOW!) are unknown -- and unknowable.
    >>>
    >>> It is not 1 ns unit but 100 ns units. And the low 32 bits
    >>> is around 430 seconds.

    >>
    >> Thanks for the information. I'll revise my claim: "The
    >> low-order twenty-five bits are unknown and unknowable."
    >>
    >>> We do probably not have any measurements at 430 seconds accuracy
    >>> for year 1. But do have it today. And it would be rather inconvenient
    >>> to use different units for different periods.

    >>
    >> Intervals between contemporary events can (sometimes) be
    >> measured to nanosecond precision. In the laboratory, femtosecond
    >> precision may be attainable. But extending the scale to longer
    >> periods is pure fiction! Claim: You cannot measure the time
    >> between an event at lunchtime yesterday and one at lunchtime today
    >> with nanosecond precision. You probably can't measure it with
    >> millisecond precision, and even one-second precision would require
    >> a good deal of care.
    >>
    >> Even in one single lunch hour, you cannot measure the time
    >> between the swallow and the belch with nanosecond precision.

    >
    > All true.
    >
    > But still it is a lot easier to use the same unit for
    > both long and short intervals.


    I've no quarrel with measuring *intervals* in tiny units.
    The thing that started me ranting and foaming at the mouth was
    the statement that "C# (.net) ticks are based on nanoseconds
    since 1/1/0001." *That's* the association I regard as fiction,
    bordering on nonsense.

    I seem to have mislaid those pills the court psychiatrist
    ordered me to take. Anybody know where they are? ;-)

    --
    Eric Sosman
    lid
     
    Eric Sosman, Mar 5, 2010
    #17
  18. Peter K

    Arne Vajhøj Guest

    On 05-03-2010 09:14, Eric Sosman wrote:
    > On 3/4/2010 10:56 PM, Arne Vajhøj wrote:
    >> On 03-03-2010 21:21, Eric Sosman wrote:
    >>> On 3/3/2010 8:57 PM, Arne Vajhøj wrote:
    >>>> On 03-03-2010 20:45, Eric Sosman wrote:
    >>>>> On 3/2/2010 4:28 PM, Peter K wrote:
    >>>>>> [...]
    >>>>>> C# (.net) ticks are based on nanoseconds since 1/1/0001.
    >>>>>
    >>>>> Assertion: The low-order thirty-two bits of such a value
    >>>>> at any given moment (NOW!) are unknown -- and unknowable.
    >>>>
    >>>> It is not 1 ns unit but 100 ns units. And the low 32 bits
    >>>> is around 430 seconds.
    >>>
    >>> Thanks for the information. I'll revise my claim: "The
    >>> low-order twenty-five bits are unknown and unknowable."
    >>>
    >>>> We do probably not have any measurements at 430 seconds accuracy
    >>>> for year 1. But do have it today. And it would be rather inconvenient
    >>>> to use different units for different periods.
    >>>
    >>> Intervals between contemporary events can (sometimes) be
    >>> measured to nanosecond precision. In the laboratory, femtosecond
    >>> precision may be attainable. But extending the scale to longer
    >>> periods is pure fiction! Claim: You cannot measure the time
    >>> between an event at lunchtime yesterday and one at lunchtime today
    >>> with nanosecond precision. You probably can't measure it with
    >>> millisecond precision, and even one-second precision would require
    >>> a good deal of care.
    >>>
    >>> Even in one single lunch hour, you cannot measure the time
    >>> between the swallow and the belch with nanosecond precision.

    >>
    >> All true.
    >>
    >> But still it is a lot easier to use the same unit for
    >> both long and short intervals.

    >
    > I've no quarrel with measuring *intervals* in tiny units.
    > The thing that started me ranting and foaming at the mouth was
    > the statement that "C# (.net) ticks are based on nanoseconds
    > since 1/1/0001." *That's* the association I regard as fiction,
    > bordering on nonsense.


    Nanoseconds in year 1 is absurd.

    But it is not absurd to measure nanoseconds (or at least milliseconds
    today).

    And it is not absurd to be able to store days many years back.

    And it is not absurd to use the same unit for all times.

    So we have now proven that:
    3 x not absurd = absurd

    Arne
     
    Arne Vajhøj, Mar 5, 2010
    #18
  19. Peter K

    Peter K Guest

    "Eric Sosman" <> wrote in message
    news:hmr3jt$biv$-september.org...
    > On 3/4/2010 10:56 PM, Arne Vajhøj wrote:
    >> On 03-03-2010 21:21, Eric Sosman wrote:
    >>> On 3/3/2010 8:57 PM, Arne Vajhøj wrote:
    >>>> On 03-03-2010 20:45, Eric Sosman wrote:
    >>>>> On 3/2/2010 4:28 PM, Peter K wrote:
    >>>>>> [...]
    >>>>>> C# (.net) ticks are based on nanoseconds since 1/1/0001.
    >>>>>
    >>>>> Assertion: The low-order thirty-two bits of such a value
    >>>>> at any given moment (NOW!) are unknown -- and unknowable.
    >>>>
    >>>> It is not 1 ns unit but 100 ns units. And the low 32 bits
    >>>> is around 430 seconds.
    >>>
    >>> Thanks for the information. I'll revise my claim: "The
    >>> low-order twenty-five bits are unknown and unknowable."
    >>>
    >>>> We do probably not have any measurements at 430 seconds accuracy
    >>>> for year 1. But do have it today. And it would be rather inconvenient
    >>>> to use different units for different periods.
    >>>
    >>> Intervals between contemporary events can (sometimes) be
    >>> measured to nanosecond precision. In the laboratory, femtosecond
    >>> precision may be attainable. But extending the scale to longer
    >>> periods is pure fiction! Claim: You cannot measure the time
    >>> between an event at lunchtime yesterday and one at lunchtime today
    >>> with nanosecond precision. You probably can't measure it with
    >>> millisecond precision, and even one-second precision would require
    >>> a good deal of care.
    >>>
    >>> Even in one single lunch hour, you cannot measure the time
    >>> between the swallow and the belch with nanosecond precision.

    >>
    >> All true.
    >>
    >> But still it is a lot easier to use the same unit for
    >> both long and short intervals.

    >
    > I've no quarrel with measuring *intervals* in tiny units.
    > The thing that started me ranting and foaming at the mouth was
    > the statement that "C# (.net) ticks are based on nanoseconds
    > since 1/1/0001." *That's* the association I regard as fiction,
    > bordering on nonsense.


    Yes, sorry, I mis-wrote the definition from Microsoft.

    The .net DateTime structure represents dates and times ranging from 1/1/0001
    to 31/12/9999. The values are measured in 100ns units called ticks.

    http://msdn.microsoft.com/en-us/library/system.datetime.aspx

    But is your quarrel that if I actually went back the billions of nanoseconds
    from the value for today's nanasecond value, I wouldn't actually end up at
    1/1/0001 - due to vagaries in the Earth's orbit, spin etc?
     
    Peter K, Mar 5, 2010
    #19
  20. On Sat, 06 Mar 2010 10:02:22 +1300, Peter K wrote:

    > "Eric Sosman" <> wrote in message
    > news:hmr3jt$biv$-september.org...
    >> On 3/4/2010 10:56 PM, Arne Vajhøj wrote:
    >>> On 03-03-2010 21:21, Eric Sosman wrote:
    >>>> On 3/3/2010 8:57 PM, Arne Vajhøj wrote:
    >>>>> On 03-03-2010 20:45, Eric Sosman wrote:
    >>>>>> On 3/2/2010 4:28 PM, Peter K wrote:
    >>>>>>> [...]
    >>>>>>> C# (.net) ticks are based on nanoseconds since 1/1/0001.
    >>>>>>
    >>>>>> Assertion: The low-order thirty-two bits of such a value at any
    >>>>>> given moment (NOW!) are unknown -- and unknowable.
    >>>>>
    >>>>> It is not 1 ns unit but 100 ns units. And the low 32 bits is around
    >>>>> 430 seconds.
    >>>>
    >>>> Thanks for the information. I'll revise my claim: "The low-order
    >>>> twenty-five bits are unknown and unknowable."
    >>>>
    >>>>> We do probably not have any measurements at 430 seconds accuracy for
    >>>>> year 1. But do have it today. And it would be rather inconvenient to
    >>>>> use different units for different periods.
    >>>>
    >>>> Intervals between contemporary events can (sometimes) be measured to
    >>>> nanosecond precision. In the laboratory, femtosecond precision may be
    >>>> attainable. But extending the scale to longer periods is pure
    >>>> fiction! Claim: You cannot measure the time between an event at
    >>>> lunchtime yesterday and one at lunchtime today with nanosecond
    >>>> precision. You probably can't measure it with millisecond precision,
    >>>> and even one-second precision would require a good deal of care.
    >>>>
    >>>> Even in one single lunch hour, you cannot measure the time between
    >>>> the swallow and the belch with nanosecond precision.
    >>>
    >>> All true.
    >>>
    >>> But still it is a lot easier to use the same unit for both long and
    >>> short intervals.

    >>
    >> I've no quarrel with measuring *intervals* in tiny units.
    >> The thing that started me ranting and foaming at the mouth was the
    >> statement that "C# (.net) ticks are based on nanoseconds since
    >> 1/1/0001." *That's* the association I regard as fiction, bordering on
    >> nonsense.

    >
    > Yes, sorry, I mis-wrote the definition from Microsoft.
    >
    > The .net DateTime structure represents dates and times ranging from
    > 1/1/0001 to 31/12/9999. The values are measured in 100ns units called
    > ticks.
    >
    > http://msdn.microsoft.com/en-us/library/system.datetime.aspx
    >
    > But is your quarrel that if I actually went back the billions of
    > nanoseconds from the value for today's nanasecond value, I wouldn't
    > actually end up at 1/1/0001 - due to vagaries in the Earth's orbit, spin
    > etc?
    >

    To say nothing of the transitions between the various calendars, which,
    over the mere 2009 years in that range are probably more significant than
    spin rate and orbit deviations.


    --
    martin@ | Martin Gregorie
    gregorie. | Essex, UK
    org |
     
    Martin Gregorie, Mar 6, 2010
    #20
    1. Advertising

Want to reply to this thread or ask your own question?

It takes just 2 minutes to sign up (and it's free!). Just click the sign up button to choose a username and then you can ask your own questions on the forum.
Similar Threads
  1. TC
    Replies:
    1
    Views:
    1,420
    Eliyahu Goldin
    Apr 21, 2005
  2. Peter Grison

    Date, date date date....

    Peter Grison, May 28, 2004, in forum: Java
    Replies:
    10
    Views:
    3,279
    Michael Borgwardt
    May 30, 2004
  3. Lord0
    Replies:
    0
    Views:
    1,015
    Lord0
    Oct 25, 2006
  4. David

    How to convert from Ticks to date&time?

    David, Jun 20, 2005, in forum: C Programming
    Replies:
    4
    Views:
    777
    Richard Bos
    Jun 20, 2005
  5. David

    How to convert from Ticks to date&time?

    David, Jun 20, 2005, in forum: C Programming
    Replies:
    0
    Views:
    345
    David
    Jun 20, 2005
Loading...

Share This Page