getTime, setTime, recent change in daylight savings

Z

Zamdrist

I'm looking at an application written by a third party, its not my
code. While I'm a programmer type, I'm not terribly versed in the
details of Javacript.

Our problems lies in that during the recent daylight savings change,
we've had a user, who resides in the Mountain timezone. The software,
a web application was showing a timer with the the wrong start time,
one hour wrong. Both the server and his machine said the correct time,
that is Windows said the correct time.

I think I've tracked down the code from the application that
determines the time. Can someone tell me if this code is correct, and
if its possible determine if it could be the culprit?

var date = new Date();
date.setTime(date.getTime() + 3650*24*60*60*1000);

I know that getTime returns the number of milliseconds from 1/1/1970,
so supposedly the addition of (3650*24*60*60*1000) brings the value of
getTime to a current time...?

Ideas? Thanks
 
T

Thomas Allen

I'm looking at an application written by a third party, its not my
code. While I'm a programmer type, I'm not terribly versed in the
details of Javacript.

Our  problems lies in that during the recent daylight savings change,
we've had a user, who resides in the Mountain timezone. The software,
a web application was showing a timer with the the wrong start time,
one hour wrong. Both the server and his machine said the correct time,
that is Windows said the correct time.

I think I've tracked down the code from the application that
determines the time. Can someone tell me if this code is correct, and
if its possible determine if it could be the culprit?

var date = new Date();
date.setTime(date.getTime() + 3650*24*60*60*1000);

I know that getTime returns the number of milliseconds from 1/1/1970,
so supposedly the addition of (3650*24*60*60*1000) brings the value of
getTime to a current time...?

Ideas? Thanks

Well, the added milliseconds part is about ten years (3650 * ms in a
day). I have no idea why that code would be there. A new Date object
is current by default, so no need to "bring it up to date" as you
mentioned.

Thomas
 
Z

Zamdrist

Well, the added milliseconds part is about ten years (3650 * ms in a
day). I have no idea why that code would be there. A new Date object
is current by default, so no need to "bring it up to date" as you
mentioned.

Thomas

What's interesting, using Firefox's javacript code window I added
window.alert(date);

And sure enough its one hour off (early). It also says GMT -0600 (CST)

Is it just expressing that the current time CST is 6 hours prior to
GMT?
 
Z

Zamdrist

I think because the code runs server side (CST) the calculation is
there to account for the user being in a different time zone? Its the
only think that makes sense to me.
 
T

Thomas Allen

I think because the code runs server side (CST) the calculation is
there to account for the user being in a different time zone? Its the
only think that makes sense to me.

Date checks the local time, not a server. I don't know enough about
the implementation to speculate as to why the time zones may not
match, and I've never experienced this problem before. It sounds like
something OS-level.

Thomas
 
E

Evertjan.

Thomas Allen wrote on 06 apr 2009 in comp.lang.javascript:
Date checks the local time, not a server.

It can be server local or client local
depending on where the Javascript executes.

=======================================
Server time was:
<%
response.write(new Date());
%>

<br>
and client time was:

<script type='text/javascript'>
document.write(new Date());
</script>

<br>
when this pege was loaded.
=======================================
 
Z

Zamdrist

Thomas Allen wrote on 06 apr 2009 in comp.lang.javascript:



It can be server local or client local
depending on where the Javascript executes.

=======================================
Server time was:
<%
response.write(new Date());
%>

<br>
and client time was:

<script type='text/javascript'>
document.write(new Date());
</script>

<br>
when this pege was loaded.
=======================================

That's what I figured, all depends on where the code executes. And so
far as I can tell, it executes on the server.

If its an OS-level issue with the client or server, I don't know how
when the times on both the server and client have always shown the
correct time. Its only the web application, and I'm presuming the code
I posted that calculates the wrong time.

Try it:

var date = new Date();
date.setTime(date.getTime() + 3650*24*60*60*1000);
window.alert(date);
 
T

Thomas Allen

That's what I figured, all depends on where the code executes. And so
far as I can tell, it executes on the server.

If its an OS-level issue with the client or server, I don't know how
when the times on both the server and client have always shown the
correct time. Its only the web application, and I'm presuming the code
I posted that calculates the wrong time.

Try it:

var date = new Date();
date.setTime(date.getTime() + 3650*24*60*60*1000);
window.alert(date);

What you have there posts the wrong time, about ten years ahead.
Hours:Minutes:Seconds are correct however.

Thomas
 
Z

Zamdrist

What you have there posts the wrong time, about ten years ahead.
Hours:Minutes:Seconds are correct however.

Thomas

Yeah, its goofy. Like I said, I didn't write it. Someone who makes a
rude amount of money more than I do, did.
 
R

RobG

That is very strange code. It adds about 10 years to the date object
created by new Date, but does not account for for leap years. Since
it is roughly (not exactly) a whole number of years, daylight saving
will affect the calculation for a few days per year.

If you want to find out how many seconds there are between now and
some date in the future, use the setYear, setMonth and setDate methods
of a date object, then subtract the two times, e.g.

var now = new Date();
var tenYearsHence = new Date(+now);
tenYearsHence.setYear(now.getFullYear() + 10);

alert(
'Now: ' + now
+ '\nTen years hence: ' + tenYearsHence
+ '\nSeconds between: ' + (tenYearsHence - now)/1000
);


Of course using local calculations for dates is always problemtatic as
you don't know whether the system's clock is correctly set or that the
user knows what it is (it may not be their system, nor might they have
control over the date and time settings).


No. It gives a date about 1 to 3 days short of 10 years in the
future. Depending on when the date is calculated, there might be one,
two or three leap years in between that aren't accounted for and it
might therefore also not account for a daylight saving time change, so
it might be out by an hour one way or the other occassionally.


It is current for the system settings, which may not be correct.

What's interesting, using Firefox's javacript code window I added
window.alert(date);

Presumably you did:

alert(new Date());
And sure enough its one hour off (early).

That means you have a setting somwhere that is inconsistent with what
you think the local time is.

It also says GMT -0600 (CST)

Is it just expressing that the current time CST is 6 hours prior to
GMT?

The opposite: the time is 6 hours behind UTC (GMT is more or less
deprecated, although still commonly used).

<URL: http://en.wikipedia.org/wiki/GMT >

You can use date.getTimezoneOffset() to see the offset in minutes (for
UTC -0600 it should say 360, my timezone is UTC +1000 so I get -600).
The timezone offset is added to the local time to get UTC.

When dealing with time, it is best to do everything in UTC and convert
to local time only for the sake of display. It may also be prudent to
display the result of new Date() so the user can check if it is
correct (and hence infer whether calculations based on it are likely
to be correct).
 
E

Evertjan.

Zamdrist wrote on 06 apr 2009 in comp.lang.javascript:
==============> Server time was:
==============>

[please do not quote signatures on usenet]
That's what I figured, all depends on where the code executes. And so
far as I can tell, it executes on the server.

How can you not be sure, when you are writing the code?
If its an OS-level issue with the client or server, I don't know how
when the times on both the server and client have always shown the
correct time. Its only the web application, and I'm presuming the code
I posted that calculates the wrong time.

Try it:

var date = new Date();
date.setTime(date.getTime() + 3650*24*60*60*1000);

Do you know what clients locale summertime [dst] laws will be in 10 years
time?
window.alert(date);

This is not possible on the server: window.alert()
[and if there were a window on the server,
you would not see it on the client.]
 
D

Dr J R Stockton

In comp.lang.javascript message <dac07a05-93de-4fa6-8424-fd7f3a33c26e@e5
g2000vbe.googlegroups.com>, Mon, 6 Apr 2009 11:47:49, Zamdrist
Our problems lies in that during the recent daylight savings change,
we've had a user, who resides in the Mountain timezone.

Many people live in places with mountains. This is an international
newsgroup, and it is foolish to expect readers world-wide to be familiar
with every state's local arrangements. To be courteous, be explicit
and accurate.

Given that there has been a clock change, the location is irrelevant,
provided that one can assume it is not Lord Howe Island - which, for its
size, is actually modestly mountainous, and has a time zone shared only
with unpopulated(?) neighbours.

The software,
a web application was showing a timer with the the wrong start time,
one hour wrong. Both the server and his machine said the correct time,
that is Windows said the correct time.

I think I've tracked down the code from the application that
determines the time. Can someone tell me if this code is correct, and
if its possible determine if it could be the culprit?

var date = new Date();
date.setTime(date.getTime() + 3650*24*60*60*1000);

One cannot tell whether such a code fragment is correct without knowing
what it needs to do. The "designer" might have specified, wisely or
foolishly but authoritatively, that the time should be moved ahead by 10
years of 365 days of 24 hours, in which case the code is satisfactory
though a numerate coder would know that a day usually has 86400 seconds
and avoid writing unnecessary multiplications. One should know that
there are 864e5 ms/day.

I know that getTime returns the number of milliseconds from 1/1/1970,

From 1970-01-01 00:00:00 UTC.

so supposedly the addition of (3650*24*60*60*1000) brings the value of
getTime to a current time...?

It is adding ten years worth of absolute time, on the assumption that
Leap Years have been abolished and the state of Summer Time will be the
same at the end as at the beginning.

A likely intent is to get a date/time about ten years from now, and for
that one should use
var date = new Date(); // then one of
1. date.setYear(date.getYear()+10)
2. date.setFullYear(date.getFullYear()+10)
3. date.setUTCFullYear(date.getUTCFullYear()+10) // faster

There, 1 is slovenly but probably safe; 2 will move ahead exactly 10
civil years, if possible (there is a missing hour each Spring, in many
places; 3 is faster but may appear an hour in error if an odd number of
Summer Time changed are crossed. Your code is equivalent to 3, apart
from having the year-length error.

To fix the problem, you will need to understand the needs and intent of
the time-handling of the system; looking at a fragment is insufficient.
 
D

Dr J R Stockton

In comp.lang.javascript message <14741c42-d7e8-4370-95d5-b48f2cbea3d0@y3
3g2000prg.googlegroups.com>, Mon, 6 Apr 2009 16:40:59, RobG
That is very strange code. It adds about 10 years to the date object
created by new Date, but does not account for for leap years. Since
it is roughly (not exactly) a whole number of years, daylight saving
will affect the calculation for a few days per year.

The ordinal and calendar dates of Summer Time vary more or less
systematically from year to year. Therefore, Summer Time would still be
able to have an effect, occasionally, even if the step were an exact
number of calendar years. In most places using Summer Time, stepping a
multiple of 28 years (except across a missing leap year), or a multiple
of 400 years, would be safe.

If you want to find out how many seconds there are between now and
some date in the future, use the setYear, setMonth and setDate methods

Better to use setFullYear. Firstly, setYear ... getFullYear looks wring
(though it's OK); secondly, getFullYear can be given all three
arguments. In fact, one should not set year, month, and date
individually unless the initial date is known to be, or is changed to,
less than 29. Consider 2009-01-31 ; set year 2010, 2010-01-31, OK; set
Month 1 (Feb), 2010-03-03 OOOH!; set Date 22, 2010-03-22. BUT
2009-01-31, set Full Year 2010 1 22, 2010-02-22.


No. It gives a date about 1 to 3 days short of 10 years in the
future. Depending on when the date is calculated, there might be one,
two or three leap years in between that aren't accounted for and it
might therefore also not account for a daylight saving time change, so
it might be out by an hour one way or the other occassionally.

Given that new Date() is used, the chances of there being only one leap
year in a ten-year span within the lifetime of the code - or its author
- are probably negligible.


If the intention is like to set a cookie date ten years ahead, so that
in practice it will not expire, it would be smarter just to set a fixed
literal date such as 9999-12-25.
 
Z

Zamdrist

Had I known I'd be flogged in such a manner, I'd rather not have
bothered.

Duly noted.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

No members online now.

Forum statistics

Threads
473,755
Messages
2,569,536
Members
45,015
Latest member
AmbrosePal

Latest Threads

Top