ASNI C runtime date bug - myth or real

A

Ade

Hello All,

The company I work for has asked me to research the possibility of an ASNI C
runtime library overflow bug for 2010 in the date routines, specifically as
to whether it is a real problem or whether it was fabricated at Y2K time as
a means of further scare-mongering. The reason I say scare-mongering is
because I can find no absolute description of a problem relating to this on
the internet (my biggest source of information). I can find many documents
that state it exists but no details and, funnily enough, they all seem to be
cloned from an original source - the wording is normally exact.

If a problem does exist in the date routines of the ASNI libraries, does
anyone have any examples or theoretical papers on what may (or may not) be
the cause?

Personally, I feel that any such overflow might occur in date manipulation
routines (adding, subtracting, etc) particularly in smaller word length
processor environments (16-bit for example). I cannot prove this, nor do I
wish to. I submit that as a pre-cursor to discussion only.

I can assure you that my intention is purely research and by no means is
this query meant to provoke any kind of heated response or business-wide
'panic'. I simply need expert help.

Any assistance you can give would be greatly appreciated.

Regards,

Adrian Birkett
 
K

Keith Thompson

Ade said:
The company I work for has asked me to research the possibility of an ASNI C
runtime library overflow bug for 2010 in the date routines, specifically as
to whether it is a real problem or whether it was fabricated at Y2K time as
a means of further scare-mongering. The reason I say scare-mongering is
because I can find no absolute description of a problem relating to this on
the internet (my biggest source of information). I can find many documents
that state it exists but no details and, funnily enough, they all seem to be
cloned from an original source - the wording is normally exact.

If a problem does exist in the date routines of the ASNI libraries, does
anyone have any examples or theoretical papers on what may (or may not) be
the cause?

Personally, I feel that any such overflow might occur in date manipulation
routines (adding, subtracting, etc) particularly in smaller word length
processor environments (16-bit for example). I cannot prove this, nor do I
wish to. I submit that as a pre-cursor to discussion only.

I can assure you that my intention is purely research and by no means is
this query meant to provoke any kind of heated response or business-wide
'panic'. I simply need expert help.

(You mean "ANSI", not "ASNI" -- and you probably *should* mean ISO,
since that's the organization that owns and publishes the C standard.)

I know of no such problems related to the year 2010. The C standard's
definition of the time_t function is so vague (it merely needs to be
an arithmetic type capable of representing times) that you can't infer
anything about time_t overflow from the standard itself. Some of the
manipulation functions might have problems; for example asctime() or
ctime() would overflow its static buffer in the year 10000.

The most common representation for time_t is an integer representing
the number of seconds since 1970-01-01 00:00:00 UTC. With a 32-bit
signed representation, that will overflow at Tue 2038-01-19 03:14:07
UTC (the infamous Y2038 problem). IMHO that leaves us more than
enough time to transition to 64-bit time_t, unless you're dealing with
timestamps in the future.

There have been a number of other times that have caused, or might
cause, problems for some systems. For example, I saw a problem on one
system when the current time_t value reached one billion (Sun
2001-09-09 01:46:40 UTC). Other problems have occurred for leap
years, such as for code that doesn't anticipate a year having 366
days. But these problems are system-specific, not related to the ISO
C language or library.
 
L

Lew Pitcher

Hello All,

The company I work for has asked me to research the possibility of an ASNI
C runtime library overflow bug for 2010 in the date routines,


I do not know of /any/ "overflow bug" in the ANSI/ISO C suite that will
occur in 2010.

Perhaps you are thinking of the well-known exposure on certain systems (of
Unix-like derivation) where time_t is defined as a 32-bit signed integer,
and the results of the time() call are expressed as a value of seconds
since midnight, Jan 1, 1970. This representation will overflow at 03:14:07
UTC on January 19, 2038.

However, the C standard does /not/ dictate that time_t be defined as a
signed 32-bit integer, or that the time() call return a value of seconds
since midnight Jan 1, 1970.

--
Lew Pitcher

Master Codewright & JOAT-in-training | Registered Linux User #112576
http://pitcher.digitalfreehold.ca/ | GPG public key available by request
---------- Slackware - Because I know what I'm doing. ------
 
B

Beej Jorgensen

Ade said:
The company I work for has asked me to research the possibility of an ASNI C
runtime library overflow bug for 2010 in the date routines

You should provide the links you already have.

I've never heard of this, but apparently there's some problem with
Newton OS which I guess has a 30-bit time_t with some weird offset, or
something--I didn't really read it:

http://40hz.org/Pages/Newton Year 2010 Problem

The Spec pretty much allows any range and precision of times by an
implementation, so the Newton's not out of line; there's nothing in the
spec that prevents a system from having an atomic clock good until the
year 10 billion, or, for that matter, a clock that fails in 2010.

C99 7.23.1p4:
# The range and precision of times representable in clock_t and time_t
# are implementation-defined.

C99 7.23.2.4p3:
# The time function returns the implementation's best approximation to
# the current calendar time.

I'm pretty sure my Unix boxes (and other people's Windows boxes :) )
will keep humming past 2010 without a date problem.

-Beej
 
A

Antoninus Twink

I know of no such problems related to the year 2010.

Right. 2038 is the key date, though by then all the world will be
64-bit and there won't be any problem.

Some dry humor from Wikipedia:
"Using a (signed) 64-bit value introduces a new wraparound date in
approximately 292 billion years on Sunday, December 4, 292,277,026,596."
The most common representation for time_t is an integer representing
the number of seconds since 1970-01-01 00:00:00 UTC.

Is it really the most common? I'm a *nix buff as much as the next man,
but you'd have to admit that MicroSoft has a pretty large share of the
desktop market.
 
J

jacob navia

Antoninus said:
Some dry humor from Wikipedia:
"Using a (signed) 64-bit value introduces a new wraparound date in
approximately 292 billion years on Sunday, December 4, 292,277,026,596."


Went to the mart to buy some cigarettes, and
when the maid smiled behind the counter I thought
of the time when she will be completely gone.

I thought that a time will come when the cigarette store
will be gone, together with its enclosing mart, yes.

A time will come when the continent will be gone where
we build our cigarette stores, a time when the sun that
gives us light, and even the galaxy will be gone.

Together with all species in the galaxy
that suicide themselves with smoke.

Gone up in smoke. All of them. Yes, a time will come when
December 4th 297 277 026 596 arrives.

What a surprise when preparing for Christmas. I thought,
yes I thought I should get some cigarettes from the store.

Was a long time coming 297 277 026 596 but eventually...

Eventually December 4th arrived.
 
R

robertwessel2

Right. 2038 is the key date, though by then all the world will be
64-bit and there won't be any problem.

Some dry humor from Wikipedia:
"Using a (signed) 64-bit value introduces a new wraparound date in
approximately 292 billion years on Sunday, December 4, 292,277,026,596."


Is it really the most common? I'm a *nix buff as much as the next man,
but you'd have to admit that MicroSoft has a pretty large share of the
desktop market.


32 bit time_t's on Windows are typically the same and the *nix
versions. Windows itself tend not to use that as a time base
internally, but MS's CRT does.
 
A

Ade

Thanks all for your responses.

Err... yes, I did mean "ANSI" and can't think why I spelt it incorrectly
throughout - sorry.

Some links that mention it...

http://www.ugu.com/sui/ugu/showclassic?I=y2k.dates&F=0111111111&G=Y

http://www.ismosys.com/downloads/general/Dates potentially causing computer problems.pdf

I wasn't aware of the Newton problem but this does seem to be the likely
candidate

http://40hz.org/Pages/Newton Year 2010 Problem

Thanks again. I hope there are no smelling pistakes in this posting :)


Ade
 
K

Keith Thompson

Ade said:
Thanks all for your responses.

Err... yes, I did mean "ANSI" and can't think why I spelt it incorrectly
throughout - sorry.

Some links that mention it...

http://www.ugu.com/sui/ugu/showclassic?I=y2k.dates&F=0111111111&G=Y

This mentions:

January 1, 2010 - Overflow ANSI C Library (Note: This event is
alleged to be a valid Y2K problem date. I do not have any
additional information on this claim)

As far as I know, this is nonsense.

This also says:

2010/01/01: Overflow for ANSI C library.

And:

2034/09/30: Overflow for UNIX time function.

I can't figure out the significance of that date, and I don't believe
there is any.

[...]
 
P

Phil Carmody

Keith Thompson said:
This mentions:

January 1, 2010 - Overflow ANSI C Library (Note: This event is
alleged to be a valid Y2K problem date. I do not have any
additional information on this claim)

As far as I know, this is nonsense.


This also says:

2010/01/01: Overflow for ANSI C library.

That's clearly the date I will fill the entirety of RAM on a 4GB machine
with non-NUL characters, and run strlen on it.
And:

2034/09/30: Overflow for UNIX time function.

I can't figure out the significance of that date, and I don't believe
there is any.

I suspect the nonsense density of that document is quite high.

Phil
 
A

Ade

Gordon Burditt said:
ANSI C specifies very little about the date routines. In particular:

- It does NOT specify that a time_t is 32 bits.
- It does NOT specify that a time_t represents a count of the number
of <insert some time unit here> since <fixed time>. It especially
doesn't specify that a time_t represents a number of seconds.
- It does NOT specify that if you subtract two time_t's, you get anything
meaningful. Even the sign of the result is useless. Try, for example,
subtracting 12311999 (December 31, 1999) from 01012000 (January 1, 2000).
You get -11299999, which doesn't mean much.

There is no problem with "ANSI C date routines" in common; you have
to specify a particular implementation. "Cars will run out of gas
by June 17, 2009". Well, mine probably will if I don't fill it up
first.

A time_t could, for example, increment by one every time there's
an earthquake in California. For that, I guess you'd need a 128-bit
integer. A time_t could, for example, increment by one every time
there's an earthquake in Texas. With 5 in a week recently in or
near one city, that might overflow an 8-bit integer faster than you
think.


Post the documents, or at least pointers to them. I can do Google
searches
too, but I want to see what *YOU* see.


So let's see the exact wording.


The common implementation of time_t, as a 32-bit number of seconds
since January 1, 1970 00:00 GMT, will overflow into the sign bit
on Jan 19, 2038. The obvious fix for this is to extend it to 64
bits. Many systems have already done this.

The asctime() and ctime() routines have a Y10K problem, since the
format for a printable date is specified as having 4 digits. Someone
might misinterpret Y10K as meaning 2010 instead of 10000.

The struct tm element tm_year is subject to overflow (of either 16-bit
or 32-bit integers) if you extend time_t to 64 bits as a number of seconds
since January 1, 1970 0:00:00 GMT. This first happens for the year
32768+1900 on a machine with 16-bit ints, unless time_t runs out first.

There *is* a real 2010 time bug in NewtonOS on January 5, 2010, in
some functions in NewtonScript. This clock uses a number of seconds
since January 1, 1993, and uses a 30-bit integer. As far as I know,
there is no implementation of C built on NewtonScript. There is a
(or maybe several) C++ implementation on NewtonOS; it uses (unsigned
32-bit) dates from 1904-2040 without problems. Does your company
depend on Apple Newtons to conduct its business?

The GPS week number (0-1023) overflowed on August 21, 1999. The
next overflow is due about 19.6 years from then. That's not close
to 2010.

There is a real problem representing the maturity date of a 28-year
mortgage issued in 2010. As far as I know, nobody issues 28-year
mortgages. 30-year mortgages used to be common, and that overflow
has already happened. Banks used to have money, also. Besides,
banks want to represent the day, not the time, as time_t does funny
things with time zones that don't jive with the way banks do business.

Some economists are projecting the US national debt will overflow
an integer in 2010. That's a 64-bit, 128-bit, 256-bit, or 512-bit
integer, take your pick.


It is possible that something in *one* ANSI C implementation used
an int where a long should have been used. This is not a problem
on systems with 32-bit or larger ints. I'm also having trouble
imagining what overflows. The number of *days* since January 1, 1970
isn't close to overflowing into the sign bit of a 16-bit int.


Gordon,

Thanks for the detailled response, it's basically what I was looking for. I
have posted a couple of links in my previous reply to a response and I'd
appreciate your comments on them being seemingly cloned and altogether
vague.

My company was contacted by a large customer which uses C code to control
some, err..., let's just say, sensitive equipment, along with ADA, Pascal
and Fortran. This code has to be re-certified after each change and so the
costs of suddenly realizing that a potential problem might exist sent alarm
bells ringing. I also have to add here that, although my question concerns
ANSI-C based compilers in particular, the end platform might be Solaris, VMS
or cross-compiled onto 68000 based chips/circuits depending on the
application, very few of them rely on windows-based technology.

Thanks again,

Ade
 
B

Ben Pfaff

ANSI C specifies very little about the date routines. In particular:

- It does NOT specify that a time_t is 32 bits.
- It does NOT specify that a time_t represents a count of the number
of <insert some time unit here> since <fixed time>. It especially
doesn't specify that a time_t represents a number of seconds.
- It does NOT specify that if you subtract two time_t's, you get anything
meaningful. Even the sign of the result is useless. Try, for example,
subtracting 12311999 (December 31, 1999) from 01012000 (January 1, 2000).
You get -11299999, which doesn't mean much.

It is, however, a common implementation choice for a time_t to
represent a number of seconds since the epoch, and in particular
it is a POSIX requirement.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

Forum statistics

Threads
473,755
Messages
2,569,536
Members
45,013
Latest member
KatriceSwa

Latest Threads

Top