On my system (at home) the only reliable method would be asking the
user to provide it. The machine is running on UTC while the official
timezone is MESZ (UTC+2).
But Unix machines, although they "run" on UTC, can be configured with a
local timezone. This is the time you will see when you run the 'date'
command, for example. On most machines I've seen this is set by creating
/etc/localtime as a symlink to /usr/share/zoneinfo/<whatever>; it can also
be overriden by setting environment variable TZ.
In many parts of the world the timezone offset varies throughout the year,
but with a modern implementation of localtime() it will tell you it:
#include <stdio.h>
#include <time.h>
int main(void)
{
time_t now = time(NULL);
struct tm *tm = localtime(&now);
printf("The current timezone name is %s\n",tm->tm_zone);
printf("The current timezone offset is %ld secs from GMT\n",tm->tm_gmtoff);
return 0;
}
When I run this I get:
The current timezone name is BST
The current timezone offset is 3600 secs from GMT
(I am in the UK). So the information is there - whether there's a Ruby
wrapper for it I don't know.
If you're on a Windows machine though, you're in a whole different sorry
state. At work I get E-mails from people using Outlook calendar saying
"meeting scheduled for 14:00 GMT (London, Lisbon, ...)" when in fact they
mean 2pm local time, which in the summer is 13:00 GMT. The clock is actually
moved forwards and backwards when daylight savings starts and ends, instead
of calculating the offset. If you turn your computer on at 1.30am on the day
that the change takes place, your computer has no idea what the correct time
is.
Cheers,
Brian.