I am using strtotime() to get a timestamp from a date and time string. I will be running strtotime() during the summer (daylight savings) to give me a timestamp of a winter date (non-daylight savings).
In the winter, I will need to convert my timestamp to a readable date using date() -- will it be the same date/time I put into strtotime() during the summer?
On each one of my pages, I am setting my timezone by date_default_timezone_set with my city.
So, running this during the summer (daylight savings):
date_default_timezone_set('America/Los_Angeles');
echo strtotime("Dec 1 2014 8:00 am");
Gives me a certain timestamp 1417449600.
Will running this during the winter (non-daylight savings) return 8:00am as I need it to do?
date_default_timezone_set('America/Los_Angeles');
echo date("g:ia",1417449600);
Yes. If the timezone you set is doesn't explicitly say whether it's standard or daylight-savings time, it automatically determines the state of DST from the time that you give it and the rules for when the timezone switches into and out of DST.
Yes. A UNIX timestamp such as 1417449600 represents a completely, globally, universally unique point in time, independent of fussy timezone notation. There's only one "December 1st 2014 8 am in Los Angeles", which is the same point in time as "December 1st 2014 17:00 CET" and a number of other local notations across the world. The UNIX timestamp 1417449600 expresses that point in time, regardless of whatever your wall clock says exactly.
When you format this unique point in time back to a more human readable format using date(), it figures out what exactly the time must be formatted at based on the set timezone. It won't change based on what the time or DST settings are now.
Related
I just want to check if time() returns a UTC/GMT timestamp or do I need to use date_default_timezone_set()?
time returns a UNIX timestamp, which is timezone independent. Since a UNIX timestamp denotes the seconds since 1970 UTC you could say it's UTC, but it really has no timezone.
To be really clear, a UNIX timestamp is the same value all over the world at any given time. At the time of writing it's 1296096875 in Tokyo, London and New York. To convert this into a "human readable" time, you need to specify which timezone you want to display it in. 1296096875 in Tokyo is 2011-01-27 11:54:35, in London it's 2011-01-27 02:54:35 and in New York it's 2011-01-26 21:54:35.
In effect you're usually dealing with (a mix of) these concepts when handling times:
absolute points in time, which I like to refer to as points in human history
local time, which I like to refer to as wall clock time
complete timestamps in any format which express an absolute point in human history
incomplete local wall clock time
Visualise time like this:
-------+-------------------+-------+--------+----------------+------>
| | | | |
Dinosaurs died Jesus born Y2K Mars colonised ???
(not to scale)
An absolute point on this line can be expressed as:
1296096875
Jan. 27 2011 02:54:35 Europe/London
Both formats express the same absolute point in time in different notations. The former is a simple counter which started roughly here:
start of UNIX epoch
|
-------+-------------------+------++--------+----------------+------>
| | | | |
Dinosaurs died Jesus born Y2K Mars colonised ???
The latter is a much more complicated but equally valid and expressive counter which started roughly here:
start of Gregorian calendar
|
-------+-------------------+-------+--------+----------------+------>
| | | | |
Dinosaurs died Jesus born Y2K Mars colonised ???
UNIX timestamps are simple. They're a counter which started at one specific point in time and which keeps increasing by 1 every second (for the official definition of what a second is). Imagine someone in London started a stopwatch at midnight Jan 1st 1970, which is still running. That's more or less what a UNIX timestamp is. Everybody uses the same value of that one stopwatch.
Human readable wall clock time is more complicated, and it's even more complicated by the fact that it's abbreviated and parts of it omitted in daily use. 02:54:35 means almost nothing on the timeline pictured above. Jan. 27 2011 02:54:35 is already a lot more specific, but could still mean a variety of different points on this line. "When the clock struck 02:54:35 on Jan. 27 2011 in London, Europe" is now finally an unambiguous absolute point on this line, because there's only one point in time at which this was true.
So, timezones are a "modifier" of "wall clock times" which are necessary to express a unique, absolute point in time using a calendar and hour/minute/second notation. Without a timezone a timestamp in such a format is ambiguous, because the clock struck 02:54:35 on Jan. 27 2011 in every country around the globe at different times.
A UNIX timestamp inherently does not have this problem.
To convert from a UNIX timestamp to a human readable wall clock time, you need to specify which timezone you'd like the time displayed in. To convert from wall clock time to a UNIX timestamp, you need to know which timezone that wall clock time is supposed to be in. You either have to include the timezone every single time with each such conversion, or you set the default timezone to be used with date_default_timezone_set.
Since PHP 5.1.0 (when the date/time functions were rewritten), every
call to a date/time function will generate a E_NOTICE if the timezone
isn't valid, and/or a E_WARNING message if using the system settings
or the TZ environment variable.
So in order to get a UTC timestamp you should check what the current timezone is and work off of that or just use:
$utc_str = gmdate("M d Y H:i:s", time());
$utc = strtotime($utc_str);
http://us3.php.net/time
"Returns the current time measured in the number of seconds since the Unix Epoch (January 1 1970 00:00:00 GMT)."
So I believe the answer to your question is yes.
From the documentation
Returns the current time measured in the number of seconds since the Unix Epoch (January 1 1970 00:00:00 GMT).
One of the comments claimed: "both time() and strtotime(gmdate("M d Y H:i:s", time())) return the same result"
Since I wasn't sure about that, I ran a test:
$now = strtotime(gmdate("Y-m-d H:i:s", time()));
$now2 = time();
echo ' now='.$now.' now2='.$now2.' diff='.($now - $now2);
Output was:
now=1536824036 now2=1536806036 diff=18000
Diff is 18000 seconds = 5 hours = the timezone offset for the server running the test.
Today at work we had an argument. You register a user and write in the db the date of creation of the account. The PHP.ini is set to utc and writes the date in UTC. The problem is that when you transform the time for every user (they can set it to the Europe/London and some other countries in Europe. So, the argument was do you get a different time depending on your date-time savings or not and is that a problem for the database?
As long as you are storing in a timestamp column, timezones will not affect your time functions. This is because timestamp stores an absolute time (Epoch time) representing time elapsed since Jan 1 1970 UTC. So, all you will have to do is convert your time to/from Epoch time and your timezone in the browser should be localized. Of course, by storing Epoch time, your times will always be relative to UTC in the database.
I'm having a problem with PHP's gmstrftime() function.
Please look:
<?
$ts[]=1348573985; // '2012-09-25 13:53:05' (date returned from mysql's from_unixtime() function)
$ts[]=1233958620; // '2009-02-06 23:17:00' (date returned from mysql's from_unixtime() function)
foreach($ts as $t) {
echo $t." => ".gmstrftime( "%d %B %Y - %H:%M", $t )."\n";
}
?>
Output will be:
1348573985 => 25 September 2012 - 11:53
1233958620 => 06 February 2009 - 22:17
As you can see, the first timestamp is 2 hours off (from mysql's output), which is normal because of timezone settings. But the second one is only 1 hour off but I did not change the timezone between the two gmstrftime() call's??
Is this a bug in PHP's gmstrftime() function, or anything else?
From the manual for gmstrftime:
Behaves the same as strftime() except that the time returned is Greenwich Mean Time (GMT).
Greenwich Mean Time is the same all year around. This is different from the local time in the UK, which is set as GMT in winter, but "British Summer Time" (GMT+1, i.e. one hour ahead of GMT) in the summer. The same happens in Western Europe, which is GMT+1 in winter, but GMT+2 in summer.
Your MySQL database is presumably configured for local European time, so when converting a Unix timestamp that occurs during the summer, it adds an extra hour to line up with the Summer Time adjustment.
In my opinion, the best policy is to set all your systems to use 'UTC' (basically the same as GMT) and then convert to a local timezone "at the last minute". You could standardise on some other timezone, but UTC acts as a good baseline for debugging.
In a table all the records are stored in GMT time. But through my application i want to display only those records which falls into timezone UTC. i.e., in a web page i want to display only records that comes under UTC time zone.
Converting from GMT to UTC. Or Query the database to get all the records of UTC timezone.
I really appreciate an early reply.
I am using oracle database and application in PHP.
From Greenwich Mean Time on Wikipedia:
It [GMT] is arguably the same as Coordinated Universal Time (UTC).
Regarding converting, you can add/remove intervals, but in so far as I'm aware, Oracle supports timestamps with/without time zones:
select now() at time zone 'UTC' as utc,
now() at time zone 'EST' as est,
now() at time zone 'Europe/London' as london;
The last example, if it works, would allow you to not worry about daylight savings and so forth.
I'll assume that your times are stored as DATE values, that all the values are stored as UTC times, and that the timezone you're interested in is constant. To convert from UTC to a given timezone you add the timezone's offset. In this case, since the timezone of interest has a negative offset you need to add in the same negative number. Thus, the following might be useful:
SELECT DATE_FIELD + INTERVAL '-5' HOUR
FROM SOME_TABLE
WHERE <whatever>
FWIW, there are some places where the conversion to local time uses a non-whole-hour offset - for example, Adelaide, Australia uses a +9.5 hour offset from GMT, and Kathmandu, Nepal uses +5.75 hours.
Share and enjoy.
EDIT: Given the data as you've described it, your best bet is probably to simply add in the session time zone, as follows:
SELECT your_gmt_timestamp_field AT TIME ZONE SESSIONTIMEZONE
FROM your_table
Give this a try and see if it helps.
I need to format a UNIX timestamp in GMT format to a local date/time. I'm using gmstrftime to do this and I can get the correct result if I use an offset. I just so happen to know what my offset is for the pacific timezone but I don't want to have to get the correct time like this. I've used date_default_timezone_set so gmstrftime is reporting the correct timezone but the time is off by like a day.
I don't get it. If gmstrftime knows what timezone I'm in, why is the time off?
If you have the correct timezone set (such as with date_default_timezone_set) then you only need to use date() for the formatting, no extra coding. UNIX timestamps are in GMT by definition of a UNIX timestamp -- number of seconds since January 1, 1970 00:00:00 GMT.
gmstrftime will always return the time as UTC, seems like you want strftime.