In Actionscript, the Unix timestamp in milliseconds is obtainable like this:
public static function getTimeStamp():uint { var now:Date = new Date(); return now.getTime(); }
The doc clearly states the following:
getTime():Number Returns the number of milliseconds since midnight January 1, 1970, universal time, for a Date object.
When I trace it, it returns the following:
824655597
So, 824655597 / 1000 / 60 / 60 / 24 / 365 = 0.02 years. This is obviously not correct, as it should be around 39 years.
Question #1: What's wrong here?
Now, onto the PHP part: I'm trying to get the timestamp in milliseconds there as well. The microtime()
function returns either a string (0.29207800 1246365903) or a float (1246365134.01), depending on the given argument. Because I thought timestamps were easy, I was going to do this myself. But now that I have tried and noticed this float, and combine that with my problems in Actionscript I really have no clue.
Question #2: how should I make it returns the amount of milliseconds in a Unix timestamp?
Timestamps should be so easy, I'm probably missing something.. sorry about that. Thanks in advance.
EDIT1: Answered the first question by myself. See below.
EDIT2: Answered second question by myself as well. See below. Can't accept answer within 48 hours.
This returns the number of milliseconds since the Unix Epoch of the Day.
The UNIX timestamp is an integer that represents the number of seconds elapsed since January 1 1970. The timestamp in JavaScript is expressed in milliseconds.
I used unsigned integer as the return type of the function. This should be Number.
public static function getTimeStamp():Number { var now:Date = new Date(); return now.getTime(); }
Think I got the function for getting milliseconds in PHP5 now.
function msTimeStamp() { return round(microtime(1) * 1000); }
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With