I am trying to understand something about getTime()
,
My problem is that,
I am setting up a new Date with lets say: 23,07,2012.
When I am using getTime()
on it I should get the milliseconds Since 01,01,1970.
When I divide the value I getting from getTime()
with (1000*60*60*24) I should get the days number was pass since 01,01,1970 until 05,07,2012 but somehow I get a number with a Decimal point (15543.875) I dont understand why, I mean since 01,01,1970 and 23,07,2012 I should get an integer(that what I think), well I know I am really wrong, if can someone please help me understand why the decimal point at the result.
If you are setting a date like new Date(2012, 06, 23)
It will be set according to the timezone of the client, where as .getTime()
is UTC
. You want Date.UTC
:
Date.UTC(2012,6,23) / (1000*60*60*24)
//15544 For any computer
new Date(2012, 06, 23) / (1000*60*60*24)
//15543.875 For my computer, I am coincidentally in the same timezone as Israel. This result will depend on what timezone the client is.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With