I have such a code in the C#:
public static decimal StringDateToDecimal(this string stringDate)
{
var cultureInfo = new CultureInfo("ru-Ru");
var date = DateTime.Parse(stringDate, cultureInfo);
var number = date.ToBinary();
var unix = new DateTime(1970, 1, 1);
var diff = (date - unix).Ticks;
return (decimal)diff;
}
For 01 march 2018 it returns 15198624000000000.
But code in the js:
var from = $input.split(".");
var d = new Date(from[2], from[1] - 1, from[0]);
var dStart = new Date(1970, 1, 1);
var seconds = d.getTime();
var secondsStart = dStart.getTime();
var dateDifference = seconds - secondsStart;
For "01.03.2018" it returns 1517184000000
or
var ticks2 = ((d.getTime() * 10000) + 621355968000000000);
var ticks1 = ((dStart.getTime() * 10000) + 621355968000000000);
var difrent = ticks2 - ticks1;
What am I doing wrong in the js code?
I want to get the same value in the js
A tick is the dequeuing of an event from the "event loop queue" and the execution of said event.
In JavaScript, in order to get the current timestamp, you can use Date. now() . It's important to note that Date. now() will return the number of milliseconds since January, 1 1970 UTC.
The static Date.now() method returns the number of milliseconds elapsed since January 1, 1970 00:00:00 UTC.
If the DateTime object has its Kind property set to Unspecified , its ticks represent the time elapsed time since 12:00:00 midnight, January 1, 0001 in the unknown time zone. In general, the ticks represent the time according to the time zone specified by the Kind property.
JavaScript: Date.getTime()
: a number, representing the number of milliseconds since midnight January 1, 1970.
C#: A single tick represents one hundred nanoseconds or one ten-millionth of a second. There are 10,000 ticks in a millisecond, or 10 million ticks in a second.
So, JavaScript ticks = (C# ticks / 10000)
and your code looks OK. You just need to account for the difference in whichever code (C# or JavaScript) you choose.
If you take (an unambiguous day-month date) in JavaScript (e.g. 23 December 2018):
var $input = "23.12.2018";
var from = $input.split(".");
var dNew = new Date(from[2], from[1]-1, from[0]);
//Note dStart has 0 as month because JavaScript dates start at 0 and end with 11
var dStart = new Date(1970, 0, 1);
var seconds = dNew.getTime();
var secondsStart = dStart.getTime();
var dateDifference = seconds - secondsStart;
// multiply by 10000 to reconcile to c#
console.log("===> " + dateDifference * 10000);
Both answers come to 15455232000000000 if you use StringDateToDecimal("23 december 2018");
in your C# code.
I hope I'm getting the jist of what you are asking.
The JavaScript Date
type's origin is the Unix epoch starting at midnight on 1 January 1970.
The .NET DateTime
type's origin is midnight on 1 January 0001.
And for ticks, from the DateTime.Ticks
docs:
A single tick represents one hundred nanoseconds or one ten-millionth of a second. There are 10,000 ticks in a millisecond, or 10 million ticks in a second.
So with all of that in mind, your code can be reduced to this:
var d = new Date(2018, 3, 1); // Your date
var dStart = new Date(1970, 1, 1);
var dateDifference = ((d.getTime() - dStart.getTime()) * 10000);
console.log(dateDifference); // 15198588000000000
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With