I'm converting a DateTime to OADate. I was expecting to get the exact same DateTime when converting the OADate back, but now it has only millisecond resolution, and is therefore different.
var a = DateTime.UtcNow;
double oadate = a.ToOADate();
var b = DateTime.FromOADate(oadate);
int compare = DateTime.Compare(a, b);
//Compare is not 0; the date times are not the same
Ticks from a: 634202170964319073
Ticks from b: 634202170964310000
The OADate double: 40437.290467951389
What is the reason for this? The resolution of DateTime is clearly good enough.
I think this is an excellent question. (I just discovered it.)
Unless you're operating with dates quite close to the year 1900, a DateTime
will have a higher precision than an OA date. But for some obscure reason, the authors of the DateTime
struct just love to truncate to the nearest whole millisecond when they convert between DateTime
and something else. Needless to say, doing this throws away a lot of precision without good reason.
Here's a work-around:
static readonly DateTime oaEpoch = new DateTime(1899, 12, 30);
public static DateTime FromOADatePrecise(double d)
{
if (!(d >= 0))
throw new ArgumentOutOfRangeException(); // NaN or negative d not supported
return oaEpoch + TimeSpan.FromTicks(Convert.ToInt64(d * TimeSpan.TicksPerDay));
}
public static double ToOADatePrecise(this DateTime dt)
{
if (dt < oaEpoch)
throw new ArgumentOutOfRangeException();
return Convert.ToDouble((dt - oaEpoch).Ticks) / TimeSpan.TicksPerDay;
}
Now, let's consider (from your question) the DateTime
given by:
var ourDT = new DateTime(634202170964319073);
// .ToSting("O") gives 2010-09-16T06:58:16.4319073
The precision of any DateTime
is 0.1 µs.
Near the date and time we're considering, the precision of an OA date is:
Math.Pow(2.0, -37.0)
days, or circa0.6286
µs
We conclude that in this region a DateTime
is more precise than an OA date by (just over) a factor six.
Let's convert ourDT
to double
using my extension method above
double ourOADate = ourDT.ToOADatePrecise();
// .ToString("G") gives 40437.2904679619
// .ToString("R") gives 40437.290467961888
Now, if you convert ourOADate
back to a DateTime
using the static FromOADatePrecise
method above, you get
2010-09-16T06:58:16.4319072
(written with"O"
format)
Comparing with the original, we see that the loss of precision is in this case 0.1 µs. We expect the loss of precision to be within ±0.4 µs since this interval has length 0.8 µs which is comparable to the 0.6286 µs mentioned earlier.
If we go the other way, starting with a double
representing an OA date not too close to the year 1900, and first use FromOADatePrecise
, and then ToOADatePrecise
, then we get back to a double
, and because the precision of the intermediate DateTime
is superior to that of an OA date, we expect a perfect round-trip in this case. If, on the other hand, you use the BCL methods FromOADate
and ToOADate
in the same order, it is extremely improbable to get a good round-trip (unless the double
we started with has a very special form).
The static method called by ToOADate clearly divides the ticks by 10000 and then stores the result in a long, thus removing any sub millisecond info
Does anyone know where to find the specs of the OADate format?
private static double TicksToOADate(long value)
{
if (value == 0L)
{
return 0.0;
}
if (value < 0xc92a69c000L)
{
value += 0x85103c0cb83c000L;
}
if (value < 0x6efdddaec64000L)
{
throw new OverflowException(Environment.GetResourceString("Arg_OleAutDateInvalid"));
}
long num = (value - 0x85103c0cb83c000L) / 0x2710L;
if (num < 0L)
{
long num2 = num % 0x5265c00L;
if (num2 != 0L)
{
num -= (0x5265c00L + num2) * 2L;
}
}
return (((double)num) / 86400000.0);
}
Probably has something to do with precision of the double, not the DateTime.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With