I have a integer column in MySql storing the DateTime.Ticks.
A single tick represents one hundred nanoseconds or one ten-millionth of a second. There are 10,000 ticks in a millisecond.
The value of this property represents the number of 100-nanosecond intervals that have elapsed since 12:00:00 midnight, January 1, 0001
How can I convert this to a DateTime in a query? I’ve tried many things, but cannot get it to work.
For the ticks 634128921500016150 I hope to get the mysql datetime ‘2010-06-23 12:06:50’
I would have believed the following should work, but it gives ‘4009-06-22 12:15:50.001600’. It seems it’s off by 2001 years, 1 day and 9 minutes… If the years and days is consistent, I can just fix it manually, but the minutes seems a little odd.
SELECT DATE_ADD('0000-01-01 00:00:00',
INTERVAL 634128921500016150/10000000 SECOND_MICROSECOND);
I’ve tried adding more zeros, but it never matches 😐
I also tried Jon Skeet’s suggestion, but it gives nearly the same result (some fraction of a second different)
Rather than adding using SECOND_MICROSECOND, try just adding via MICROSECOND:
EDIT: I’ve just worked out why the years are so wrong. MySQL’s minimum date is the year 1000. So I suggest you change it to:
where
base_ticksis the value of the ticks fromnew DateTime(1001, 1, 1).Ticks.Heck, you could rebase wherever you want (e.g. 2000) – that might even work round the 9 minutes issue. It’s possible that it’s making up for leap seconds over the years, or something like that.