I know that when I use DateTime.Now.Ticks in C# it returns a long value but I need to store it in an int variable and I am confused as to whether or not I can maintain that precision. As of right now I just have a cast
int timeStampValue = (int)DateTime.Now.Ticks;
This is a project constraint so I understand there is tons of precision lost. I guess I just couldn't think of another way to do a timestamp thing stored in an int that I could then compare to other timestamps.
Any suggestions or advice on how to maintain the precision, if possible, would be much appreciated.
Everyone's answers were illustrative. I actually ended up just setting up a process involving counters where when an item is used it's counter is set to '0' and all other counters are incremented by 1. Then whatever is the highest counter is the next item to use.
Do you need all the most-significant bits? (e.g. which year)
Do you need all the least significant bits? (e.g. sub-nanosecond precision)
How long an interval do you need to measure over?
If you need millisecond precision only, why not lose the least significant bits
int timeStamp = (int)(DateTime.Now.Ticks >> 10) // lose smallest 10 bits
the OP wants to store times of recently used items: if this is user selections for a single user, you probably don't want anything shorter than a second! as there are 10^7 ticks per second, there are log(10^7)/log(2)=23 excess bits in the long value!
So how much space do you need? Well, your values ought to specify year, month, day, hour, minute and second; There are about 32 million seconds in a year = about 24 bits. add 3 bits if you want to store the last 10 years worth. So will easily fit into an int32. I'd suggest
int timeStamp = (int)(DateTime.Now.Ticks >>23) // retain bits 23 to 55
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With