I am not sure if this question belongs on StackOverflow but here it is.
I need to generate a timestamp using C# for some data which is to be transferred from one party to another party and I need to know what is the worst case precision of the system clock in all operating system (Windows, Linux and Unix)? What I need is to figure out the precision such that all operating systems are able to validate this timestamp.
As an example the clock’s resolution for Windows Vista operating systems is approximately 10-15 milliseconds.
Interesting. The major operating systems have—at worst—centisecond resolution (0.01 seconds), though that’s often embedded with more precision.
Linux offers up to microsecond resolution in its timestamps (see
man utime) depending on the computer’s clock hardware. Windows NT/Win2K/XP/etc. offer millisecond precision in file timestamps (using NTFS only) though it accounts for all system timestamps in0.000 000 1second units (ten million per second).If accurate and precise time resolution is needed between systems, GPS receivers easily achieve 100 nanosecond precision as a side effect of how they work, and many inexpensive models do as well as 10 ns. Special GPS models make the derived time available for external use.