Update: in the original version of this post I’ve jumped to an incorrect conclusion that the timer resolution depends on the operating system. However thanks to a comment from James Manning who has corrected me, I realized that this is likely not related to the OS version (because we’re seeing high resolution DateTime on both Win7 and Windows Server 2008 R2).
I’ve just discovered an interesting thing: apparently one of my machines has timer resolution 1 millisecond or better (as opposed to the typical 15ms that I’m used to seeing elsewhere).
I ran these tests on one machine:
var dt1 = DateTime.UtcNow;
var dt2 = DateTime.UtcNow;
Console.WriteLine(dt2 - dt1);
It outputs 00:00:00.0156006 (common knowledge – 15-16 milliseconds)
However on my other machine, it outputs: 00:00:00.0110011
Also, I ran this test:
var sw = Stopwatch.StartNew();
var start = DateTime.UtcNow;
int changes = 0;
int idle = 0;
while (sw.ElapsedMilliseconds < 50)
var now = DateTime.UtcNow;
if (now != start)
start = now;
var elapsed = sw.Elapsed.ToString();
and it printed:
Which means it only updated 3 times within a 50 millisecond interval. On my new machine, this program printed:
Which means that it updated at least every millisecond.
I don't believe this is a server OS thing.
FWIW, DateTime.UtcNow just uses the Win32 call GetSystemTimeAsFileTime (see msdn.microsoft.com/.../ms724397.aspx ), so the managed code isn't doing anything special here.
As such, the *precision* is actually 100ns (like the Win32 call it makes), but the accuracy is based on the timer interval (15.6ms or 1ms). The precision of 100ns is documented via the Ticks property: "The value of this property represents the number of 100-nanosecond intervals that have elapsed since 12:00:00 midnight, January 1, 0001"
The timer resolution could always go lower (even in XP), but not many things used it.
You can check your current timer resolution (and min/max) using the ClockRes utility from sysinternals @ technet.microsoft.com/.../bb897568
On this 32-bit Win7 SP1 machine I'm typing this on, for instance, the current resolution is 1ms (although this may not be the default at boot, I haven't tested that)
C:\Users\James\Downloads\ClockRes » .\Clockres.exe
ClockRes v2.0 - View the system clock resolution
Copyright (C) 2009 Mark Russinovich
SysInternals - www.sysinternals.com
Maximum timer interval: 15.600 ms
Minimum timer interval: 0.500 ms
Current timer interval: 1.000 ms
And running your second snippet in linqpad on the machine gives:
Mark's 1997 paper discussing the timer resolution is at www.decuslib.com/.../timer.txt
A slightly prettier version of the same is at v3ps.narod.ru/.../timer.shtml.htm
Hey James - thanks so much for correcting me. I should be ashamed for jumping to conclusions without verifying the hypothesis.
I should add that multimedia timers are a common cause of seeing a lower resolution on some systems. There's also powercfg /energy which can report which applications are tinkering with the timer resolution -- which is, generally, a BAD thing as it adversely affects power consumption.
I don't think this has anything to do with UtcNow. the "type" is consistent across OS editions (i.e. 64-bits of 100-nanosecond granularity). What you are probably seeing is the different granularity of Thread.Sleep() between Server and Client SKUs of Windows or the different speed (or amount of CPU given the that process at that time) between client/server SKUs...
You can find out more on this topic as well as a little program I wrote to modify the system timer at www.lucashale.com/timer-resolution