3

In troubleshooting a test that validated concurrency I discovered that some of our build machines (running in VMs) consistently report that tasks aren't waiting the full Task.Delay interval. To confirm, I wrote a test which creates a histogram of how many milliseconds from the requested delay it took.

On my machine, the results are as you'd expect:

interval ms => number of items completing in that interval
--
0 - 4 ms => 13 # 13 iterations completed within 4 ms of the requested delay
5 - 9 ms => 194
10 - 14 ms => 714
15 - 19 ms => 61
20 - 24 ms => 12
25 - 29 ms => 3
40 - 44 ms => 2
45 - 49 ms => 1

But on the build machine, the test reports that all of the tasks completed before the requested delay was completed:

-10 - -6 ms => 999
-5 - -1 ms => 1

Here's the test / histogram generator:

[Test]
[Timeout(60_000)]
public async Task TaskDelayAccuracyCheck()
{
  var results = new List<long>();
  for (var i = 0; i<1000; ++i)
  {
    var sw = new Stopwatch();
    sw.Start();
    await Task.Delay(20);
    sw.Stop();
    results.Add((sw.ElapsedTicks - 20*10_000)/10_000);
  }
  var histo = results.GroupBy(t => t / 5).OrderBy(x => x.Key);
  foreach (var group in histo)
  {
    Console.WriteLine($"{group.Key*5} - {(group.Key+1)*5 - 1} ms => {group.Count()}");
  }
  Assert.Multiple(() =>
  {
    foreach (var group in histo)
    {
      Assert.That(group.Key, Is.GreaterThanOrEqualTo(0));
    }
  });
}

The Task.Delay documentation says (emphasis added):

This method depends on the system clock. This means that the time delay will approximately equal the resolution of the system clock if the delay argument is less than the resolution of the system clock, which is approximately 15 milliseconds on Windows systems.

In this case, I have ensured that the delay is not less than the 15 ms referred to above, and I've used the Stopwatch class to ensure the timings are as precise as possible. In addition, Stopwatch.IsHighResolution returns true, so I should be able to trust the timings.

I have been operating under the presumption that the delay will always be at least the requested time, but could be longer by the resolution of the system clock. Should I infer instead that the delay will always be within the (system clock resolution) or roughly 15ms on Windows? Or, is something else going on?

4

1 回答 1

4

I don't think you're measuring correctly.

You're assuming that there are always 10,000 ticks per millisecond. But that might not always be the case. You can look at Stopwatch.Frequency to see the ticks per second that it uses on the current system. That value is set the first time Stopwatch is used based on a call to the native Windows QueryPerformanceFrequency function.

Divide by 1000 to get ticks per millisecond.

var ticksPerMillisecond = Stopwatch.Frequency / 1000;

But you can even more easily just use the ElapsedMilliseconds property, which accurately converts the ticks to milliseconds for you.

results.Add(sw.ElapsedMilliseconds - 20);
于 2019-12-12T18:40:21.533 回答