I made a simple async method to call an SQL stored procedure asynchronously.
In my console program, I am calling this method 1000 times in a loop, and sleep for 1ms (Thread.Sleep) between each call. I start a StopWatch before entering the loop, and stop it when exiting the loop, and display the time spent in the loop.
On my development machine (Win7 - VS 2012 RC), I can see what I was expecting to see :
Completed in 1006 ms
This seems logical, considering that the call to the async method returns almost immediately (when reaching the first await
keyword), so there is just a small overhead (6ms) incured while executing the code before the await.
However when I run the exact same code on a server machine (Win2008 R2 SP1) on which I have installed .NET Framework 4.5 RC the code is running fine however the execution time is far from the one I am expecting, in no comparison with the one obtained when running the program on my development machine :
Completed in 15520 ms
Meaning that somehow the async method being called is not really called asynchronously and the first await seems to somehow block ?
Here is the code of the async method I am calling :
public async void CallSpAsync()
{
var cmd = new SqlCommand("sp_mysp");
{
var conn = new SqlConnection(connectionString);
{
cmd.Connection = conn;
cmd.CommandType = CommandType.StoredProcedure;
[...Filling command parameters here - nothing interesting...]
await cmd.Connection.OpenAsync();
await cmd.ExecuteNonQueryAsync();
cmd.Dispose();
cmd.Connection.Dispose();
}
}
}
And here is the main program testing code (loop) :
Stopwatch sw = new Stopwatch();
sw.Start();
for (int i = 0; i < 1000; i++)
{
CallSpAsync();
Thread.Sleep(1);
}
sw.Stop();
I am running the exact same executable (console program compiled in Release) on both machines.
I am tyring to figure out why the method is not truly called asynchronously when running the program on the server machine.
Any idea ?
Thanks !
EDIT
The problem has nothing to do with async/await which is working just fine, but is due to the timer resolution (used by StopWatch) on the server being 15 time less than on my workstation. The code is not running slower at all, it's just the resolution of the timer which is leading to incorect elapsed time computation.
See answer from James Manning below.
This might be a function of the timer resolution on the server machine. Check the ClockRes utility. You can get some more details in my comment on High accuracy DateTime.UtcNow.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With