await, WhenAll, WaitAll, oh my!!

If you are dealing with asynchronous work in .NET, you might know that the Task class has become the main driver for wrapping asynchronous calls. Although this class was officially introduced in .NET 4.0, the programming model for consuming tasks was much more simplified in C# 5.0 in .NET 4.5 with the addition of the new async/await keywords. In a nutshell, you can use these keywords to make asynchronous calls as if they were sequential, and avoiding in that way any fork or callback in the code. The compiler takes care of the rest.

I was yesterday writing some code for making multiple asynchronous calls to backend services in parallel. The code looked as follow,

var allResults = new List<Result>();
foreach(var provider in providers)
{
  var results = await provider.GetResults();
  allResults.AddRange(results);
}
return allResults;

You see, I was using the await keyword to make multiple calls in parallel. Something I did not consider was the overhead this code implied after being compiled. I started an interesting discussion with some smart folks in twitter. One of them, Tugberk Ugurlu, had the brilliant idea of actually write some code to make a performance comparison with another approach using Task.WhenAll.

There are two additional methods you can use to wait for the results of multiple calls in parallel, WhenAll and WaitAll.

WhenAll creates a new task and waits for results in that new task, so it does not block the calling thread. WaitAll, on the other hand, blocks the calling thread. This is the code Tugberk initially wrote, and I modified afterwards to also show the results of WaitAll.

 class Program
    {
        private static Func<Stopwatch, Task>[] funcs = new Func<Stopwatch, Task>[] { 
            async (watch) => { watch.Start(); await Task.Delay(1000); 
                Console.WriteLine("1000 one has been completed."); },
            async (watch) => { await Task.Delay(1500); 
                Console.WriteLine("1500 one has been completed."); },
            async (watch) => { await Task.Delay(2000); 
                Console.WriteLine("2000 one has been completed."); watch.Stop(); 
                Console.WriteLine(watch.ElapsedMilliseconds + "ms has been elapsed."); }
        };
        static void Main(string[] args)
        {
            Console.WriteLine("Await in loop work starts...");
            DoWorkAsync().ContinueWith(task =>
            {
                Console.WriteLine("Parallel work starts...");
                DoWorkInParallelAsync().ContinueWith(t =>
                    {
                        Console.WriteLine("WaitAll work starts...");
                        WaitForAll();
                    });
            });
            Console.ReadLine();
        }
        static async Task DoWorkAsync()
        {
            Stopwatch watch = new Stopwatch();
            foreach (var func in funcs)
            {
                await func(watch);
            }
        }
        static async Task DoWorkInParallelAsync()
        {
            Stopwatch watch = new Stopwatch();
            await Task.WhenAll(funcs[0](watch), funcs[1](watch), funcs[2](watch));
        }
        static void WaitForAll()
        {
            Stopwatch watch = new Stopwatch();
            Task.WaitAll(funcs[0](watch), funcs[1](watch), funcs[2](watch));
        }
    }

After running this code, the results were very concluding.

Await in loop work starts...
1000 one has been completed.
1500 one has been completed.
2000 one has been completed.
4532ms has been elapsed.


Parallel work starts...
1000 one has been completed.
1500 one has been completed.
2000 one has been completed.
2007ms has been elapsed.

WaitAll work starts...
1000 one has been completed.
1500 one has been completed.
2000 one has been completed.
2009ms has been elapsed.

The await keyword in a loop does not really make the calls in parallel.

1 Comment

  • This seems to work as you state, but in the example you are passing in a known amount of functions. How would you go about this when you don't know how many items were in your list?

Comments have been disabled for this content.