Asynchronous Streaming in ASP.NET WebApi

Tags: .NET, ASP.NET WebAPI, Async, C#, NET4.5

 Hi everyone, if you use the cool MVC4 WebApi you might encounter yourself in a common situation where you need to return a rather large amount of data (most probably from a database) and you want to accomplish two things:

  1. Use streaming so the client fetch the data as needed, and that directly correlates to more fetching in the server side (from our database, for example) without consuming large amounts of memory.
  2. Leverage the new MVC4 WebApi and .NET 4.5 async/await asynchronous execution model to free ASP.NET Threadpool threads (if possible). 

So, #1 and #2 are not directly related to each other and we could implement our code fulfilling one or the other, or both. The main point about #1 is that we want our method to immediately return to the caller a stream, and that client side stream be represented by a server side stream that gets written (and its related database fetch) only when needed. In this case we would need some form of "state machine" that keeps running in the server and "knows" what is the next thing to fetch into the output stream when the client ask for more content.

This technique is generally called a "continuation" and is nothing new in .NET, in fact using an IEnumerable<> interface and the "yield return" keyword does exactly that, so our first impulse might be to write our WebApi method more or less like this:

 

        public IEnumerable<Metadata> Get([FromUriint accountId)
        {
            // Execute the command and get a reader
            using (var reader = GetMetadataListReader(accountId))
            {
                // Read rows asynchronously, put data into buffer and write asynchronously
                while (reader.Read())
                {
                    yield return MapRecord(reader);
                }
            }
        }
 

While the above method works, unfortunately it doesn't accomplish our objective of returning immediately to the caller, and that's because the MVC WebApi infrastructure doesn't yet recognize our intentions and when it finds an IEnumerable return value, enumerates it before returning to the client its values. To prove my point, I can code a test method that calls this method, for example:

        [TestMethod]
        public void StreamedDownload()
        {
            var baseUrl = @"http://localhost:57771/api/metadata/1";
            var client = new HttpClient();
 
            var sw = Stopwatch.StartNew();
            var stream = client.GetStreamAsync(baseUrl).Result;
            sw.Stop();
            Debug.WriteLine("Elapsed time Call: {0}ms", sw.ElapsedMilliseconds);
         }

 

So, I would expect the line "var stream = client.GetStreamAsync(baseUrl).Result" returns immediately without server-side fetching of all data in the database reader, and this didn't happened. To make the behavior more evident, you could insert a wait time (like Thread.Sleep(1000);) inside the "while" loop, and you will see that the client call (GetStreamAsync) is not going to return control after n seconds (being n == number of reader records being fetched).

Ok, we know this doesn't work, and the question would be: is there a way to do it?

Fortunately, YES!  and is not very difficult although a little more convoluted than our simple IEnumerable return value. Maybe in the future this scenario will be automatically detected and supported in MVC/WebApi.

The solution to our needs is to use a very handy class named PushStreamContent and then our method signature needs to change to accommodate this, returning an HttpResponseMessage instead of our previously used IEnumerable<>. The final code will be something like this:

 

public HttpResponseMessage Get([FromUriint accountId)
        {
            HttpResponseMessage response = Request.CreateResponse();
 
            // Create push content with a delegate that will get called when it is time to write out 
            // the response.
            response.Content = new PushStreamContent(
                async (outputStream, httpContent, transportContext) =>
                {
                    try
                    {
                        // Execute the command and get a reader
                        using (var reader = GetMetadataListReader(accountId))
                        {
 
                            // Read rows asynchronously, put data into buffer and write asynchronously
                            while (await reader.ReadAsync())
                            {
                                var rec = MapRecord(reader);
 
                                var str = await JsonConvert.SerializeObjectAsync(rec);
 
                                var buffer = UTF8Encoding.UTF8.GetBytes(str);
 
                                // Write out data to output stream
                                await outputStream.WriteAsync(buffer, 0, buffer.Length);
                            }
                        }
                    }
                    catch(HttpException ex)
                    {
                        if (ex.ErrorCode == -2147023667) // The remote host closed the connection. 
                        {
                            return;
                        }
                    }
                    finally
                    {
                        // Close output stream as we are done
                        outputStream.Close();
                    }
                });
 
            return response;
        }

 

As an extra bonus, all involved classes used already support async/await asynchronous execution model, so taking advantage of that was very easy. Please note that the PushStreamContent class receives in its constructor a lambda (specifically an Action) and we decorated our anonymous method with the async keyword (not a very well known technique but quite handy) so we can await over the I/O intensive calls we execute like reading from the database reader, serializing our entity and finally writing to the output stream.

 

 Well, if we execute the test again we will immediately notice that the client line (var stream = client.GetStreamAsync(baseUrl).Result;) returns immediately and then the rest of the server code is executed only when the client reads through the obtained stream, therefore we get low memory usage and far greater scalability for our beloved application serving big chunks of data.

Enjoy!

Andrés.

 

  

 

 

 

 

2 Comments

  • andresv said

    For what I understand, SignalR is all about realtime signaling to client browser applications, like for example, keeping a data grid synchronized with changes coming from multiple users. The technique I'm showing here might be used for the same thing, but you will still need to construct the high level abstractions SignalR already provides. The point of this example is just to show a very specific solution to a very specific problem: how to serve large amounts of data without consuming large amounts of server memory and resources. Hope this helps! Andrés.

Comments have been disabled for this content.