Handling batch operations with Entity Framework and EF Utilities

In this post I am going to provide you with a hands-on example on how to handle efficiently batch operations with Entity Framework and the EF Utilities library.

Entity Framework is an object-relational mapping (ORM) framework for the .NET Framework.EF addresses the problem of Object-relational impedance mismatch. I will not be talking about that mismatch because it is well documented in many sites on the Internet.

Through that framework we can program against a conceptual application model instead of programming directly against a relational schema-model. By doing so we can decrease the amount of code we do write to access a data storage and thus decrease maintenance time. You can find many posts regarding Entity Framework in this blog.

As stated earlier it is easy to implement basic CRUD (create, read, update, delete) functionality using EF. There are performances issues when performing batch operations on thousand of domain objects (records).

In this post I will do a batch update on a table with traditional EF techniques and then use another library called  EF Utilities. The performance can be significantly improved with EF Utilities batch CUD (create, update, delete) operations.

EF Utilities(EntityFramework.Utilities) is created by Mikal Eliasson (http://mikee.se/ - @MikaelEliasson). 

1) I will create an empty ASP.Net Application (Web Forms Application) and I give it the name EFUtilitiesUpdate.  I am using Visual Studio 2013 Ultimate edition.

2) I add a new web forms page in the application. Leave the default name.

3) I will use the AdventureWorks2014 database (You can download it here) for this application and more specifically the Person.Person table. I have installed SQL Server 2014 Enterprise edition in my machine. SQL Express edition will work fine.

4) I will add an ADO.Net Entity data model using Database First. Follow the wizzard steps, create the connection string and then import into the conceptual model the Person.Person table which will become an entity in the domain model. 

Have a look at the picture below to see what you must have in Visual Studio.


5) I will write a sql statement to acquire rows from the Person.Person table. Connect to your SQL Server instance. Type the following query.

SELECT  [EmailPromotion]

FROM    [AdventureWorks2014].[Person].[Person]

WHERE   EmailPromotion = 0

This query after execution will return 11158 rows. We wil create a Batch update operation through our ASP.Net web forms application. We will also measure the performance of this batch operation. We will create a batch operation from our application that will affect all the rows (11158). If we did the batch update from T-SQL we would use the following statement.

UPDATE [AdventureWorks2014].[Person].[Person]

SET EmailPromotion = 10

WHERE EmailPromotion =0

6) In our webform in the Page_Load event we write the following code. We add 10 to every row in the Persons entity to the EmailPromotion column.

  protected void Page_Load(object sender, EventArgs e)


  using (AdventureWorks2014Entities ctx = new AdventureWorks2014Entities())


            IQueryable<Person> persons = ctx.People.Where(x => x.EmailPromotion == 0);

            foreach (Person person in persons)


                person.EmailPromotion = person.EmailPromotion + 10;





Make sure you build your application but do not run it just yet.

7) I will use the Entity Framework Profiler to gain valuable insight in the batch operation and record metrics regarding the performance. You can download it here -http://www.hibernatingrhinos.com/products/efprof. You can acquire a trial license very easily. Make sure you download and install this fantastic EF profiler.

Inside the folder where you downloaded the Entity Framework Profiler locate the HibernatingRhinos.Profiler.Appender DLL.
We will add a reference to this DLL to our web application.Make sure you do that.

In the Global.asax file we need to add the following code to the Application_Start() event handling routine.


Launch the EF Profiler and then build and run the application. The application will update the 11158 rows.

The results I got for my machine can be seen in the pictures below. You can see that the database has been hit 11158 times and the average duration for each one is 8.41ms.

You can also see the total duration in seconds that the whole batch operation lasted. we needed 111 seconds (almost 2 minutes) for updating 11158 rows. As you can understand this is not acceptible and we need to find a more efficient way to do the batch update.

8) Now we must install the EFUtilities library. We will do that through the Nuget Package Manager.Launch Nuget from VS and install EFUtilities. Have a look at the picture below

Now you must add a reference to using EntityFramework.Utilities library to your application. Inside the WebForm1.aspx.cs add the following

using EntityFramework.Utilities;

9) Comment out everything in the Global.asax file in Application_Start() event handling routine.

    Comment out everything in the webform Page_Load event handling routine. Type only the following inside the Page_Load event handling routine.

       protected void Page_Load(object sender, EventArgs e)


        using (AdventureWorks2014Entities ctx = new AdventureWorks2014Entities())


       //we find the rows with EmailPromotion =10 (11158) rows and add 10 more to each row

          EFBatchOperation.For(ctx, ctx.People).Where(b => b.EmailPromotion == 10).Update(b => b.EmailPromotion, b => b.EmailPromotion + 10);



10) I will use the Sql Server Profiler to get the metrics regarding the batch operation using the EntityFramework.Utilities library. I launch the SQL Server Profiler, I connect to the SQL Server Enterprise edition instance in my machine and then create a trace with the SQL:BatchCompleted, SQL:BatchStarting events and a filter in the Database column.

Have a look at the picture below

11) Build and run your application and see the SQL Server Profiler trace. The duration column indicates that the batch operation needed only 66 ms to update 11158 rows. 

Have a look at picture below

The results are amazing and the reason for such vast performance increase is because in traditional EF a separate UPDATE SQL statement runs for each row in the table, while the batch update operation provided by EF Utilities runs one UPDATE BULK SQL statement utilizing the SqlBulkCopy class. Inserts and deletes also work directly against the database.

My laptop is a Toshiba Qosmio Intel(R) Core(TM) i7-4700 MQ CPU @ 2.40GHz with 32Gbytes RAM with a 256 SSD drive.

Batch Update Operation with EF (11158 rows affected): 111 seconds

Batch Update Operation with EF Utilities (11158 rows affected): 56 ms  - that means it is 1982 times faster

Hope it helps!!!

No Comments