Entity Framework Core – Isolation Of Integration Tests

When working with Entity Framework Core (EF) a lot of code can be tested using the In-Memory database provider but sometimes you want (or have) to go to the real database. For example, you are using not just LINQ but custom SQL statements due to performance reasons or you want to check that a specific exception is thrown by the database under some conditions like when having a primary key violation.

In diesem Artikel:

Entity Framework Core – Isolation Of Integration Tests
Pawel Gerr ist Architekt und Consultant bei Thinktecture. Er hat sich auf .NET Core Backends spezialisiert und kennt Entity Framework von vorne bis hinten.
The biggest challenge of integration tests is the isolation of one test from another. In this post we will look at 3 options how to do that. The code for the 3rd option is on GitHub: PawelGerr/EntityFrameworkCore-Demos

Remarks: in my demos I’m using 3rd party libs: FluentAssertions and xunit.

Given is a DemoRepository with a method AddProduct that we want to test. (The code kept oversimplified for clarity reasons)

				
					public class DemoRepository
{
  ...

  public void AddProduct(Guid id)
  {
    _dbContext.Products.Add(new Product { Id = id });
    _dbContext.SaveChanges();
  }
}

				
			

Using Transaction Scopes

EF Core added support for [TransactionScope](https://docs.microsoft.com/en-us/ef/core/saving/transactions#using-systemtransactions) in version 2.1.

The isolation of tests via TransactionScope is very simple just wrap the call AddProduct into a TransactionScope to revert all changes at the end of the test. But, there are few preconditions. The testing method must not starting transactions using BeginTransaction() or it has to use a TransactionScope as well.

Also, I recommend to read my other blog post: Entity Framework Core: Use TransactionScope with Caution!

				
					public DemoRepositoryTests()
{
  _dbContext = CreateDbContext();
  _repository = new DemoRepository(_dbContext);
}

[Fact]
public void Should_add_new_product()
{
  var productId = new Guid("DBD9439E-6FFD-4719-93C7-3F7FA64D2220");

  using(var scope = new TransactionScope())
  {
    _repository.AddProduct(productId);

    _dbContext.Products.FirstOrDefault(p => p.Id == productId).Should().NotBeNull();

    // the transaction is going to be rolled back because the scope is not completed
    // scope.Complete();
  } 
}

				
			

Using new Databases

Creating a new database for each test is very easy but the tests are very time consuming. On my machine each test takes about 10 seconds to create and to delete a database on the fly.

The steps of each test are: generate a new database name, create the database by running EF migrations and delete the database in the end.

				
					public class DemoRepositoryTests : IDisposable
{
  private readonly DemoDbContext _dbContext;
  private readonly DemoRepository _repository;
  private readonly string _databaseName;

  public DemoRepositoryTests()
  {
    _databaseName = Guid.NewGuid().ToString();

    var options = new DbContextOptionsBuilder<DemoDbContext>()
              .UseSqlServer($"Server=(local);Database={_databaseName};...")
              .Options;

    _dbContext = new DemoDbContext(options);
    _dbContext.Database.Migrate();

    _repository = new DemoRepository(_dbContext);
  }

  // Tests come here

  public void Dispose()
  {
    _dbContext.Database.ExecuteSqlCommand((string)$"DROP DATABASE [{_databaseName}]");
  }
}
				
			

Using different Database Schemas

The 3rd option is to use the same database but different schemas. The creation of a new schema and running EF migrations usually takes less than 50 ms, which is totally acceptable for an integration test. The prerequisites to run queries with different schemas are schema-aware instances of DbContext and schema-aware EF migrations. Read my blog posts for more information about how to change the database schema at runtime:

The class executing integration tests consists of 2 parts: creation of the tables in constructor and the deletion of them in Dispose().

I’m using a generic base class to use the same logic for different types of DbContext.

In the constructor we generate the name of the schema using Guid.NewGuid(), create DbContextOptions using DbSchemaAwareMigrationAssembly and DbSchemaAwareModelCacheKeyFactory described in my previous posts, create the DbContext and run the EF migrations. The database is now fully prepared for executing tests. After execution of the tests the EF migrations are rolled back using IMigrator.Migrate("0"), the EF history table __EFMigrationsHistory is deleted and newly generated schema is dropped.

				
					public abstract class IntegrationTestsBase<T> : IDisposable
  where T : DbContext
{
  private readonly string _schema;
  private readonly string _historyTableName;
  private readonly DbContextOptions<T> _options;

  protected T DbContext { get; }

  protected IntegrationTestsBase()
  {
    _schema = Guid.NewGuid().ToString("N");
    _historyTableName = "__EFMigrationsHistory";

    _options = CreateOptions();
    DbContext = CreateContext();
    DbContext.Database.Migrate();
  }

  protected abstract T CreateContext(DbContextOptions<T> options, 
                                     IDbContextSchema schema);

  protected T CreateContext()
  {
    return CreateContext(_options, new DbContextSchema(_schema));
  }

  private DbContextOptions<T> CreateOptions()
  {
    return new DbContextOptionsBuilder<T>()
        .UseSqlServer($"Server=(local);Database=Demo;...", 
                builder => builder.MigrationsHistoryTable(_historyTableName, _schema))
        .ReplaceService<IMigrationsAssembly, DbSchemaAwareMigrationAssembly>()
        .ReplaceService<IModelCacheKeyFactory, DbSchemaAwareModelCacheKeyFactory>()
        .Options;
  }

  public void Dispose()
  {
    DbContext.GetService<IMigrator>().Migrate("0");
    DbContext.Database.ExecuteSqlCommand(
           (string)$"DROP TABLE [{_schema}].[{_historyTableName}]");
    DbContext.Database.ExecuteSqlCommand((string)$"DROP SCHEMA [{_schema}]");

    DbContext?.Dispose();
  }
}
				
			

The usage of the base class looks as follows

				
					public class DemoRepositoryTests : IntegrationTestsBase<DemoDbContext>
{
  private readonly DemoRepository _repository;

  public DemoRepositoryTests()
  {
    _repository = new DemoRepository(DbContext);
  }

  protected override DemoDbContext CreateContext(DbContextOptions<DemoDbContext> options, 
                                                 IDbContextSchema schema)
  {
    return new DemoDbContext(options, schema);
  }

  [Fact]
  public void Should_add_new_product()
  {
    var productId = new Guid("DBD9439E-6FFD-4719-93C7-3F7FA64D2220");

    _repository.AddProduct(productId);

    DbContext.Products.FirstOrDefault(p => p.Id == productId).Should().NotBeNull();
  }
}
				
			

Happy testing!

Kostenloser
Newsletter

Aktuelle Artikel, Screencasts, Webinare und Interviews unserer Experten für Sie

Verpassen Sie keine Inhalte zu Angular, .NET Core, Blazor, Azure und Kubernetes und melden Sie sich zu unserem kostenlosen monatlichen Dev-Newsletter an.

Diese Artikel könnten Sie interessieren
.NET
Incremental Roslyn Source Generators: High-Level API – ForAttributeWithMetadataName – Part 8

Incremental Roslyn Source Generators: High-Level API – ForAttributeWithMetadataName – Part 8

With the version 4.3.1 of Microsoft.CodeAnalysis.* Roslyn provides a new high-level API - the method "ForAttributeWithMetadataName". Although it is just 1 method, still, it addresses one of the biggest performance issue with Source Generators.
16.05.2023
.NET
Integrating AI Power into Your .NET Applications with the Semantic Kernel Toolkit – an Early View

Integrating AI Power into Your .NET Applications with the Semantic Kernel Toolkit – an Early View

With the rise of powerful AI models and services, questions come up on how to integrate those into our applications and make reasonable use of them. While other languages like Python already have popular and feature-rich libraries like LangChain, we are missing these in .NET and C#. But there is a new kid on the block that might change this situation. Welcome Semantic Kernel by Microsoft!
03.05.2023
.NET
.NET 7 Performance: Regular Expressions – Part 2

.NET 7 Performance: Regular Expressions – Part 2

There is this popular quote by Jamie Zawinski: Some people, when confronted with a problem, think "I know, I'll use regular expressions." Now they have two problems."

In this second article of our short performance series, we want to look at the latter one of those problems.
25.04.2023
.NET
.NET 7 Performance: Introduction and Runtime Optimizations – Part 1

.NET 7 Performance: Introduction and Runtime Optimizations – Part 1

.NET 7 is fast. Superfast. All the teams at Microsoft working on .NET are keen to improve the performance and do so every year with each new .NET release. Though this time the achievements are really impressive. In this series of short articles, we want to explore some of the most significant performance updates in .NET and look at how that may affect our own projects. This first article is taking a deep look under the hood of the compiler and the runtime to look for some remarkably interesting and significant updates.
28.03.2023
.NET
Incremental Roslyn Source Generators: Using Additional Files – Part 7

Incremental Roslyn Source Generators: Using Additional Files – Part 7

In the previous article the Source Generator itself needed a 3rd-party library Newtonsoft.Json in order to generate new source code. The JSON-strings were hard-coded inside the Source Generator for simplicity reasons. In this article we will see how to process not just .NET code, but also other files, like JSON or XML.
21.03.2023
Entity Framework
Entity Framework Core 7 Performance: Cartesian Explosion

Entity Framework Core 7 Performance: Cartesian Explosion

In Entity Framework Core 3 (EF 3) the SQL statement generation (re)introduced the Cartesian Explosion problem. A lot has happened since then, so it is time to revisit the issue with Entity Framework Core 7 (EF 7).
14.03.2023