Unnecessary Fuzzy Searches May Hurt Your Entity Framework Core Performance

After talking about performance issues like N+1 Queries and the Cartesian Explosion that made its comeback in Entity Framework Core 3, we will today look at a performance issue that is not tied to any Entity Framework version but is rather a general one.

In this article:

pg
Pawel Gerr is architect consultant at Thinktecture. He focuses on backends with .NET Core and knows Entity Framework inside out.

What do I mean by "unnecessary fuzzy searches"?

In this article, I consider any filter criteria (i.e. WHERE clause) that can be more exact/precise.
For example, when filtering for a Product with unique names (of same length) we could do it either using Contains() or the equality == operator:

				
					var name = "my product";

var product1 = Context.Products.FirstOrDefault(p => p.Name.Contains(name));
// OR
var product2 = Context.Products.FirstOrDefault(p => p.Name == name);

				
			

The same applies when comparing numbers or dates using ><==, and so on.

Why would someone do an unnecessary fuzzy search?

Hardly anyone would intentionally do something unnecessary, but unintentionally, it happens…

One of the causes for sub-optimal filters is the lack of exact domain knowledge and/or side effects in the programming model used. I’ve seen code as in the example above using Contains() instead of == multiple times and the authors could not fully explain why they took this approach because both ways work. The best thing we can do in such a situation is to sensitize users of EF Core for issues like these.

Another cause can be unfavorable code sharing. Imagine, in one of our core projects we found a method LoadProducts that takes a few parameters, and among them the parameter string name.
Now, our code could look like:

				
					var name = "my product";

var product = someRepository.LoadProducts(name).FirstOrDefault();
				
			

After a quick test we are sure that the method produces a correct response, which is true, but let us have a closer look.

				
					public List<Product> LoadProducts(
    string name = null,
    string someOtherFilter = null)
{
    var query = Context.Products.AsQueryable();

    if(name != null)
        query = query.Where(p => p.Name.Contains(name));

    if(someOtherFilter != null)
        query = query.Where(...);

    return query.ToList();
}
				
			

As we can see, there are a few issues with this method. First, this method is clearly not made for our use case but for some kind of global product search, probably for displaying the data on a user interface. And second, it resides in a core project although we are pretty sure it is not a general-purpose business logic but for a specific use case that differs from ours.

Increased database load due to fuzzy searches

In the previous examples, we saw that the LINQ queries or rather the filter criteria could be changed without affecting the outcome but there was no evidence that the comparison via == performs better than using Contains().

Now we will execute two slightly different LINQ queries, look at the SQL statements generated by EF and analyze the execution plans to determine the winner. An execution plan describes all operations the database performs to handle the SQL request.
I’m using MS SQL Server for the following tests.

Given is a table with 100 Products having columns Id and Name and an unique index on the column Name. The LINQ queries are similar to the one in the method LoadProducts and return exactly 1 product.

				
					var name = "my product";

// query 1
var product1 = Context.Products.Where(p => p.Name.Contains(name)).ToList();

// query 2
var product2 = Context.Products.Where(p => p.Name == name).ToList();
				
			

The corresponding SQL statements (simplified for the sake of readability).

				
					-- query 1
SELECT *
FROM Products
WHERE CHARINDEX(@__name_0, Name) > 0

-- query 2
SELECT *
FROM Products
WHERE Name = @__name_0
				
			

In both cases, there is just one operation (worth mentioning) being executed by the database. For the query using Contains() the database performs a so-called Index Scan, i.e. the database scans the whole data source record by record. The other query using the == comparison does an Index Seek, i.e. the database uses the index to jump right to the requested record.

Depending on how large the data source is, scanning it can lead to a considerable load on the database, but that is not all.

(Unnecessary) Scanning of the data source is just the tip of the iceberg

The performance loss due to the scanning of the whole data source is just the beginning of a chain reaction. If the query in the example above is just a small part of a bigger query, Index Scan will lead to bigger internal working sets the database has to work with. Due to bigger working sets the database may decide for sub-optimal JOIN order (from our perspective). The increased complexity may lead to parallelization and/or use of internal temp tables, which are both not for free. In the end, the database will require more memory for handling the query (high query memory grant), leaving us with very bad performance.

Summary

When writing database queries with EF Core then the filter criteria should be both as precise and as restrictive as possible. Otherwise, not just the query itself but the overall database performance could suffer greatly. If you need numbers to compare with when optimizing database requests, I recommend analyzing execution plans.

Free
Newsletter

Current articles, screencasts and interviews by our experts

Don’t miss any content on Angular, .NET Core, Blazor, Azure, and Kubernetes and sign up for our free monthly dev newsletter.

EN Newsletter Anmeldung (#7)
Related Articles
.NET
KP-round
.NET 8 brings Native AOT to ASP.NET Core, but many frameworks and libraries rely on unbound reflection internally and thus cannot support this scenario yet. This is true for ORMs, too: EF Core and Dapper will only bring full support for Native AOT in later releases. In this post, we will implement a database access layer with Sessions using the Humble Object pattern to get a similar developer experience. We will use Npgsql as a plain ADO.NET provider targeting PostgreSQL.
15.11.2023
.NET
KP-round
Originally introduced in .NET 7, Native AOT can be used with ASP.NET Core in the upcoming .NET 8 release. In this post, we look at the benefits and drawbacks from a general perspective and perform measurements to quantify the improvements on different platforms.
02.11.2023
.NET
KP-round
.NET 8 introduces a new Garbage Collector feature called DATAS for Server GC mode - let's make some benchmarks and check how it fits into the big picture.
09.10.2023