Here are some frequent errors that can cause your application to perform poorly. But what’s the point of knowing the errors if we don’t know how to fix or avoid them? That’s why I’m providing you with some tips right here.

Constant access to the same data

In a C# application, it’s quite common to query the database multiple times, not only to retrieve operational information needed to complete a use case but also to obtain information that changes infrequently. We’ll address the latter case.

Imagine you have a table storing credentials to authenticate with a third-party service. Typically, before you can utilize the service, you need to authenticate. In your application, you probably have a routine to retrieve these credentials from a table in your database. Now, this may not seem like a problem if the routine is executed infrequently or only a few times a day. However, in a larger use case where you need to use the third-party service more frequently, this can become a performance issue in your application over time. Considering that credentials are information that doesn’t update frequently, you can opt to use the application’s in-memory cache.

In recent versions of .NET, this can be quite simple and quick to implement. Consider the following scenario.

public async Task<List<AccountSenders>> GetAccountSenders()
{
  return await _context.AccountSenders.ToListAsync();
}

In the above code, we retrieve a list of credentials from a table in our database. This method itself doesn’t seem to pose a performance issue. However, if we consider that we use our third-party service multiple times in our application, the usage of this method multiplies, resulting in unnecessary database queries. This is because the credentials don’t change or won’t change as frequently as we authenticate. Let’s consider the following approach using the in-memory cache provided by .NET.

public async Task<List<AccountSenders>?> GetAccountSenders(IMemoryCache _cache)
{
    string cacheKey = "AccountSenders";

    if (!_cache.TryGetValue(cacheKey, out List<AccountSenders>? accountSenders))
    {
        accountSenders = await _context.AccountSenders.ToListAsync();

        var cacheEntryOptions = new MemoryCacheEntryOptions()
            .SetSlidingExpiration(TimeSpan.FromMinutes(5));

         _cache.Set(cacheKey, accountSenders, cacheEntryOptions);
    }
    return accountSenders;
}

You’re probably thinking that more code doesn’t necessarily translate to better performance, but let’s take a look behind the scenes…

This is where caching comes into play. It’s a way of storing data in memory so you don’t have to fetch it every time. And although setting it up might initially seem a bit more challenging, believe me, it’ll be worth it in the end.

Going back to the previous code, before querying the database, we’re checking if that information is already stored in the cache. At the same time, the method also considers the cache expiration time, which is a crucial property. Continuing with the routine, we notice that when the information isn’t found in the cache based on the provided name, then we can proceed to query the database. Afterward, we can store that information in the cache so that next time, we can repeat the validation and retrieve the information from the cache instead of the database.

Also, note that when configuring the cache, we set an expiration time, which is crucial as the cache must have it. Once this time lapses, we can again query the database because, after all, we always want to have updated information.

So there you have it, it’s a simple, straightforward way that can greatly assist you in those frequently executed queries. Moreover, it’s a practical way to begin implementing caching in your .NET applications.

Bad HttpClient instance — a silent enemy of performance

When considering a solution to communicate with an external service via the HTTP protocol, it’s almost certain that we’ll conclude using HttpClient to achieve it. However, what lies behind creating a simple instance of HttpClient that can become a silent enemy of our application’s performance? Well, almost everything! Every time an instance of HttpClient is created, new TCP connections are opened, consuming resources and potentially creating bottlenecks, especially under high loads.

public async Task<string> GetAsync(string url)
{
  using (var client = new HttpClient())
  {
    var response = await client.GetAsync(url);
    return await response.Content.ReadAsStringAsync();
  }
}

In the above code, there doesn’t seem to be any issue, just a simple instance of HttpClient making a request. While its usage appears straightforward, instantiating it incorrectly or using it inadequately can be the root cause of a myriad of performance issues in your application.

Using HttpClientFactory to create a single, shared instance of HttpClient is a recommended practice.

public async Task<string> GetAsync(string url)
{
  using (var client = _httpClientFactory.CreateClient())
  {
    var response = await client.GetAsync(url);
    return await response.Content.ReadAsStringAsync();
  }
}

With HttpClientFactory, you can create HTTP clients properly configured and managed by the framework itself. These clients created by HttpClientFactory are stored in a pool, so new clients aren’t created every time one is needed. Furthermore, when a client is no longer in use, it’s returned to the pool and cleaned up for reuse.

In summary, incorrectly instantiating HttpClient or using it without proper configuration can negatively impact your application’s performance. Utilizing HttpClientFactory and proper configuration can significantly enhance the speed and efficiency of your HTTP requests.

Concatenating strings

Concatenating strings in .NET can affect performance compared to using StringBuilder due to the way strings are handled in memory.

When you concatenate strings using the + operator or String.Concat() method, .NET creates a new string object every time a concatenation operation occurs. This means that memory allocations happen frequently, especially in scenarios where multiple concatenations are performed within loops or in large-scale string manipulations. Each new string object requires memory allocation and deallocation, which can lead to memory fragmentation and increased garbage collection overhead.

On the other hand, StringBuilder provides a more efficient way to manipulate strings, especially when concatenating multiple strings or performing repeated string manipulations. StringBuilder uses a resizable buffer internally to store the string data, which minimizes memory allocations and reduces the overhead associated with creating new string objects.

Here’s an example to illustrate the difference:

// Using string concatenation
string result = "";
for (int i = 0; i < 10000; i++) {
    result += i.ToString();  // Each concatenation creates a new string object
}

// Using StringBuilder
StringBuilder sb = new StringBuilder();
for (int i = 0; i < 10000; i++) {
    sb.Append(i.ToString());  // StringBuilder appends data to its internal buffer
}
string result = sb.ToString();  // Convert StringBuilder to string when needed

In the above example, using string concatenation (+=) repeatedly inside the loop creates a new string object for each concatenation operation, resulting in poor performance and increased memory usage. However, using StringBuilder's Append() method appends data to its internal buffer efficiently, avoiding unnecessary memory allocations and improving performance.

In summary, while string concatenation may seem convenient, especially for small-scale operations, it’s crucial to use StringBuilder for larger string manipulations or scenarios where performance is critical to optimize memory usage and application performance in .NET.

Exceptions and NULL

Throwing too many exceptions and incorrect usage of returning null can impact the performance of a .NET application due to the overhead involved in exception handling and null checking.

When exceptions are thrown frequently, especially in performance-critical sections of the code, it can lead to significant performance degradation. Exception handling involves capturing stack traces, unwinding the call stack, and other operations that consume CPU cycles and memory.

Example:

try {
    // Code that might throw exceptions
} catch (Exception ex) {
    // Exception handling logic
}

In the above example, if the code within the try block frequently throws exceptions, the performance overhead of handling these exceptions can become noticeable, impacting the application’s overall performance.

Incorrect Usage of Returning Null

Returning null instead of handling null values appropriately can lead to null reference exceptions (NREs) when the returned value is accessed without proper null checks. This can introduce bugs and unexpected behavior in the application and requires additional null checks throughout the codebase, which can impact performance.

Example:

public string GetCustomerName(int customerId) {
    // Some logic to fetch customer name
    if (customerExists) {
        return customerName;
    } else {
        return null; // Incorrect usage if caller does not handle null properly
    }
}

  1. In the above example, if the caller of GetCustomerName does not handle the possibility of a null return value, it may result in null reference exceptions, leading to runtime errors and potential application crashes.

To mitigate these performance issues:

  • Use exceptions for exceptional conditions, not for regular control flow.
  • Minimize the use of exceptions in performance-critical sections of the code.
  • Handle expected error conditions using return codes or other mechanisms instead of exceptions.
  • Ensure that methods returning null clearly document the possibility of null return values, and callers handle them appropriately.

By following these best practices, you can improve the performance and reliability of your .NET applications.

Conclussion

In the realm of .NET application development, it’s not uncommon to overlook performance considerations, assuming that minor inefficiencies won’t significantly impact the application’s functionality. However, this complacency can lead to a gradual accumulation of suboptimal practices that collectively erode the performance of the application over time. While it may seem inconsequential to prioritize other aspects of development initially, neglecting performance considerations can result in severe consequences as the application scales or encounters increased usage.

Despite the allure of expediency in development, it’s essential to recognize that each instance of poor performance-inducing practice contributes to the overall degradation of the application’s performance. Whether it’s inefficient database queries, excessive exception throwing, or improper string manipulation, these seemingly minor lapses in optimization can compound and manifest as noticeable performance bottlenecks. As developers, it’s crucial to cultivate a culture of performance consciousness, where considerations for optimization are integrated into every stage of the development lifecycle.

Ultimately, the long-term success and viability of a .NET application hinge on the collective diligence of developers in prioritizing performance considerations from the outset. By fostering a proactive approach to performance optimization and adhering to best practices, developers can mitigate the risk of a performance-sapping application. Through a concerted effort to address performance bottlenecks and uphold optimization standards, developers can ensure that their .NET applications deliver optimal performance, scalability, and user experience in the face of evolving demands and usage patterns.

Happy coding!