Supercharging Performance with Caching in .NET
Application performance is often constrained not by the business logic itself, but by repetitive database queries and API calls. Caching is one of the most effective techniques to address this problem. By storing frequently accessed data in memory or a fast distributed store, applications can reduce latency, scale more effectively, and lower infrastructure costs.
According to Microsoft, caching can reduce response times by up to 80% for frequently requested data (Microsoft Docs). Similarly, Redis benchmarks demonstrate the ability to handle millions of requests per second with sub-millisecond latency (Redis Labs).
What is Caching?
Caching is the process of storing data in a faster medium for repeated access. Instead of querying a database or an external API on every request, the result is stored and reused for a defined period.
The benefits are clear:
- Reduced response times.
- Decreased load on databases and APIs.
- Improved scalability under high traffic.
Benchmark: With and Without Cache
To demonstrate the impact, consider a service call that takes ~200ms:
- Without caching: 1,000 requests = ~200 seconds total.
- With in-memory caching: 1,000 requests = ~1.5 seconds total.
This represents a two orders of magnitude improvement with a simple caching layer.
Caching Options in .NET
1. In-Memory Cache
The simplest option is IMemoryCache, which stores data in the application’s memory.
var cache = new MemoryCache(new MemoryCacheOptions());
cache.Set("CurrentTime", DateTime.Now, TimeSpan.FromMinutes(5));
if (cache.TryGetValue("CurrentTime", out DateTime cachedTime))
{
Console.WriteLine($"Cached Time: {cachedTime}");
}
- Retrieval times are in microseconds.
- Best suited for single-server deployments.
- Not shared across multiple instances.
2. Distributed Cache
For cloud and multi-server applications, .NET provides IDistributedCache, with implementations such as Redis, SQL Server, and NCache.
public async Task<string> GetWeatherAsync()
{
var cacheKey = "weather-today";
var cached = await _cache.GetStringAsync(cacheKey);
if (cached != null) return cached;
var weather = "Sunny"; // Simulated API call
await _cache.SetStringAsync(cacheKey, weather,
new DistributedCacheEntryOptions
{
AbsoluteExpirationRelativeToNow = TimeSpan.FromMinutes(30)
});
return weather;
}
- Data is shared across all application instances.
- Redis is a widely adopted choice, proven to deliver sub-millisecond response times even at scale.
3. Response and Output Caching
ASP.NET Core includes built-in support for response caching, allowing entire HTTP responses to be cached.
[ResponseCache(Duration = 60)]
public IActionResult Index()
{
return View();
}
This approach reduces server load and is effective for content that does not change frequently.
4. Hybrid Caching
Hybrid caching is an advanced strategy supported in ASP.NET Core 9.0+ that blends in-memory caching with a distributed cache (like Redis).
The main idea:
- Your app first checks the local memory cache for lightning-fast access.
- If the data isn’t there, it falls back to the distributed cache shared across all servers.
- If still missing, the data source (database or API) is queried, then stored in both caches for future requests.
This reduces latency while still keeping data consistent across a multi-server or cloud environment.
Example:
builder.Services.AddHybridCache(options =>
{
options.DefaultEntryOptions = new HybridCacheEntryOptions
{
Expiration = TimeSpan.FromMinutes(5)
};
});
Developers can then inject and use IHybridCache to store and retrieve items efficiently:
public class WeatherService
{
private readonly IHybridCache _cache;
public WeatherService(IHybridCache cache) => _cache = cache;
public async Task<string> GetForecastAsync(string city)
{
return await _cache.GetOrCreateAsync(
key: $"forecast-{city}",
factory: async cancel =>
{
// Fetch from source if not cached
await Task.Delay(50, cancel);
return $"Forecast for {city} at {DateTime.Now}";
},
options: new HybridCacheEntryOptions
{
Expiration = TimeSpan.FromMinutes(10)
});
}
}
Why Hybrid Matters
- Performance: Fast in-memory lookups.
- Scalability: Shared state across clusters or cloud deployments.
- Simplicity: First-class support in .NET 9 with
IHybridCache.
At InSync Software, we often implement hybrid caching in modern SaaS solutions. It allows us to deliver sub-millisecond response times while ensuring consistency across multiple nodes — a balance many businesses need when scaling to thousands of users.
When to Use Caching
Appropriate scenarios:
- Product catalogs or reference data.
- Exchange rates and financial market data.
- Weather information.
- Expensive report generation.
Avoid caching:
- Highly volatile data requiring strict consistency (e.g., inventory levels).
- Sensitive data such as passwords or tokens.
Conclusion
Caching in .NET offers substantial performance improvements with relatively low implementation effort.
- In-memory caching is the fastest option for single-server setups.
- Distributed caching, particularly with Redis, enables horizontal scalability.
- Response caching in ASP.NET Core reduces backend processing for repeated requests.
At InSync Software, we leverage these caching strategies when designing custom solutions for clients. Whether it’s improving responsiveness in high-traffic APIs, scaling multi-tenant SaaS platforms, or reducing cloud costs, caching consistently proves to be one of the highest-impact optimizations.
The key is to cache selectively: cache data that is expensive to compute or fetch, and that does not require constant freshness. When applied strategically, caching can improve responsiveness by over 100x while reducing infrastructure costs.e reducing infrastructure costs.
References