In a typical Microservices-based application, multiple Microservices work together while remaining loosely coupled and scalable. Such applications employ many different services to satisfy core business requirements such as keeping track of and processing critical business data. They also have additional dedicated Microservices handle authentication, and load monitoring, while also serving as API gateways.
A key feature of such an application is that each Microservice is designed, developed, and deployed independently using any various technology stacks. Since each Microservice is a standalone autonomous application in its own right, each Microservice has separate persistent storage as well, be it a relational database, a NoSQL DB, or even a legacy file storage system. This allows the individual Microservices to scale independently and makes real-time infrastructure changes much more manageable.
Why does your microservice need NCache?
However, despite the numerous architectural advantages that Microservices offer, still there are cases where bottlenecks arise as a result of increased application transactions. This is common in architectures where Microservices store data in relational databases (which do not allow scaling out). In such situations, scaling out the Microservice does not resolve the issue.
To counter these issues, you can seamlessly introduce NCache as your distributed cache between your Microservices and the aforementioned datastores.
Scalability Through Pub/Sub
Microservices communication is frequently implemented using the Publisher/Subscriber model that allows messaging between different Microservices. In this regard, NCache serves as scalable, in-memory Pub/Sub NCache Documentation and Guides broker through which all the Microservices in the application can publish events and subscribe for them. The scalability and reliability inherent with NCache clustering are automatically translated when we come to Pub/Sub. Find out more about NCache as a message broker in the Microservices environment through our blog on Scaling.NET Microservices Communication with In-Memory Pub/Sub.
Scalability Through Caching
NCache offers real-time scalability, letting you add as many server nodes as you want in your running cache cluster without incurring any application downtime. NCache enhances the performance of individual Microservices and improves application response time and availability with its clustering architecture. This is especially true when considering workflows with dozens of Microservices spread across multiple hosts.
How to Use NCache for Data Caching?
With NCache in place, if a Microservice requires data, it first checks the cache instead of directly accessing the database. Given that the most frequently accessed data typically constitutes a small portion of the entire data in the store, caching this data is highly effective. Since this data is already cached and readily available, it significantly reduces any database-related latency. Additionally, this eases the database load, as most data requests are served by the cache.
Microservices based applications are inherently slower compared to more monolithic designs due to the distributed nature of their components. Thus, the performance boost provided by NCache is essential. NCache reduces the overall latency during complex transactions (for example operations that involve multiple services working in sequence), enabling you to fully leverage the benefits of a Microservices architecture.
NCache offers several unique features for precise control over caching operations. These include enforcing cache consistency through Expiration and database synchronization, and providing rich APIs for implementing various caching patterns using backing source providers. Additionally, NCache supports SQL-like queries on the cache and serves as a caching provider for Object-Relational Mappers (ORM) like EF Core.
To get started with NCache in your Microservices-based application, the first thing you need is to configure your services. This provides the information your Microservices need to start using NCache. Here’s an overview of how to establish a shared context between NCache and your Microservices-based application:
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 |
public IServiceProvider ConfigureServices(IServiceCollection services) { //Add additional code here services.AddDbContext<CatalogContext>(options => { var cacheID = configuration["CatalogCache"]; if (string.IsNullOrEmpty(cacheID)) cacheID = "CatalogCache"; NCacheConfiguration.Configure(cacheID, DependencyType.Other); // Changing default behavior when client evaluation occurs to throw. // Default in EF Core would be to log a warning when client evaluation is performed. options.ConfigureWarnings(warnings => warnings.Throw(RelationalEventId.QueryClientEvaluationWarning)); //Check Client vs. Server evaluation: https://docs.microsoft.com/en-us/ef/core/querying/client-eval }); var container = new ContainerBuilder(); container.Populate(services); return new AutofacServiceProvider(container.Build()); } |
What you need to do now is to deploy a controller with the logic that lets it get an item from the cache if it finds it there. And if not, then the controller fetches the item from the database and stores it in the cache. The implementation of such a controller is shown below:
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 |
[Route("api/v1/[controller]")] [ApiController] public class CatalogController : ControllerBase { private readonly CatalogContext _catalogContext; private readonly CatalogSettings _settings; private readonly ICatalogIntegrationEventService _catalogIntegrationEventService; public CatalogController(CatalogContext context, IOptionsSnapshot<CatalogSettings> settings, ICatalogIntegrationEventService catalogIntegrationEventService) { _catalogContext = context ?? throw new ArgumentNullException(nameof(context)); _catalogIntegrationEventService = catalogIntegrationEventService ?? throw new ArgumentNullException(nameof(catalogIntegrationEventService)); _settings = settings.Value; } [HttpGet] [Route("items/{id:int}")] [ProducesResponseType((int)HttpStatusCode.NotFound)] [ProducesResponseType((int)HttpStatusCode.BadRequest)] [ProducesResponseType(typeof(CatalogItem), (int)HttpStatusCode.OK)] public async Task<ActionResult<CatalogItem>> ItemByIdAsync(int id) { if (id <= 0) { return BadRequest(); } CatalogItem item = null; var cache = _catalogContext.GetCache(); string catalogItemKey = "CatalogItem:" + id; //Getting item from cache item = cache.Get<CatalogItem>(catalogItemKey); if (item == null) { item = await _catalogContext.CatalogItems.SingleOrDefaultAsync(ci => ci.Id == id); cache.Insert(catalogItemKey, item); } // Your logic here if (item != null) return item; return NotFound(); } } |
Let’s explore further key NCache features to understand its capabilities for supporting Microservices.
Keep Cache Fresh, Always
One key consideration when using a cache is the risk of it retaining stale data, i.e., no longer synchronized with the primary datastore. To ensure that the application Microservices always receive updated data from the cache, it’s essential to refresh the cache regularly. Fortunately, NCache offers features like Database Synchronization and Expiration to keep the cached data consistent with the primary datastore.
You can keep the cache synchronized with the datastore by setting an Expiration time for cached items. When an item expires, NCache automatically removes it, ensuring that new cache requests receive up-to-date data. NCache offers both Absolute and Sliding Expiration strategies, enabling you to select the most appropriate option based on how long the data needs to remain updated.
As an example, the following code snippet shows how easily you can add Absolute Expiration to a particular cache item:
1 2 3 4 |
//Create a new CacheItem from this product<br /> var cacheItem = new CacheItem(product); var expiration = new Expiration(ExpirationType.Absolute, TimeSpan.FromMinutes(5)); cacheItem.Expiration = expiration; cache.Insert(key, cacheItem); |
To use Sliding expiration, all you have to do is change the ExpirationType as demonstrated below:
1 |
var expiration = new Expiration(ExpirationType.Sliding, TimeSpan.FromMinutes(5)); |
When setting expirations in a cache to maintain data consistency, it’s crucial that the expiration times align with how quickly the data changes in the datastore. If expiration times are too short, data may be removed unnecessarily, leading to costly trips to the database. Conversely, if expiration times are too long, the cache may serve stale data.
Determining the optimal expiration times requires a deep understanding of data change patterns, which is often not practical. When strict cache consistency is required, database synchronization strategies are recommended. NCache offers several such strategies to ensure data consistency between the cache and the datastore. Now, whenever there is any change in an item in the datastore, the cache removes that item automatically.
The following code snippet demonstrates how you can synchronize NCache with an SQL Server database by adding NCache SQL Dependency on the cached items.
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 |
// Creating SQL Dependency string query = "SELECT ProductName, UnitPrice FROM dbo.Products WHERE CategoryID = 'Dairy';"; SqlCacheDependency sqlDependency = new SqlCacheDependency(connectionString, query); // Get orders that contain products with given category ID Order[] orders = FetchOrdersByProductCategoryID("Dairy"); foreach (var order in orders) { // Generate a unique cache key for this order string key = $"Order:ProductCategory-Dairy:{order.OrderID}"; // Create a new cacheitem and add sql dependency to it CacheItem item = new CacheItem(order); item.Dependency = sqlDependency; //Add cache item in the cache with SQL Dependency cache.Insert(key, item); } |
SQL Query on Cache
NCache enables your Microservices to query indexed cache data through an SQL-like querying mechanism. This feature is useful when the values of the keys for the cached items are unknown. This also simplifies the lower-level cache API calls, making your application code easier to understand and maintain. It’s particularly useful if you’re more comfortable with SQL-like commands.
An example code snippet demonstrating the use of the NCache SQL Query feature is given below:
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 |
string query = "SELECT * FROM FQN.Product WHERE ProductID > ?"; // Use QueryCommand for query execution var queryCommand = new QueryCommand(query); // Providing parameters for query queryCommand.Parameters.Add("ProductID",50000); // Executing QueryCommand through ICacheReader ICacheReader reader = cache.SearchService.ExecuteReader(queryCommand); // Check if the result set is not empty if (reader.FieldCount > 0) { while (reader.Read()) { string result = reader.GetValue<string>(1); // Perform operations using the retrieved keys } } else { // Null query result set retrieved } |
SQL queries can work with query indexes, the NCache distributed data structures as well as cache tags. Follow the link for more information on how to use the NCache SQL Query feature.
Read-Thru and Write-Thru
Using the NCache Data Source Providers feature, set NCache up as the single entry into the data access layer from the perspective of the microservice. When a microservice requires data, it first accesses the cache. If the data is available, the cache serves it directly. If not, the cache retrieves the data from the datastore using a Read-through handler, caches it, and then delivers it to the Microservice.
Similarly, by utilizing a Write-through handler, a microservice only has to execute a write operation (Add, Update, Delete) on the cache and the cache then performs the relevant write operation on the datastore automatically. What’s more, you can even force the cache to retrieve data directly from the data store irrespective of whether the cache holds a possibly stale version of it. This is critical when the microservice requires up-to-date information and builds on the cache consistency strategies mentioned previously.
Not only does the backing Data Source Provider feature streamline your application code but when used together with other available NCache database synchronization features, the cache is automatically refreshed with up-to-date data, ensuring it’s always ready for computation.
The following code snippet will help you start using Read-through in your microservices:
1 2 3 4 5 6 |
// Specify the readThruOptions for read-through operations var readThruOptions = new ReadThruOptions(); readThruOptions.Mode = ReadMode.ReadThru; // Retrieve the data of the corresponding item with read-through enabled Product data = cache.Get<Product>(key, readThruOptions); |
Similarly, you can implement Write-through by using the following code snippet:
1 2 3 4 5 6 |
// Enable write-through for the cacheItem created var writeThruOptions = new WriteThruOptions(); writeThruOptions.Mode = WriteMode.WriteBehind; // Add item in the cache with write-behind cache.Insert(key, cacheItem, writeThruOptions); |
For more detailed information on how to use these providers, refer to our documentation at Read-through Caching and Write-through Caching.
EF Core Caching
Entity Framework (EF) Core is a powerful Object Relational Mapper (O/RM) frequently used in enterprise .NET applications. As it is so popular, NCache offers an EF Core caching provider which allows you to seamlessly add caching within the EF Core-related code using extension methods such as FromCache. This enables EF Core developers, even those not deeply familiar with NCache APIs, to leverage the full capabilities of NCache.
The following code demonstrates the ease of use of the NCache EF Core caching provider to introduce caching in your existing Microservice application logic.
1 2 3 4 5 6 7 8 9 |
var options = new CachingOptions { // To store the result as collection in cache StoreAs = StoreAs.Collection }; options.SetAbsoluteExpiration(DateTime.Now.AddMinutes(_settings.NCacheAbsoluteExpirationTime)); // Get items from cache. If not found, fetch from database and store in cache. item = await _catalogContext.CatalogItems.DeferredSingleOrDefault(ci => ci.Id == id).FromCacheAsync(options); |
You can find more information about the EF Core Caching Provider API and how it can help your business case at NCache EF Core Provider.
Conclusion
NCache provides essential tools to enhance the performance, scalability, and data consistency of Microservices-based applications. By integrating features like database synchronization, expiration strategies, and EF Core caching, NCache reduces latency and eases database load, ensuring your Microservices operate efficiently. As your application scales, NCache keeps it responsive and reliable, making it a critical component for optimizing Microservices architecture.