Before we jump into the Client Cache, let’s zoom out a bit and understand the basics of caching. In computing, a cache is a high-speed memory used as an auxiliary store to avoid expensive data trips and optimize the performance of simpler applications that run on a single server node. But in a load-balanced distributed system, a single request is handled by multiple application nodes. This is where the cache causes a performance drop.
To tackle this exigency, an in-memory distributed cache is introduced for applications that cater to extreme transaction loads per second and require high availability and linear scalability.
Developers can choose from many popular distributed caching options available in the market such as NCache. NCache is an extremely fast and linearly scalable in-memory distributed cache that caches application data to reduce expensive database trips and improve response time. But despite all this, there is still room for a performance boost that can be made possible by adding a cache on top of a cache –a Client Cache.
NCache Details Distributed Caching in .NET Highly Available NCache
How does Client Cache work in NCache?
In most cases, a distributed cache is hosted on a set of dedicated cache servers across the network, so your application has to make network trips to fetch data. And this is not as fast as accessing data locally and from within the application process.
To handle this problem, NCache provides the Client Cache. Client Cache works as temporary storage that resides near the application process so that data fetching is easier and faster. Client Cache takes the performance of your application to a greater level as multiple applications running on the same client machine can communicate and share data using OutProc mode. Client Cache brings the hot data set closer to your application, even inside the application’s process with the InProc mode which gives a huge performance boost.
In NCache, using Client Cache is quite simple. No code changes are required at the application end. It is a simple configuration option. You can create a Client Cache through the NCache Web Manager or the NCache-supported PowerShell cmdlets. Once the client has been configured, client applications will automatically start using it.
Let’s understand the working of Client Cache (also called first-level or L1 Cache) with an example of an E-Commerce application. The application frequently accesses the product catalog and data of currently active users. Such data can be kept in the Client Cache, running on the client box (where the application resides). So that a network trip across the clustered cache (also called second-level or L2 Cache) can be avoided and data is provided from the L1 Cache much faster.
Now you must be thinking about how data is synchronized between two caches so that the application is always served with updated data from the L1 Cache.
Data synchronization between the L1 and L2 Cache
To make sure that the application always gets up-to-date data, two background threads operate in the Client Cache. These two powerful syncing mechanisms running in the Client Cache ensure that the application always gets the latest data with added performance and scalability. The two background threads are:
Notification-based thread
When data is added to the L1 Cache, it registers a data change notification instantly and the L2 Cache keeps track of cache items that the L1 Cache holds and monitors changes when an update/remove change in data occurs. Upon modification of data in the L2 cache, the L1 cache receives a change notification and in response, the L1 cache synchronizes itself with L2.
Polling-based thread
This is a fallback mechanism that gets triggered only when the communication between L1 and L2 Cache is stopped due to a network glitch or a connection loss. In this case, L1 Cache waits for 10 seconds and then polls itself and requests the data changes in the L2 Cache. Upon receiving the change notifications, it synchronizes itself with the L2 cache.
To fully reap the benefits of Client Cache, you can use it in either of the two process-level isolation modes available.
Isolation modes in the Client Cache
Client Cache runs on the client node where your applications are running. Depending on your performance needs and the application architecture, you can choose one of the following process-level isolation modes supported by the Client Cache. NCache Client Cache exists in two modes: InProc and OutProc.
InProc mode:
In InProc mode, Client Cache runs inside the application process, eliminating inter-process communication. InProc mode provides maximum performance to the application as the data remains in object form, reducing the cost of serialization and deserialization.
In InProc mode, data among other application instances are not shared, so each instance of the application hosts a dedicated Client Cache instance which boosts the performance.
OutProc mode:
In OutProc mode, Client Cache runs in its dedicated process on the client node. Communication between applications and Client Cache occurs via TCP sockets. OutProc mode supports data sharing so multiple application instances can communicate with the same Client Cache. It has one major advantage since data is shared among multiple applications, so data loaded or updated by one application becomes available to others.
Performance-wise comparison of InProc and OutProc mode:
There are multiple factors upon which the performance of InProc and OutProc mode is measured i.e., data availability, resource consumption, and speed.
Data availability: In OutProc mode, cache restart does not result in data loss which provides data stability. Whereas, in InProc mode, cache restart results in data loss.
Resource consumption: InProc mode provides maximum performance when resources like memory are not limited, as each Client Cache instance holds its copy of data. OutProc mode requires fewer physical resources as compared to InProc mode.
Speed: Client Cache InProc mode is super-fast because it is like an object on your heap and it keeps things in a deserialized fashion, saving the cost of serialization/deserialization that you have to do for any OutProc or any remote access to the cache.
NCache Details Client Cache Event Notifications in Cache
The operational flow of Client Cache
Client Cache is closer to the application layer, and Client Cache is the subset of the clustered cache. So, all the operations occur in a distributed manner.
All the key-based read operations are directly performed on the L1 Cache, key based write operations such as Add, Insert, and Remove are first performed on the L2 Cache, then added to the L1 Cache before returning to the application.
Data is returned to L1 Cache before returning to the application so that the next time when same data is requested it is directly provided from L1 Cache.
Non-key-based read and write operations are only performed at the L2 Cache because the L1 Cache holds the subset of the L2 Cache.
Other Client Cache instances are updated through a background data synchronization mechanism.
Synchronization modes in Client Cache:
While being local to your application, a Client Cache is not stand-alone. Instead, it is always synchronized with the clustered cache. This ensures that data in the Client Cache is never stale.
NCache Client Cache provides two modes for synchronization of data between the Client Cache and the clustered cache. Client Cache holds a copy of clustered cache data. Any change that occurs in the clustered cache must be synced in the Client Cache since the application layer directly communicates with the Client Cache. Two modes for data synchronization are optimistic and pessimistic.
Optimistic mode is the default mode for data synchronization in NCache. Synchronization takes place in the background and when an application requests the data, it is returned to the application from the Client Cache.
Applications that are more sensitive and require fresh data use pessimistic mode for data synchronization. when an application requests the data, L1 Cache verifies the version with the L2 Cache and provides the updated data.
Conclusion
NCache allows you to take advantage of Client Cache with a distributed cache. The best part of using Client Cache is that there is no programming done at the user end. It is a simple configuration setting, and the Client Cache is automatically plugged in. Client Cache boosts performance in many folds for applications that perform more reads than writes. So, download a fully working 60-day trial of NCache Enterprise and try it out for yourself.