distributed cache
Distributed cache is a method of configuring a data cache to span multiple servers, storing common requests and enabling quick retrieval.
Distributed cache is used on web servers and application servers to provide non-local storage for better redundancy, serving multiple regions and providing extendable capacity for storage and transaction throughput.
Data stored in a distributed cache is generally determined by what is most frequently accessed from a given web or application server. As previously requested pieces of data are left unrequested, more recently requested data takes precedence; old data is eventually deleted from the cache.
One of the more common uses of distributed caches is to save users’ web session data. Data caches can be configured in many different ways. For example, there may be many servers spread geographically to better serve global customers. Servers may contain duplicated data for failover or they may include only unduplicated content to better cater to different requesting groups. Distributed caches can combine those approaches.
Distributed cache has become increasingly viable because of the decreasing cost of memory per GB and the low cost and increasing prevalence of 10 gigabit network interfaces.