Cachemere
Modular Caching Library for C++
|
Cachemere is a modular header-only caching library for C++.
It provides implementations for some state-of-the-art caching algorithms, as well as a comprehensive set of primitives for building custom caching solutions.
Cachemere supports using arbitrary types as keys. The page on Using Custom Keys covers this in greater detail.
Like std::map
, cachemere::Cache
supports heterogeneous lookup. Take a look at the documentation on heterogeneous lookup for more details and usage instructions.
For more information, take a look at the full documentation.
At its core, a cache has much in common with a hash map; both are data structures used for associating a key with a value. The main difference between the two structures is that a cache usually introduces an additional set of constraints to prevent it from growing too much in memory. Some implementations restrict the number of items allowed at once in the cache, others restrict the total amount of memory used by the cache, while some do away with the size constraints entirely and instead restrict the amount of time an item can stay in the cache before expiring.
A caching scheme is the set of algorithms used to select which items should be kept in cache over others. Since many different metrics can be prioritized when implementing a cache, there is no one-size-fits-all caching scheme (e.g. a CPU cache usually only needs to prioritize the cache hit ratio, while a web server cache also needs to consider latency on cache miss).
Because of this impossiblity of relying on a single well-performing and reusable caching scheme, many organizations and products fall back on using a simple Least-Recently Used (LRU) cache everywhere. Although LRU is certainly suitable for most use cases it is far from optimal, and using it without an afterthought might lead to excessive memory usage or degraded performance.
Cachemere is designed with the explicit goal of providing a single reusable cache that can be customized using different schemes, allowing applications to get the maintainability benefits of a single cache implementation, while at the same time getting all the performance benefits of a purpose-built cache.
Cachemere tackles this goal in a modular fashion with the concept of policies. A cachemere::Cache
is parameterized by a Constraint Policy, an Insertion Policy, and an Eviction Policy.
Broadly speaking, the job of the Constraint Policy is to determine whether an item can be in the cache (e.g. if there is enough free space), the job of an Insertion Policy is to determine whether an item should be inserted or kept in cache, while the job of an Eviction Policy is to determine which item(s) should be evicted from the cache. The cache will query its policies when it has to make a decision on whether an item should be added or excluded.
This modular design allows for bi-directional code reuse and customization: the core cache implementation can be used with different policies, and the policies can also be used with different cache implementations.
For instance, instead of using more common policies like Least-Recently Used (LRU), a product might need to use a custom cache that takes the frequency of access as well as the size of the item into consideration when establishing which item to evict. To do this, one could implement a SizeBasedEvictionPolicy
and use it with the existing cachemere::Cache
.