What is cache-manager?
The cache-manager npm package is a flexible caching library for Node.js applications, which supports a variety of storage solutions and provides a uniform API to interact with different caching mechanisms. It allows for easy integration and switching between different cache stores without changing the underlying application code.
What are cache-manager's main functionalities?
Caching and Retrieving Data
This feature allows you to cache data in memory and retrieve it using a key. The 'set' method stores the value, and the 'get' method retrieves it. The 'ttl' option specifies the time-to-live in seconds.
{"const cacheManager = require('cache-manager');
const memoryCache = cacheManager.caching({ store: 'memory', max: 100, ttl: 10/*seconds*/ });
// Now set a value
memoryCache.set('myKey', 'myValue', { ttl: 5 }, (err) => {
if (err) { throw err; }
// Get the value
memoryCache.get('myKey', (error, result) => {
console.log(result);
// >> 'myValue'
});
});
}
Cache Store Agnosticism
Cache-manager supports different stores such as memory, Redis, and more. This feature allows you to switch between different cache stores seamlessly. The example shows how to use Redis as the cache store.
{"const cacheManager = require('cache-manager');
const redisStore = require('cache-manager-redis-store');
const redisCache = cacheManager.caching({ store: redisStore, host: 'localhost', port: 6379, auth_pass: 'XXXX', db: 0, ttl: 600 });
// Listen for redis ready event
redisCache.store.events.on('redisReady', () => {
console.log('Redis is ready');
});
// Listen for redis error event
redisCache.store.events.on('redisError', (error) => {
console.error('Redis error', error);
});
}
Multi-Level Caching
Cache-manager allows for multi-level caching, where you can have a hierarchy of cache stores. Data is first checked in the fastest cache (e.g., memory), and if not found, it falls back to slower caches (e.g., Redis).
{"const cacheManager = require('cache-manager');
const memoryCache = cacheManager.caching({ store: 'memory', max: 100, ttl: 10 });
const redisCache = cacheManager.caching({ store: require('cache-manager-redis-store'), ttl: 600 });
const multiCache = cacheManager.multiCaching([memoryCache, redisCache]);
multiCache.set('foo', 'bar', { ttl: 5 }, (err) => {
if (err) { throw err; }
multiCache.get('foo', (error, result) => {
console.log(result);
// >> 'bar'
});
});
}
Other packages similar to cache-manager
node-cache
node-cache is an in-memory caching package similar to cache-manager's memory store. It offers a simple and fast caching solution but does not support multiple backends or a tiered caching system.
lru-cache
lru-cache is an in-memory cache that implements the LRU (Least Recently Used) eviction policy. Unlike cache-manager, it is specifically tailored for LRU caching and does not support multiple storage backends.
keyv
keyv is a simple key-value storage with support for multiple backends, including Redis, MongoDB, SQLite, and more. It provides a unified interface across different stores but does not have built-in support for multi-level caching.
node-cache-manager
Flexible NodeJS cache module
A cache module for nodejs that allows easy wrapping of functions in cache, tiered caches, and a consistent interface.
Features
- Made with Typescript and compatible with ESModules
- Easy way to wrap any function in cache.
- Tiered caches -- data gets stored in each cache and fetched from the highest.
priority cache(s) first.
- Use any cache you want, as long as it has the same API.
- 100% test coverage via vitest.
Installation
pnpm install cache-manager
Usage Examples
Single Store
import { caching } from 'cache-manager';
const memoryCache = await caching('memory', {
max: 100,
ttl: 10 * 1000 ,
});
const ttl = 5 * 1000;
await memoryCache.set('foo', 'bar', ttl);
console.log(await memoryCache.get('foo'));
await memoryCache.del('foo');
console.log(await memoryCache.get('foo'));
const getUser = (id: string) => new Promise.resolve({ id: id, name: 'Bob' });
const userId = 123;
const key = 'user_' + userId;
console.log(await memoryCache.wrap(key, () => getUser(userId), ttl));
See unit tests in test/caching.test.ts
for more information.
Example setting/getting several keys with mset() and mget()
await memoryCache.store.mset(
[
['foo', 'bar'],
['foo2', 'bar2'],
],
ttl,
);
console.log(await memoryCache.store.mget('foo', 'foo2'));
await memoryCache.store.mdel('foo', 'foo2');
Custom Stores
You can use your own custom store by creating one with the same API as the built-in memory stores.
Multi-Store
import { multiCaching } from 'cache-manager';
const multiCache = multiCaching([memoryCache, someOtherCache]);
const userId2 = 456;
const key2 = 'user_' + userId;
const ttl = 5;
await multiCache.set('foo2', 'bar2', ttl);
console.log(await multiCache.get('foo2'));
await multiCache.del('foo2');
await multiCache.mset(
[
['foo', 'bar'],
['foo2', 'bar2'],
],
ttl
);
console.log(await multiCache.mget('key', 'key2'));
await multiCache.mdel('foo', 'foo2');
See unit tests in test/multi-caching.test.ts
for more information.
Refresh cache keys in background
Both the caching
and multicaching
modules support a mechanism to refresh expiring cache keys in background when using the wrap
function.
This is done by adding a refreshThreshold
attribute while creating the caching store.
If refreshThreshold
is set and after retrieving a value from cache the TTL will be checked.
If the remaining TTL is less than refreshThreshold
, the system will update the value asynchronously,
following same rules as standard fetching. In the meantime, the system will return the old value until expiration.
NOTES:
- In case of multicaching, the store that will be checked for refresh is the one where the key will be found first (highest priority).
- If the threshold is low and the worker function is slow, the key may expire and you may encounter a racing condition with updating values.
- The background refresh mechanism currently does not support providing multiple keys to
wrap
function.
For example, pass the refreshThreshold to caching
like this:
const memoryCache = await caching('memory', {
max: 100,
ttl: 10 * 1000 ,
refreshThreshold: 3 * 1000 ,
});
When a value will be retrieved from Redis with a TTL minor than 3sec, the value will be updated in the background.
Store Engines
Official and updated to last version
Third party
Contribute
If you would like to contribute to the project, please fork it and send us a pull request. Please add tests
for any new features or bug fixes.
License
node-cache-manager is licensed under the MIT license.