Caching
Caching functionality with decorators for declarative use.
Install: @travetto/cache
npm install @travetto/cache
yarn add @travetto/cache
Provides a foundational structure for integrating caching at the method level. This allows for easy extension with a variety of providers, and is usable with or without Dependency Injection. The code aims to handle use cases surrounding common/basic usage.
The cache module requires an Expiry to provide functionality for reading and writing streams. You can use any existing providers to serve as your Expiry, or you can roll your own.
Install: provider
npm install @travetto/model-{provider}
yarn add @travetto/model-{provider}
Currently, the following are packages that provide Expiry:
Decorators
The caching framework provides method decorators that enables simple use cases. One of the requirements to use the caching decorators is that the method arguments, and return values need to be serializable into JSON. Any other data types are not currently supported and would require either manual usage of the caching services directly, or specification of serialization/deserialization routines in the cache config.
Additionally, to use the decorators you will need to have a CacheService object accessible on the class instance. This can be dependency injected, or manually constructed. The decorators will detect the field at time of method execution, which decouples construction of your class from the cache construction.
@Cache is a decorator that will cache all successful results, keyed by a computation based on the method arguments. Given the desire for supporting remote caches (e.g. redis, memcached), only asynchronous methods are supported.
Code: Using decorators to cache expensive async call
import { MemoryModelService } from '@travetto/model';
import { Cache, CacheService } from '@travetto/cache';
async function request(url: string): Promise<string> {
let value: string;
return value!;
}
export class Worker {
myCache = new CacheService(
new MemoryModelService({ namespace: '' })
);
@Cache('myCache', '1s')
async calculateExpensiveResult(expression: string): Promise<string> {
const value = await request(`https://google.com?q=${expression}`);
return value;
}
}
Cache
The @Cache decorator supports configurations on:
-
name
the field name of the current class which points to the desired cache source.
-
config
the additional/optional config options, on a per invocation basis
keySpace
the key space within the cache. Defaults to class name plus method name.key
the function will use the inputs to determine the cache key, defaults to all params JSON.stringify
iedparams
the function used to determine the inputs for computing the cache key. This is an easier place to start to define what parameters are important in ,caching. This defaults to all inputs.maxAge
the number of milliseconds will hold the value before considering the cache entry to be invalid. By default values will live infinitely.extendOnAccess
determines if the cache timeout should be extended on access. This only applies to cache values that have specified a maxAge
.serialize
the function to execute before storing a cacheable value. This allows for any custom data modification needed to persist as a string properly.reinstate
the function to execute on return of a cached value. This allows for any necessary operations to conform to expected output (e.g. re-establishing class instances, etc.). This method should not be used often, as the return values of the methods should naturally serialize to/from JSON
and the values should be usable either way.
EvictCache
Additionally, there is support for planned eviction via the @EvictCache decorator. On successful execution of a method with this decorator, the matching keySpace/key value will be evicted from the cache. This requires coordination between multiple methods, to use the same keySpace
and key
to compute the expected key.
Code: Using decorators to cache/evict user access
import { MemoryModelService } from '@travetto/model';
import { Cache, EvictCache, CacheService } from '@travetto/cache';
class User { }
export class UserService {
myCache = new CacheService(new MemoryModelService({ namespace: '' }));
database: {
lookupUser(id: string): Promise<User>;
deleteUser(id: string): Promise<void>;
updateUser(user: User): Promise<User>;
};
@Cache('myCache', '5m', { keySpace: 'user.id' })
async getUser(id: string): Promise<User> {
return this.database.lookupUser(id);
}
@EvictCache('myCache', { keySpace: 'user.id', params: user => [user.id] })
async updateUser(user: User): Promise<void> {
this.database.updateUser(user);
}
@EvictCache('myCache', { keySpace: 'user.id' })
async deleteUser(userId: string): Promise<void> {
this.database.deleteUser(userId);
}
}
Extending the Cache Service
By design, the CacheService relies solely on the Data Modeling Support module. Specifically on the Expiry. This combines basic support for CRUD as well as knowledge of how to manage expirable content. Any model service that honors these contracts is a valid candidate to power the CacheService. The CacheService is expecting the model service to be registered using the @travetto/cache:model:
Code: Registering a Custom Model Source
import { InjectableFactory } from '@travetto/di';
import { MemoryModelService, ModelExpirySupport } from '@travetto/model';
import { CacheModelⲐ } from '@travetto/cache';
class Config {
@InjectableFactory(CacheModelⲐ)
static getModel(): ModelExpirySupport {
return new CustomAwesomeModelService({});
}
}
class CustomAwesomeModelService extends MemoryModelService {
}