Research
Security News
Quasar RAT Disguised as an npm Package for Detecting Vulnerabilities in Ethereum Smart Contracts
Socket researchers uncover a malicious npm package posing as a tool for detecting vulnerabilities in Etherium smart contracts.
@rlvt/cache-cache
Advanced tools
You are right, there are already a few package that does that:
So why bother re-writting one from scratch ? Here are the reasons:
Our objectives when writting that package were simple:
By default the package is configured to use the MEMORY
cache layer with a TTL of 15s, you can of course change those default:
Each method accept a whole configuration, if you want to have a common one, you can use useAsDefault
:
import { getMemoize, useAsDefault, AvailableCacheLayer } from '@rlvt/cache-cache'
// those default we be applied to all following new cache.
useAsDefault({
layerConfigs: {
[AvailableCacheLayer.MEMORY]: {
ttl: 10 * 1000
},
[AvailableCachelayer.REDIS]: {
ttl: 30 * 60 * 1000
}
},
layerOrder: [
AvailableCacheLayer.MEMORY,
AvailableCacheLayer.REDIS
]
})
const expensiveFunction = async () => {
// do something expensive
return {}
}
// and you can override them per method if you want
const noMoreExpensive = getMemoize(expensiveFunction, {
layerConfigs: {
[AvailableCacheLayer.MEMORY]: {
ttl: 10 * 1000
}
},
layerOrder: [
AvailableCacheLayer.MEMORY
]
})
Things to know:
getMemoize
callNOTE: You can only memoize async
function since call to cache can be asynchronous
WARNING: If your function returns undefined
, it will not be cached, prefer using null
.
You currently have two API to memoize a function:
@Memoize
decoratorimport { Memoize } from '@rlvt/cache-cache'
@Memoize({
layerConfigs: {
[AvailableCacheLayer.MEMORY]: {
ttl: 10 * 1000
}
},
layerOrder: [
AvailableCacheLayer.MEMORY
]
})
const expensiveFunction = async () => {
// do something expensive
return {}
}
// calling expensiveFunction will now automatically use the memoized version
getMemoize
API:import { Memoize } from '@rlvt/cache-cache'
const expensiveFunction = async () => {
// do something expensive
return {}
}
// and you can override them per method if you want
const noMoreExpensive = getMemoize(expensiveFunction, {
layerConfigs: {
[AvailableCacheLayer.MEMORY]: {
ttl: 10 * 1000
}
},
layerOrder: [
AvailableCacheLayer.MEMORY
]
})
You can use the getStore
API to get a cache that you can use with simple get
/set
functions:
import { getStore } from '@rlvt/cache-cache'
const store = getStore({
layerConfigs: {
[AvailableCacheLayer.MEMORY]: {
ttl: 10 * 1000
}
},
layerOrder: [
AvailableCacheLayer.MEMORY
]
})
await store.set('key', { value: 1 })
const obj = await store.get('key')
// obj.value === 1
await store.clear('key')
const obj = await store.get('key')
// obj === undefined
Since the data stored inside the cache isn't typed, by default you will only get raw object as type when fetching from cache. We implement our API to accept a generic argument that is the type of the memoized function:
import { Memoize } from '@rlvt/cache-cache'
type Person = {
age: number
name: number
}
@Memoize<typeof expensiveFunction>()
const expensiveFunction = async (): Person => {
// do something expensive
return {}
}
enum AvailableCacheLayer {
MEMORY = 'MEMORY',
REDIS = 'REDIS'
}
type Config = {
/**
* Configuration for each cache layer
*/
layerConfigs: {
[AvailableCacheLayer.REDIS]?: {
/**
* Default Time-To-Live for all keys on this layer
*/
ttl: number,
/**
* When applying a custom ttl to a specific key you may want to increase the TTL
* for a given layer, you can use this multiplier that will be applied to compute the
* final TTL for the layer.
*/
ttlMultiplier?: number
/**
* How much time to wait for the cache to respond when fetching a key
* if it reach the timeout, the next layer will be called (or if no layer is left)
* it will call the original function.
*/
timeout?: number
/**
* Shallow errors should be set as true if you want to ignore all caches issues
* (like failing to connect to a redis server) and fallback to undefined
*/
shallowErrors?: boolean
/**
* Provide your own ioredis client
*/
redisClient: IORedis.Redis,
/**
* A custom prefix used to differenciate keys
* By default memoized function use their name as prefix
*/
prefix?: string,
/**
* Enable or not the hashmap mode. When enabled, cache-cache will use
* redis' hashmaps (hget/hset) with namespace+prefix as key and hash as field
* NOTE: The expiration is set on the hashmap
*/
hashmap?: boolean = false
}
[AvailableCacheLayer.MEMORY]?: {
/**
* Default Time-To-Live for all keys on this layer
*/
ttl: number,
/**
* When applying a custom ttl to a specific key you may want to increase the TTL
* for a given layer, you can use this multiplier that will be applied to compute the
* final TTL for the layer.
*/
ttlMultiplier?: number
/**
* How much time to wait for the cache to respond when fetching a key
* if it reach the timeout, the next layer will be called (or if no layer is left)
* it will call the original function.
*/
timeout?: number
/**
* Shallow errors should be set as true if you want to ignore all caches issues
* (like failing to connect to a redis server) and fallback to undefined
*/
shallowErrors?: boolean
/**
* Maximum number of keys to store inside the in-memory cache, if it's reached
* the cache implement the `least-recently-used` algorithm, see
* https://github.com/isaacs/node-lru-cache
*
* default to 100000 keys
*/
maxEntries?: number
}
}
/**
* Order in the array is the order in which each layer will be called when getting a value
*/
layerOrder: AvailableCacheLayer[]
}
Apache-2.0
FAQs
cache all the things
The npm package @rlvt/cache-cache receives a total of 0 weekly downloads. As such, @rlvt/cache-cache popularity was classified as not popular.
We found that @rlvt/cache-cache demonstrated a not healthy version release cadence and project activity because the last version was released a year ago. It has 7 open source maintainers collaborating on the project.
Did you know?
Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.
Research
Security News
Socket researchers uncover a malicious npm package posing as a tool for detecting vulnerabilities in Etherium smart contracts.
Security News
Research
A supply chain attack on Rspack's npm packages injected cryptomining malware, potentially impacting thousands of developers.
Research
Security News
Socket researchers discovered a malware campaign on npm delivering the Skuld infostealer via typosquatted packages, exposing sensitive data.