Research
Security News
Quasar RAT Disguised as an npm Package for Detecting Vulnerabilities in Ethereum Smart Contracts
Socket researchers uncover a malicious npm package posing as a tool for detecting vulnerabilities in Etherium smart contracts.
@alexmchan/memocache
Advanced tools
--- title: Usage description: Learn how to use the Memocache package to cache data in your Node.js applications. ---
This package provides a flexible and extensible caching solution for Node.js applications. It supports multiple storage backends and offers features like TTL (Time-To-Live), automatic background revalidation, and function memoization.
const cachedFunction = createCachedFunction(anyFunction)
with automatic typed arguments and return valuespnpm install @alexmchan/memocache
import { createCache } from '@alexmchan/memocache'
import { createTTLStore } from '@alexmchan/memocache/stores'
import { Time } from '@alexmchan/memocache/time'
const store = createTTLStore({
defaultTTL: 5 * Time.Minute,
})
const cache = createCache({
defaultFresh: 30 * Time.Second,
stores: [store],
})
const { createCachedFunction } = cache
// your fetch function
function fetchSomething(arg) {
return `Result for ${arg}`
}
// Create a cached version of a function
const cachedFetchSomething = createCachedFunction(fetchSomething)
// Use the cached function
console.log(await cachedFetchSomething('example')) // fetchSomething is called once
console.log(await cachedFetchSomething('example')) // fetchSomething is not called cached value is returned
Data is read from the stores, if it is not found we call the function to get the data.
If the data exists in the store, we check if it is stale we return the value in the store and then call the function to get the fresh data.
If the data is past it's time to live it will be expired from the store
There can be a lot of wrapper code to use a cache. This package provides a simple to use cache that supports stale while revalidation. The typical pattern for caching requires:
One may also want to write back to multiple stores such as an in memory TTL Cache, a local sqlite instance, or Redis. This package provides a simple to use API that supports all of these features.
We use the stable stringified hash popularized by react-query to generate the cache key and a sha256 hash of any function code. This allows for easy generation of the cache key based on the function signature and arguments allowing us to easily memoize functions.
This also supports the stale while revalidate pattern allowing to return stale data while fetching fresh data in the background.
// with memocache
const { createCachedFunction } = createCache({
stores: [createTTLStore()],
})
// Create a cached version of a function
const cachedFunction = createCachedFunction(async ({ id, name }) => {
// some expensive operation or fetch
return `Result for ${arg}`
})
// The old way without memocache to help
function doSomething({ id, name }) {
// check the cache
const key = JSON.stringify({ id, name })
const cachedValue = cache.get(key)
if (cachedValue) {
return cachedValue
}
// some expensive operation or fetch
const result = await doSomethingVeryExpensive(`Result for ${id} and ${name}`)
// we have to wait here or we need to find a way to signal to the platform for serverless that the
try {
await cache.set(key, result, timeToLive)
} catch (e) {
//...
}
// now repeat for all available cache's and we still need
// to add stale while revalidate
// and serverless support
return result
}
With this method it's easy to wrap a function and have it read/write from multiple stores.
createCache(options: CacheOptions)
Creates a new cache instance.
options.stores
: An array of CacheStore
instancesoptions.defaultTTL
: Default Time-To-Live for cache entriesoptions.defaultFresh
: Revalidate stale data after this timeoptions.context
: (Optional) A custom context for managing async operationsReturns an object with the following methods:
createCachedFunction<T>(fn, options)
: Creates a memoized version of a functioncacheQuery<T>({ queryFn, queryKey, options })
: Executes a cache querydispose()
: Disposes of the cache and its storescreateCachedFunction(fn, options)
Creates a memoized version of a function.
fn
: The function to memoizeoptions
(optional): CacheQueryOptions
Options for the memoized functionoptions.cachePrefix
: A prefix will be auto generated based on the function contents for convenience and will add only fractions of a millisecond, however for very very large functions this might be a concern so one can specify a prefix such as /api/todos
to scope the function to a known value. Note that any change to the function code will also change the cache key unless this value is set.options.ttl
: Time-To-Live for cache entries in msoptions.fresh
: Revalidate stale data after this time in ms🟡 Make sure that any identifiers that might be inferred from auth such as customerId are passed in as arguments to the function to ensure that the cache key is unique per user so that one user doesn't see another user's data. Alternatively, specify your own cache keys with cacheQuery.
cacheQuery({ queryFn, queryKey, options })
Executes a cache query with a specific set of keys. This resembles the useQuery
api. As an added bonus, the react-query eslint plugin will also help validate that external values are included in the querykey.
function exampleFunction({ storeId, customerId }) {
return cacheQuery({
queryKey: ['/items', { storeId, customerId }],
queryFn: async fetch() {
return fetchItems({ storeId,customerId })
},
})
}
To invalidate a cache entry, there's a .invalidate()
method on the memoized function that can be called with the same signature as the original function.
const { createCachedFunction } = createCache({
stores: [sqliteStore],
})
const myCachedFunction = createCachedFunction(async ({ example }) => {
return `Result for ${example}`
})
await myCachedFunction({ example: 'example' })
await myCachedFunction.invalidate({ example: 'example' })
Creates an in-memory TTL store based on @isaacs/ttl-cache
.
const store = createTTLStore({
defaultTTL: 5 * Time.Minute,
})
createSqliteStore(options)
Creates a libSql SQLite store.
options.sqliteClient
: An instance of @libsql/client
options.defaultTTL
: Default Time-To-Live for cache entriesoptions.cleanupInterval
: Interval for cleaning up expired entriesimport { createCache } from '@alexmchan/memocache'
import { createSqliteStore } from '@alexmchan/memocache/stores/sqlite'
import { Time } from '@alexmchan/memocache/time'
import { createClient } from '@libsql/client'
const sqliteClient = createClient({
url: 'file::memory:', // or file:./cache.db
})
const sqliteStore = createSqliteStore({
sqliteClient,
cleanupIntervalp: 5 * Time.Minute,
defaultTTL: 10 * Time.Minute,
})
const cache = createCache({
stores: [sqliteStore],
defaultFresh: 1 * Time.Minute,
defaultTTL: 5 * Time.Minute,
})
// Use cache as in the previous example
createRedisStore
An ioredis based store.
pnpm install @alexmchan/memocache-store-redis
import { createRedisStore } from '@alexmchan/memocache-store-redis'
import { Redis } from 'ioredis'
const redisStore = createRedisStore({
redisClient: new Redis({
host: 'localhost',
port: 6379,
}),
defaultTTL: 5 * Time.Minute,
})
createUpstashRedisStore
An Upstash Redis store (also @vercel/kv
since it's a proxy of Upstash).
import { createUpstashRedisStore } from '@alexmchan/memocache-store-redis'
import { Redis } from '@upstash/redis'
const redisRestStore = createUpstashRedisStore({
redisClient: new Redis({
url: process.env.UPSTASH_REDIS_REST_URL,
token: process.env.UPSTASH_REDIS_REST_TOKEN,
}),
defaultTTL: 5 * Time.Minute,
})
Middleware wraps the store definition and returns a new store that can be used in the cache. Middleware can be used to add additional functionality to the store, such as logging, metrics, or encryption. See the encryption
middleware for an example of how to use middleware.
Attach encryption to any store. This middleware encrypts the value before storing it in the store and decrypts it when retrieving it.
A hash of the key/salt is used to encrypt the value and a part of the cache. Changing the key or salt will effectively invalidate the cache values.
const ttlStore = createTTLStore({ defaultTTL: 60 * Time.Second })
const encryptedStore = createEncryptedStore({
key: 'this is secret sauce',
salt: 'this is salty',
store: ttlStore,
})
export const { createCachedFunction, cacheQuery } = createCache({
stores: [encryptedStore],
})
Attach metrics to any store. This middleware logs the time taken to get, set, and delete values from the store.
const ttlStore = createTTLStore({ defaultTTL: 60 * Time.Second })
const metricsSqliteStore = createMetricsStore({
store: ttlStore,
})
// output to the logger
// Metric {
// metric: "cache.read",
// key: "[\"hello/80c56980e62840587ea4c2f103f23f08e042bd8cea808025219e4e7d1b7c996d\",[{\"message\":\"world\"}]]",
// hit: true,
// latency: 1,
// }
For serverless functions, the context object can be used to manage asynchronous operations. The context object has a waitUntil
method that can be used to enqueue asynchronous tasks to be performed during the lifecycle of the request.
The job of the context is to wait on any asynchronous operations that need to be completed before the function can return so it is left up to the implementer to decide what to do with the context. The context will be provided with promise(s) that need to be completed.
As described in the Vercel documentation:
The waitUntil() method enqueues an asynchronous task to be performed during the lifecycle of the request. You can use it for anything that can be done after the response is sent, such as logging, sending analytics, or updating a cache, without blocking the response from being sent. waitUntil() is available in the Node.js and Edge Runtime. Promises passed to waitUntil() will have the same timeout as the function itself. If the function times out, the promises will be cancelled.
To use waitUntil() in your function, import the waitUntil() method from @vercel/functions package. For more information, see the @vercel/functions reference.
export interface Context {
waitUntil: (p: Promise<unknown>) => void
}
import { Context } from './context'
import { waitUntil } from '@vercel/functions'
createCache({
context: {
waitUntil,
[Symbol.asyncDispose]() {
// cleanup
},
},
// ...
})
Vendor specific documentation:
To be tested implementation of a cache flushing waitable context:
//**----------------------------------------------------
/* This is a simple context and only for serverless environments
/* where the list of waitables won't grow indefinitely
/*--------------------------------------------------**/
function createSimpleContext() {
waitables: Promise<unknown>[] = []
const context = {
waitables,
waitUntil(p) {
waitables.push(p);
if (waitables.length > 1000) {
this.flushCache();
}
},
async flushCache() {
await Promise.allSettled(waitables);
waitables.length = 0;
},
[Symbol.asyncDispose]() {
return this.flushCache();
}
};
return context;
}
async function handler(event, context) {
using simpleContext = new SimpleContext()
using cache = createCache({
stores: [store],
context: simpleContext,
})
// do work, ideally return using streaming response otherwise the user response will wait on the flushCache
// without `using` we have to wait for all promises to finish
await simpleContext.flushCache()
}
Stores are the underlying data structure that the cache uses to store the data. The cache uses the store to get, set, and delete data. The store can be anything that implements the Store
interface and could be an in-memory store, an SQLite store, a Redis store, etc.
The time to live will be taken in the order of:
This allows for overriding of the per function TTL, but otherwise we can have different TTLs for the stores so that something that has a larger capacity such as a disk store can have a longer TTL than a memory store.
Time
ConstantsConstants for time units in milliseconds.
Time.Millisecond
Time.Second
Time.Minute
Time.Hour
Time.Day
Time.Week
Usage 5 * Time.Minute
or 10 * Time.Second
, mirrors go's Time durations
.
Or choose any time library for millisecond
durations
import { Duration } from 'effect'
const defaultTTL = Duration.decode('10 minutes').value.millis
The store interface is the following
export interface CacheStore extends AsyncDisposable {
/** Set a value in the store, ttl in milliseconds */
set(key: string, value: any, ttl?: number): Promise<any>
get(key: string): Promise<any>
delete(key: string): Promise<unknown>
/** Remove all values from the store */
clear?(): Promise<any>
/** dispose of any resources or connections when the cache is no longer in use */
dispose?(): Promise<any>
}
Although it will work, it is better to setup and export the memozied function outside of the function that uses it to avoid creating the cache key on every call since we create a hash of the function.toString() to generate a unique cache key. Hashing is normally hardware accelerated and should add less than a few fractions of a ms (less than the encryption numbers in the benchmarks and only when the function is called for the first time).
// ok but slower
export function useExampleFn() {
const exampleFn = () => 'example'
const memoizedFn = createCachedFunction(exampleFn)
return memoizedFn()
}
// good, especially with many memoized functions
export const memoizedFn = createCachedFunction(exampleFn)
// good, doesn't require any function hashing
const memoizedFn = createCachedFunction(exampleFn, {
options: { cachePrefix: '/api/todos' },
})
The created cache's cacheQuery
function is an API that allows for more fine tuning of the cache key. This can be useful if the default cache key is not sufficient or additional keys are needed to use to help invalidate or scope the cache.
This also allows for different queries to update the same cache key.
import { cacheQuery } from 'your/path/to/cache'
function exampleGetItems() {
return cacheQuery({
queryKey: ['/items', {customerId, storeId}],
queryFn: async fetch() {
return fetchItems({ storeId })
},
})
}
Note that a similar behaviour could be achieved to add additional keys by wrapping the the memoized function with the extra keys needed to invalidate the cache.
const memoizedFn = createCachedFunction(({ storeId, customerId }) =>
exampleFn({ storeId }),
)
To bypass the cache just call the original function directly.
const exampleOriginalFn = async (parameter) => {
return `Result for ${parameter}`
}
const memoizedFn = createCachedFunction(exampleOriginalFn)
exampleOriginalFn('example') // bypasses the cache
If there are large payloads in the memoized function calls, these are stored as a part of the cache key. Some middleware could be utilized if this is the case to hash the key payload and store the hash as the key instead.
The encryption middlware hashes the key by default and is an option to use that for larger keys.
The cache supports automatic disposing of the cache with the using and its stores from typescript >= 5.2
. This is useful for cleaning up resources when the cache is no longer needed rather than calling the dispose manually.
async function main() {
// initialize stores and dispose of when done
await using cache = createCache({
stores: [store],
})
// we can also manually dispose of all stores if we don't have the `using` keyword available
await cache.dispose()
}
This was inspired by the apis and code @unkey/cache and react-query. Laravel also has a similar caching api.
The primary difference to @unkey/cache
is that this package is more focused on providing an even more simple api so that each function that is called doesn't need to generate it's own store itself and to allow per function based cache invalidation and configuration of stale and expiry times.
Run the src/__tests__/benchmark.ts
script to get a rough idea of the performance of the different stores on your platform.
Performs a read write of the same key multiple times and outputs the average time taken.
TTL store average time: 0.000ms
SQLite in memory store average time: 0.037ms
Memoized encrypted TTL store average time: 0.034ms
Non-memoized encrypted TTL store average time: 0.039ms // this is more the overhead of the encryption
Redis store average time: 0.241ms // Redis on localhost
SQLite disk store average time: 0.531ms // SQLite on disk
Upstash redis from local to remote: 118.711ms // Redis on upstash over https from local -- hopefully faster from aws?
FAQs
--- title: Usage description: Learn how to use the Memocache package to cache data in your Node.js applications. ---
We found that @alexmchan/memocache demonstrated a healthy version release cadence and project activity because the last version was released less than a year ago. It has 0 open source maintainers collaborating on the project.
Did you know?
Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.
Research
Security News
Socket researchers uncover a malicious npm package posing as a tool for detecting vulnerabilities in Etherium smart contracts.
Security News
Research
A supply chain attack on Rspack's npm packages injected cryptomining malware, potentially impacting thousands of developers.
Research
Security News
Socket researchers discovered a malware campaign on npm delivering the Skuld infostealer via typosquatted packages, exposing sensitive data.