Research
Security News
Quasar RAT Disguised as an npm Package for Detecting Vulnerabilities in Ethereum Smart Contracts
Socket researchers uncover a malicious npm package posing as a tool for detecting vulnerabilities in Etherium smart contracts.
async-deco
Advanced tools
async-deco is a collection of decorators for asynchronous functions (functions returning a promise). It allows to add features such as timeout, retry, dedupe, limit and much more! They can be combined together using the "compose" function (included).
Here is the list of the decorators:
Every module is available in 2 EcmaScript editions: ES5, ES2015 (native).
The individual modules can be required either using named imports, or by importing the specific submodule you need. Using named imports will include the entire library and thus should only be done when bundle weight is not a concern (node) or when using a es2015+ module versions in combination with webpack3+ or rollup.
Here are some examples:
// es5 is default
const log = require('async-deco').log;
import { log } from 'async-deco');
// es5
const log = require('async-deco/es5/log');
import { log } from 'async-deco/es5';
// es2015
const log = require('async-deco/es2015/log');
import { log } from 'async-deco/es2015';
Note: file names are all lowercase, dash separated. Module names are camelcase.
All decorators are designed to work on both node.js and browsers.
"decorator" is a pattern where you run a function on an object (in this case a function) to extend or change its behaviour. For example:
// decorator
const addHello = (func) =>
(name) => return `hello ${func(name)}!`;
// function to extend
const echo = (name) => name;
const helloEcho = addHello(echo)
Where addHello is a decorator that enhances the "echo" function.
Let's see another example:
const memoize = (func) => {
const previousResults = new Map();
return (n) => {
if (previousResults.has(n)) {
return previousResults.get(n);
}
const result = func(n);
previousResults.set(n, result);
return result
}
}
The memoize decorator can be used to store previous results of an expensive function call, and can make an function much faster.
const fastFunction = memoize(sloFunction);
The decorator pattern allows to extract a feature in a function.
This library aims to give a set of decorators that adds useful features to asynchronous functions. For example this decorator waits a second before execute the decorated function:
const delay1sec = (func) => {
return (...args) =>
new Promise((resolve, reject) => {
setTimeout(() => {
func(...args)
.then(resolve)
.catch(reject);
}, 1000);
})
}
const delayedMyFunction = delay1sec(myfunction);
delayedMyFunction() // this takes an extra second to execute ...
Logging what happen inside a decorator, especially if it is working asynchronously, is a bit tricky.
To do it, I use a "logging context" added using the decorator returned by add-logger. This logger takes as argument a function used for logging:
import { addLogger } = from 'async-deco';
const logger = addLogger((evt, payload, timestamp, id) => {
// ...
});
This decorator should wrap all the others.
logger(decorator2(decorator1(myfunction))) // You can also use compose as explained below.
The log function is called with the following arguments:
To show how this work, here's an example of a decorator that uses logging:
import { getLogger } = from 'async-deco';
function exampleDecorator(func) {
// note: you have to use a named function because you need "this"
return function (...args) {
const logger = getLogger(this);
return func(...args)
.then((response) => {
logger('response-successful', { response });
return response
});
}
}
And then you decorate the function and enable the logging:
logger(exampleDecorator(myfunction))
Decorator can be composed using the function provided. So instead of writing:
const retry = retryDecorator();
const timeout = timeoutDecorator({ ms: 20 })
const myNewFunction = retry(timeout(myfunction));
You can:
import { compose } from 'async-deco';
const decorator = compose(
retryDecorator(),
timeoutDecorator({ ms: 20 })
);
const myNewFunction = decorator(myfunction);
Note: compose applies the decorators right to left!
It enables the logging for the whole chain of decorators. Read the description in the Logging section.
The decorated function can't be called concurrently. Here's what happen, in order:
The default locking mechanism is very simple and works "in the same process". Its interface is compatible with node-redlock a distributed locking mechanism backed by redis.
import { atomic } from 'async-deco';
var atomicDecorator = atomic(options);
Options:
In node, you can use a distrubuted locking mechanism to ensure only an instance of a function is executed, across many process/services. Here's how using redlock:
import { atomic } from 'async-deco';
import redis from 'redis';
import Redlock from 'redlock';
var client = redis.createClient();
var redlock = new Redlock([client]);
var atomicDecorator = atomic({ lock: redlock, ttl: 1000 });
event | payload |
---|---|
atomic-lock-error | { err } |
This decorator allows to distribute the load between a group of functions. The functions should take the same arguments.
import { balance } from 'async-deco';
const balanceDecorator = balance();
const func = balanceDecorator([...list of functions]);
You can initialise the decorator with different policies:
import { balance, policyRoundRobin, policyRandom, policyIdlest } from 'async-deco';
const balanceDecorator = balance(policyRoundRobin);
There are 3 policies available in the "balance-policies" package:
You can also define your own policy:
const mypolicy = (counter, loads, args) => {
// "counter" is the number of times I have called the function
// "loads" is an array with length equal to the number of functions.
// it contains how many concurrent calls are currently running for that function
// "args" is an array containing the arguments of the current function call
// the function should return the index of the function I want to run
});
event | payload |
---|---|
balance-execute | {loads, executing } |
This decorator adds a caching layer to a function. It can use multiple caching engines. You can follow the respective README for their configuration:
memoize-cache (version >= 6.0.2)
import CacheRAM from 'memoize-cache/cache-ram';
This is a full featured in-RAM implementation. It works in any environment. It can take any value as cache key (other engines might have different constraints).
memoize-cache-redis (version >= 2.0.1)
import CacheRedis from 'memoize-cache-redis';
To use with node.js, backed by redis. It can take only ascii strings as cache key.
memoize-cache-manager (version >= 2.0.2).
import CacheManager from 'memoize-cache-manager';
It uses the library cache-manager to support multiple backends (node.js). It supports all features except "tags". It can take only ascii strings as cache key.
If you don't specify the "cache" object, cacheRAM will be used and argument will be used for its configuration. Here's a couple of examples:
import { cache } from 'async-deco'
// the result of the function will be cached in RAM forever,
// no matter the arguments used
const cacheDecorator = cache();
// the getKey function will take the same arguments passed
// to the decorated function and will return the key used as cache key
const cacheDecorator = cache({ getKey });
Here's a list of all arguments:
If you define the cache engine externally you can share between multiple decorators (cache, purgeCache, fallbackCache). These are equivalent:
const cacheDecorator = cache({ getKey, maxLen: 100 });
import CacheRAM from 'memoize-cache/cache-ram';
const cacheRAM = new CacheRAM({ getKey, maxLen: 100 });
const cacheDecorator = cache({ cache: cacheRAM });
If a function fails, the error will not be cached.
event | payload |
---|---|
cache-error | { err } |
cache-hit | { key, info } |
cache-miss | { key, info } |
cache-set | { key, tags } |
It manages multiple concurrent calls to the same function, calling the decorated function only once. It can use the "getKey" function and execute a function once for each key.
import { dedupe } from 'async-deco';
const dedupeDecorator = dedupe(options);
Options:
Using redis backed version of functionBus and lock it is possible to implement a distributed version of the of the deduplication. Example:
import { dedupe } from 'async-deco';
import redis from 'redis';
import Lock from 'redis-redlock';
import FunctionBus from 'function-bus-redis';
const lock = new Lock([redis.createClient()]);
const functionBus = new FunctionBus({
pub: redis.createClient(),
sub:redis.createClient()
});
const dedupeDecorator = dedupe({
lock: lock,
functionBus: functionBus
});
event | payload |
---|---|
dedupe-execute | { key, len } |
If a function fails, it calls another one as fallback or use a value.
import { fallback } from 'async-deco';
const fallbackDecorator = fallback({ func, value });
It takes either one of these 2 arguments:
event | payload |
---|---|
fallback | { err } |
If the decorated function throws an error, it tries to use a previous cached result. This uses the same cache objects used by the cache decorator.
import { fallbackCache } from 'async-deco';
const fallbackDecorator = fallbackCache();
Just like the cache decorators it can either take a cache object or the cacheRAM object will be used with the options provided. So these 2 are equivalent:
const fallbackDecorator = fallbackCache({ getKey, maxLen: 100 });
import CacheRAM from 'memoize-cache/cache-ram';
const cacheRAM = new CacheRAM({ getKey, maxLen: 100 });
const fallbackDecorator = fallbackCache({ cache: cacheRAM });
If you use this decorator together the the cache decorator you might want to use 2 additional options:
For example:
const cacheRAM = new CacheRAM({ getKey, maxAge: 600, maxValidity: 3600 });
const fallbackDecorator = fallbackCache({ cache: cacheRAM });
const cacheDecorator = cache({ cache: cacheRAM });
const cachedFunction = fallbackDecorator(cacheDecorator(myfunction))
This cached function will cache values for 10 minutes, but if for any reason the decorated function fails, it will use a stale value from the cache (up to one hour old).
event | payload |
---|---|
fallback-cache-error | { err, cacheErr } |
fallback-cache-hit | { key, info } |
fallback-cache-miss | { key, info } |
fallback-cache-set | { key, tags } |
It executes a "check function" before the decorated function. If it returns an error it will use this error as the return value of the decorated function. It is useful if you want to run a function only if it passes some condition (access control).
import { guard } 'async-deco';
const guardDecorator = guard({ check });
It takes 1 argument:
event | payload |
---|---|
guard-denied | { err } |
Limit the concurrency of a function. Every function call that excedees the limit will be queued. If the maximum queue size is reached, the function at the bottom of the queue will return an error (LimitError).
import { limit, LimitError } from 'async-deco';
const limitTwo = limitDecorator({ concurrency: 2 });
You can initialise the decorator with 1 argument:
The comparator has this form:
const comparator = (a, b) => {
// a.func, b.func are the functions in the queue
// a.args, b.args are arrays with the arguments
})
event | payload |
---|---|
limit-queue | { key } |
limit-drop | { key } |
It logs when a function start, end and fail. It requires the addLogger decorator.
import { log, addLogger } from 'async-deco';
const logger = addLogger((evt, payload, ts, id) => {
console.log(evt, payload);
});
const addLogs = log();
const loggedfunc = logger(addLogs(myfunc));
Running "myfunc" you will have:
log-start {}
log-end { res } // "res" is the output of myfunc
or
log-start {}
log-error { err } // "err" is the exception returned by myfunc
When using multiple decorator, it can be useful to attach this decorator multiple times, to give an insight about when the original function starts/ends and when the decorated function is called. To tell what log is called you can add a prefix to the logs. For example:
import { log, addLogger, cache } from 'async-deco';
const logger = addLogger((evt, payload, ts, id) => {
console.log(evt, payload);
});
const addInnerLogs = log('inner-');
const addOuterLogs = log('outer-');
const cacheDecorator = cache();
const myfunc = logger(addOuterLogs(cacheDecorator(addInnerLogs(myfunc))));
In this example outer-log-start and outer-log-end (or outer-log-error) will be always called. The inner logs only in case of cache miss.
The decorated function is executed only when used with a new set of arguments. Then the results are cached and reused. The result is cached against the arguments. They are checked by reference (strict equality). Promise rejections are cached as well.
The decorator uses a LRU cache algorithm to decide whether to get rid of a cached value. It also supports a time-to-live for cached values. But you have to consider stale cache entries are not removed until the size of the cache exceeds the length. This allows to keep the running time of the algorithm constant (O(1)) for any operation. The cache is local to the process. So multiple process will store multiple cache items.
The decorator factory takes 2 arguments:
import { memoize } from 'async-deco';
const memoizeDecorator = memoize({ len: 10, ttl: 10000 })
memoizeDecorator(() => ...)
The decorator doesn't provide logging.
It executes this function on the result once it is fulfilled.
import { onFulfilled } from 'async-deco';
onFulfilled((res) => ...)
This
const multiplyBy2 = onFulfilled((res) => res * 2)
const myNewFunc = multiplyBy2(myfunc)
is equivalent to:
myfunc
.then((res) => res * 2)
But being wrapped in a decorator helps to abstract away the logic.
It executes this function on the error, once is rejected.
import { onRejected } from 'async-deco';
onRejected((err) => ...)
This
const removeReferenceErrors = onRejected((err) => {
if (err instanceof ReferenceError) {
return null
}
throw err
});
const myNewFunc = removeReferenceErrors(myfunc)
is equivalent to:
myfunc
.catch((err) => {
if (err instanceof ReferenceError) {
return null
}
throw err
})
But being wrapped in a decorator helps to abstract away the logic.
When the decorated function succeed, it purges the corresponding cache entry/entries.
import { purgeCache } from 'async-deco';
const purgeCacheDecorator = purgeCache({ cache, getKeys, getTags });
Here's the arguments:
You should use at least one of getKeys of getTags.
event | payload |
---|---|
purge-cache-error | { err } |
purge-cache | { keys, tags } |
If a function fails, it retry it again
import { retry } from 'async-deco';
const retryTenTimes = retry({ times: 10, interval: 1000 });
You can initialise the decorator with 2 arguments:
event | payload |
---|---|
retry | { times, err } |
If a function takes to much, returns a timeout exception.
import { timeout, TimeoutError } from 'async-deco';
const timeoutOneSec = timeout({ ms: 1000 });
This will wait one second before returning a TimeoutError. It takes 1 argument:
event | payload |
---|---|
timeout | { ms } |
Using "cache" on an asynchronous function has a conceptual flaw. Let's say for example I have a function with 100ms latency. I call this function every 10 ms:
executed ⬇⬇⬇⬇⬇⬇⬇⬇⬇⬇
------------------------------
requested ⬆⬆⬆⬆⬆⬆⬆⬆⬆⬆⬆⬆⬆⬆⬆⬆⬆⬆⬆⬆⬆⬆⬆⬆⬆⬆⬆⬆⬆⬆
What happen is that while I am still waiting for the first result (to cache) I regularly execute other 9 functions. What if I compose cache with dedupe?
const decorator = compose(dedupe(), cache());
const newfunc = decorator(...);
dedupe should fill the gap:
executed ⬇
------------------------------
requested ⬆⬆⬆⬆⬆⬆⬆⬆⬆⬆⬆⬆⬆⬆⬆⬆⬆⬆⬆⬆⬆⬆⬆⬆⬆⬆⬆⬆⬆⬆
Imagine a case in which you want to be sure you did everything to get a result, and in case is not possible you want to return a good fallback:
const decorator = compose(
fallback({ value: null }), // last resort fallback
fallbackCache(), // try to use a previous cached output
retry({ times: 3 }), // it retry 3 times
timeout({ ms: 5000 })); // it times out after 5 seconds
const newfunc = decorator(...);
If you want to preserve the sequence used to call a function. For example, sending commands a service and be sure they are executed in the right order.
const queue = limit({ concurrency: 1 });
const myfunc = queue(...);
FAQs
A collection of decorators for adding features to asynchronous functions
The npm package async-deco receives a total of 3 weekly downloads. As such, async-deco popularity was classified as not popular.
We found that async-deco demonstrated a not healthy version release cadence and project activity because the last version was released a year ago. It has 1 open source maintainer collaborating on the project.
Did you know?
Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.
Research
Security News
Socket researchers uncover a malicious npm package posing as a tool for detecting vulnerabilities in Etherium smart contracts.
Security News
Research
A supply chain attack on Rspack's npm packages injected cryptomining malware, potentially impacting thousands of developers.
Research
Security News
Socket researchers discovered a malware campaign on npm delivering the Skuld infostealer via typosquatted packages, exposing sensitive data.