@InBatches(📦,📦,📦,...)
InBatches is a zero-dependency generic TypeScript library that provides a convenient way to batch executions that runs
asynchronous.
It is designed to be used as part of your application's data fetching layer to provide a consistent API over various
backends and reduce requests to those backends via batching.
This library is especially useful for scenarios where you need to perform multiple asynchronous operations efficiently,
such as when making network requests or performing database queries.
Heavily inspired by graphql/dataloader but simpler using decorators (😜 really
decoupled). Because of that the
rest of your application doesn't event need to know about the batching/dataloader, it just works!
Table of Contents
Installation
npm install inbatches
or
yarn add inbatches
Usage
Basic usage with @InBatches
Decorator
The simplest way to get the grown running is to use the @InBatches
decorator. This decorator will wrap your method
and will batch-enable it, like magic! 🧙♂️
import { InBatches } from 'inbatches';
class MyService {
async fetch(key: number): Promise<User>;
@InBatches()
async fetch(keys: number | number[]): Promise<User | User[]> {
if (Array.isArray(keys)) return await this.db.getMany(keys);
throw new Error('It will never be called with a single key 😉');
}
}
Profit! 🤑
const service = new MyService();
const result = [1, 2, 3, 4, 5].map(async id => {
return await service.fetch(id);
});
result.then(results => {
console.log(results);
});
Advanced usage with custom Batcher
class
Another way to use the library is to create a class that extends the Batcher
class and implement the run
method.
This class will provide a enqueue
method that you can use to enqueue keys for batched execution.
import { Batcher } from 'inbatches';
class MyBatcher extends Batcher<number, string> {
async run(ids: number[]): Promise<string[]> {
return this.db.getMany(ids);
}
}
then
const batcher = new MyBatcher();
const result = [1, 2, 3, 4, 5].map(async id => {
return await batcher.enqueue(id);
});
result.then(results => {
console.log(results);
});
API
BatcherOptions
An interface to specify options for the batcher.
maxBatchSize
: The maximum number of keys to batch together. Default is 25
.delayWindowInMs
: (not recommended) The delay window in milliseconds before dispatching the batch. Default
is undefined
and will use process.nextTick
to dispatch the batch, which is highly efficient and fast. Only use
this if you really want to accumulate promises calls in a window of time before dispatching the batch.
Contributing
Contributions are welcome! Feel free to open issues or submit pull requests on
the GitHub repository.
License
This project is licensed under the MIT License - see the LICENSE file for details.