🚀 Big News: Socket Acquires Coana to Bring Reachability Analysis to Every Appsec Team.Learn more

limit-concurrency-decorator

Advanced tools

Socket logo

Install Socket

Detect and block malicious and high-risk dependencies

Install

limit-concurrency-decorator

Decorator to limit concurrency of async functions

0.6.0
latest
Version published
Weekly downloads
0
Maintainers
2
Weekly downloads
 
Created

limit-concurrency-decorator

Package Version Build Status PackagePhobia Latest Commit

Decorator to limit concurrency of async functions

Similar to these libraries, but can be used as decorator:

Also similar to p-concurrency, but the limit can be enforced over multiple functions.

Install

Installation of the npm package:

> npm install --save limit-concurrency-decorator

Usage

Simply apply the decorator to a method:

import { limitConcurrency } from "limit-concurrency-decorator";

class HttpClient {
  @limitConcurrency(2)
  get() {
    // ...
  }
}

const client = new HttpClient();

// these calls will run in parallel
client.get("http://example.net/");
client.get("http://example2.net/");

// this call will wait for one of the 2 previous to finish
client.get("http://example3.net/");

Or a simple function as a wrapper:

import httpRequest from "http-request-plus";

const httpRequestLimited = limitConcurrency(2)(httpRequest);

// these calls will run in parallel
httpRequestLimited("http://example.net/");
httpRequestLimited("http://example2.net/");

// this call will wait for one of the 2 previous to finish
httpRequestLimited("http://example3.net/");

Or even as a call limiter:

const limiter = limitConcurrency(2)(/* nothing */);

// these calls will run in parallel
limiter(asyncFn, param1, ...);
limiter.call(thisArg, asyncFn, param1, ...);

// this call will wait for one of the 2 previous to finish
limiter.call(thisArg, methodName, param1, ...)

The limit can be shared:

const myLimit = limitConcurrency(2);

class HttpClient {
  @myLimit
  post() {
    // ...
  }

  @myLimit
  put() {
    // ...
  }
}

With FAIL_ON_QUEUE you can fail early instead of waiting:

import { FAIL_ON_QUEUE } from "limit-concurrency-decorator";

try {
  await httpRequestLimited(FAIL_ON_QUEUE, "http://example2.net");
} catch (error) {
  error.message; // 'no available place in queue'
}

Custom termination:

const httpRequestLimited = limitConcurrency(2, async (promise) => {
  const stream = await promise;
  await new Promise((resolve) => {
    stream.on("end", resolve);
    stream.on("error", reject);
  });
})(httpRequest);

// these calls will run in parallel
httpRequestLimited("http://example.net/");
httpRequestLimited("http://example2.net/");

// this call will wait for one of the 2 previous responses to have been read entirely
httpRequestLimited("http://example3.net/");

Contributions

Contributions are very welcomed, either on the documentation or on the code.

You may:

  • report any issue you've encountered;
  • fork and create a pull request.

License

ISC © Julien Fontanet

FAQs

Package last updated on 18 Jan 2024

Did you know?

Socket

Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.

Install

Related posts