🚀 Big News: Socket Acquires Coana to Bring Reachability Analysis to Every Appsec Team.Learn more
Socket
DemoInstallSign in
Socket

limit-concurrency-decorator

Package Overview
Dependencies
Maintainers
2
Versions
7
Alerts
File Explorer

Advanced tools

Socket logo

Install Socket

Detect and block malicious and high-risk dependencies

Install

limit-concurrency-decorator

Decorator to limit concurrency of async functions

0.5.0
Source
npm
Version published
Weekly downloads
657
8.96%
Maintainers
2
Weekly downloads
 
Created
Source

limit-concurrency-decorator

Package Version Build Status PackagePhobia Latest Commit

Decorator to limit concurrency of async functions

Similar to these libraries, but can be used as decorator:

  • throat
  • p-limit

Also similar to p-concurrency, but the limit can be enforced over multiple functions.

Install

Installation of the npm package:

> npm install --save limit-concurrency-decorator

Usage

Simply apply the decorator to a method:

import { limitConcurrency } from "limit-concurrency-decorator";

class HttpClient {
  @limitConcurrency(2)
  get() {
    // ...
  }
}

const client = new HttpClient();

// these calls will run in parallel
client.get("http://example.net/");
client.get("http://example2.net/");

// this call will wait for one of the 2 previous to finish
client.get("http://example3.net/");

Or a simple function as a wrapper:

import httpRequest from "http-request-plus";

const httpRequestLimited = limitConcurrency(2)(httpRequest);

// these calls will run in parallel
httpRequestLimited("http://example.net/");
httpRequestLimited("http://example2.net/");

// this call will wait for one of the 2 previous to finish
httpRequestLimited("http://example3.net/");

The limit can be shared:

const myLimit = limitConcurrency(2);

class HttpClient {
  @myLimit
  post() {
    // ...
  }

  @myLimit
  put() {
    // ...
  }
}

With FAIL_ON_QUEUE you can fail early instead of waiting:

import { FAIL_ON_QUEUE } from "limit-concurrency-decorator";

try {
  await httpRequestLimited(FAIL_ON_QUEUE, "http://example2.net");
} catch (error) {
  error.message; // 'no available place in queue'
}

Custom termination:

const httpRequestLimited = limitConcurrency(2, async promise => {
  const stream = await promise;
  await new Promise(resolve => {
    stream.on("end", resolve);
    stream.on("error", reject);
  });
})(httpRequest);

// these calls will run in parallel
httpRequestLimited("http://example.net/");
httpRequestLimited("http://example2.net/");

// this call will wait for one of the 2 previous responses to have been read entirely
httpRequestLimited("http://example3.net/");

Contributions

Contributions are very welcomed, either on the documentation or on the code.

You may:

  • report any issue you've encountered;
  • fork and create a pull request.

License

ISC © Julien Fontanet

Keywords

async

FAQs

Package last updated on 18 May 2021

Did you know?

Socket

Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.

Install

Related posts