Exciting release!Introducing "safe npm". Learn more
Socket
Log inDemoInstall

limit-concurrency-decorator

Package Overview
Dependencies
0
Maintainers
2
Versions
6
Issues
File Explorer

Advanced tools

limit-concurrency-decorator

Decorator to limit concurrency of async functions

    0.5.0latest
    GitHub

Version published
Maintainers
2
Weekly downloads
595
increased by1.54%

Weekly downloads

Readme

Source

limit-concurrency-decorator

Package Version Build Status PackagePhobia Latest Commit

Decorator to limit concurrency of async functions

Similar to these libraries, but can be used as decorator:

Also similar to p-concurrency, but the limit can be enforced over multiple functions.

Install

Installation of the npm package:

> npm install --save limit-concurrency-decorator

Usage

Simply apply the decorator to a method:

import { limitConcurrency } from "limit-concurrency-decorator"; class HttpClient { @limitConcurrency(2) get() { // ... } } const client = new HttpClient(); // these calls will run in parallel client.get("http://example.net/"); client.get("http://example2.net/"); // this call will wait for one of the 2 previous to finish client.get("http://example3.net/");

Or a simple function as a wrapper:

import httpRequest from "http-request-plus"; const httpRequestLimited = limitConcurrency(2)(httpRequest); // these calls will run in parallel httpRequestLimited("http://example.net/"); httpRequestLimited("http://example2.net/"); // this call will wait for one of the 2 previous to finish httpRequestLimited("http://example3.net/");

The limit can be shared:

const myLimit = limitConcurrency(2); class HttpClient { @myLimit post() { // ... } @myLimit put() { // ... } }

With FAIL_ON_QUEUE you can fail early instead of waiting:

import { FAIL_ON_QUEUE } from "limit-concurrency-decorator"; try { await httpRequestLimited(FAIL_ON_QUEUE, "http://example2.net"); } catch (error) { error.message; // 'no available place in queue' }

Custom termination:

const httpRequestLimited = limitConcurrency(2, async promise => { const stream = await promise; await new Promise(resolve => { stream.on("end", resolve); stream.on("error", reject); }); })(httpRequest); // these calls will run in parallel httpRequestLimited("http://example.net/"); httpRequestLimited("http://example2.net/"); // this call will wait for one of the 2 previous responses to have been read entirely httpRequestLimited("http://example3.net/");

Contributions

Contributions are very welcomed, either on the documentation or on the code.

You may:

  • report any issue you've encountered;
  • fork and create a pull request.

License

ISC © Julien Fontanet

Keywords

FAQs

Last updated on 18 May 2021

Did you know?

Socket installs a Github app to automatically flag issues on every pull request and report the health of your dependencies. Find out what is inside your node modules and prevent malicious activity before you update the dependencies.

Install Socket
Socket
support@socket.devSocket SOC 2 Logo

Product

  • Package Issues
  • Integrations
  • Docs
  • Pricing
  • FAQ
  • Roadmap

Stay in touch

Get open source security insights delivered straight into your inbox.


  • Terms
  • Privacy
  • Security

Made with ⚡️ by Socket Inc