concurrent-promise-queue
Advanced tools
Comparing version 1.0.2 to 1.0.3
{ | ||
"name": "concurrent-promise-queue", | ||
"version": "1.0.2", | ||
"version": "1.0.3", | ||
"description": "Allows promises to be queued up and executed at a maximum rate defined by time or max concurrency", | ||
"keywords": ["queue", "concurrency", "concurrent", "parralel", "job", "task", "async", "asynchronous", "promise"], | ||
"keywords": [ | ||
"queue", | ||
"concurrency", | ||
"concurrent", | ||
"parralel", | ||
"job", | ||
"task", | ||
"async", | ||
"asynchronous", | ||
"promise" | ||
], | ||
"repository": "github:doo-gl/concurrent-promise-queue", | ||
@@ -15,3 +25,7 @@ "main": "dist/index.js", | ||
"devDependencies": { | ||
"@jest/globals": "^29.7.0", | ||
"@types/jest": "^29.5.11", | ||
"@types/uuid": "^8.3.0", | ||
"jest": "^29.7.0", | ||
"ts-jest": "^29.1.2", | ||
"typescript": "^4.3.3" | ||
@@ -18,0 +32,0 @@ }, |
@@ -5,5 +5,9 @@ # concurrent-promise-queue | ||
It can set a maximum number of promises to run concurrently, for example running a list of promises 1 or 2 at a time. | ||
It can run promises according to a time based rate-limit, for example running at most 1 promise every second, or 10 promises every 5 seconds. | ||
## installation | ||
```shell | ||
npm install concurrent-promise-queue | ||
npm install --save concurrent-promise-queue | ||
``` | ||
@@ -13,8 +17,6 @@ | ||
This utility was written because I was trying to call a lot of external APIs from a small node server | ||
and found that trying to start 5000 promises that perform API calls would crash my server and also | ||
trigger the external APIs to return a 429 too many requests response. | ||
Generally this is useful if you have a lot of resources to call, but you do not want to overload your server by attempting to do them all at the same time. | ||
By throttling how many promises can be performed the memory footprint of performing thousands of | ||
promises can be controlled, and the number of calls to external APIs can be kept within their rate limits. | ||
For example, making 100 HTTP Requests, if all 100 Requests occur at the same time it is likely the server will run out of memory on a small server. | ||
So using Concurrent Promise Queue allows the server to process say, 5 at a time, to ensure it does not run out of memory. | ||
@@ -44,14 +46,8 @@ ## Usage | ||
``` | ||
08:44:38.289Z - Calling /book/1 | ||
08:44:41.187Z - Called /book/1 | ||
08:44:41.187Z - Calling /book/2 | ||
08:44:43.744Z - Called /book/2 | ||
08:44:43.744Z - Calling /book/3 | ||
08:44:46.358Z - Called /book/3 | ||
08:44:46.359Z - Calling /book/4 | ||
08:44:48.914Z - Called /book/4 | ||
08:44:48.914Z - Calling /book/5 | ||
08:44:50.794Z - Called /book/5 | ||
08:44:50.794Z - Calling /book/6 | ||
08:44:52.203Z - Called /book/6 | ||
0s - Called /book/1 | ||
1s - Called /book/2 | ||
2s - Called /book/3 | ||
3s - Called /book/4 | ||
4s - Called /book/5 | ||
5s - Called /book/6 | ||
``` | ||
@@ -79,16 +75,47 @@ | ||
``` | ||
Some API calls take longer than others, the queue will start new promises to maintain the concurrency limit. | ||
The queue will start new promises to maintain the concurrency limit. | ||
``` | ||
08:49:34.190Z - Calling /book/1 | ||
08:49:34.190Z - Calling /book/2 | ||
08:49:35.545Z - Called /book/1 | ||
08:49:35.545Z - Calling /book/3 | ||
08:49:36.093Z - Called /book/2 | ||
08:49:36.094Z - Calling /book/4 | ||
08:49:37.829Z - Called /book/3 | ||
08:49:37.830Z - Calling /book/5 | ||
08:49:37.836Z - Called /book/4 | ||
08:49:37.837Z - Calling /book/6 | ||
08:49:40.276Z - Called /book/6 | ||
08:49:40.535Z - Called /book/5 | ||
0s - Called /book/1 | ||
0s - Called /book/2 | ||
1s - Called /book/3 | ||
1s - Called /book/4 | ||
2s - Called /book/5 | ||
2s - Called /book/6 | ||
``` | ||
### Execute promises with a time-based rate limit | ||
```js | ||
import {ConcurrentPromiseQueue} from "concurrent-promise-queue"; | ||
// Setting unitOfTimeMillis to 4000 | ||
// Setting maxThroughputPerUnitTime to 2 | ||
// means that at most, 2 promises will be executed per 4 seconds | ||
const queue = new ConcurrentPromiseQueue({ | ||
unitOfTimeMillis: 4000, | ||
maxThroughputPerUnitTime: 2, | ||
}); | ||
return Promise.all([ | ||
queue.addPromise(() => callApi('/book/1')), | ||
queue.addPromise(() => callApi('/book/2')), | ||
queue.addPromise(() => callApi('/book/3')), | ||
queue.addPromise(() => callApi('/book/4')), | ||
queue.addPromise(() => callApi('/book/5')), | ||
queue.addPromise(() => callApi('/book/6')), | ||
]) | ||
.then(results => { | ||
// do something with the results | ||
return results | ||
}) | ||
``` | ||
The queue will start new promises to maintain the concurrency limit. | ||
``` | ||
0s - Called /book/1 | ||
0s - Called /book/2 | ||
4s - Called /book/3 | ||
4s - Called /book/4 | ||
8s - Called /book/5 | ||
8s - Called /book/6 | ||
``` | ||
@@ -45,3 +45,3 @@ | ||
private promiseCompletedTimesLog:Date[]; | ||
private reattemptTimeoutId:number|null; | ||
private reattemptTimeoutId:NodeJS.Timeout|null; | ||
@@ -110,3 +110,3 @@ constructor(options:QueueOptions) { | ||
const numberOfPromisesLeftInConcurrencyLimit:number = this.maxNumberOfConcurrentPromises - numberOfPromisesBeingExecuted; | ||
const numberOfPromisesLeftInRateLimit:number = this.maxThroughputPerUnitTime - numberOfPromisesFinishedInLastUnitTime; | ||
const numberOfPromisesLeftInRateLimit:number = this.maxThroughputPerUnitTime - numberOfPromisesFinishedInLastUnitTime - numberOfPromisesBeingExecuted; | ||
const numberOfPromisesToStart:number = Math.min(numberOfPromisesLeftInConcurrencyLimit, numberOfPromisesLeftInRateLimit); | ||
@@ -113,0 +113,0 @@ if (numberOfPromisesToStart <= 0) { |
License Policy Violation
LicenseThis package is not allowed per your license policy. Review the package's license to ensure compliance.
Found 1 instance in 1 package
License Policy Violation
LicenseThis package is not allowed per your license policy. Review the package's license to ensure compliance.
Found 1 instance in 1 package
33285
11
555
118
6