Security News
Opengrep Emerges as Open Source Alternative Amid Semgrep Licensing Controversy
Opengrep forks Semgrep to preserve open source SAST in response to controversial licensing changes.
The 'queue' npm package is a fast, robust, and extensible queue implementation for managing a list of tasks in a sequential manner. It allows for asynchronous task processing, concurrency control, timeout for tasks, and pausing/resuming the queue. This package is particularly useful for rate-limiting tasks or operations that need to be executed in order but might have asynchronous results, such as API calls, file processing, or any task that requires throttling.
Basic Queue Functionality
This demonstrates how to create a basic queue, add tasks to it, and start processing. Each task is a function that accepts a callback, which must be called upon completion.
const queue = require('queue');
const q = queue();
q.push(function(cb) {
console.log('Hello');
cb();
});
q.push(function(cb) {
console.log('World');
cb();
});
q.start(function(err) {
console.log('All tasks finished.');
});
Concurrency Control
This example shows how to set a concurrency limit, allowing up to 2 tasks to be processed simultaneously.
const q = queue({concurrency: 2});
// Add tasks to q
q.start(function(err) {
console.log('All tasks processed with a maximum of 2 tasks concurrently.');
});
Timeout for Tasks
This code sets a timeout for each task in the queue. If a task does not call its callback within the specified timeout, the queue will move on to the next task.
const q = queue();
q.timeout = 1000; // 1 second timeout for each task
q.push(function(cb) {
setTimeout(function() {
console.log('This task will timeout');
cb();
}, 1500); // This task takes longer than the timeout
});
q.start();
The 'async' package provides a wide array of functionalities for working with asynchronous JavaScript, including queue management. Compared to 'queue', 'async' offers more comprehensive control over asynchronous flow control but might be more complex for simple queue needs.
Bull is a Redis-backed queue package for handling distributed jobs and messages in Node.js. It's more suited for scenarios requiring robustness, such as background processing or job scheduling, and offers features like prioritization, repeatable jobs, and event listeners. It's more complex and feature-rich compared to 'queue', which is simpler and doesn't require Redis.
p-queue is a promise-based queue with concurrency control, similar to 'queue' but leveraging Promises for task handling. It provides an easy-to-use API for managing asynchronous tasks with more modern JavaScript syntax. It's a good alternative if you prefer working with Promises over callbacks.
____ __ _____ __ _____
/ __ `/ / / / _ \/ / / / _ \
/ /_/ / /_/ / __/ /_/ / __/
\__, /\__,_/\___/\__,_/\___/
/_/
Asynchronous function queue with adjustable concurrency.
This module exports a class Queue
that implements most of the Array
API. Pass async functions (ones that accept a callback or return a promise) to an instance's additive array methods. Processing begins when you call q.start()
.
Do npm run example
or npm run dev
and open the example directory (and your console) to run the following program:
import Queue from 'queue'
const q = new Queue({ results: [] })
// add jobs using the familiar Array API
q.push(cb => {
const result = 'two'
cb(null, result)
})
q.push(
cb => {
const result = 'four'
cb(null, result)
},
cb => {
const result = 'five'
cb(null, result)
}
)
// jobs can accept a callback or return a promise
q.push(() => {
return new Promise((resolve, reject) => {
const result = 'one'
resolve(result)
})
})
q.unshift(cb => {
const result = 'one'
cb(null, result)
})
q.splice(2, 0, cb => {
const result = 'three'
cb(null, result)
})
// use the timeout feature to deal with jobs that
// take too long or forget to execute a callback
q.timeout = 100
q.addEventListener('timeout', e => {
console.log('job timed out:', e.detail.job.toString().replace(/\n/g, ''))
e.detail.next()
})
q.push(cb => {
setTimeout(() => {
console.log('slow job finished')
cb()
}, 200)
})
q.push(cb => {
console.log('forgot to execute callback')
})
// jobs can also override the queue's timeout
// on a per-job basis
function extraSlowJob (cb) {
setTimeout(() => {
console.log('extra slow job finished')
cb()
}, 400)
}
extraSlowJob.timeout = 500
q.push(extraSlowJob)
// jobs can also opt-out of the timeout altogether
function superSlowJob (cb) {
setTimeout(() => {
console.log('super slow job finished')
cb()
}, 1000)
}
superSlowJob.timeout = null
q.push(superSlowJob)
// get notified when jobs complete
q.addEventListener('success', e => {
console.log('job finished processing:', e.detail.toString().replace(/\n/g, ''))
console.log('The result is:', e.detail.result)
})
// begin processing, get notified on end / failure
q.start(err => {
if (err) throw err
console.log('all done:', q.results)
})
npm install queue
yarn add queue
npm test
npm run dev // for testing in a browser, open test directory (and your console)
const q = new Queue([opts])
Constructor. opts
may contain initial values for:
q.concurrency
q.timeout
q.autostart
q.results
q.start([cb])
Explicitly starts processing jobs and provides feedback to the caller when the queue empties or an error occurs. If cb is not passed a promise will be returned.
q.stop()
Stops the queue. can be resumed with q.start()
.
q.end([err])
Stop and empty the queue immediately.
Array
Mozilla has docs on how these methods work here. Note that slice
does not copy the queue.
q.push(element1, ..., elementN)
q.unshift(element1, ..., elementN)
q.splice(index , howMany[, element1[, ...[, elementN]]])
q.pop()
q.shift()
q.slice(begin[, end])
q.reverse()
q.indexOf(searchElement[, fromIndex])
q.lastIndexOf(searchElement[, fromIndex])
q.concurrency
Max number of jobs the queue should process concurrently, defaults to Infinity
.
q.timeout
Milliseconds to wait for a job to execute its callback. This can be overridden by specifying a timeout
property on a per-job basis.
q.autostart
Ensures the queue is always running if jobs are available. Useful in situations where you are using a queue only for concurrency control.
q.results
An array to set job callback arguments on.
q.length
Jobs pending + jobs to process (readonly).
q.dispatchEvent(new QueueEvent('start', { job }))
Immediately before a job begins to execute.
q.dispatchEvent(new QueueEvent('success', { result: [...result], job }))
After a job executes its callback.
q.dispatchEvent(new QueueEvent('error', { err, job }))
After a job passes an error to its callback.
q.dispatchEvent(new QueueEvent('timeout', { next, job }))
After q.timeout
milliseconds have elapsed and a job has not executed its callback.
q.dispatchEvent(new QueueEvent('end', { err }))
After all jobs have been processed
The latest stable release is published to npm. Abbreviated changelog below:
start
event before job begins (@joelgriffith)timeout
property on a job to override the queue's timeout (@joelgriffith)Infinity
q.start()
to accept an optional callback executed on q.emit('end')
FAQs
asynchronous function queue with adjustable concurrency
The npm package queue receives a total of 4,098,699 weekly downloads. As such, queue popularity was classified as popular.
We found that queue demonstrated a not healthy version release cadence and project activity because the last version was released a year ago. It has 1 open source maintainer collaborating on the project.
Did you know?
Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.
Security News
Opengrep forks Semgrep to preserve open source SAST in response to controversial licensing changes.
Security News
Critics call the Node.js EOL CVE a misuse of the system, sparking debate over CVE standards and the growing noise in vulnerability databases.
Security News
cURL and Go security teams are publicly rejecting CVSS as flawed for assessing vulnerabilities and are calling for more accurate, context-aware approaches.