Research
Security News
Quasar RAT Disguised as an npm Package for Detecting Vulnerabilities in Ethereum Smart Contracts
Socket researchers uncover a malicious npm package posing as a tool for detecting vulnerabilities in Etherium smart contracts.
promise-stream-queue
Advanced tools
Promise Stream. Queue promises and retrieve the resolved/rejected ones in the inserted order
Promise Stream. Queue promises and retrieve the resolved/rejected ones in the inserted order
This module is responsible for serializing promises, but allowing their asynchronous execution. Unlike other queues, where promises are executed serially, one after another, what this module does is insert promises in a queue, in a certain order, allowing its asynchronous execution, but making the output be in the same order in which they were inserted. As soon as the promise in the head of the queue is resolved, it is released, moving on to the next.
In case of promises never resolved/rejected, the stream allows an execution timeout that releases the promise from the queue.
This modules is compatible with any A+ Promises module.
npm install promise-stream-queue
Supose we execute 10 asynchronous tasks:
var arr = [1,2,3,4,5,6,7,8,9,10];
var promises = arr.map(i=>{
return new Promise((resolve,reject)=>{
setTimeout(()=>{
resolve("Promise => "+i);
},Math.floor(Math.random(1000)));
});
});
The above code creates 10 promises that are resolved at a random timeout of 1 second.
Then, we could wait for them to be resolved with Promise.all, an then iterate the results in the same order of the "promises" array
Promise.all(promises).then(results=>{
results.forEach(res=>console.log(res));
});
That's OK, but we had to wait for all the promises to be resolved. What if we need to iterate for the results as soon as the are available?
We could just simply get the head element of the array and wait for its resolution, an then move to the next.
function iterate() {
var promise = promises.shift();
promise.then(res=>{
// Process response
doSomething(res);
// Move to next
iterate();
})
}
Here, the problem is that the array of promises is fixed, and we can't take into acount problems as promises never resolved/reject, etc.. What whe want is a continuous stream where promises are pushed, and then retrieved the results in the same order they where inserted.
const Stream = require("promise-stream-queue");
// Creates the stream with max execution of 5 secs per promise
var stream = new Stream(5000); // Execution timeout of 5000 ms
var i = 0;
setInterval(()=>{
stream.push(new Promise((resolve,reject)=>{
var randomTimeout = Math.floor(Math.random()*1000);
setTimeout(()=>resolve("Promise => "+i),randomTimeout);
}));
});
stream.forEach((err,res,ex)=>{
console.log(res);
});
Now, the stream.forEach method will, asynchronously iterate an stream of ordered promises. The iteration never ends as long as promises are being pushed to the stream.
Creates a new stream with a max execution of timeout millisecons per promise. If a promise fails to be resolved/reject after this timeout, it is discarded and rejected with a timeout error.
Pushes a promise to the stream.
Search and discards a previously promise from the stream. This doesn't remove it from the stream, but the promise will be immediately rejected with a kill error.
Closes the stream. New pushed promises will be ignored, and the stream will be marked as closed. The pending promises are still being processed.
Drains the stream. Rejects all the pending promises of the queue.
Returns a snapshot of the stream as an static array of promises.
Iterates infinitely and asynchronously over the stream.
Same as before, but now, the next argument is a function that must be called in order to move to the next element. This is useful when we want to wait to the callback function to finish the process before move to the next promise.
The callback function will be invoked when the stream is closed and empty. Useful when the stream has been closed and we want to know when all promises have been processed
Returns a full implementation of a nodejs Duplex Stream (Writable and Readable stream). This stream is Object Mode by default, and can be passed an optional transform callback before piping to another stream.
var stream = new Stream().toStream({readableObjectMode:false});
function trfn(data,callback) {
callback(JSON.stringify(data)+"\n");
}
// Pipe stream to stdout
stream.transform(trfn).pipe(process.stdout);
// Write promise to stream
stream.write(new Promise((resolve,reject)=>{
setTimeout(()=>{
resolve({data:"This is a json data object"});
},100);
}));
Fired when the head promise is resolved
stream.on("resolve",res=>console.log(res));
Fired when the head promise is rejected
stream.on("reject",err=>console.log(err));
Fired when the head promise has thrown an error
stream.on("catch",ex=>console.log(ex));
Fired when the stream has been closed
stream.on("closed",()=>console.log('Closed'));
Fired when the stream is closed and all promises has been processed
stream.on("done",()=>console.log('Done'));
FAQs
Promise Stream. Queue promises and retrieve the resolved/rejected ones in the inserted order
We found that promise-stream-queue demonstrated a not healthy version release cadence and project activity because the last version was released a year ago. It has 1 open source maintainer collaborating on the project.
Did you know?
Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.
Research
Security News
Socket researchers uncover a malicious npm package posing as a tool for detecting vulnerabilities in Etherium smart contracts.
Security News
Research
A supply chain attack on Rspack's npm packages injected cryptomining malware, potentially impacting thousands of developers.
Research
Security News
Socket researchers discovered a malware campaign on npm delivering the Skuld infostealer via typosquatted packages, exposing sensitive data.