Research
Security News
Malicious npm Packages Inject SSH Backdoors via Typosquatted Libraries
Socket’s threat research team has detected six malicious npm packages typosquatting popular libraries to insert SSH backdoors.
remote-web-streams
Advanced tools
Web streams that work across web workers and <iframe>
s.
Suppose you want to process some data that you've downloaded somewhere. The processing is quite CPU-intensive,
so you want to do it inside a worker. No problem, the web has you covered with postMessage
!
// main.js
(async () => {
const response = await fetch('./some-data.txt');
const data = await response.text();
const worker = new Worker('./worker.js');
worker.onmessage = (event) => {
const output = event.data;
const results = document.getElementById('results');
results.appendChild(document.createTextNode(output)); // tadaa!
};
worker.postMessage(data);
})();
// worker.js
self.onmessage = (event) => {
const input = event.data;
const output = process(input); // do the actual work
self.postMessage(output);
}
All is good: your processing does not block the main thread, so your web page remains responsive. However, it takes quite a long time before the results show up: first all of the data needs to be downloaded, then all that data needs to be processed, and finally everything is shown on the page. Wouldn't it be nice if we could already show something as soon as some of the data has been downloaded and processed?
Normally, you'd tackle this with by reading the input as a stream, piping it through one or more transform streams and finally displaying the results as they come in.
// main.js
(async () => {
const response = await fetch('./some-data.txt');
await response.body
.pipeThrough(new TransformStream({
transform(chunk, controller) {
controller.enqueue(process(chunk)); // do the actual work
}
}))
.pipeTo(new WritableStream({
write(chunk) {
const results = document.getElementById('results');
results.appendChild(document.createTextNode(chunk)); // tadaa!
}
}));
})();
Now you can see the first results as they come in, but your processing is blocking the main thread again! Can we get the best of both worlds: process data as it comes in, but off the main thread?
Enter: remote-web-streams
. With this libray, you can create pairs of readable and writable streams
where you can write chunks to a writable stream inside one context, and read those chunks from a readable stream
inside a different context.
Functionally, such a pair behaves just like an identity transform stream, and you can
use and compose them just like any other stream.
The basic steps for setting up a pair of linked streams are:
RemoteReadableStream
. This returns two objects:
MessagePort
which must be used to construct the linked WritableStream
inside the other contextReadableStream
which will read chunks written by the linked WritableStream
// main.js
import { RemoteReadableStream } from 'remote-web-streams';
const { readable, writablePort } = new RemoteReadableStream();
writablePort
to the other context, and instantiate the linked WritableStream
in that context
using fromWritablePort
.// main.js
const worker = new Worker('./worker.js');
worker.postMessage({ writablePort }, [writablePort]);
// worker.js
import { fromWritablePort } from 'remote-web-streams';
self.onmessage = (event) => {
const { writablePort } = event.data;
const writable = RemoteWebStreams.fromWritablePort(writablePort);
}
writable
inside one context,
the readable
in the other context will receive it.// worker.js
const writer = writable.getWriter();
writer.write('hello');
writer.write('world');
writer.close();
// main.js
(async () => {
const reader = readable.getReader();
console.log(await reader.read()); // { done: false, value: 'hello' }
console.log(await reader.read()); // { done: false, value: 'world' }
console.log(await reader.read()); // { done: true, value: undefined }
})();
You can also create a RemoteWritableStream
.
This is the complement to RemoteReadableStream
:
WritableStream
(instead of a readable one).readablePort
to the other context,
and instantiate the linked ReadableStream
with fromReadablePort
inside that context.// main.js
import { RemoteWritableStream } from 'remote-web-streams';
worker.postMessage({ readablePort }, [readablePort]);
const writer = writable.getWriter();
// ...
// worker.js
import { fromReadablePort } from 'remote-web-streams';
self.onmessage = (event) => {
const { readablePort } = event.data;
const reader = readable.getReader();
// ...
}
In the basic setup, we create one pair of streams and transfer one end to the worker. However, it's also possible to set up multiple pairs and transfer them all to a worker.
This opens up interesting possibilities. We can use a RemoteWritableStream
to write chunks to a worker,
let the worker transform them using one or more TransformStream
s, and then read those transformed chunks
back on the main thread using a RemoteReadableStream
.
This allows us to move one or more CPU-intensive TransformStream
s off the main thread,
and turn them into a "remote transform stream".
To demonstrate these "remote transform streams", we set one up to solve the original problem statement:
RemoteReadableStream
and a RemoteWritableStream
on the main thread.readable
to the writable
by piping it
through one or more TransformStream
s.writable
and read transformed data from the readable
.
Pro-tip: we can use .pipeThrough({ readable, writable })
for this!// main.js
import { RemoteReadableStream, RemoteWritableStream } from 'remote-web-streams';
(async () => {
const worker = new Worker('./worker.js');
// create a stream to send the input to the worker
const { writable, readablePort } = new RemoteWritableStream();
// create a stream to receive the output from the worker
const { readable, writablePort } = new RemoteReadableStream();
// transfer the other ends to the worker
worker.postMessage({ readablePort, writablePort }, [readablePort, writablePort]);
const response = await fetch('./some-data.txt');
await response.body
// send the downloaded data to the worker
// and receive the results back
.pipeThrough({ readable, writable })
// show the results as they come in
.pipeTo(new WritableStream({
write(chunk) {
const results = document.getElementById('results');
results.appendChild(document.createTextNode(chunk)); // tadaa!
}
}));
})();
// worker.js
import { fromReadablePort, fromWritablePort } from 'remote-web-streams';
self.onmessage = async (event) => {
// create the input and output streams from the transferred ports
const { readablePort, writablePort } = event.data;
const readable = fromReadablePort(readablePort);
const writable = fromWritablePort(writablePort);
// process data
await readable
.pipeThrough(new TransformStream({
transform(chunk, controller) {
controller.enqueue(process(chunk)); // do the actual work
}
}))
.pipeTo(writable); // send the results back to main thread
};
With this set up, we achieve the desired goals:
The results are shown as fast as possible, and your web page stays snappy. Great success! 🎉
The library works its magic by creating a MessageChannel
between the WritableStream
and the ReadableStream
.
The writable end sends a message to the readable end whenever a new chunk is written,
so the readable end can enqueue it for reading.
Similarly, the readable end sends a message to the writable end whenever it needs more data,
so the writable end can release any backpressure.
FAQs
Web streams that work across web workers and iframes.
The npm package remote-web-streams receives a total of 239 weekly downloads. As such, remote-web-streams popularity was classified as not popular.
We found that remote-web-streams demonstrated a not healthy version release cadence and project activity because the last version was released a year ago. It has 1 open source maintainer collaborating on the project.
Did you know?
Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.
Research
Security News
Socket’s threat research team has detected six malicious npm packages typosquatting popular libraries to insert SSH backdoors.
Security News
MITRE's 2024 CWE Top 25 highlights critical software vulnerabilities like XSS, SQL Injection, and CSRF, reflecting shifts due to a refined ranking methodology.
Security News
In this segment of the Risky Business podcast, Feross Aboukhadijeh and Patrick Gray discuss the challenges of tracking malware discovered in open source softare.