
Product
Introducing Supply Chain Attack Campaigns Tracking in the Socket Dashboard
Campaign-level threat intelligence in Socket now shows when active supply chain attacks affect your repositories and packages.
fast-async-zlib
Advanced tools
Speed up zlib operations by running them using the sync APIs but in a Worker
Speed up zlib operations by running them using the sync APIs but in a Worker.
npm install fast-async-zlib
Works similar to the core zlib module, except it uses a Worker to batch pending zips
which can be quite faster than using the normal zlib.gzip(data, cb) API.
const ZLibWorker = require('fast-async-zlib')
const z = new ZLibWorker({
maxBatchBytes: 1024 * 1024 // how large a batch buffer should be used? (1 MB default)
})
const buf = await z.gzip('some data')
console.log('gzipped:', buf)
There is a small bench included that benches three approaches to zipping 100k ~1kb strings. On my laptop it produces the following result:
running bench
using core sync: 3.383s
using core async: 4.640s
using worker: 2.870s
re-running bench
using core sync: 3.873s
using core async: 4.843s
using worker: 2.929s
Ie. worker.gzip is ~10% faster than zlib.gzipSync and ~40% faster than zlib.gzip(data, cb).
const z = new ZLibWorker([options])Create a new worker instance. Will use a Worker thread in the background to run the actual gzip, using a SharedArrayBuffer to pass data back and fourth. Options include:
{
maxBatch: 512, // how many entries to max batch to the worker
maxBatchBytes: 1MB // how much memory to use for the shared array buffer
}
Note that maxBatchBytes must be larger than largest payload you pass to z.gzip(payload),
otherwise that method will throw an exception.
If this is a big problem to you, open an issue and we'll see if can make the buffer autogrow easily.
const buf = await z.gzip(inp)Gzip a string or buffer using the worker.
z.destroy()Fully destroy the worker. Only needed if you for some reason want to get rid of it while the program is running.
const pool = ZLibWorker.pool(size, [options])Make a simple worker pool of the given size.
Has the same API as the ZLibWorker but will use size workers behind the scenes to spread out the load.
If you have a need for gunzip, inflate, deflate etc open an issue and we'll see about adding it.
MIT
FAQs
Speed up zlib operations by running them using the sync APIs but in a Worker
We found that fast-async-zlib demonstrated a not healthy version release cadence and project activity because the last version was released a year ago. It has 1 open source maintainer collaborating on the project.
Did you know?

Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.

Product
Campaign-level threat intelligence in Socket now shows when active supply chain attacks affect your repositories and packages.

Research
Malicious PyPI package sympy-dev targets SymPy users, a Python symbolic math library with 85 million monthly downloads.

Security News
Node.js 25.4.0 makes require(esm) stable, formalizing CommonJS and ESM compatibility across supported Node versions.