Security News
GitHub Removes Malicious Pull Requests Targeting Open Source Repositories
GitHub removed 27 malicious pull requests attempting to inject harmful code across multiple open source repositories, in another round of low-effort attacks.
compressing
Advanced tools
The missing compressing and uncompressing lib for node.
Currently supported:
npm install compressing
Use gzip as an example, tar, tgz and zip is same as gzip.
promise style
const compressing = require('compressing');
// compress a file
compressing.gzip.compressFile('file/path/to/compress', 'path/to/destination.gz')
.then(compressDone)
.catch(handleError);
// compress a file buffer
compressing.gzip.compressFile(buffer, 'path/to/destination.gz')
.then(compressDone)
.catch(handleError);
// compress a stream
compressing.gzip.compressFile(stream, 'path/to/destination.gz')
.then(compressDone)
.catch(handleError);
stream style
const compressing = require('compressing');
new compressing.gzip.FileStream({ source: 'file/path/to/compress' })
.on('error', handleError)
.pipe(fs.createWriteStream('path/to/destination.gz'))
.on('error', handleError);
// It's a transform stream, so you can pipe to it
fs.createReadStream('file/path/to/compress')
.on('error', handleError)
.pipe(new compressing.gzip.FileStream())
.on('error', handleError)
.pipe(fs.createWriteStream('path/to/destination.gz'))
.on('error', handleError);
// You should take care of stream errors in caution, use pump to handle error in one place
const pump = require('pump';)
const sourceStream = fs.createReadStream('file/path/to/compress')
const gzipStream = new compressing.gzip.FileStream();
const destStream = fs.createWriteStream('path/to/destination.gz');
pump(sourceStream, gzipStream, destStream, handleError);
Use tar as an example, tgz and zip is same as gzip.
Gzip only support compressing a single file. if you want to compress a dir with gzip, then you may need tgz instead.
promise style
const compressing = require('compressing');
compressing.tar.compressDir('dir/path/to/compress', 'path/to/destination.tar')
.then(compressDone)
.catch(handleError);
stream style
const compressing = require('compressing');
const tarStream = new compressing.tar.Stream();
tarStream.addEntry('dir/path/to/compress');
tarStream
.on('error', handleError)
.pipe(fs.createWriteStream('path/to/destination.tar'))
.on('error', handleError);
// You should take care of stream errors in caution, use pump to handle error in one place
const tarStream = new compressing.tar.Stream();
tarStream.addEntry('dir/path/to/compress');
const destStream = fs.createWriteStream('path/to/destination.tar');
pump(tarStream, destStream, handleError);
Stream is very powerful, you can compress multiple entries in it;
const tarStream = new compressing.tar.Stream();
// dir
tarStream.addEntry('dir/path/to/compress');
// file
tarStream.addEntry('file/path/to/compress');
// buffer
tarStream.addEntry(buffer);
// stream
tarStream.addEntry(stream);
const destStream = fs.createWriteStream('path/to/destination.tar');
pipe(tarStream, destStream, handleError);
promise style
const compressing = require('compressing');
// uncompress a file
compressing.tgz.uncompress('file/path/to/uncompress.tgz', 'path/to/destination/dir')
.then(uncompressDone)
.catch(handleError);
// uncompress a file buffer
compressing.tgz.uncompress(buffer, 'path/to/destination/dir')
.then(uncompressDone)
.catch(handleError);
// uncompress a stream
compressing.tgz.uncompress(stream, 'path/to/destination/dir')
.then(uncompressDone)
.catch(handleError);
Note: tar, tgz and zip have the same uncompressing API as above: destination should be a path of a directory, while that of gzip is slightly different: destination must be a file or filestream.
And working with urllib is super easy. Let's download a tgz file and uncompress to a directory:
const urllib = require('urllib');
const targetDir = require('os').tmpdir();
const compressing = require('compressing');
urllib.request('http://registry.npmjs.org/pedding/-/pedding-1.1.0.tgz', {
streaming: true,
followRedirect: true,
})
.then(result => compressing.tgz.uncompress(result.res, targetDir))
.then(() => console.log('uncompress done'))
.catch(console.error);
stream style
const compressing = require('compressing');
const mkdirp = require('mkdirp');
function onEntry(header, stream, next) => {
stream.on('end', next);
// header.type => file | directory
// header.name => path name
if (header.type === 'file') {
stream.pipe(fs.createWriteStream(path.join(destDir, header.name)));
} else { // directory
mkdirp(path.join(destDir, header.name), err => {
if (err) return handleError(err);
stream.resume();
});
}
}
new compressing.tgz.UncompressStream({ source: 'file/path/to/uncompress.tgz' })
.on('error', handleError)
.on('finish', handleFinish) // uncompressing is done
.on('entry', onEntry);
// It's a writable stream, so you can pipe to it
fs.createReadStream('file/path/to/uncompress')
.on('error', handleError)
.pipe(new compressing.tgz.UncompressStream())
.on('error', handleError)
.on('finish', handleFinish) // uncompressing is done
.on('entry', onEntry);
Note: tar, tgz and zip have the same uncompressing streaming API as above: it's a writable stream, and entries will be emitted while uncompressing one after one another, while that of gzip is slightly different: gzip.UncompressStream is a transform stream, so no entry
event will be emitted and you can just pipe to another stream
This constrants is brought by Gzip algorithm itself, it only support compressing one file and uncompress one file.
new compressing.gzip.UncompressStream({ source: 'file/path/to/uncompress.gz' })
.on('error', handleError)
.pipe(fs.createWriteStream('path/to/dest/file'))
.on('error', handleError);
Use this API to compress a single file. This is a convenient method, which wraps FileStream API below, but you can handle error in one place.
Params
/path/to/xx.tgz
), or a writable stream.Returns a promise object.
Use this API to compress a dir. This is a convenient method, which wraps Stream API below, but you can handle error in one place.
Note: gzip do not have a compressDir method, you may need tgz instead.
Params
/path/to/xx.tgz
), or a writable stream.Use this API to uncompress a file. This is a convenient method, which wraps UncompressStream API below, but you can handle error in one place. RECOMMANDED.
Params
/path/to/xx
). When uncompressing gzip, it should be a file path or a writable stream.opts.zipFileNameEncoding {String} - Only work on zip format, default is 'utf8'. Major non-UTF8 encodings by languages:
The transform stream to compress a single file.
Note: If you are not very familiar with streams, just use compressFile() API, error can be handled in one place.
Common params:
Gzip params:
Tar params:
Tgz params:
tgz.FileStream is a combination of tar.FileStream and gzip.FileStream, so the params are the combination of params of tar and gzip.
Zip params:
The readable stream to compress anything as you need.
Note: If you are not very familiar with streams, just use compressFile() and compressDir() API, error can be handled in one place.
Gzip only support compressing a single file. So gzip.Stream is not available.
Constructor
No options in all constructors.
Instance methods
Params
The writable stream to uncompress anything as you need.
Note: If you are not very familiar with streams, just use uncompress()
API, error can be handled in one place.
Gzip only support compressing and uncompressing one single file. So gzip.UncompressStream is a transform stream which is different from others.
Constructor
Common params:
CAUTION for zip.UncompressStream
Due to the design of the .zip file format, it's impossible to interpret a .zip file without loading all data into memory.
Although the API is streaming style(try to keep it handy), it still loads all data into memory.
https://github.com/thejoshwolfe/yauzl#no-streaming-unzip-api
shaoshuai0102 | fengmk2 | popomore | DiamondYuan | bytemain | Ryqsky |
---|---|---|---|---|---|
ShadyZOZ |
This project follows the git-contributor spec, auto updated at Mon Jun 13 2022 13:26:08 GMT+0800
.
FAQs
Everything you need for compressing and uncompressing
The npm package compressing receives a total of 0 weekly downloads. As such, compressing popularity was classified as not popular.
We found that compressing demonstrated a healthy version release cadence and project activity because the last version was released less than a year ago. It has 3 open source maintainers collaborating on the project.
Did you know?
Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.
Security News
GitHub removed 27 malicious pull requests attempting to inject harmful code across multiple open source repositories, in another round of low-effort attacks.
Security News
RubyGems.org has added a new "maintainer" role that allows for publishing new versions of gems. This new permission type is aimed at improving security for gem owners and the service overall.
Security News
Node.js will be enforcing stricter semver-major PR policies a month before major releases to enhance stability and ensure reliable release candidates.