Security News
Research
Data Theft Repackaged: A Case Study in Malicious Wrapper Packages on npm
The Socket Research Team breaks down a malicious wrapper package that uses obfuscation to harvest credentials and exfiltrate sensitive data.
multerator
Advanced tools
Multerator (short for _multipart-iterator_) is a `multipart/form-data` parser for Node.js.
Multerator (short for multipart-iterator) is a multipart/form-data
parser for Node.js.
Compatible for Node.js versions >= 10.21.0
.
This is an initial README and more documentation will be eventually added.
With npm:
npm install multerator
With yarn:
yarn add multerator
const { multerator } = require('multerator');
(async () => {
// Obtain a multipart data stream:
const stream = getSomeMultipartStream();
const boundary = '--------------------------120789128139917295588288';
// Feed it to multerator:
const streamParts = multerator({ input: stream, boundary });
for await (const part of streamParts) {
if (part.type === 'text') {
console.log(
`Got text field "${part.name}" with value "${part.data}"`
);
} else {
console.log(
`Got file field "${part.name}" of filename "${part.filename}" with content type "${part.contentType}" and incoming data chunks:`
);
for await (const chunk of part.data) {
console.log(`Received ${chunk.length} bytes`);
}
}
}
})();
Name | Type | Description |
---|---|---|
options | object (required) | |
options.input | Readable | AsyncIterable<Buffer> (required) | A Readable stream or any async iterable of Buffer objects. |
options.boundary | string (required) | The boundary token by which to separate parts across the contents of given options.input . |
options.maxFileSize | number | Default: none. Optional size limit (in bytes) for individual file part bodies. The moment this limit is reached, multerator will immediately cut the input data stream and yield an error of type ERR_BODY_REACHED_SIZE_LIMIT . |
options.maxFieldSize | number | Default: none. Optional size limit (in bytes) for individual field part bodies. The moment this limit is reached, multerator will immediately cut the input data stream and yield an error of type ERR_BODY_REACHED_SIZE_LIMIT . That's a recommended general safety measure as field part bodies are collected as complete strings in memory which might be unsafe in the case of dealing with an "unreasonable" data source. |
const fs = require('fs');
const { PassThrough } = require('stream');
const FormData = require('form-data');
const { multerator } = require('multerator');
(async () => {
// Obtain a multipart data stream with help from form-data package:
const form = new FormData();
form.append('my_text_field', 'my text value');
form.append('my_file_field', fs.createReadStream(`${__dirname}/image.jpg`));
const input = form.pipe(new PassThrough()); // Converting the form data instance into a normalized Node.js stream, which is async-iteration-friendly as required for multerator's input
const boundary = form.getBoundary();
// Feed it to multerator:
try {
for await (const part of multerator({ input, boundary })) {
if (part.type === 'text') {
console.log(
`Got text field "${part.name}" with value "${part.data}"`
);
} else {
console.log(
`Got file field "${part.name}" of filename "${part.filename}" with content type "${part.contentType}" and incoming data chunks:`
);
for await (const chunk of part.data) {
console.log(`Received ${chunk.length} bytes`);
}
}
}
} catch (err) {
console.log('Multipart parsing failed:', err);
}
})();
const { createWriteStream } = require('fs');
const { pipeline } = require('stream');
const { promisify } = require('util');
const express = require('express');
const { multerator } = require('multerator');
const pipelinePromisified = promisify(pipeline);
const expressApp = express();
expressApp.post('/upload', async (req, res) => {
const contentType = req.headers['content-type'];
try {
if (!contentType.startsWith('multipart/form-data')) {
throw new Error(
'😢 Only requests of type multipart/form-data are allowed'
);
}
const boundary = contentType.split('boundary=')[1];
const parts = multerator({ input: req, boundary });
for await (const part of parts) {
if (part.type === 'file') {
console.log(
`Incoming upload: field name: ${part.name}, filename: ${part.filename}, content type: ${part.contentType}`
);
await pipelinePromisified(
part.data,
createWriteStream(`${__dirname}/uploads/${part.filename}`)
);
}
}
res.status(200).send({ success: true });
} catch (err) {
res.status(500).send({ success: false, error: err.message });
}
});
expressApp.listen(8080, () => console.log('Server listening on 8080'));
...callable by e.g:
curl \
-F my_file_field=@image.jpg \
http://127.0.0.1:8080/upload
FAQs
Multerator (short for _multipart-iterator_) is a `multipart/form-data` parser for Node.js.
We found that multerator demonstrated a not healthy version release cadence and project activity because the last version was released a year ago. It has 1 open source maintainer collaborating on the project.
Did you know?
Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.
Security News
Research
The Socket Research Team breaks down a malicious wrapper package that uses obfuscation to harvest credentials and exfiltrate sensitive data.
Research
Security News
Attackers used a malicious npm package typosquatting a popular ESLint plugin to steal sensitive data, execute commands, and exploit developer systems.
Security News
The Ultralytics' PyPI Package was compromised four times in one weekend through GitHub Actions cache poisoning and failure to rotate previously compromised API tokens.