Huge News!Announcing our $40M Series B led by Abstract Ventures.Learn More
Socket
Sign inDemoInstall
Socket

multerator

Package Overview
Dependencies
Maintainers
1
Versions
16
Alerts
File Explorer

Advanced tools

Socket logo

Install Socket

Detect and block malicious and high-risk dependencies

Install

multerator

Multerator (short for _multipart-iterator_) is a `multipart/form-data` parser for Node.js.

  • 0.11.0
  • latest
  • Source
  • npm
  • Socket score

Version published
Weekly downloads
0
decreased by-100%
Maintainers
1
Weekly downloads
 
Created
Source

Overview

Multerator (short for multipart-iterator) is a multipart/form-data parser for Node.js.

Compatible for Node.js versions >= 10.21.0.

This is an initial README and more documentation will be eventually added.

Installation

With npm:

npm install multerator

With yarn:

yarn add multerator

Synopsis

const { multerator } = require('multerator');

(async () => {
   // Obtain a multipart data stream:
  const stream = getSomeMultipartStream();
  const boundary = '--------------------------120789128139917295588288';

  // Feed it to multerator:
  const streamParts = multerator({ input: stream, boundary });

  for await (const part of streamParts) {
    if (part.type === 'text') {
      console.log(
        `Got text field "${part.name}" with value "${part.data}"`
      );
    } else {
      console.log(
        `Got file field "${part.name}" of filename "${part.filename}" with content type "${part.contentType}" and incoming data chunks:`
      );
      for await (const chunk of part.data) {
        console.log(`Received ${chunk.length} bytes`);
      }
    }
  }
})();

API

Input parameters

NameTypeDescription
optionsobject (required)
options.inputReadable | AsyncIterable<Buffer> (required)A Readable stream or any async iterable of Buffer objects.
options.boundarystring (required)The boundary token by which to separate parts across the contents of given options.input.
options.maxFileSizenumberDefault: none. Optional size limit (in bytes) for individual file part bodies. The moment this limit is reached, multerator will immediately cut the input data stream and yield an error of type ERR_BODY_REACHED_SIZE_LIMIT.
options.maxFieldSizenumberDefault: none. Optional size limit (in bytes) for individual field part bodies. The moment this limit is reached, multerator will immediately cut the input data stream and yield an error of type ERR_BODY_REACHED_SIZE_LIMIT. That's a recommended general safety measure as field part bodies are collected as complete strings in memory which might be unsafe in the case of dealing with an "unreasonable" data source.

Usage examples

General usage:

const fs = require('fs');
const { PassThrough } = require('stream');
const FormData = require('form-data');
const { multerator } = require('multerator');

(async () => {
  // Obtain a multipart data stream with help from form-data package:
  const form = new FormData();
  form.append('my_text_field', 'my text value');
  form.append('my_file_field', fs.createReadStream(`${__dirname}/image.jpg`));
  const input = form.pipe(new PassThrough()); // Converting the form data instance into a normalized Node.js stream, which is async-iteration-friendly as required for multerator's input
  const boundary = form.getBoundary();

  // Feed it to multerator:
  try {
    for await (const part of multerator({ input, boundary })) {
      if (part.type === 'text') {
        console.log(
          `Got text field "${part.name}" with value "${part.data}"`
        );
      } else {
        console.log(
          `Got file field "${part.name}" of filename "${part.filename}" with content type "${part.contentType}" and incoming data chunks:`
        );
        for await (const chunk of part.data) {
          console.log(`Received ${chunk.length} bytes`);
        }
      }
    }
  } catch (err) {
    console.log('Multipart parsing failed:', err);
  }
})();

Very banal file upload server with Express:

const { createWriteStream } = require('fs');
const { pipeline } = require('stream');
const { promisify } = require('util');
const express = require('express');
const { multerator } = require('multerator');

const pipelinePromisified = promisify(pipeline);

const expressApp = express();

expressApp.post('/upload', async (req, res) => {
  const contentType = req.headers['content-type'];

  try {
    if (!contentType.startsWith('multipart/form-data')) {
      throw new Error(
        '😢 Only requests of type multipart/form-data are allowed'
      );
    }

    const boundary = contentType.split('boundary=')[1];

    const parts = multerator({ input: req, boundary });

    for await (const part of parts) {
      if (part.type === 'file') {
        console.log(
          `Incoming upload: field name: ${part.name}, filename: ${part.filename}, content type: ${part.contentType}`
        );
        await pipelinePromisified(
          part.data,
          createWriteStream(`${__dirname}/uploads/${part.filename}`)
        );
      }
    }

    res.status(200).send({ success: true });
  } catch (err) {
    res.status(500).send({ success: false, error: err.message });
  }
});

expressApp.listen(8080, () => console.log('Server listening on 8080'));

...callable by e.g:

curl \
  -F my_file_field=@image.jpg \
  http://127.0.0.1:8080/upload

FAQs

Package last updated on 23 Oct 2022

Did you know?

Socket

Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.

Install

Related posts

SocketSocket SOC 2 Logo

Product

  • Package Alerts
  • Integrations
  • Docs
  • Pricing
  • FAQ
  • Roadmap
  • Changelog

Packages

npm

Stay in touch

Get open source security insights delivered straight into your inbox.


  • Terms
  • Privacy
  • Security

Made with ⚡️ by Socket Inc