fastify-multipart


Fastify plugin to parse the multipart content-type.
Under the hood it uses busboy.
Install
npm i fastify-multipart --save
Usage
const fastify = require('fastify')()
const concat = require('concat-stream')
const fs = require('fs')
const pump = require('pump')
fastify.register(require('fastify-multipart'))
fastify.post('/', function (req, reply) {
const mp = req.multipart(handler, function (err) {
console.log('upload completed')
reply.code(200).send()
})
mp.on('field', function (key, value) {
console.log('form-data', key, value)
})
function handler (field, file, filename, encoding, mimetype) {
pump(file, fs.createWriteStream('a-destination'))
}
})
fastify.listen(3000, err => {
if (err) throw err
console.log(`server listening on ${fastify.server.address().port}`)
})
You can also pass optional arguments to busboy when registering with fastify. This is useful for setting limits on the content that can be uploaded. A full list of available options can be found in the busboy documentation.
fastify.register(require('fastify-multipart'), {
limits: {
fieldNameSize: 100,
fieldSize: 1000000,
fields: 10,
fileSize: 100,
files: 1,
headerPairs: 2000
}
});
If you do set upload limits, be sure to listen for limit events in the handler method. An error or exception will not occur if a limit is reached, but rather the stream will be truncated. These events are documented in more detail here.
mp.on('partsLimit', () => console.log('Maximum number of form parts reached'));
mp.on('filesLimit', () => console.log('Maximum number of files reached'));
mp.on('fieldsLimit', () => console.log('Maximim number of fields reached'));
function handler (field, file, filename, encoding, mimetype) {
file.on('limit', () => console.log('File size limit reached'));
}
Note, if the file size limit is exceeded the file will not be attached to the body.
Additionally, you can pass per-request options to the req.multipart function
fastify.post('/', function (req, reply) {
const options = { limits: { fileSize: 1000 } };
const mp = req.multipart(handler, done, options)
function done (err) {
console.log('upload completed')
reply.code(200).send()
}
function handler (field, file, filename, encoding, mimetype) {
pump(file, fs.createWriteStream('a-destination'))
}
})
You can also use all the parsed HTTP request parameters to the body:
const options = {
addToBody: true,
sharedSchemaId: 'MultipartFileType',
onFile: (fieldName, stream, filename, encoding, mimetype, body) => {
stream.resume()
},
limit: { }
}
fastify.register(require('fastify-multipart'), options)
fastify.post('/', function (req, reply) {
console.log(req.body)
reply.code(200).send()
})
The options onFile
and sharedSchemaId
will be used only when addToBody: true
.
The onFile
option define how the file streams are managed:
- if you don't set it the
req.body.<fieldName>[index].data
will be a Buffer with the data loaded in memory - if you set it with a function you must consume the stream, and the
req.body.<fieldName>[index].data
will be an empty array
Note: By default values in fields with files have array type, so if there's only one file uploaded, you can access it via req.body.<fieldName>[0].data
The sharedSchemaId
parameter must provide a string ID and a shared schema will be added to your fastify instance so you will be able to apply the validation to your service like this:
fastify.post('/upload', {
schema: {
body: {
type: 'object',
required: ['myStringField', 'myFilenameField'],
properties: {
myStringField: { type: 'string' },
myFilenameField: { type: 'array', items: 'MultipartFileType#' }
}
}
}, function (req, reply) {
reply.send('done')
})
The shared schema added will be like this:
{
type: 'object',
properties: {
encoding: { type: 'string' },
filename: { type: 'string' },
limit: { type: 'boolean' },
mimetype: { type: 'string' }
}
}
Acknowledgements
This project is kindly sponsored by:
License
Licensed under MIT.