fastify-multipart
Fastify plugin to parse the multipart content-type. Supports:
- Async / Await
- Async iterator support to handle multiple parts
- Stream & Disk mode
- Accumulate whole file in memory
- Mode to attach all fields to the request body
- Tested across Linux/Mac/Windows
Under the hood it uses busboy.
Install
npm i fastify-multipart
npm i -D @types/busboy
Usage
If you are looking for the documentation for the legacy callback-api please see here.
const fastify = require('fastify')()
const fs = require('fs')
const util = require('util')
const path = require('path')
const { pipeline } = require('stream')
const pump = util.promisify(pipeline)
fastify.register(require('fastify-multipart'))
fastify.post('/', async function (req, reply) {
const data = await req.file()
data.file
data.fields
data.fieldname
data.filename
data.encoding
data.mimetype
await pump(data.file, fs.createWriteStream(data.filename))
reply.send()
})
fastify.listen(3000, err => {
if (err) throw err
console.log(`server listening on ${fastify.server.address().port}`)
})
You can also pass optional arguments to busboy when registering with Fastify. This is useful for setting limits on the content that can be uploaded. A full list of available options can be found in the busboy documentation.
Note: if the file stream that is provided by data.file
is not consumed, like in the example below with the usage of pump, the promise will not be fulfilled at the end of the multipart processing.
This behavior is inherited from busboy.
fastify.register(require('fastify-multipart'), {
limits: {
fieldNameSize: 100,
fieldSize: 100,
fields: 10,
fileSize: 1000000,
files: 1,
headerPairs: 2000
}
});
Note: if the file stream that is provided by data.file
is not consumed, like in the example below with the usage of pump, the promise will not be fulfilled at the end of the multipart processing.
This behavior is inherited from busboy.
Note: if you set a fileSize
limit and you want to know if the file limit was reached you can:
- listen to
data.file.on('limit')
- or check at the end of the stream the property
data.file.truncated
- or call
data.file.toBuffer()
and wait for the error to be thrown
const data = await req.file()
await pump(data.file, fs.createWriteStream(data.filename))
if (data.file.truncated) {
reply.send(new fastify.multipartErrors.FilesLimitError());
}
const data = await req.file()
try {
const buffer = await data.toBuffer()
} catch (err) {
}
Additionally, you can pass per-request options to the req.file
, req.files
, req.saveRequestFiles
or req.multipartIterator
function.
fastify.post('/', async function (req, reply) {
const options = { limits: { fileSize: 1000 } };
const data = await req.file(options)
await pump(data.file, fs.createWriteStream(data.filename))
reply.send()
})
Handle multiple file streams
fastify.post('/', async function (req, reply) {
const parts = req.files()
for await (const part of parts) {
await pump(part.file, fs.createWriteStream(part.filename))
}
reply.send()
})
Handle multiple file streams and fields
fastify.post('/upload/raw/any', async function (req, reply) {
const parts = req.parts()
for await (const part of parts) {
if (part.file) {
await pump(part.file, fs.createWriteStream(part.filename))
} else {
console.log(part)
}
}
reply.send()
})
Accumulate whole file in memory
fastify.post('/upload/raw/any', async function (req, reply) {
const data = await req.file()
const buffer = await data.toBuffer()
reply.send()
})
Upload files to disk and work with temporary file paths
This will store all files in the operating system default directory for temporary files. As soon as the response ends all files are removed.
fastify.post('/upload/files', async function (req, reply) {
const files = await req.saveRequestFiles()
files[0].filepath
files[0].fieldname
files[0].filename
files[0].encoding
files[0].mimetype
files[0].fields
reply.send()
})
Handle file size limitation
If you set a fileSize
limit, it is able to throw a RequestFileTooLargeError
error when limit reached.
fastify.post('/upload/files', async function (req, reply) {
try {
const file = await req.file({ limits: { fileSize: 17000 } })
reply.send()
} catch (error) {
}
})
If you want to fallback to the handling before 4.0.0
, you can disable the throwing behavior by passing throwFileSizeLimit
.
Note: It will not affect the behavior of saveRequestFiles()
fastify.register(fastifyMultipart, { throwFileSizeLimit: false })
fastify.post('/upload/file', async function (req, reply) {
const file = await req.file({ throwFileSizeLimit: false, limits: { fileSize: 17000 } })
reply.send()
})
Parse all fields and assign them to the body
This allows you to parse all fields automatically and assign them to the request.body
. By default files are accumulated in memory (Be careful!) to buffer objects. Uncaught errors are handled by Fastify.
fastify.register(require('fastify-multipart'), { attachFieldsToBody: true })
fastify.post('/upload/files', async function (req, reply) {
const uploadValue = await req.body.upload.toBuffer()
const fooValue = req.body.foo.value
const body = Object.fromEntries(
Object.keys(req.body).map((key) => [key, req.body[key].value])
)
})
You can also define an onFile
handler to avoid accumulating all files in memory.
async function onFile(part) {
await pump(part.file, fs.createWriteStream(part.filename))
}
fastify.register(require('fastify-multipart'), { attachFieldsToBody: true, onFile })
fastify.post('/upload/files', async function (req, reply) {
const fooValue = req.body.foo.value
})
Note: if you assign all fields to the body and don't define an onFile
handler, you won't be able to read the files through streams, as they are already read and their contents are accumulated in memory.
You can only use the toBuffer
method to read the content.
If you try to read from a stream and pipe to a new file, you will obtain an empty new file.
JSON Schema body validation
If you enable attachFieldsToBody
and set sharedSchemaId
a shared JSON Schema is added, which can be used to validate parsed multipart fields.
const opts = {
attachFieldsToBody: true,
sharedSchemaId: '#mySharedSchema'
}
fastify.register(require('fastify-multipart'), opts)
fastify.post('/upload/files', {
schema: {
body: {
type: 'object',
required: ['myField'],
properties: {
myField: { $ref: '#mySharedSchema'},
myFiles: { type: 'array', items: fastify.getSchema('mySharedSchema') },
hello: {
properties: {
value: {
type: 'string',
enum: ['male']
}
}
}
}
}
}
}, function (req, reply) {
console.log({ body: req.body })
reply.send('done')
})
Access all errors
We export all custom errors via a server decorator fastify.multipartErrors
. This is useful if you want to react to specific errors. They are derived from fastify-error and include the correct statusCode
property.
fastify.post('/upload/files', async function (req, reply) {
const { FilesLimitError } = fastify.multipartErrors
})
Acknowledgements
This project is kindly sponsored by:
License
Licensed under MIT.