What is fetch-blob?
The fetch-blob npm package is a module that allows you to work with Blob data in a way that is consistent with the browser's Fetch API. It provides a way to create, read, and manipulate binary data in Node.js, which can be useful for tasks such as file uploads, image processing, and other operations that involve handling raw binary data.
What are fetch-blob's main functionalities?
Creating a Blob
This feature allows you to create a new Blob object from raw data. The example code creates a Blob containing the text 'Hello, world!' with a MIME type of 'text/plain'.
const { Blob } = require('fetch-blob');
const blob = new Blob(['Hello, world!'], { type: 'text/plain' });
Reading a Blob as text
This feature allows you to read the contents of a Blob as text. The example code reads the text from the Blob and logs it to the console.
const { Blob } = require('fetch-blob');
const blob = new Blob(['Hello, world!'], { type: 'text/plain' });
blob.text().then((text) => {
console.log(text); // Outputs: Hello, world!
});
Reading a Blob as a Buffer
This feature allows you to read the contents of a Blob as an ArrayBuffer, which can then be converted to a Node.js Buffer. The example code demonstrates how to convert the ArrayBuffer to a Buffer and log it to the console.
const { Blob } = require('fetch-blob');
const blob = new Blob(['Hello, world!'], { type: 'text/plain' });
blob.arrayBuffer().then((buffer) => {
const nodeBuffer = Buffer.from(buffer);
console.log(nodeBuffer); // Outputs: <Buffer 48 65 6c 6c 6f 2c 20 77 6f 72 6c 64 21>
});
Other packages similar to fetch-blob
blob
The 'blob' package is another implementation of the Blob object for Node.js. It is similar to fetch-blob but may have differences in API and supported features.
form-data
The 'form-data' package allows you to create `multipart/form-data` streams that can be used for submitting forms and file uploads in Node.js. It is similar to fetch-blob in that it deals with binary data, but it is more focused on form submission and multipart data.
buffer
The 'buffer' package is a Node.js core module that provides a way to handle binary data. While not a direct alternative to fetch-blob, it is often used in conjunction with other modules to handle binary data in Node.js applications.
fetch-blob
A Blob implementation in Node.js, originally from node-fetch.
Installation
npm install fetch-blob
Upgrading from 2x to 3x
Updating from 2 to 3 should be a breeze since there is not many changes to the blob specification.
The major cause of a major release is coding standards.
- internal WeakMaps was replaced with private fields
- internal Buffer.from was replaced with TextEncoder/Decoder
- internal buffers was replaced with Uint8Arrays
- CommonJS was replaced with ESM
- The node stream returned by calling blob.stream()
was replaced with a simple generator function that yields Uint8Array (Breaking change)
(Read "Differences from other blobs" for more info.)
All of this changes have made it dependency free of any core node modules, so it would be possible to just import it using http-import from a CDN without any bundling
Differences from other Blobs
- Unlike NodeJS
buffer.Blob
(Added in: v15.7.0) and browser native Blob this polyfilled version can't be sent via PostMessage - This blob version is more arbitrary, it can be constructed with blob parts that isn't a instance of itself
it has to look and behave as a blob to be accepted as a blob part.
- The benefit of this is that you can create other types of blobs that don't contain any internal data that has to be read in other ways, such as the
BlobDataItem
created in from.js
that wraps a file path into a blob-like item and read lazily (nodejs plans to implement this as well)
- The
blob.stream()
is the most noticeable differences. It returns a AsyncGeneratorFunction that yields Uint8Arrays
The reasoning behind Blob.prototype.stream()
is that NodeJS readable stream
isn't spec compatible with whatwg streams and we didn't want to import the hole whatwg stream polyfill for node
or browserify NodeJS streams for the browsers and picking any flavor over the other. So we decided to opted out
of any stream and just implement the bear minium of what both streams have in common which is the asyncIterator
that both yields Uint8Array. this is the most isomorphic way with the use of for-await-of
loops.
It would be redundant to convert anything to whatwg streams and than convert it back to
node streams since you work inside of Node.
It will probably stay like this until nodejs get native support for whatwg[1][https://github.com/nodejs/whatwg-stream] streams and whatwg stream add the node
equivalent for Readable.from(iterable)
2
But for now if you really need a Node Stream then you can do so using this transformation
import {Readable} from 'stream'
const stream = Readable.from(blob.stream())
But if you don't need it to be a stream then you can just use the asyncIterator part of it that is isomorphic.
for await (const chunk of blob.stream()) {
console.log(chunk)
}
If you need to make some feature detection to fix this different behavior
if (Blob.prototype.stream?.constructor?.name === 'AsyncGeneratorFunction') {
let orig = Blob.prototype.stream
Blob.prototype.stream = function () {
const iterator = orig.call(this)
return new ReadableStream({
async pull (ctrl) {
const next = await iterator.next()
return next.done ? ctrl.close() : ctrl.enqueue(next.value)
}
})
}
}
Possible feature whatwg version: ReadableStream.from(iterator)
It's also possible to delete this method and instead use .slice()
and .arrayBuffer()
since it has both a public and private stream method
Usage
import Blob from 'fetch-blob'
import File from 'fetch-blob/file.js'
import {Blob} from 'fetch-blob'
import {File} from 'fetch-blob/file.js'
const {Blob} = await import('fetch-blob')
const blob = new Blob(['hello, world'])
await blob.text()
await blob.arrayBuffer()
for await (let chunk of blob.stream()) { ... }
stream.Readable.from(blob.stream())
globalThis.ReadableStream.from(blob.stream())
Blob part backed up by filesystem
fetch-blob/from.js
comes packed with tools to convert any filepath into either a Blob or a File
It will not read the content into memory. It will only stat the file for last modified date and file size.
import blobFromSync from 'fetch-blob/from.js'
import {File, Blob, blobFrom, blobFromSync, fileFrom, fileFromSync} from 'fetch-blob/from.js'
const fsFile = fileFromSync('./2-GiB-file.bin', 'application/octet-stream')
const fsBlob = await blobFrom('./2-GiB-file.mp4')
const blob = new Blob([fsFile, fsBlob, 'memory', new Uint8Array(10)])
console.log(blob.size)
blobFrom|blobFromSync|fileFrom|fileFromSync(path, [mimetype])
Creating Blobs backed up by other async sources
Our Blob & File class are more generic then any other polyfills in the way that it can accept any blob look-a-like item
An example of this is that our blob implementation can be constructed with parts coming from BlobDataItem (aka a filepath) or from buffer.Blob, It dose not have to implement all the methods - just enough that it can be read/understood by our Blob implementation. The minium requirements is that it has Symbol.toStringTag
, size
, slice()
and either a stream()
or a arrayBuffer()
method. If you then wrap it in our Blob or File new Blob([blobDataItem])
then you get all of the other methods that should be implemented in a blob or file
An example of this could be to create a file or blob like item coming from a remote HTTP request. Or from a DataBase
See the MDN documentation and tests for more details of how to use the Blob.