Research
Security News
Quasar RAT Disguised as an npm Package for Detecting Vulnerabilities in Ethereum Smart Contracts
Socket researchers uncover a malicious npm package posing as a tool for detecting vulnerabilities in Etherium smart contracts.
fetch-blob
Advanced tools
The fetch-blob npm package is a module that allows you to work with Blob data in a way that is consistent with the browser's Fetch API. It provides a way to create, read, and manipulate binary data in Node.js, which can be useful for tasks such as file uploads, image processing, and other operations that involve handling raw binary data.
Creating a Blob
This feature allows you to create a new Blob object from raw data. The example code creates a Blob containing the text 'Hello, world!' with a MIME type of 'text/plain'.
const { Blob } = require('fetch-blob');
const blob = new Blob(['Hello, world!'], { type: 'text/plain' });
Reading a Blob as text
This feature allows you to read the contents of a Blob as text. The example code reads the text from the Blob and logs it to the console.
const { Blob } = require('fetch-blob');
const blob = new Blob(['Hello, world!'], { type: 'text/plain' });
blob.text().then((text) => {
console.log(text); // Outputs: Hello, world!
});
Reading a Blob as a Buffer
This feature allows you to read the contents of a Blob as an ArrayBuffer, which can then be converted to a Node.js Buffer. The example code demonstrates how to convert the ArrayBuffer to a Buffer and log it to the console.
const { Blob } = require('fetch-blob');
const blob = new Blob(['Hello, world!'], { type: 'text/plain' });
blob.arrayBuffer().then((buffer) => {
const nodeBuffer = Buffer.from(buffer);
console.log(nodeBuffer); // Outputs: <Buffer 48 65 6c 6c 6f 2c 20 77 6f 72 6c 64 21>
});
The 'blob' package is another implementation of the Blob object for Node.js. It is similar to fetch-blob but may have differences in API and supported features.
The 'form-data' package allows you to create `multipart/form-data` streams that can be used for submitting forms and file uploads in Node.js. It is similar to fetch-blob in that it deals with binary data, but it is more focused on form submission and multipart data.
The 'buffer' package is a Node.js core module that provides a way to handle binary data. While not a direct alternative to fetch-blob, it is often used in conjunction with other modules to handle binary data in Node.js applications.
A Blob implementation in Node.js, originally from node-fetch.
Use the built-in Blob
in Node.js 18 and later.
npm install fetch-blob
Updating from 2 to 3 should be a breeze since there is not many changes to the blob specification.
The major cause of a major release is coding standards.
- internal WeakMaps was replaced with private fields
- internal Buffer.from was replaced with TextEncoder/Decoder
- internal buffers was replaced with Uint8Arrays
- CommonJS was replaced with ESM
- The node stream returned by calling blob.stream()
was replaced with whatwg streams
- (Read "Differences from other blobs" for more info.)
buffer.Blob
(Added in: v15.7.0) and browser native Blob this polyfilled version can't be sent via PostMessageBlobDataItem
created in from.js
that wraps a file path into a blob-like item and read lazily (nodejs plans to implement this as well)blob.stream()
is the most noticeable differences. It returns a WHATWG stream now. to keep it as a node stream you would have to do: import {Readable} from 'stream'
const stream = Readable.from(blob.stream())
// Ways to import
import { Blob } from 'fetch-blob'
import { File } from 'fetch-blob/file.js'
const { Blob } = await import('fetch-blob')
// Ways to read the blob:
const blob = new Blob(['hello, world'])
await blob.text()
await blob.arrayBuffer()
for await (let chunk of blob.stream()) { ... }
blob.stream().getReader().read()
blob.stream().getReader({mode: 'byob'}).read(view)
fetch-blob/from.js
comes packed with tools to convert any filepath into either a Blob or a File
It will not read the content into memory. It will only stat the file for last modified date and file size.
// The default export is sync and use fs.stat to retrieve size & last modified as a blob
import {File, Blob, blobFrom, blobFromSync, fileFrom, fileFromSync} from 'fetch-blob/from.js'
const fsFile = fileFromSync('./2-GiB-file.bin', 'application/octet-stream')
const fsBlob = await blobFrom('./2-GiB-file.mp4')
// Not a 4 GiB memory snapshot, just holds references
// points to where data is located on the disk
const blob = new Blob([fsFile, fsBlob, 'memory', new Uint8Array(10)])
console.log(blob.size) // ~4 GiB
blobFrom|blobFromSync|fileFrom|fileFromSync(path, [mimetype])
(requires FinalizationRegistry - node v14.6)
When using both createTemporaryBlob
and createTemporaryFile
then you will write data to the temporary folder in their respective OS.
The arguments can be anything that fsPromises.writeFile supports. NodeJS
v14.17.0+ also supports writing (async)Iterable streams and passing in a
AbortSignal, so both NodeJS stream and whatwg streams are supported. When the
file have been written it will return a Blob/File handle with a references to
this temporary location on the disk. When you no longer have a references to
this Blob/File anymore and it have been GC then it will automatically be deleted.
This files are also unlinked upon exiting the process.
import { createTemporaryBlob, createTemporaryFile } from 'fetch-blob/from.js'
const req = new Request('https://httpbin.org/image/png')
const res = await fetch(req)
const type = res.headers.get('content-type')
const signal = req.signal
let blob = await createTemporaryBlob(res.body, { type, signal })
// const file = createTemporaryBlob(res.body, 'img.png', { type, signal })
blob = undefined // loosing references will delete the file from disk
createTemporaryBlob(data, { type, signal })
createTemporaryFile(data, FileName, { type, signal, lastModified })
Our Blob & File class are more generic then any other polyfills in the way that it can accept any blob look-a-like item
An example of this is that our blob implementation can be constructed with parts coming from BlobDataItem (aka a filepath) or from buffer.Blob, It dose not have to implement all the methods - just enough that it can be read/understood by our Blob implementation. The minium requirements is that it has Symbol.toStringTag
, size
, slice()
, stream()
methods (the stream method
can be as simple as being a sync or async iterator that yields Uint8Arrays. If you then wrap it in our Blob or File new Blob([blobDataItem])
then you get all of the other methods that should be implemented in a blob or file (aka: text(), arrayBuffer() and type and a ReadableStream)
An example of this could be to create a file or blob like item coming from a remote HTTP request. Or from a DataBase
See the MDN documentation and tests for more details of how to use the Blob.
FAQs
Blob & File implementation in Node.js, originally from node-fetch.
The npm package fetch-blob receives a total of 4,349,075 weekly downloads. As such, fetch-blob popularity was classified as popular.
We found that fetch-blob demonstrated a not healthy version release cadence and project activity because the last version was released a year ago. It has 3 open source maintainers collaborating on the project.
Did you know?
Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.
Research
Security News
Socket researchers uncover a malicious npm package posing as a tool for detecting vulnerabilities in Etherium smart contracts.
Security News
Research
A supply chain attack on Rspack's npm packages injected cryptomining malware, potentially impacting thousands of developers.
Research
Security News
Socket researchers discovered a malware campaign on npm delivering the Skuld infostealer via typosquatted packages, exposing sensitive data.