
Research
Malicious npm Packages Impersonate Flashbots SDKs, Targeting Ethereum Wallet Credentials
Four npm packages disguised as cryptographic tools steal developer credentials and send them to attacker-controlled Telegram infrastructure.
@bugsplat/fetch-blob
Advanced tools
Blob & File implementation in Node.js, originally from node-fetch.
Forked from node-fetch/fetch-blob, a commonJS Blob implementation in Node.js, originally from node-fetch.
Use the built-in Blob
in Node.js 18 and later.
npm install fetch-blob
Updating from 2 to 3 should be a breeze since there is not many changes to the blob specification.
The major cause of a major release is coding standards.
- internal WeakMaps was replaced with private fields
- internal Buffer.from was replaced with TextEncoder/Decoder
- internal buffers was replaced with Uint8Arrays
- CommonJS was replaced with ESM
- The node stream returned by calling blob.stream()
was replaced with whatwg streams
- (Read "Differences from other blobs" for more info.)
buffer.Blob
(Added in: v15.7.0) and browser native Blob this polyfilled version can't be sent via PostMessageBlobDataItem
created in from.js
that wraps a file path into a blob-like item and read lazily (nodejs plans to implement this as well)blob.stream()
is the most noticeable differences. It returns a WHATWG stream now. to keep it as a node stream you would have to do: import {Readable} from 'stream'
const stream = Readable.from(blob.stream())
// Ways to import
import { Blob } from 'fetch-blob'
import { File } from 'fetch-blob/file.js'
const { Blob } = await import('fetch-blob')
// Ways to read the blob:
const blob = new Blob(['hello, world'])
await blob.text()
await blob.arrayBuffer()
for await (let chunk of blob.stream()) { ... }
blob.stream().getReader().read()
blob.stream().getReader({mode: 'byob'}).read(view)
fetch-blob/from.js
comes packed with tools to convert any filepath into either a Blob or a File
It will not read the content into memory. It will only stat the file for last modified date and file size.
// The default export is sync and use fs.stat to retrieve size & last modified as a blob
import {File, Blob, blobFrom, blobFromSync, fileFrom, fileFromSync} from 'fetch-blob/from.js'
const fsFile = fileFromSync('./2-GiB-file.bin', 'application/octet-stream')
const fsBlob = await blobFrom('./2-GiB-file.mp4')
// Not a 4 GiB memory snapshot, just holds references
// points to where data is located on the disk
const blob = new Blob([fsFile, fsBlob, 'memory', new Uint8Array(10)])
console.log(blob.size) // ~4 GiB
blobFrom|blobFromSync|fileFrom|fileFromSync(path, [mimetype])
(requires FinalizationRegistry - node v14.6)
When using both createTemporaryBlob
and createTemporaryFile
then you will write data to the temporary folder in their respective OS.
The arguments can be anything that fsPromises.writeFile supports. NodeJS
v14.17.0+ also supports writing (async)Iterable streams and passing in a
AbortSignal, so both NodeJS stream and whatwg streams are supported. When the
file have been written it will return a Blob/File handle with a references to
this temporary location on the disk. When you no longer have a references to
this Blob/File anymore and it have been GC then it will automatically be deleted.
This files are also unlinked upon exiting the process.
import { createTemporaryBlob, createTemporaryFile } from 'fetch-blob/from.js'
const req = new Request('https://httpbin.org/image/png')
const res = await fetch(req)
const type = res.headers.get('content-type')
const signal = req.signal
let blob = await createTemporaryBlob(res.body, { type, signal })
// const file = createTemporaryBlob(res.body, 'img.png', { type, signal })
blob = undefined // loosing references will delete the file from disk
createTemporaryBlob(data, { type, signal })
createTemporaryFile(data, FileName, { type, signal, lastModified })
Our Blob & File class are more generic then any other polyfills in the way that it can accept any blob look-a-like item
An example of this is that our blob implementation can be constructed with parts coming from BlobDataItem (aka a filepath) or from buffer.Blob, It dose not have to implement all the methods - just enough that it can be read/understood by our Blob implementation. The minium requirements is that it has Symbol.toStringTag
, size
, slice()
, stream()
methods (the stream method
can be as simple as being a sync or async iterator that yields Uint8Arrays. If you then wrap it in our Blob or File new Blob([blobDataItem])
then you get all of the other methods that should be implemented in a blob or file (aka: text(), arrayBuffer() and type and a ReadableStream)
An example of this could be to create a file or blob like item coming from a remote HTTP request. Or from a DataBase
See the MDN documentation and tests for more details of how to use the Blob.
FAQs
Blob & File implementation in Node.js, originally from node-fetch.
The npm package @bugsplat/fetch-blob receives a total of 0 weekly downloads. As such, @bugsplat/fetch-blob popularity was classified as not popular.
We found that @bugsplat/fetch-blob demonstrated a not healthy version release cadence and project activity because the last version was released a year ago. It has 1 open source maintainer collaborating on the project.
Did you know?
Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.
Research
Four npm packages disguised as cryptographic tools steal developer credentials and send them to attacker-controlled Telegram infrastructure.
Security News
Ruby maintainers from Bundler and rbenv teams are building rv to bring Python uv's speed and unified tooling approach to Ruby development.
Security News
Following last week’s supply chain attack, Nx published findings on the GitHub Actions exploit and moved npm publishing to Trusted Publishers.