
Security News
Open Source Maintainers Demand Ability to Block Copilot-Generated Issues and PRs
Open source maintainers are urging GitHub to let them block Copilot from submitting AI-generated issues and pull requests to their repositories.
ipfs-unixfs-engine
Advanced tools
JavaScript implementation of the layout and chunking mechanisms used by IPFS
With npm installed, run
$ npm install ipfs-unixfs-engine
Let's create a little directory to import:
$ cd /tmp
$ mkdir foo
$ echo 'hello' > foo/bar
$ echo 'world' > foo/quux
And write the importing logic:
// Dependencies to create a DAG Service (where the dir will be imported into)
const memStore = require('abstract-blob-store')
const Repo = require('ipfs-repo')
const Block = require('ipfs-block')
const BlockService = require('ipfs-block-service')
const MerkleDag = require('ipfs-merkle-dag')
const fs = require('fs')
const repo = new Repo('', { stores: memStore })
const blockService = new BlockService(repo)
const dagService = new ipfsMerkleDag.DAGService(blocks)
const Importer = require('ipfs-unixfs-engine').Importer
const filesAddStream = new Importer(dagService)
// An array to hold the return of nested file/dir info from the importer
// A root DAG Node is received upon completion
const res = []
// Import path /tmp/foo/bar
const rs = fs.createReadStream(file)
const rs2 = fs.createReadStream(file2)
const input = {path: /tmp/foo/bar, content: rs}
const input2 = {path: /tmp/foo/quxx, content: rs2}
// Listen for the data event from the importer stream
filesAddStream.on('data', (info) => {
res.push(info)
})
// The end event of the stream signals that the importer is done
filesAddStream.on('end', () => {
console.log('Finished filesAddStreaming files!')
})
// Calling write on the importer to filesAddStream the file/object tuples
filesAddStream.write(input)
filesAddStream.write(input2)
filesAddStream.end()
When run, the stat of DAG Node is outputted for each file on data event until the root:
{ multihash: <Buffer 12 20 bd e2 2b 57 3f 6f bd 7c cc 5a 11 7f 28 6c a2 9a 9f c0 90 e1 d4 16 d0 5f 42 81 ec 0c 2a 7f 7f 93>,
size: 39243,
path: '/tmp/foo/bar' }
{ multihash: <Buffer 12 20 bd e2 2b 57 3f 6f bd 7c cc 5a 11 7f 28 6c a2 9a 9f c0 90 e1 d4 16 d0 5f 42 81 ec 0c 2a 7f 7f 93>,
size: 59843,
path: '/tmp/foo/quxx' }
{ multihash: <Buffer 12 20 bd e2 2b 57 3f 6f bd 7c cc 5a 11 7f 28 6c a2 9a 9f c0 90 e1 d4 16 d0 5f 42 81 ec 0c 2a 7f 7f 93>,
size: 93242,
path: '/tmp/foo' }
{ multihash: <Buffer 12 20 bd e2 2b 57 3f 6f bd 7c cc 5a 11 7f 28 6c a2 9a 9f c0 90 e1 d4 16 d0 5f 42 81 ec 0c 2a 7f 7f 93>,
size: 94234,
path: '/tmp' }
const Importer = require('ipfs-unixfs-engine').importer
The importer is a object Transform stream that accepts objects of the form
{
path: 'a name',
content: (Buffer or Readable stream)
}
The stream will output IPFS DAG Node stats for the nodes as they are added to the DAG Service. When stats on a node are emitted they are guaranteed to have been written into the DAG Service's storage mechanism.
The input's file paths and directory structure will be preserved in the DAG Nodes.
const Repo = require('ipfs-repo')
const Block = require('ipfs-block')
const BlockService = require('ipfs-block-service')
const MerkleDAG = require('ipfs-merkle-dag')
const repo = new Repo('', { stores: memStore })
const blockService = new BlockService(repo)
const dagService = new MerkleDag.DAGService(blockService)
// Create an export readable object stream with the hash you want to export and a dag service
const filesStream = Exporter(<multihash>, dag)
// Pipe the return stream to console
filesStream.on('data', (file) => {
file.content.pipe(process.stdout)
}
const Exporter = require('ipfs-unixfs-engine').Exporter
Uses the given DAG Service to fetch an IPFS UnixFS object(s) by their multiaddress.
Creates a new readable stream in object mode that outputs objects of the form
{
path: 'a name',
content: (Buffer or Readable stream)
}
Errors are received as with a normal stream, by listening on the 'error'
event to be emitted.
Feel free to join in. All welcome. Open an issue!
This repository falls under the IPFS Code of Conduct.
FAQs
JavaScript implementation of the unixfs Engine used by IPFS
The npm package ipfs-unixfs-engine receives a total of 1,092 weekly downloads. As such, ipfs-unixfs-engine popularity was classified as popular.
We found that ipfs-unixfs-engine demonstrated a not healthy version release cadence and project activity because the last version was released a year ago. It has 1 open source maintainer collaborating on the project.
Did you know?
Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.
Security News
Open source maintainers are urging GitHub to let them block Copilot from submitting AI-generated issues and pull requests to their repositories.
Research
Security News
Malicious Koishi plugin silently exfiltrates messages with hex strings to a hardcoded QQ account, exposing secrets in chatbots across platforms.
Research
Security News
Malicious PyPI checkers validate stolen emails against TikTok and Instagram APIs, enabling targeted account attacks and dark web credential sales.