🚀 Big News: Socket Acquires Coana to Bring Reachability Analysis to Every Appsec Team.Learn more
Socket
DemoInstallSign in
Socket

ipfs-unixfs-engine

Package Overview
Dependencies
Maintainers
1
Versions
80
Alerts
File Explorer

Advanced tools

Socket logo

Install Socket

Detect and block malicious and high-risk dependencies

Install

ipfs-unixfs-engine

JavaScript implementation of the unixfs Engine used by IPFS

0.11.0
Source
npm
Version published
Weekly downloads
1.5K
8.25%
Maintainers
1
Weekly downloads
 
Created
Source

IPFS unixFS Engine

standard-readme compliant Build Status Coverage Status Dependency Status js-standard-style

JavaScript implementation of the layout and chunking mechanisms used by IPFS

Table of Contents

Install

With npm installed, run

$ npm install ipfs-unixfs-engine

Usage

Example Importer

Let's create a little directory to import:

$ cd /tmp
$ mkdir foo
$ echo 'hello' > foo/bar
$ echo 'world' > foo/quux

And write the importing logic:

// Dependencies to create a DAG Service (where the dir will be imported into)
const memStore = require('abstract-blob-store')
const Repo = require('ipfs-repo')
const Block = require('ipfs-block')
const BlockService = require('ipfs-block-service')
const MerkleDag = require('ipfs-merkle-dag')
const fs = require('fs')

const repo = new Repo('', { stores: memStore })
const blockService = new BlockService(repo)
const dagService = new ipfsMerkleDag.DAGService(blocks)


const Importer = require('ipfs-unixfs-engine').Importer
const filesAddStream = new Importer(dagService)

// An array to hold the return of nested file/dir info from the importer
// A root DAG Node is received upon completion

const res = []

// Import path /tmp/foo/bar

const rs = fs.createReadStream(file)
const rs2 = fs.createReadStream(file2)
const input = {path: /tmp/foo/bar, content: rs}
const input2 = {path: /tmp/foo/quxx, content: rs2}

// Listen for the data event from the importer stream

filesAddStream.on('data', (info) => {
	res.push(info)
})

// The end event of the stream signals that the importer is done

filesAddStream.on('end', () => {
	console.log('Finished filesAddStreaming files!')
})

// Calling write on the importer to filesAddStream the file/object tuples

filesAddStream.write(input)
filesAddStream.write(input2)
filesAddStream.end()

When run, the stat of DAG Node is outputted for each file on data event until the root:

{ multihash: <Buffer 12 20 bd e2 2b 57 3f 6f bd 7c cc 5a 11 7f 28 6c a2 9a 9f c0 90 e1 d4 16 d0 5f 42 81 ec 0c 2a 7f 7f 93>,
  size: 39243,
  path: '/tmp/foo/bar' }

{ multihash: <Buffer 12 20 bd e2 2b 57 3f 6f bd 7c cc 5a 11 7f 28 6c a2 9a 9f c0 90 e1 d4 16 d0 5f 42 81 ec 0c 2a 7f 7f 93>,
  size: 59843,
  path: '/tmp/foo/quxx' }

{ multihash: <Buffer 12 20 bd e2 2b 57 3f 6f bd 7c cc 5a 11 7f 28 6c a2 9a 9f c0 90 e1 d4 16 d0 5f 42 81 ec 0c 2a 7f 7f 93>,
  size: 93242,
  path: '/tmp/foo' } 

{ multihash: <Buffer 12 20 bd e2 2b 57 3f 6f bd 7c cc 5a 11 7f 28 6c a2 9a 9f c0 90 e1 d4 16 d0 5f 42 81 ec 0c 2a 7f 7f 93>,
  size: 94234,
  path: '/tmp' }   

Importer API

const Importer = require('ipfs-unixfs-engine').importer

const add = new Importer(dag)

The importer is a object Transform stream that accepts objects of the form

{
  path: 'a name',
  content: (Buffer or Readable stream)
}

The stream will output IPFS DAG Node stats for the nodes as they are added to the DAG Service. When stats on a node are emitted they are guaranteed to have been written into the DAG Service's storage mechanism.

The input's file paths and directory structure will be preserved in the DAG Nodes.

Example Exporter

const Repo = require('ipfs-repo')
const Block = require('ipfs-block')
const BlockService = require('ipfs-block-service')
const MerkleDAG = require('ipfs-merkle-dag')

const repo = new Repo('', { stores: memStore })
const blockService = new BlockService(repo)
const dagService = new MerkleDag.DAGService(blockService)

// Create an export readable object stream with the hash you want to export and a dag service

const filesStream = Exporter(<multihash>, dag)

// Pipe the return stream to console

filesStream.on('data', (file) => {
	file.content.pipe(process.stdout)
}

Exporter: API

const Exporter = require('ipfs-unixfs-engine').Exporter

new Exporter(hash, dagService)

Uses the given DAG Service to fetch an IPFS UnixFS object(s) by their multiaddress.

Creates a new readable stream in object mode that outputs objects of the form

{
  path: 'a name',
  content: (Buffer or Readable stream)
}

Errors are received as with a normal stream, by listening on the 'error' event to be emitted.

Contribute

Feel free to join in. All welcome. Open an issue!

This repository falls under the IPFS Code of Conduct.

License

MIT

Keywords

IPFS

FAQs

Package last updated on 08 Sep 2016

Did you know?

Socket

Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.

Install

Related posts