IPFS unixFS Engine
Import data into an IPFS DAG Service.

Example Importer
Let's create a little directory to import:
$ cd /tmp
$ mkdir foo
$ echo 'hello' > foo/bar
$ echo 'world' > foo/quux
And write the importing logic:
const memStore = require('abstract-blob-store')
const ipfsRepo = require('ipfs-repo')
const ipfsBlock = require('ipfs-block')
const ipfsBlockService = require('ipfs-block-service')
const ipfsMerkleDag = require('ipfs-merkle-dag')
const fs = require('fs')
const repo = new ipfsRepo('', { stores: memStore })
const blocks = new ipfsBlockService(repo)
const dag = new ipfsMerkleDag.DAGService(blocks)
const Importer = require('ipfs-unixfs-engine').importer
const add = new Importer(dag)
const res = []
const rs = fs.createReadStream(file)
const rs2 = fs.createReadStream(file2)
const input = {path: /tmp/foo/bar, stream: rs}
const input2 = {path: /tmp/foo/quxx, stream: rs2}
add.on('data', (info) => {
res.push(info)
})
add.on('end', () => {
console.log('Finished adding files!')
return
})
add.write(input)
add.write(input2)
add.end()
When run, the stat of DAG Node is outputted for each file on data event until the root:
{ multihash: <Buffer 12 20 bd e2 2b 57 3f 6f bd 7c cc 5a 11 7f 28 6c a2 9a 9f c0 90 e1 d4 16 d0 5f 42 81 ec 0c 2a 7f 7f 93>,
Size: 39243,
path: '/tmp/foo/bar' }
{ multihash: <Buffer 12 20 bd e2 2b 57 3f 6f bd 7c cc 5a 11 7f 28 6c a2 9a 9f c0 90 e1 d4 16 d0 5f 42 81 ec 0c 2a 7f 7f 93>,
Size: 59843,
path: '/tmp/foo/quxx' }
{ multihash: <Buffer 12 20 bd e2 2b 57 3f 6f bd 7c cc 5a 11 7f 28 6c a2 9a 9f c0 90 e1 d4 16 d0 5f 42 81 ec 0c 2a 7f 7f 93>,
Size: 93242,
path: '/tmp/foo' }
{ multihash: <Buffer 12 20 bd e2 2b 57 3f 6f bd 7c cc 5a 11 7f 28 6c a2 9a 9f c0 90 e1 d4 16 d0 5f 42 81 ec 0c 2a 7f 7f 93>,
Size: 94234,
path: '/tmp' }
API
const Importer = require('ipfs-unixfs-engine').importer
const add = new Importer(dag)
The importer is a duplex stream in object mode that writes inputs of tuples
of path and readable streams of data. You can stream an array of files to the
importer, just call the 'end' function to signal that you are done inputting file/s.
Listen to the 'data' for the returned informtion 'multihash, size and path' for
each file added. Listen to the 'end' event from the stream to know when the
importer has finished importing files. Input file paths with directory structure
will preserve the hierarchy in the dag node.
Uses the DAG Service instance
dagService
.
Example Exporter
const ipfsRepo = require('ipfs-repo')
const ipfsBlock = require('ipfs-block')
const ipfsBlockService = require('ipfs-block-service')
const ipfsMerkleDag = require('ipfs-merkle-dag')
const repo = new ipfsRepo('', { stores: memStore })
const blocks = new ipfsBlockService(repo)
const dag = new ipfsMerkleDag.DAGService(blocks)
// Create an export readable object stream with the hash you want to export and a dag service
const exportEvent = Exporter(hash, dag)
// Pipe the return stream to console
exportEvent.on('data', (result) => {
result.stream.pipe(process.stdout)
}
##API
const Importer = require('ipfs-unixfs-engine').exporter
The exporter is a readable stream in object mode that returns an object { stream: stream, path: 'path' }
by the multihash of the file from the dag service.
install
With npm installed, run
$ npm install ipfs-unixfs-engine
license
ISC