Research
Security News
Threat Actor Exposes Playbook for Exploiting npm to Build Blockchain-Powered Botnets
A threat actor's playbook for exploiting the npm ecosystem was exposed on the dark web, detailing how to build a blockchain-powered botnet.
pino-abstract-transport
Advanced tools
The pino-abstract-transport package provides a way to create custom transports for Pino, a fast JSON logger for Node.js. These transports can be used to process or forward log messages to different destinations or services in a structured and efficient manner. The package simplifies the development of these transports by abstracting common tasks and providing a standard interface for implementation.
Creating a custom transport
This feature allows developers to create a custom transport for Pino logs. The code sample demonstrates how to use the `build` function from `pino-abstract-transport` to create a simple transport that prints log messages to the console.
"use strict";\nconst { build } = require('pino-abstract-transport');\nasync function myTransport(opts) {\n return async function (source) {\n for await (const obj of source) {\n console.log(obj);\n }\n };\n}\nmodule.exports = build(myTransport);"
Pino-pretty is a module that can be used to format Pino log messages in a more human-readable way. Unlike pino-abstract-transport, which is focused on creating custom transports, pino-pretty is specifically designed for pretty-printing log messages in development environments.
Pino-elasticsearch is a transport for Pino that forwards log messages to Elasticsearch. It provides a more specialized functionality compared to pino-abstract-transport, which offers a generic interface for building various types of transports.
Write Pino transports easily.
npm i pino-abstract-transport
import build from 'pino-abstract-transport'
export default async function (opts) {
return build(async function (source) {
for await (let obj of source) {
console.log(obj)
}
})
}
or in CommonJS and streams:
'use strict'
const build = require('pino-abstract-transport')
module.exports = function (opts) {
return build(function (source) {
source.on('data', function (obj) {
console.log(obj)
})
})
}
Install the type definitions for node. Make sure the major version of the type definitions matches the node version you are using.
npm i -D @types/node@16
Create a split2
instance and returns it.
This same instance is also passed to the given function, which is called
synchronously.
If opts.transform
is true
, pino-abstract-transform
will
wrap the split2 instance and the returned stream using duplexify
,
so they can be concatenated into multiple transports.
In addition to all events emitted by a Readable
stream, it emits the following events:
unknown
where an unparsable line is found, both the line and optional error is emitted.parse
an option to change to data format passed to build function. When this option is set to lines
,
the data is passed as a string, otherwise the data is passed as an object. Default: undefined
.
close(err, cb)
a function that is called to shutdown the transport. It's called both on error and non-error shutdowns.
It can also return a promise. In this case discard the the cb
argument.
parseLine(line)
a function that is used to parse line received from pino
.
expectPinoConfig
a boolean that indicates if the transport expects Pino to add some of its configuration to the stream. Default: false
.
You can allow custom parseLine
from users while providing a simple and safe default parseLine.
'use strict'
const build = require('pino-abstract-transport')
function defaultParseLine (line) {
const obj = JSON.parse(line)
// property foo will be added on each line
obj.foo = 'bar'
return obj
}
module.exports = function (opts) {
const parseLine = typeof opts.parseLine === 'function' ? opts.parseLine : defaultParseLine
return build(function (source) {
source.on('data', function (obj) {
console.log(obj)
})
}, {
parseLine: parseLine
})
}
You can pipeline multiple transports:
const build = require('pino-abstract-transport')
const { Transform, pipeline } = require('stream')
function buildTransform () {
return build(function (source) {
return new Transform({
objectMode: true,
autoDestroy: true,
transform (line, enc, cb) {
line.service = 'bob'
cb(null, JSON.stringify(line))
}
})
}, { enablePipelining: true })
}
function buildDestination () {
return build(function (source) {
source.on('data', function (obj) {
console.log(obj)
})
})
}
pipeline(process.stdin, buildTransform(), buildDestination(), function (err) {
console.log('pipeline completed!', err)
})
Setting expectPinoConfig
to true
will make the transport wait for pino to send its configuration before starting to process logs. It will add levels
, messageKey
and errorKey
to the stream.
When used with an incompatible version of pino, the stream will immediately error.
import build from 'pino-abstract-transport'
export default function (opts) {
return build(async function (source) {
for await (const obj of source) {
console.log(`[${source.levels.labels[obj.level]}]: ${obj[source.messageKey]}`)
}
}, {
expectPinoConfig: true
})
}
MIT
FAQs
Write Pino transports easily
The npm package pino-abstract-transport receives a total of 6,851,967 weekly downloads. As such, pino-abstract-transport popularity was classified as popular.
We found that pino-abstract-transport demonstrated a healthy version release cadence and project activity because the last version was released less than a year ago. It has 1 open source maintainer collaborating on the project.
Did you know?
Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.
Research
Security News
A threat actor's playbook for exploiting the npm ecosystem was exposed on the dark web, detailing how to build a blockchain-powered botnet.
Security News
NVD’s backlog surpasses 20,000 CVEs as analysis slows and NIST announces new system updates to address ongoing delays.
Security News
Research
A malicious npm package disguised as a WhatsApp client is exploiting authentication flows with a remote kill switch to exfiltrate data and destroy files.