Socket
Socket
Sign inDemoInstall

pino-abstract-transport

Package Overview
Dependencies
11
Maintainers
1
Versions
8
Alerts
File Explorer

Advanced tools

Install Socket

Detect and block malicious and high-risk dependencies

Install

    pino-abstract-transport

Write Pino transports easily


Version published
Weekly downloads
5.4M
increased by2.02%
Maintainers
1
Created
Weekly downloads
 

Package description

What is pino-abstract-transport?

The pino-abstract-transport package provides a way to create custom transports for Pino, a fast JSON logger for Node.js. These transports can be used to process or forward log messages to different destinations or services in a structured and efficient manner. The package simplifies the development of these transports by abstracting common tasks and providing a standard interface for implementation.

What are pino-abstract-transport's main functionalities?

Creating a custom transport

This feature allows developers to create a custom transport for Pino logs. The code sample demonstrates how to use the `build` function from `pino-abstract-transport` to create a simple transport that prints log messages to the console.

"use strict";\nconst { build } = require('pino-abstract-transport');\nasync function myTransport(opts) {\n  return async function (source) {\n    for await (const obj of source) {\n      console.log(obj);\n    }\n  };\n}\nmodule.exports = build(myTransport);"

Other packages similar to pino-abstract-transport

Readme

Source

pino-abstract-transport

npm version Build Status Coverage Status js-standard-style

Write Pino transports easily.

Install

npm i pino-abstract-transport

Usage

import build from 'pino-abstract-transport'

export default async function (opts) {
  return build(async function (source) {
    for await (let obj of source) {
      console.log(obj)
    }
  })
}

or in CommonJS and streams:

'use strict'

const build = require('pino-abstract-transport')

module.exports = function (opts) {
  return build(function (source) {
    source.on('data', function (obj) {
      console.log(obj)
    })
  })
}

Typescript usage

Install the type definitions for node. Make sure the major version of the type definitions matches the node version you are using.

Node 16
npm i -D @types/node@16

API

build(fn, opts) => Stream

Create a split2 instance and returns it. This same instance is also passed to the given function, which is called synchronously.

If opts.transform is true, pino-abstract-transform will wrap the split2 instance and the returned stream using duplexify, so they can be concatenated into multiple transports.

Events emitted

In addition to all events emitted by a Readable stream, it emits the following events:

  • unknown where an unparsable line is found, both the line and optional error is emitted.
Options
  • parse an option to change to data format passed to build function. When this option is set to lines, the data is passed as a string, otherwise the data is passed as an object. Default: undefined.

  • close(err, cb) a function that is called to shutdown the transport. It's called both on error and non-error shutdowns. It can also return a promise. In this case discard the the cb argument.

  • parseLine(line) a function that is used to parse line received from pino.

  • expectPinoConfig a boolean that indicates if the transport expects Pino to add some of its configuration to the stream. Default: false.

Example

custom parseLine

You can allow custom parseLine from users while providing a simple and safe default parseLine.

'use strict'

const build = require('pino-abstract-transport')

function defaultParseLine (line) {
  const obj = JSON.parse(line)
  // property foo will be added on each line
  obj.foo = 'bar'
  return obj
}

module.exports = function (opts) {
  const parseLine = typeof opts.parseLine === 'function' ? opts.parseLine : defaultParseLine
  return build(function (source) {
    source.on('data', function (obj) {
      console.log(obj)
    })
  }, {
    parseLine: parseLine
  })
}

Stream concatenation / pipeline

You can pipeline multiple transports:

const build = require('pino-abstract-transport')
const { Transform, pipeline } = require('stream')

function buildTransform () {
  return build(function (source) {
    return new Transform({
      objectMode: true,
      autoDestroy: true,
      transform (line, enc, cb) {
        line.service = 'bob'
        cb(null, JSON.stringify(line))
      }
    })
  }, { enablePipelining: true })
}

function buildDestination () {
  return build(function (source) {
    source.on('data', function (obj) {
      console.log(obj)
    })
  })
}

pipeline(process.stdin, buildTransform(), buildDestination(), function (err) {
  console.log('pipeline completed!', err)
})

Using pino config

Setting expectPinoConfig to true will make the transport wait for pino to send its configuration before starting to process logs. It will add levels, messageKey and errorKey to the stream.

When used with an incompatible version of pino, the stream will immediately error.

import build from 'pino-abstract-transport'

export default function (opts) {
  return build(async function (source) {
    for await (const obj of source) {
      console.log(`[${source.levels.labels[obj.level]}]: ${obj[source.messageKey]}`)
    }
  }, {
    expectPinoConfig: true
  })
}

License

MIT

Keywords

FAQs

Last updated on 22 Apr 2024

Did you know?

Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.

Install

Related posts

SocketSocket SOC 2 Logo

Product

  • Package Alerts
  • Integrations
  • Docs
  • Pricing
  • FAQ
  • Roadmap

Stay in touch

Get open source security insights delivered straight into your inbox.


  • Terms
  • Privacy
  • Security

Made with ⚡️ by Socket Inc