Huge News!Announcing our $40M Series B led by Abstract Ventures.Learn More
Socket
Sign inDemoInstall
Socket

pino

Package Overview
Dependencies
Maintainers
4
Versions
312
Alerts
File Explorer

Advanced tools

Socket logo

Install Socket

Detect and block malicious and high-risk dependencies

Install

pino - npm Package Compare versions

Comparing version 9.5.0 to 9.6.0

17

docs/api.md

@@ -378,2 +378,19 @@ # API

<a id="streamWrite"></a>
##### `streamWrite`
Allows for manipulating the _stringified_ JSON log data just before writing to various transports.
The method receives the stringified JSON and must return valid stringified JSON.
For example:
```js
const hooks = {
streamWrite (s) {
return s.replaceAll('sensitive-api-key', 'XXX')
}
}
```
<a id=opt-formatters></a>

@@ -380,0 +397,0 @@ #### `formatters` (Object)

25

docs/ecosystem.md

@@ -19,6 +19,6 @@ # Pino Ecosystem

log requests within [Koa](https://koajs.com/).
+ [`restify-pino-logger`](https://github.com/pinojs/restify-pino-logger): use
Pino to log requests within [restify](http://restify.com/).
+ [`rill-pino-logger`](https://github.com/pinojs/rill-pino-logger): use Pino as
the logger for the [Rill framework](https://rill.site/).
+ [`restify-pino-logger`](https://github.com/pinojs/restify-pino-logger): use
Pino to log requests within [restify](http://restify.com/).

@@ -47,6 +47,6 @@ ### Utilities

logs into traditional [HTTPD](https://httpd.apache.org/) style request logs.
+ [`pino-mongodb`](https://github.com/pinojs/pino-mongodb): store Pino logs
in a MongoDB database.
+ [`pino-multi-stream`](https://github.com/pinojs/pino-multi-stream): send
logs to multiple destination streams (slow!).
+ [`pino-mongodb`](https://github.com/pinojs/pino-mongodb): store Pino logs
in a MongoDB database.
+ [`pino-noir`](https://github.com/pinojs/pino-noir): redact sensitive information

@@ -64,6 +64,6 @@ in logs.

based upon log levels.
+ [`pino-test`](https://github.com/pinojs/pino-test): a set of utilities for
verifying logs generated by the Pino logger.
+ [`pino-toke`](https://github.com/pinojs/pino-toke): reformat Pino logs
according to a given format string.
+ [`pino-test`](https://github.com/pinojs/pino-test): a set of utilities for
verifying logs generated by the Pino logger.

@@ -74,14 +74,15 @@

+ [`@google-cloud/pino-logging-gcp-config`](https://www.npmjs.com/package/@google-cloud/pino-logging-gcp-config): Config helper and formatter to output [Google Cloud Platform Structured Logging](https://cloud.google.com/logging/docs/structured-logging)
+ [`@newrelic/pino-enricher`](https://github.com/newrelic/newrelic-node-log-extensions/blob/main/packages/pino-log-enricher): a log customization to add New Relic context to use [Logs In Context](https://docs.newrelic.com/docs/logs/logs-context/logs-in-context/)
+ [`cloud-pine`](https://github.com/metcoder95/cloud-pine): transport that provides abstraction and compatibility with [`@google-cloud/logging`](https://www.npmjs.com/package/@google-cloud/logging).
+ [`cls-proxify`](https://github.com/keenondrums/cls-proxify): integration of pino and [CLS](https://github.com/jeff-lewis/cls-hooked). Useful for creating dynamically configured child loggers (e.g. with added trace ID) for each request.
+ [`crawlee-pino`](https://github.com/imyelo/crawlee-pino): use Pino to log within Crawlee
+ [`pino-colada`](https://github.com/lrlna/pino-colada): cute ndjson formatter for pino.
+ [`pino-dev`](https://github.com/dnjstrom/pino-dev): simple prettifier for pino with built-in support for common ecosystem packages.
+ [`pino-fluentd`](https://github.com/davidedantonio/pino-fluentd): send Pino logs to Elasticsearch,
MongoDB, and many [others](https://www.fluentd.org/dataoutputs) via Fluentd.
+ [`pino-lambda`](https://github.com/FormidableLabs/pino-lambda): log transport for cloudwatch support inside aws-lambda
+ [`pino-pretty-min`](https://github.com/unjello/pino-pretty-min): a minimal
prettifier inspired by the [logrus](https://github.com/sirupsen/logrus) logger.
+ [`pino-rotating-file`](https://github.com/homeaway/pino-rotating-file): a hapi-pino log transport for splitting logs into separate, automatically rotating files.
+ [`cls-proxify`](https://github.com/keenondrums/cls-proxify): integration of pino and [CLS](https://github.com/jeff-lewis/cls-hooked). Useful for creating dynamically configured child loggers (e.g. with added trace ID) for each request.
+ [`pino-tiny`](https://github.com/holmok/pino-tiny): a tiny (and extensible?) little log formatter for pino.
+ [`pino-dev`](https://github.com/dnjstrom/pino-dev): simple prettifier for pino with built-in support for common ecosystem packages.
+ [`@newrelic/pino-enricher`](https://github.com/newrelic/newrelic-node-log-extensions/blob/main/packages/pino-log-enricher): a log customization to add New Relic context to use [Logs In Context](https://docs.newrelic.com/docs/logs/logs-context/logs-in-context/)
+ [`pino-lambda`](https://github.com/FormidableLabs/pino-lambda): log transport for cloudwatch support inside aws-lambda
+ [`cloud-pine`](https://github.com/metcoder95/cloud-pine): transport that provides abstraction and compatibility with [`@google-cloud/logging`](https://www.npmjs.com/package/@google-cloud/logging).
+ [`crawlee-pino`](https://github.com/imyelo/crawlee-pino): use Pino to log within Crawlee

@@ -128,3 +128,3 @@ # Help

not recommended for performance reasons, multiple destinations can be targeted
by using [`pino.multistream`](/doc/api.md#pino-multistream).
by using [`pino.multistream`](/docs/api.md#pino-multistream).

@@ -264,2 +264,28 @@ In this example, we use `stderr` for `error` level logs and `stdout` as default

A library that configures Pino for
[Google Cloud Structured Logging](https://cloud.google.com/logging/docs/structured-logging)
is available at:
[@google-cloud/pino-logging-gcp-config](https://www.npmjs.com/package/@google-cloud/pino-logging-gcp-config)
This library has the following features:
+ Converts Pino log levels to Google Cloud Logging log levels, as above
+ Uses `message` instead of `msg` for the message key, as above
+ Adds a millisecond-granularity timestamp in the
[structure](https://cloud.google.com/logging/docs/agent/logging/configuration#timestamp-processing)
recognised by Google Cloud Logging eg: \
`"timestamp":{"seconds":1445470140,"nanos":123000000}`
+ Adds a sequential
[`insertId`](https://cloud.google.com/logging/docs/reference/v2/rest/v2/LogEntry#FIELDS.insert_id)
to ensure log messages with identical timestamps are ordered correctly.
+ Logs including an `Error` object have the
[`stack_trace`](https://cloud.google.com/error-reporting/docs/formatting-error-messages#log-error)
property set so that the error is forwarded to Google Cloud Error Reporting.
+ Includes a
[`ServiceContext`](https://cloud.google.com/error-reporting/reference/rest/v1beta1/ServiceContext)
object in the logs for Google Cloud Error Reporting, auto detected from the
environment if not specified
+ Maps the OpenTelemetry properties `span_id`, `trace_id`, and `trace_flags`
to the equivalent Google Cloud Logging fields.
<a id="grafana-loki"></a>

@@ -321,2 +347,2 @@ ## Using Grafana Loki to evaluate pino logs in a kubernetes cluster

See [`pino-test`](https://github.com/pinojs/pino-test).
See [`pino-test`](https://github.com/pinojs/pino-test).

@@ -225,3 +225,3 @@ # Transports

const logger = pino(pino.transport({ targets: transports })
const logger = pino(pino.transport({ targets: transports }))
```

@@ -430,2 +430,3 @@

+ [@macfja/pino-fingers-crossed](#macfja-pino-fingers-crossed)
+ [@openobserve/pino-openobserve](#pino-openobserve)
+ [pino-airbrake-transport](#pino-airbrake-transport)

@@ -530,3 +531,32 @@ + [pino-axiom](#pino-axiom)

```
<a id="pino-openobserve"></a>
### @openobserve/pino-openobserve
[@openobserve/pino-openobserve](https://github.com/openobserve/pino-openobserve) is a
Pino v7+ transport that will send logs to an
[OpenObserve](https://openobserve.ai) instance.
```
const pino = require('pino');
const OpenobserveTransport = require('@openobserve/pino-openobserve');
const logger = pino({
level: 'info',
transport: {
target: OpenobserveTransport,
options: {
url: 'https://your-openobserve-server.com',
organization: 'your-organization',
streamName: 'your-stream',
auth: {
username: 'your-username',
password: 'your-password',
},
},
},
});
```
For full documentation check the [README](https://github.com/openobserve/pino-openobserve).
<a id="pino-airbrake-transport"></a>

@@ -659,3 +689,3 @@ ### pino-airbrake-transport

[pino-datadog-transport]: https://github.com/theogravity/pino-datadog-transport
[pino-datadog-transport]: https://github.com/theogravity/datadog-transports
[Datadog]: https://www.datadoghq.com/

@@ -662,0 +692,0 @@

'use strict'
module.exports = { version: '9.5.0' }
module.exports = { version: '9.6.0' }

@@ -30,3 +30,4 @@ 'use strict'

stringifiersSym,
msgPrefixSym
msgPrefixSym,
hooksSym
} = require('./symbols')

@@ -189,2 +190,3 @@ const {

let obj
const streamWriteHook = this[hooksSym].streamWrite

@@ -219,3 +221,3 @@ if (_obj === undefined || _obj === null) {

}
stream.write(s)
stream.write(streamWriteHook ? streamWriteHook(s) : s)
}

@@ -222,0 +224,0 @@

{
"name": "pino",
"version": "9.5.0",
"version": "9.6.0",
"description": "super fast, all natural json logger",

@@ -66,4 +66,4 @@ "main": "pino.js",

"@types/tap": "^15.0.6",
"@yao-pkg/pkg": "5.15.0",
"airtap": "4.0.4",
"@yao-pkg/pkg": "6.1.1",
"airtap": "5.0.0",
"benchmark": "^2.1.4",

@@ -102,3 +102,3 @@ "bole": "^5.0.5",

"tsd": "^0.31.0",
"typescript": "^5.1.3",
"typescript": "~5.7.2",
"winston": "^3.7.2"

@@ -105,0 +105,0 @@ },

@@ -644,2 +644,8 @@ // Project: https://github.com/pinojs/pino.git, http://getpino.io

logMethod?: (this: Logger, args: Parameters<LogFn>, method: LogFn, level: number) => void;
/**
* Allows for manipulating the stringified JSON log output just before writing to various transports.
* This function must return a string and must be valid JSON.
*/
streamWrite?: (s: string) => string;
};

@@ -646,0 +652,0 @@

@@ -73,3 +73,4 @@ 'use strict'

hooks: {
logMethod: undefined
logMethod: undefined,
streamWrite: undefined
},

@@ -76,0 +77,0 @@ timestamp: epochTime,

@@ -98,1 +98,22 @@ 'use strict'

})
tap.test('streamWrite hook', t => {
t.test('gets invoked', async t => {
t.plan(1)
const stream = sink()
const logger = pino({
hooks: {
streamWrite (s) {
return s.replaceAll('redact-me', 'XXX')
}
}
}, stream)
const o = once(stream, 'data')
logger.info('hide redact-me in this string')
t.match(await o, { msg: 'hide XXX in this string' })
})
t.end()
})

@@ -9,3 +9,3 @@ 'use strict'

const { platform } = require('node:process')
const exec = promisify(require('node:child_process').exec)
const execFile = promisify(require('node:child_process').execFile)

@@ -28,3 +28,3 @@ /**

const configPath = join(__dirname, 'pkg.config.json')
const { stderr } = await exec(`npx pkg ${filePath} --config ${configPath}`)
const { stderr } = await execFile('npx', ['pkg', filePath, '--config', configPath], { shell: true })

@@ -47,3 +47,3 @@ // there should be no error when packaging

const { stderr } = await exec(executablePath)
const { stderr } = await execFile(executablePath)

@@ -50,0 +50,0 @@ // check if there were no errors

@@ -212,2 +212,6 @@ import { IncomingMessage, ServerResponse } from "http";

},
streamWrite(s) {
expectType<string>(s);
return s.replaceAll('secret-key', 'xxx');
},
},

@@ -214,0 +218,0 @@ });

SocketSocket SOC 2 Logo

Product

  • Package Alerts
  • Integrations
  • Docs
  • Pricing
  • FAQ
  • Roadmap
  • Changelog

Packages

npm

Stay in touch

Get open source security insights delivered straight into your inbox.


  • Terms
  • Privacy
  • Security

Made with ⚡️ by Socket Inc