Huge News!Announcing our $40M Series B led by Abstract Ventures.Learn More
Socket
Sign inDemoInstall
Socket

pino

Package Overview
Dependencies
Maintainers
4
Versions
311
Alerts
File Explorer

Advanced tools

Socket logo

Install Socket

Detect and block malicious and high-risk dependencies

Install

pino - npm Package Compare versions

Comparing version 9.0.0 to 9.1.0

test/pkg/index.js

18

browser.js

@@ -305,3 +305,3 @@ 'use strict'

if (opts.serialize && !opts.asObject) {
if (opts.serialize && !opts.transmit) {
applySerializers(args, this._serialize, this.serializers, this._stdErrSerialize)

@@ -334,6 +334,5 @@ }

const {
level: levelFormatter = () => logger.levels.values[level],
level: levelFormatter,
log: logObjectFormatter = (obj) => obj
} = formatters
if (logger._serialize) applySerializers(args, logger._serialize, logger.serializers, logger._stdErrSerialize)
const argsCloned = args.slice()

@@ -345,4 +344,10 @@ let msg = argsCloned[0]

}
logObject.level = levelFormatter(level, logger.levels.values[level])
if (levelFormatter) {
const formattedLevel = levelFormatter(level, logger.levels.values[level])
Object.assign(logObject, formattedLevel)
} else {
logObject.level = logger.levels.values[level]
}
let lvl = (logger._childLevel | 0) + 1

@@ -367,5 +372,5 @@ if (lvl < 1) lvl = 1

args[i] = pino.stdSerializers.err(args[i])
} else if (typeof args[i] === 'object' && !Array.isArray(args[i])) {
} else if (typeof args[i] === 'object' && !Array.isArray(args[i]) && serialize) {
for (const k in args[i]) {
if (serialize && serialize.indexOf(k) > -1 && k in serializers) {
if (serialize.indexOf(k) > -1 && k in serializers) {
args[i][k] = serializers[k](args[i][k])

@@ -392,2 +397,3 @@ }

)
logger._logEvent.ts = ts

@@ -394,0 +400,0 @@ logger._logEvent.messages = args.filter(function (arg) {

@@ -244,9 +244,9 @@ # API

As of pino 7.x, when the `mixin` is used with the [`nestedKey` option](#opt-nestedkey),
the object returned from the `mixin` method will also be nested. Prior versions would mix
this object into the root.
As of pino 7.x, when the `mixin` is used with the [`nestedKey` option](#opt-nestedkey),
the object returned from the `mixin` method will also be nested. Prior versions would mix
this object into the root.
```js
const logger = pino({
nestedKey: 'payload',
nestedKey: 'payload',
mixin() {

@@ -594,3 +594,3 @@ return { requestId: requestId.currentId() }

The `onChild` function is a synchronous callback that will be called on each creation of a new child, passing the child instance as its first argument.
The `onChild` function is a synchronous callback that will be called on each creation of a new child, passing the child instance as its first argument.
Any error thrown inside the callback will be uncaught and should be handled inside the callback.

@@ -614,3 +614,3 @@ ```js

An ordinary Node.js `stream` file descriptor can be passed as the
destination (such as the result
destination (such as the result
of `fs.createWriteStream`) but for peak log writing performance, it is strongly

@@ -1007,3 +1007,3 @@ recommended to use `pino.destination` to create the destination stream.

* See [`bindings` parameter in `logger.child`](#logger-child-bindings)
* See [`bindings` parameter in `logger.child`](#logger-child-bindings)

@@ -1246,2 +1246,26 @@ <a id="flush"></a>

Multiple transports can now be defined to include pipelines:
```js
const pino = require('pino')
const transport = pino.transport({
targets: [{
level: 'info',
target: 'pino-pretty' // must be installed separately
}, {
level: 'trace',
target: 'pino/file',
options: { destination: '/path/to/store/logs' }
}, {
pipeline: [{
target: 'pino-syslog' // must be installed separately
}, {
target: 'pino-socket' // must be installed separately
}]
}
]
})
pino(transport)
```
If `WeakRef`, `WeakMap`, and `FinalizationRegistry` are available in the current runtime (v14.5.0+), then the thread

@@ -1284,3 +1308,3 @@ will be automatically terminated in case the stream or logger goes out of scope.

* `worker`: [Worker thread](https://nodejs.org/api/worker_threads.html#worker_threads_new_worker_filename_options) configuration options. Additionally, the `worker` option supports `worker.autoEnd`. If this is set to `false` logs will not be flushed on process exit. It is then up to the developer to call `transport.end()` to flush logs.
* `targets`: May be specified instead of `target`. Must be an array of transport configurations. Transport configurations include the aforementioned `options` and `target` options plus a `level` option which will send only logs above a specified level to a transport.
* `targets`: May be specified instead of `target`. Must be an array of transport configurations and/or pipelines. Transport configurations include the aforementioned `options` and `target` options plus a `level` option which will send only logs above a specified level to a transport.
* `pipeline`: May be specified instead of `target`. Must be an array of transport configurations. Transport configurations include the aforementioned `options` and `target` options. All intermediate steps in the pipeline _must_ be `Transform` streams and not `Writable`.

@@ -1287,0 +1311,0 @@ * `dedupe`: See [pino.multistream options](#pino-multistream)

@@ -147,3 +147,3 @@ # Browser API

For more info on serializers see https://github.com/pinojs/pino/blob/master/docs/api.md#parameters.
For more info on serializers see https://github.com/pinojs/pino/blob/master/docs/api.md#mergingobject.

@@ -150,0 +150,0 @@ ### `transmit` (Object)

@@ -10,3 +10,2 @@ # Bundling

* `lib/worker.js`
* `lib/worker-pipeline.js`
* Any transport used by the user (like `pino-pretty`)

@@ -26,3 +25,2 @@

'pino-worker': pinoWebpackAbsolutePath('./pino-worker.js'),
'pino-pipeline-worker': pinoWebpackAbsolutePath('./pino-pipeline-worker.js'),
'pino-pretty': pinoWebpackAbsolutePath('./pino-pretty.js'),

@@ -32,3 +30,3 @@ };

Note that `pino/file`, `pino-worker`, `pino-pipeline-worker`, and `thread-stream-worker` are required identifiers. Other identifiers are possible based on the user configuration.
Note that `pino/file`, `pino-worker` and `thread-stream-worker` are required identifiers. Other identifiers are possible based on the user configuration.

@@ -35,0 +33,0 @@ ## Webpack Plugin

@@ -46,4 +46,4 @@ ## Long Term Support

| :------ | :----------- | :-------------- | :------------------- |
| 9.x | TBD. | TBD | 18, 20 |
| 8.x | 2022-06-01 | TBD | 14, 16, 18 |
| 9.x | 2024-04-26 | TBD | 18, 20, 22 |
| 8.x | 2022-06-01 | 2024-10-26 | 14, 16, 18, 20 |
| 7.x | 2021-10-14 | 2023-06-01 | 12, 14, 16 |

@@ -63,4 +63,4 @@ | 6.x | 2020-03-07 | 2022-04-14 | 10, 12, 14, 16 |

|---------|------------------------|--------------|
| Linux | `ubuntu-latest` | 18, 20, 21 |
| Windows | `windows-latest` | 18, 20, 21 |
| MacOS | `macos-latest` | 18, 20, 21 |
| Linux | `ubuntu-latest` | 18, 20, 22 |
| Windows | `windows-latest` | 18, 20, 22 |
| MacOS | `macos-latest` | 18, 20, 22 |

@@ -239,22 +239,32 @@ # Web Frameworks

```sh
npm install pino-http
npm install pino-http h3
```
Save as `server.mjs`:
```js
import { createServer } from 'http'
import { createApp } from 'h3'
import { createApp, createRouter, eventHandler, fromNodeMiddleware } from "h3";
import pino from 'pino-http'
const app = createApp()
export const app = createApp();
app.use(pino())
const router = createRouter();
app.use(router);
app.use(fromNodeMiddleware(pino()))
app.use('/', (req) => {
req.log.info('something')
app.use(eventHandler((event) => {
event.node.req.log.info('something')
return 'hello world'
})
}))
createServer(app).listen(process.env.PORT || 3000)
router.get(
"/",
eventHandler((event) => {
return { path: event.path, message: "Hello World!" };
}),
);
```
Execute `npx --yes listhen -w --open ./server.mjs`.
See the [pino-http README](https://npm.im/pino-http) for more info.
'use strict'
module.exports = { version: '9.0.0' }
module.exports = { version: '9.1.0' }

@@ -16,3 +16,3 @@ 'use strict'

try {
const toLoad = 'file://' + target
const toLoad = target.startsWith('file://') ? target : 'file://' + target

@@ -35,4 +35,6 @@ if (toLoad.endsWith('.ts') || toLoad.endsWith('.cts')) {

fn = realRequire(target)
} else if (error.code === undefined) {
} else if (error.code === undefined || error.code === 'ERR_VM_DYNAMIC_IMPORT_CALLBACK_MISSING') {
// When bundled with pkg, an undefined error is thrown when called with realImport
// When bundled with pkg and using node v20, an ERR_VM_DYNAMIC_IMPORT_CALLBACK_MISSING error is thrown when called with realImport
// More info at: https://github.com/pinojs/thread-stream/issues/143
fn = realRequire(decodeURIComponent(target))

@@ -39,0 +41,0 @@ } else {

@@ -90,3 +90,3 @@ 'use strict'

target = bundlerOverrides['pino-worker'] || join(__dirname, 'worker.js')
options.targets = targets.map((dest) => {
options.targets = targets.filter(dest => dest.target).map((dest) => {
return {

@@ -97,5 +97,14 @@ ...dest,

})
options.pipelines = targets.filter(dest => dest.pipeline).map((dest) => {
return dest.pipeline.map((t) => {
return {
...t,
level: dest.level, // duplicate the pipeline `level` property defined in the upper level
target: fixTarget(t.target)
}
})
})
} else if (pipeline) {
target = bundlerOverrides['pino-pipeline-worker'] || join(__dirname, 'worker-pipeline.js')
options.targets = pipeline.map((dest) => {
target = bundlerOverrides['pino-worker'] || join(__dirname, 'worker.js')
options.pipelines = [pipeline.map((dest) => {
return {

@@ -105,3 +114,3 @@ ...dest,

}
})
})]
}

@@ -108,0 +117,0 @@

'use strict'
const EE = require('events')
const { pipeline, PassThrough } = require('stream')
const pino = require('../pino.js')

@@ -12,32 +14,140 @@ const build = require('pino-abstract-transport')

module.exports = async function ({ targets, levels, dedupe }) {
targets = await Promise.all(targets.map(async (t) => {
const fn = await loadTransportStreamBuilder(t.target)
const stream = await fn(t.options)
return {
level: t.level,
stream
}
}))
return build(process, {
parse: 'lines',
metadata: true,
close (err, cb) {
let expected = 0
for (const transport of targets) {
expected++
transport.stream.on('close', closeCb)
transport.stream.end()
/*
* > Multiple targets & pipelines
*
*
* ┌─────────────────────────────────────────────────┐ ┌─────┐
* │ │ │ p │
* │ │ │ i │
* │ target │ │ n │
* │ │ ────────────────────────────────┼────┤ o │
* │ targets │ target │ │ . │
* │ ────────────► │ ────────────────────────────────┼────┤ m │ source
* │ │ target │ │ u │ │
* │ │ ────────────────────────────────┼────┤ l │ │write
* │ │ │ │ t │ ▼
* │ │ pipeline ┌───────────────┐ │ │ i │ ┌────────┐
* │ │ ──────────► │ PassThrough ├───┼────┤ s ├──────┤ │
* │ │ └───────────────┘ │ │ t │ write│ Thread │
* │ │ │ │ r │◄─────┤ Stream │
* │ │ pipeline ┌───────────────┐ │ │ e │ │ │
* │ │ ──────────► │ PassThrough ├───┼────┤ a │ └────────┘
* │ └───────────────┘ │ │ m │
* │ │ │ │
* └─────────────────────────────────────────────────┘ └─────┘
*
*
*
* > One single pipeline or target
*
*
* source
* │
* ┌────────────────────────────────────────────────┐ │write
* │ │ ▼
* │ │ ┌────────┐
* │ targets │ target │ │ │
* │ ────────────► │ ──────────────────────────────┤ │ │
* │ │ │ │ │
* │ ├──────┤ │
* │ │ │ │
* │ │ │ │
* │ OR │ │ │
* │ │ │ │
* │ │ │ │
* │ ┌──────────────┐ │ │ │
* │ targets │ pipeline │ │ │ │ Thread │
* │ ────────────► │ ────────────►│ PassThrough ├─┤ │ Stream │
* │ │ │ │ │ │ │
* │ └──────────────┘ │ │ │
* │ │ │ │
* │ OR │ write│ │
* │ │◄─────┤ │
* │ │ │ │
* │ ┌──────────────┐ │ │ │
* │ pipeline │ │ │ │ │
* │ ──────────────►│ PassThrough ├────────────────┤ │ │
* │ │ │ │ │ │
* │ └──────────────┘ │ └────────┘
* │ │
* │ │
* └────────────────────────────────────────────────┘
*/
module.exports = async function ({ targets, pipelines, levels, dedupe }) {
const targetStreams = []
// Process targets
if (targets && targets.length) {
targets = await Promise.all(targets.map(async (t) => {
const fn = await loadTransportStreamBuilder(t.target)
const stream = await fn(t.options)
return {
level: t.level,
stream
}
}))
function closeCb () {
if (--expected === 0) {
cb(err)
targetStreams.push(...targets)
}
// Process pipelines
if (pipelines && pipelines.length) {
pipelines = await Promise.all(
pipelines.map(async (p) => {
let level
const pipeDests = await Promise.all(
p.map(async (t) => {
// level assigned to pipeline is duplicated over all its targets, just store it
level = t.level
const fn = await loadTransportStreamBuilder(t.target)
const stream = await fn(t.options)
return stream
}
))
return {
level,
stream: createPipeline(pipeDests)
}
})
)
targetStreams.push(...pipelines)
}
// Skip building the multistream step if either one single pipeline or target is defined and
// return directly the stream instance back to TreadStream.
// This is equivalent to define either:
//
// pino.transport({ target: ... })
//
// OR
//
// pino.transport({ pipeline: ... })
if (targetStreams.length === 1) {
return targetStreams[0].stream
} else {
return build(process, {
parse: 'lines',
metadata: true,
close (err, cb) {
let expected = 0
for (const transport of targetStreams) {
expected++
transport.stream.on('close', closeCb)
transport.stream.end()
}
function closeCb () {
if (--expected === 0) {
cb(err)
}
}
}
}
})
})
}
// TODO: Why split2 was not used for pipelines?
function process (stream) {
const multi = pino.multistream(targets, { levels, dedupe })
const multi = pino.multistream(targetStreams, { levels, dedupe })
// TODO manage backpressure

@@ -55,2 +165,32 @@ stream.on('data', function (chunk) {

}
/**
* Creates a pipeline using the provided streams and return an instance of `PassThrough` stream
* as a source for the pipeline.
*
* @param {(TransformStream|WritableStream)[]} streams An array of streams.
* All intermediate streams in the array *MUST* be `Transform` streams and only the last one `Writable`.
* @returns A `PassThrough` stream instance representing the source stream of the pipeline
*/
function createPipeline (streams) {
const ee = new EE()
const stream = new PassThrough({
autoDestroy: true,
destroy (_, cb) {
ee.on('error', cb)
ee.on('closed', cb)
}
})
pipeline(stream, ...streams, function (err) {
if (err && err.code !== 'ERR_STREAM_PREMATURE_CLOSE') {
ee.emit('error', err)
return
}
ee.emit('closed')
})
return stream
}
}
{
"name": "pino",
"version": "9.0.0",
"version": "9.1.0",
"description": "super fast, all natural json logger",

@@ -66,2 +66,3 @@ "main": "pino.js",

"@types/tap": "^15.0.6",
"@yao-pkg/pkg": "5.11.5",
"airtap": "4.0.4",

@@ -87,3 +88,3 @@ "benchmark": "^2.1.4",

"midnight-smoker": "1.1.1",
"pino-pretty": "^10.2.1",
"pino-pretty": "^11.0.0",
"pre-commit": "^1.2.2",

@@ -101,3 +102,3 @@ "proxyquire": "^2.1.3",

"ts-node": "^10.9.1",
"tsd": "^0.30.4",
"tsd": "^0.31.0",
"typescript": "^5.1.3",

@@ -111,3 +112,3 @@ "winston": "^3.7.2"

"pino-abstract-transport": "^1.2.0",
"pino-std-serializers": "^6.0.0",
"pino-std-serializers": "^7.0.0",
"process-warning": "^3.0.0",

@@ -117,4 +118,4 @@ "quick-format-unescaped": "^4.0.3",

"safe-stable-stringify": "^2.3.1",
"sonic-boom": "^3.7.0",
"thread-stream": "^2.6.0"
"sonic-boom": "^4.0.1",
"thread-stream": "^3.0.0"
},

@@ -121,0 +122,0 @@ "tsd": {

@@ -262,6 +262,7 @@ // Project: https://github.com/pinojs/pino.git, http://getpino.io

pipeline: TransportSingleOptions<TransportOptions>[]
level?: LevelWithSilentOrString
}
interface TransportMultiOptions<TransportOptions = Record<string, any>> extends TransportBaseOptions<TransportOptions>{
targets: readonly TransportTargetOptions<TransportOptions>[],
targets: readonly (TransportTargetOptions<TransportOptions>|TransportPipelineOptions<TransportOptions>)[],
levels?: Record<string, number>

@@ -431,2 +432,13 @@ dedupe?: boolean

asObject?: boolean;
formatters?: {
/**
* Changes the shape of the log level.
* The default shape is { level: number }.
*/
level?: (label: string, number: number) => object;
/**
* Changes the shape of the log object.
*/
log?: (object: Record<string, unknown>) => Record<string, unknown>;
}
/**

@@ -433,0 +445,0 @@ * Instead of passing log messages to `console.log` they can be passed to a supplied function. If `write` is

@@ -146,3 +146,3 @@ ![banner](pino-banner.png)

See the [CONTRIBUTING.md](https://github.com/pinojs/pino/blob/master/CONTRIBUTING.md) file for more details.
See the [CONTRIBUTING.md](https://github.com/pinojs/pino/blob/main/CONTRIBUTING.md) file for more details.

@@ -153,2 +153,3 @@ <a name="acknowledgments"></a>

This project was kindly sponsored by [nearForm](https://nearform.com).
This project is kindly sponsored by [Platformatic](https://platformatic.dev).

@@ -155,0 +156,0 @@ Logo and identity designed by Cosmic Fox Design: https://www.behance.net/cosmicfox.

@@ -287,3 +287,2 @@ 'use strict'

const parent = pino({
test: 'this',
browser: {

@@ -305,3 +304,2 @@ serialize: true,

const parent = pino({
test: 'this',
serializers: parentSerializers,

@@ -308,0 +306,0 @@ browser: {

@@ -169,3 +169,3 @@ 'use strict'

test('opts.browser.formatters logs pino-like object to console', ({ end, ok, is }) => {
test('opts.browser.formatters (level) logs pino-like object to console', ({ end, ok, is }) => {
const info = console.info

@@ -193,3 +193,3 @@ console.info = function (o) {

test('opts.browser.formatters logs pino-like object to console', ({ end, ok, is }) => {
test('opts.browser.formatters (log) logs pino-like object to console', ({ end, ok, is }) => {
const info = console.info

@@ -218,18 +218,45 @@ console.info = function (o) {

test('opts.browser.write func log single string', ({ end, ok, is }) => {
const instance = pino({
test('opts.browser.serialize and opts.browser.transmit only serializes log data once', ({ end, ok, is }) => {
const instance = require('../browser')({
serializers: {
extras (data) {
return { serializedExtras: data }
}
},
browser: {
serialize: ['extras'],
transmit: {
level: 'info',
send (level, o) {
is(o.messages[0].extras.serializedExtras, 'world')
}
}
}
})
instance.info({ extras: 'world' }, 'test')
end()
})
test('opts.browser.serialize and opts.asObject only serializes log data once', ({ end, ok, is }) => {
const instance = require('../browser')({
serializers: {
extras (data) {
return { serializedExtras: data }
}
},
browser: {
serialize: ['extras'],
asObject: true,
write: function (o) {
is(o.level, 30)
is(o.msg, 'test')
ok(o.time)
is(o.extras.serializedExtras, 'world')
}
}
})
instance.info('test')
instance.info({ extras: 'world' }, 'test')
end()
})
test('opts.browser.write func string joining', ({ end, ok, is }) => {
test('opts.browser.write func log single string', ({ end, ok, is }) => {
const instance = pino({

@@ -239,3 +266,3 @@ browser: {

is(o.level, 30)
is(o.msg, 'test test2 test3')
is(o.msg, 'test')
ok(o.time)

@@ -245,3 +272,3 @@ }

})
instance.info('test %s %s', 'test2', 'test3')
instance.info('test')

@@ -251,6 +278,5 @@ end()

test('opts.browser.write func string joining when asObject is true', ({ end, ok, is }) => {
test('opts.browser.write func string joining', ({ end, ok, is }) => {
const instance = pino({
browser: {
asObject: true,
write: function (o) {

@@ -257,0 +283,0 @@ is(o.level, 30)

@@ -98,31 +98,1 @@ 'use strict'

})
test('pino.transport with worker-pipeline destination overridden by bundler', async ({ same, teardown }) => {
globalThis.__bundlerPathsOverrides = {
'pino-pipeline-worker': join(__dirname, '..', '..', 'lib/worker-pipeline.js')
}
const destination = file()
const transport = pino.transport({
pipeline: [
{
target: join(__dirname, '..', 'fixtures', 'to-file-transport.js'),
options: { destination }
}
]
})
teardown(transport.end.bind(transport))
const instance = pino(transport)
instance.info('hello')
await watchFileCreated(destination)
const result = JSON.parse(await readFile(destination))
delete result.time
same(result, {
pid,
hostname,
level: 30,
msg: 'hello'
})
globalThis.__bundlerPathsOverrides = undefined
})

@@ -137,3 +137,3 @@ 'use strict'

level: 'info',
target: join(__dirname, '..', 'fixtures', 'to-file-transport.js'),
target: 'file://' + join(__dirname, '..', 'fixtures', 'to-file-transport.js'),
options: { destination: dest1 }

@@ -140,0 +140,0 @@ }, {

@@ -9,2 +9,3 @@ 'use strict'

const pino = require('../../')
const { DEFAULT_LEVELS } = require('../../lib/constants')

@@ -33,3 +34,3 @@ const { pid } = process

hostname,
level: 30,
level: DEFAULT_LEVELS.info,
msg: 'hello',

@@ -39,1 +40,99 @@ service: 'pino' // this property was added by the transform

})
test('pino.transport with targets containing pipelines', async ({ same, teardown }) => {
const destinationA = file()
const destinationB = file()
const transport = pino.transport({
targets: [
{
target: join(__dirname, '..', 'fixtures', 'to-file-transport.js'),
options: { destination: destinationA }
},
{
pipeline: [
{
target: join(__dirname, '..', 'fixtures', 'transport-transform.js')
},
{
target: join(__dirname, '..', 'fixtures', 'to-file-transport.js'),
options: { destination: destinationB }
}
]
}
]
})
teardown(transport.end.bind(transport))
const instance = pino(transport)
instance.info('hello')
await watchFileCreated(destinationA)
await watchFileCreated(destinationB)
const resultA = JSON.parse(await readFile(destinationA))
const resultB = JSON.parse(await readFile(destinationB))
delete resultA.time
delete resultB.time
same(resultA, {
pid,
hostname,
level: DEFAULT_LEVELS.info,
msg: 'hello'
})
same(resultB, {
pid,
hostname,
level: DEFAULT_LEVELS.info,
msg: 'hello',
service: 'pino' // this property was added by the transform
})
})
test('pino.transport with targets containing pipelines with levels defined and dedupe', async ({ same, teardown }) => {
const destinationA = file()
const destinationB = file()
const transport = pino.transport({
targets: [
{
target: join(__dirname, '..', 'fixtures', 'to-file-transport.js'),
options: { destination: destinationA },
level: DEFAULT_LEVELS.info
},
{
pipeline: [
{
target: join(__dirname, '..', 'fixtures', 'transport-transform.js')
},
{
target: join(__dirname, '..', 'fixtures', 'to-file-transport.js'),
options: { destination: destinationB }
}
],
level: DEFAULT_LEVELS.error
}
],
dedupe: true
})
teardown(transport.end.bind(transport))
const instance = pino(transport)
instance.info('hello info')
instance.error('hello error')
await watchFileCreated(destinationA)
await watchFileCreated(destinationB)
const resultA = JSON.parse(await readFile(destinationA))
const resultB = JSON.parse(await readFile(destinationB))
delete resultA.time
delete resultB.time
same(resultA, {
pid,
hostname,
level: DEFAULT_LEVELS.info,
msg: 'hello info'
})
same(resultB, {
pid,
hostname,
level: DEFAULT_LEVELS.error,
msg: 'hello error',
service: 'pino' // this property was added by the transform
})
})
import { expectAssignable, expectType, expectNotAssignable } from "tsd";
import pino from "../../";
import type {LevelWithSilent, Logger, LogFn, P, DestinationStreamWithMetadata, Level, LevelOrString, LevelWithSilentOrString, LoggerExtras } from "../../pino";
import type {LevelWithSilent, Logger, LogFn, P, DestinationStreamWithMetadata, Level, LevelOrString, LevelWithSilentOrString, LoggerExtras, LoggerOptions } from "../../pino";

@@ -49,1 +49,17 @@ // NB: can also use `import * as pino`, but that form is callable as `pino()`

}
const loggerOptions:LoggerOptions = {
browser: {
formatters: {
log(obj) {
return obj
},
level(label, number) {
return { label, number}
}
}
}
}
expectType<LoggerOptions>(loggerOptions)

Sorry, the diff of this file is not supported yet

Sorry, the diff of this file is not supported yet

SocketSocket SOC 2 Logo

Product

  • Package Alerts
  • Integrations
  • Docs
  • Pricing
  • FAQ
  • Roadmap
  • Changelog

Packages

npm

Stay in touch

Get open source security insights delivered straight into your inbox.


  • Terms
  • Privacy
  • Security

Made with ⚡️ by Socket Inc