Logger
Synopsis
Provides a pino logger
that ships its logs to Elasticsearch via pino-elasticsearch.
First, run npm install -g pino-elasticsearch
Use HydraExpressLogger plugin for Hydra Express apps:
const HydraExpressLogger = require('fwsp-logger').HydraExpressLogger;
hydraExpress.use(new HydraExpressLogger());
hydraExpress.init(...);
with corresponding entry in config.json hydra.plugins:
"hydra": {
"plugins": {
"logger": {
"serviceName": "optional - will default to hydra.serviceName",
"logPath": "optional - will default to service/servicename.log",
"toConsole": false,
"noFile": true,
"logRequests": true,
"redact": ["password"],
"elasticsearch": {
"host": "localhost",
"port": 9200,
"index": "local-dev"
}
}
}
}
,
Or, use HydraLogger plugin for Hydra services:
const HydraLogger = require('fwsp-logger').HydraLogger;
hydra.use(new HydraLogger());
hydra.init(...);
General usage:
const PinoLogger = require('fwsp-logger').PinoLogger,
logger = new PinoLogger(
{
serviceName: 'my-service',
logPath: '/custom/log-file.log',
toConsole: true,
elasticsearch: {
host: 'your.elasticsearch.host.com',
port: 9200,
index: 'local-dev'
}
}
);
const appLogger = logger.getLogger();
appLogger.error('An error happened');
appLogger.info({
message: 'Something else happened',
details: {
foo: 'bar',
answer: 42
}
});
Testing
To make sure logs are getting shipped to Elasticsearch,
you can spin up docker containers with ES and Kibana
using the docker-compose.yml file in this repository.
You will need docker and
docker-compose installed,
then in this project folder, launch docker-compose up
.
You'll need to set up an Elasticsearch index in Kibana
before you'll be able to view logs, which should be the value of
logger.elasticsearch.index ('local-dev' in above examples),
or 'pino' by default.
If you don't have any index patterns set up, Kibana won't let you
proceed without adding one. Otherwise, to add additional indices,
go to Settings -> Indices.
License
Licensed under MIT.