
Product
Announcing Socket Fix 2.0
Socket Fix 2.0 brings targeted CVE remediation, smarter upgrade planning, and broader ecosystem support to help developers get to zero alerts.
bunyan-elasticsearch-bulk
Advanced tools
- A [Bunyan](https://github.com/trentm/node-bunyan) stream for saving logs into Elasticsearch. - Saves logs in memory instead of sending them to the Elasticsearch server one by one. - The logs are accumulated in memory until a certain number of logs reach
npm i bunyan-elasticsearch-bulk
const bunyan = require('bunyan')
const createESStream = require('bunyan-elasticsearch-bulk')
const config = {
name: 'Application Name',
streams: [{
level: 'debug',
stream: createESStream({
indexPattern: '[my-app-]YYYY.MM.DD',
node: 'http://localhost:9200'
})
}]
}
const log = bunyan.createLogger(config)
// From here, you can log things according to the best practices of Bunyan.
// Please familiarize yourself with it here: https://github.com/trentm/node-bunyan
log.info('Log this message!')
log.info({ otherInfo: 'What else do you need to log here?' }, 'Log this message!')
These Parameters are passed to the client without any change.
You need to provide the following one. The rest are up to you.
Field | Required? | Example | Description |
---|---|---|---|
node or nodes | Yes | http://localhost:9200 | The Elasticsearch endpoint to use. No trailing slash! |
Field | Default | Description |
---|---|---|
indexPattern | [my-app-]YYYY[-]MM[-]DD | Pattern for the daily segmentation of log indices. This will be passed to moment.utc().format(indexPattern) from moment.js |
type | logs | Type of the logs for Elasticsearch servers prior to v7.x |
limit | 100 | How many logs to collect before submitting them to ES |
interval | 5000 | Time in milliseconds before submitting logs even if their count has not reached the limit . |
client | Client(options) | If you don't want to use the included version of the client v7.x , you can configure the one you want and pass it here. |
Chech server.js for example usages:
Event | Fires when .. |
---|---|
log_received | we receive a log |
log_submitted | we bulk-submit logs to ES server. With a reason limit_exceeded or timeout |
error | an error happens in the whole bulk submission or when saving an individual message fails |
For you as a client, you no longer need to use the keyword new
with this module. There is a factory function now. Example above is updated.
docker-compose.yaml
file has been added to make it easier to develop and test locally.MIT
FAQs
- A [Bunyan](https://github.com/trentm/node-bunyan) stream for saving logs into Elasticsearch. - Saves logs in memory instead of sending them to the Elasticsearch server one by one. - The logs are accumulated in memory until a certain number of logs reach
We found that bunyan-elasticsearch-bulk demonstrated a not healthy version release cadence and project activity because the last version was released a year ago. It has 1 open source maintainer collaborating on the project.
Did you know?
Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.
Product
Socket Fix 2.0 brings targeted CVE remediation, smarter upgrade planning, and broader ecosystem support to help developers get to zero alerts.
Security News
Socket CEO Feross Aboukhadijeh joins Risky Business Weekly to unpack recent npm phishing attacks, their limited impact, and the risks if attackers get smarter.
Product
Socket’s new Tier 1 Reachability filters out up to 80% of irrelevant CVEs, so security teams can focus on the vulnerabilities that matter.