Security News
Research
Data Theft Repackaged: A Case Study in Malicious Wrapper Packages on npm
The Socket Research Team breaks down a malicious wrapper package that uses obfuscation to harvest credentials and exfiltrate sensitive data.
streamroller
Advanced tools
The streamroller npm package is a file-based logging utility designed to help manage log files by supporting automatic rolling of logs based on size or date. It is particularly useful for applications that generate a lot of log data and need to manage disk space efficiently.
Rolling file streams based on size
This feature allows you to create a log file that rolls over when it reaches a certain size. In the example, a new log file is created when the current log file reaches 10 MB. The '3' indicates that a maximum of three backup files are kept.
const StreamRoller = require('streamroller');
const stream = new StreamRoller.RollingFileStream('example.log', 1024 * 1024 * 10, 3);
stream.write('This is a log entry');
stream.end();
Rolling file streams based on date
This feature allows for log files to be rolled over based on date patterns. The 'yyyy-MM-dd' pattern means the log file will roll over daily. The 'daysToKeep' option specifies that logs older than 10 days should be deleted.
const StreamRoller = require('streamroller');
const stream = new StreamRoller.DateRollingFileStream('example.log', 'yyyy-MM-dd', { daysToKeep: 10 });
stream.write('This is a log entry');
stream.end();
Winston is a multi-transport async logging library for Node.js. Similar to streamroller, it supports file-based logging with log rotation, but it also offers more flexibility with multiple logging transports like console, file, and HTTP, and it supports custom log levels.
Bunyan is a simple and fast JSON logging library for Node.js services. Like streamroller, it supports log rotation, but it focuses on JSON log entries and provides a more structured logging solution that is ideal for large-scale applications.
node.js file streams that roll over when they reach a maximum size, or a date/time.
npm install streamroller
var rollers = require('streamroller');
var stream = new rollers.RollingFileStream('myfile', 1024, 3);
stream.write("stuff");
stream.end();
The streams behave the same as standard node.js streams, except that when certain conditions are met they will rename the current file to a backup and start writing to a new file.
filename
<string>maxSize
<integer> - defaults to 0
- the size in bytes to trigger a rollover. If not specified or 0, then no log rolling will happen.numBackups
<integer> - defaults to 1
- the number of old files to keep (excluding the hot file)options
<Object>
encoding
<string> - defaults to 'utf8'
mode
<integer> - defaults to 0o600
(see node.js file modes)flags
<string> - defaults to 'a'
(see node.js file flags)compress
<boolean> - defaults to false
- compress the backup files using gzip (backup files will have .gz
extension)keepFileExt
<boolean> - defaults to false
- preserve the file extension when rotating log files (file.log
becomes file.1.log
instead of file.log.1
).fileNameSep
<string> - defaults to '.'
- the filename separator when rolling. e.g.: abc.log.
1 or abc.
1.log (keepFileExt)This returns a WritableStream
. When the current file being written to (given by filename
) gets up to or larger than maxSize
, then the current file will be renamed to filename.1
and a new file will start being written to. Up to numBackups
of old files are maintained, so if numBackups
is 3 then there will be 4 files:
filename filename.1 filename.2 filename.3
When filename size >= maxSize then:
filename -> filename.1 filename.1 -> filename.2 filename.2 -> filename.3 filename.3 gets overwritten filename is a new file
filename
<string>pattern
<string> - defaults to yyyy-MM-dd
- the date pattern to trigger rolling (see below)options
<Object>
encoding
<string> - defaults to 'utf8'
mode
<integer> - defaults to 0o600
(see node.js file modes)flags
<string> - defaults to 'a'
(see node.js file flags)compress
<boolean> - defaults to false
- compress the backup files using gzip (backup files will have .gz
extension)keepFileExt
<boolean> - defaults to false
- preserve the file extension when rotating log files (file.log
becomes file.2017-05-30.log
instead of file.log.2017-05-30
).fileNameSep
<string> - defaults to '.'
- the filename separator when rolling. e.g.: abc.log.
2013-08-30 or abc.
2013-08-30.log (keepFileExt)alwaysIncludePattern
<boolean> - defaults to false
- extend the initial file with the patterndaysToKeep
numBackups
<integer> - defaults to 1
- the number of old files that matches the pattern to keep (excluding the hot file)maxSize
<integer> - defaults to 0
- the size in bytes to trigger a rollover. If not specified or 0, then no log rolling will happen.This returns a WritableStream
. When the current time, formatted as pattern
, changes then the current file will be renamed to filename.formattedDate
where formattedDate
is the result of processing the date through the pattern, and a new file will begin to be written. Streamroller uses date-format to format dates, and the pattern
should use the date-format format. e.g. with a pattern
of "yyyy-MM-dd"
, and assuming today is August 29, 2013 then writing to the stream today will just write to filename
. At midnight (or more precisely, at the next file write after midnight), filename
will be renamed to filename.2013-08-29
and a new filename
will be created. If options.alwaysIncludePattern
is true, then the initial file will be filename.2013-08-29
and no renaming will occur at midnight, but a new file will be written to with the name filename.2013-08-30
. If maxSize
is populated, when the current file being written to (given by filename
) gets up to or larger than maxSize
, then the current file will be renamed to filename.pattern.1
and a new file will start being written to. Up to numBackups
of old files are maintained, so if numBackups
is 3 then there will be 4 files:
filename filename.20220131.1 filename.20220131.2 filename.20220131.3
FAQs
file streams that roll over when size limits, or dates are reached
The npm package streamroller receives a total of 3,950,369 weekly downloads. As such, streamroller popularity was classified as popular.
We found that streamroller demonstrated a not healthy version release cadence and project activity because the last version was released a year ago. It has 2 open source maintainers collaborating on the project.
Did you know?
Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.
Security News
Research
The Socket Research Team breaks down a malicious wrapper package that uses obfuscation to harvest credentials and exfiltrate sensitive data.
Research
Security News
Attackers used a malicious npm package typosquatting a popular ESLint plugin to steal sensitive data, execute commands, and exploit developer systems.
Security News
The Ultralytics' PyPI Package was compromised four times in one weekend through GitHub Actions cache poisoning and failure to rotate previously compromised API tokens.