
Research
/Security News
60 Malicious Ruby Gems Used in Targeted Credential Theft Campaign
A RubyGems malware campaign used 60 malicious packages posing as automation tools to steal credentials from social media and marketing tool users.
While working on a NodeJs project, I came across a problem of processing huge XML files. Node process can store a limited amout of data after which it throws out of memory
exception. Thus files more than a specified limit was a pain for the application.
That is where I found a couple of npm packages
viz. xml-flow & xml-stream. These packages process the XML files in chunks using streams (which is exactly what I was looking for).
However, I came across few drawbacks regarding these packages explained in this stackoverflow question. And thus I started working on this package which does exactly what these 2 packages do but it emits xml instead of processed JSON and gives a flexibility of choosing most compatible xml to json npm packages
.
One of the key points is xtreamer
has stream
as its only dependency.
As this package uses streams, the memory consumption will always be in control.
xtreamer
itself is an extension of a Transform Stream
so it can be piped to any input Readable Stream
and xtreamer
output can be piped to a Writable Stream
.
Apart from above points, xtreamer
provides XML nodes in response which enables it to get hooked up with any XML-JSON
parsing npm packages as per requirement.
npm i xtreamer --save
xtreamer
extends Transform Steram Class & provides an additional custom event xmldata
which emits xml node.
string
, options?: object
) : stream.Transformxtreamer
function initialises the transform stream. It accepts following 2 arguments -
node
: string
options
: object
(coming soon)This function return a transform stream which can be triggered by piping it with any readable stream.
Apart from default steam events, streamer
emits xmldata
event to emit individual xml nodes.
From version
0.1.3
onwards,xtreamer
also supports the conventionaldata
event to emit individual xml nodes.
const xtreamer = require("xtreamer");
const xtreamerTransform = xtreamer("XmlNode", options);
// listening to `xmldata` event here
xtreamerTransform.on("xmldata", (data) => { });
// OR
// `data` event also supported from version 0.1.3
xtreamerTransform.on("data", (data) => { });
max_xml_size
is maximum the number of characters allowed to hold in memory.
xtreamer
raises an error
event in case in memory xml string exceed specified limit. This ensures that the node process doesn't get terminated because of excess in memory data collection.
Default value of this option restricts the amount of data held in memory to approximately 10Mb. Following snippet shows how to override default value -
const xtreamer = require("xtreamer");
// overriding `max_xml_size` value here
const options = { max_xml_size: 30000000 };
// passing `options` object as second parameter
const xtreamerTransform = xtreamer("XmlNode", options);
Typically this value is not needed to override as in most of the cases, size of xml in a single xml node will not exceed 10Mb.
This feature is introduced in version 1.1.0
The transformer
option allows to transform the xml node output to desired JSON strucure by hooking it up with the streaming pipeline. It becomes a very useful feature where JSON parser can be dynamically injected in xtreamer
.
This function is supposed to accept an xml string as a parameter and should return converted valid JSON object in response. The transformer function can also return a promise which will be resolved by xtreamer
before emitting data
event.
Note that the converted JSON is internally stringified before sending it back to in data event handler. So it is advised that the transformer function should always return valid JSON object in response.
In case transformer function encounters an error, xtreamer
emits error
event and stops the xml conversion process.
Following code snippet uses request
NPM package as input readable stream -
const request = require("request");
const xtreamer = require("xtreamer");
const sampleNode = "SampleNode";
const sampleUrl = "http://sample-xml.com/sample.xml";
let count = 0;
// input readable stream with event handlers
const readStream = request.get(sampleUrl);
// xtreamer transform stream with custom event handler
const xtreamerTransform = xtreamer(sampleNode)
.on("data", () => ++count % 100 || console.log(count))
.on("end", () => console.log(count))
.on("error", (error) => console.error(error));
// input | transform
readStream.pipe(xtreamerTransform);
As streamer
is a transform stream, one can also pipe the stream with other streams -
const { Writable } = require("stream");
const request = require("request");
const xtreamer = require('xtreamer');
class XtreamerClient extends Writable {
_write(chunk, encoding, next) {
// do stuff
next();
}
}
const sampleNode = "SampleNode";
const sampleUrl = "http://sample-xml.com/sample.xml";
// input readable stream
const readStream = request.get(sampleUrl);
// xtreamer transform stream with custom event handler
const xtreamerTransform = xtreamer(sampleNode)
.on("error", (error) => console.error(error));
// input | transform | write
readStream.pipe(xtreamerTransform).pipe(new XtreamerClient());
Check the demo for more examples which includes -
demo.js - emits xml nodes
transformer-demo.js - emits stringified JSON & uses xml-js
package within transformer function
transformer-promise-demo.js - emits stringified JSON & uses xml-js
package within transformer function
that return a promise.
npm i && npm start
npm i && npm run test
Author - Manoj Chalode (chalodem@gmail.com)
Copyright - github.com/manojc
FAQs
A NodeJS package to read XML files using NodeJS streams.
The npm package xtreamer receives a total of 537 weekly downloads. As such, xtreamer popularity was classified as not popular.
We found that xtreamer demonstrated a not healthy version release cadence and project activity because the last version was released a year ago. It has 1 open source maintainer collaborating on the project.
Did you know?
Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.
Research
/Security News
A RubyGems malware campaign used 60 malicious packages posing as automation tools to steal credentials from social media and marketing tool users.
Security News
The CNA Scorecard ranks CVE issuers by data completeness, revealing major gaps in patch info and software identifiers across thousands of vulnerabilities.
Research
/Security News
Two npm packages masquerading as WhatsApp developer libraries include a kill switch that deletes all files if the phone number isn’t whitelisted.