
Research
Security News
Lazarus Strikes npm Again with New Wave of Malicious Packages
The Socket Research Team has discovered six new malicious npm packages linked to North Korea’s Lazarus Group, designed to steal credentials and deploy backdoors.
kafka-avro
Advanced tools
Node.js bindings for librdkafka with Avro schema serialization.
The kafka-avro library is a wrapper that combines the node-rdkafka and avsc libraries to allow for Production and Consumption of messages on kafka validated and serialized by Avro.
Install the module using NPM:
npm install kafka-avro --save
The kafka-avro library operates in the following steps:
getConsumer()
and getProducer()
methods, which both return instsances of the corresponding Constructors from the node-rdkafka library.The instances of "node-rdkafka" that are returned by kafka-avro are hacked so as to intercept produced and consumed messages and run them by the Avro de/serializer.
You are highly encouraged to read the "node-rdkafka" documentation, as it explains how you can use the Producer and Consumer instances as well as check out the available configuration options of node-rdkafka.
The Kafka.CODES
enumeration of constant values provided by the "node-rdkafka" library is also available as a static var at:
var KafkaAvro = require('kafka-avro');
console.log(KafkaAvro.CODES);
var KafkaAvro = require('kafka-avro');
var kafkaAvro = new KafkaAvro({
kafkaBroker: 'localhost:9092',
schemaRegistry: 'localhost:8081',
});
// Query the Schema Registry for all topic-schema's
// fetch them and evaluate them.
kafkaAvro.init()
.then(function() {
console.log('Ready to use');
});
NOTICE: You need to initialize kafka-avro before you can produce or consume messages.
By inoking the kafkaAvro.getProducer()
method, kafka-avro will instantiate a Producer, make it connect and wait for it to be ready before the promise is resolved.
kafkaAvro.getProducer({
// Options listed bellow
})
// "getProducer()" returns a Bluebird Promise.
.then(function(producer) {
var topicName = 'test';
producer.on('disconnected', function(arg) {
console.log('producer disconnected. ' + JSON.stringify(arg));
});
//Create a Topic object with any options our Producer
//should use when producing to that topic.
var topic = producer.Topic(topicName, {
// Make the Kafka broker acknowledge our message (optional)
'request.required.acks': 1
});
var value = new Buffer('value-' +i);
var key = 'key';
// if partition is set to -1, librdkafka will use the default partitioner
var partition = -1;
producer.produce(topic, partition, value, key);
})
What kafka-avro basically does is wrap around node-rdkafka and intercept the produce method to validate and serialize the message.
NOTICE: You need to initialize kafka-avro before you can produce or consume messages.
By inoking the kafkaAvro.getConsumer()
method, kafka-avro will instantiate a Consumer, make it connect and wait for it to be ready before the promise is resolved.
kafkaAvro.getConsumer({
'group.id': 'librd-test',
'socket.keepalive.enable': true,
'enable.auto.commit': true,
})
// the "getConsumer()" method will return a bluebird promise.
.then(function(consumer) {
var topicName = 'test';
this.consumer.consume([topicName]);
this.consumer.on('data', function(rawData) {
console.log('data:', rawData);
});
});
kafkaAvro.getConsumer({
'group.id': 'librd-test',
'socket.keepalive.enable': true,
'enable.auto.commit': true,
})
// the "getConsumer()" method will return a bluebird promise.
.then(function(consumer) {
var topicName = 'test';
var stream = consumer.getReadStream(topicName, {
waitInterval: 0
});
stream.on('error', function() {
process.exit(1);
});
consumer.on('error', function(err) {
console.log(err);
process.exit(1);
});
stream.on('data', function(message) {
console.log('Received message:', message);
});
});
Same deal here, thin wrapper around node-rdkafka and deserialize incoming messages before they reach your consuming method.
kafka-avro intercepts all incoming messages and augments the object with one more property named parsed
which contained the avro deserialized object. Here is a breakdown of the properties included in the message
object you receive when consuming messages:
value
Buffer The raw message buffer from Kafka.size
Number The size of the message.key
String|Number Partioning key used.topic
String The topic this message comes from.offset
Number The Kafka offset.partition
Number The kafka partion used.parsed
Object The avro deserialized message as a JS Object ltieral.grunt release
grunt release:minor
for minor number jump.grunt release:major
for major number jump.Copyright Waldo, Inc. All rights reserved.
FAQs
Node.js bindings for librdkafka with Avro schema serialization.
The npm package kafka-avro receives a total of 202 weekly downloads. As such, kafka-avro popularity was classified as not popular.
We found that kafka-avro demonstrated a not healthy version release cadence and project activity because the last version was released a year ago. It has 2 open source maintainers collaborating on the project.
Did you know?
Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.
Research
Security News
The Socket Research Team has discovered six new malicious npm packages linked to North Korea’s Lazarus Group, designed to steal credentials and deploy backdoors.
Security News
Socket CEO Feross Aboukhadijeh discusses the open web, open source security, and how Socket tackles software supply chain attacks on The Pair Program podcast.
Security News
Opengrep continues building momentum with the alpha release of its Playground tool, demonstrating the project's rapid evolution just two months after its initial launch.