Security News
pnpm 10.0.0 Blocks Lifecycle Scripts by Default
pnpm 10 blocks lifecycle scripts by default to improve security, addressing supply chain attack risks but sparking debate over compatibility and workflow changes.
@ovotech/avro-stream
Advanced tools
Serialize/deserialize kafka-node streams with avro data, using confluent schema-registry to hold the schemas
Serialize/deserialize kafka-node streams with avro data, using confluent schema-registry to hold the schemas.
yarn add @ovotech/avro-stream
Where sourceStream
is a node readble stream, producing kafka-node produce objects. With an additional "schema" key holding the avro schema.
If the schema for the topic does not exist inside the schema registry, it would be created. Unless the auto create topic has been set for kafka, it would not create the topic automatically. You'll need to create it yourself.
import { AvroSerializer, AvroProduceRequest } from '@ovotech/avro-stream';
import { ReadableMock } from 'stream-mock';
import { ProducerStream } from 'kafka-node';
const data: AvroProduceRequest[] = [
{
topic: 'migration-completed',
partition: 0,
key: 'some-key',
schema: {
type: 'record',
name: 'TestSchema',
fields: [{ name: 'accountId', type: 'string' }],
},
messages: [{ accountId: '6709629' }, { accountId: '5709629' }],
},
];
const sourceStream = new ReadableMock(data, { objectMode: true });
const producerStream = new ProducerStream({ kafkaClient: { kafkaHost: 'localhost:29092' } });
const serializer = new AvroSerializer('http://localhost:8081');
sourceStream.pipe(serializer).pipe(producerStream);
For deserializing avro kafka events:
import { AvroDeserializer } from '@ovotech/avro-stream';
import { WritableMock } from 'stream-mock';
import { ConsumerGroupStream } from 'kafka-node';
const consumerStream = new ConsumerGroupStream(
{
kafkaHost: 'localhost:29092',
groupId: 'my-group',
encoding: 'buffer',
fromOffset: 'earliest',
},
['migration-completed'],
);
const deserializer = new AvroDeserializer('http://localhost:8081');
const sinkStream = new WritableMock({ objectMode: true });
consumerStream.pipe(deserializer).pipe(sinkStream);
The way avro serialization for kafka works is to embed the schema id as the first 5 bytes of the buffer so the buffer becomes <id><avro serialized buffer>
. For that to work we need a resolver service that can do id->schema for deserializing and schema->id to serializing kafka events.
The default provided resolver is SchemaRegistryresolver
using confluent schema-registry but you can write your own:
import { AvroSerializer, AvroDeserializer, SchemaResolver } from '@ovotech/avro-stream';
class MyResolver implements SchemaResolver {
async toId(topic: string, schema: Schema) {
return ...
}
async fromId(id: number) {
return ...
}
}
const resolver = new MyResolver();
const serializer = new AvroSerializer(resolver);
const deserializer = new AvroDeserializer(resolver);
Sometimes you'll want to pass some options to the creation of the avro type from the schema, for example to pass in logical type resolvers. You can do that with the second argument to the constructors.
import { AvroSerializer, AvroDeserializer } from '@ovotech/avro-stream';
const serializer new AvroSerializer('...', { logicalTypes: ... });
const deserializer new AvroDeserializer('...', { logicalTypes: ... });
AvroSerializer can emit an AvroSerializerError
, and subsequently AvroDeserializer - AvroDeserializerError
. They are as follows:
Property | Description |
---|---|
message | Original error message |
chunk | The event sent from the previous stream to be serialized (AvroProduceRequest) |
encoding | The buffer encoding |
originalError | The original error object that was triggered |
Property | Description |
---|---|
message | Original error message |
chunk | The event sent from the previous stream to be deserialized from kafka-node |
encoding | The buffer encoding |
originalError | The original error object that was triggered |
Example error handling:
import { AvroSerializer, AvroSerializerError } from '@ovotech/avro-stream';
const serializer new AvroSerializer('...');
serializer.on('error', (error: AvroSerializerError) => {
console.log(error.chunk);
})
A thing to be aware of is that node streams unpipe in an event of an error, which means that you'll need to provide your own error handling and repipe the streams if you want it to be resilient to errors.
The tests require a running schema registry service, kafka and zookeeper. This is setup easily with a docker-compose:
docker-compose up
Then you can run the tests with:
yarn test
Style is maintained with prettier and tslint
yarn lint
To deploy a new version, push to master and then create a new release. CircleCI will automatically build and deploy a the version to the npm registry. All package versions are synchronized, but it will only publish the versions of the packages that have changed.
Have a bug? File an issue with a simple example that reproduces this so we can take a look & confirm.
Want to make a change? Submit a PR, explain why it's useful, and make sure you've updated the docs (this file) and the tests (see test/integration.spec.ts
).
This project is licensed under Apache 2 - see the LICENSE file for details
FAQs
Serialize/deserialize kafka-node streams with avro data, using confluent schema-registry to hold the schemas
The npm package @ovotech/avro-stream receives a total of 2 weekly downloads. As such, @ovotech/avro-stream popularity was classified as not popular.
We found that @ovotech/avro-stream demonstrated a not healthy version release cadence and project activity because the last version was released a year ago. It has 77 open source maintainers collaborating on the project.
Did you know?
Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.
Security News
pnpm 10 blocks lifecycle scripts by default to improve security, addressing supply chain attack risks but sparking debate over compatibility and workflow changes.
Product
Socket now supports uv.lock files to ensure consistent, secure dependency resolution for Python projects and enhance supply chain security.
Research
Security News
Socket researchers have discovered multiple malicious npm packages targeting Solana private keys, abusing Gmail to exfiltrate the data and drain Solana wallets.