Security News
Node.js EOL Versions CVE Dubbed the "Worst CVE of the Year" by Security Experts
Critics call the Node.js EOL CVE a misuse of the system, sparking debate over CVE standards and the growing noise in vulnerability databases.
@ovotech/kafka-avro-cli
Advanced tools
A CLI for inspecting the confluent schema-registry, produce and consume avro kafka events
A CLI for inspecting the confluent schema-registry, produce and consume avro kafka events.
You can install it locally for your project
yarn add @ovotech/kafka-avro-cli
yarn kac --help
Or globally:
yarn global add @ovotech/kafka-avro-cli
kac --help
kac <command>
Commands:
kac schema [subject] Serach for a given topic
kac topic [name] Serach for a given topic
kac create-topic <topic> Create kafka topic
kac produce [file] Produce events in kafka topic from a file
kac consume <topic> Consume events from kafka topic. Display them in the
console or save them to a file
Options:
--version Show version number [boolean]
--config, -c Path to the configuration file [default: "kac.config.json"]
--help Show help [boolean]
copyright OVO Energy 2018
This cli tool communicates with schema registry and kafka which require credentials and addresses. In order to configure them you need to create a kac.config.json
file or to provide the path for it wit the --config
option.
The file should look like this:
{
"kafkaClient": {
"kafkaHost": "kafka.example.com:13581",
"sslOptions": {
"ca": "-----BEGIN CERTIFICATE-----\n...\n-----END CERTIFICATE-----",
"key": "-----BEGIN PRIVATE KEY-----\n...\n-----END PRIVATE KEY-----",
"cert": "-----BEGIN CERTIFICATE-----\n...\n-----END CERTIFICATE-----"
}
},
"schemaRegistry": "https://example-user:example-password@registry.example.com:13584"
}
kafkaClient
- options are passed directly to the KafkaClient from node-kafka.schemaRegistry
- url to the schema registry, with credentials information.kac --config my/config.json ...
kac.ts schema [subject]
Used to search for, filter and get details of a particular schema in the schema
registry. [subject] is a partial name of a subject. If no subject if provided,
all subjects are returned.
Options:
--version Show version number [boolean]
--config, -c Path to the configuration file [default: "kac.config.json"]
--help Show help [boolean]
--output-dir, -o Save the results into a folder. One file per version
If more than one subject is found, it will result in a list of partial matches.
$ kac schema migration
Searching for "migration" in localhost:8081
Found 4 matching "migration"
----------------------------------------
migration-scheduled-value
migration-completed-2-value
migration-completed-value
If only one is matched, it will display the full schemas of all the versions for this subject
$ kac schema migration-scheduled-value
Searching for "migration-scheduled-value" in localhost:8081
Subject migration-scheduled-value found [ 1 ] versions
Version: 1 ----------------------------------------
{ type: 'record',
name: 'TestSchema1',
fields: [ { name: 'accountId', type: 'string' } ] }
kac topic [name]
Used to search for, filter and get details of a particular topic in the kafka
server. [name] is a partial name of a topic. If no name if provided, all topics
are returned.
Options:
--version Show version number [boolean]
--config, -c Path to the configuration file [default: "kac.config.json"]
--help Show help
If more than one topic is found, it will result in a list of partial matches.
$ kac topic migration
Searching for "migration" in localhost:29092
Found 3 matching "migration"
----------------------------------------
migration-scheduled
migration-completed-2
migration-completed
If only one is matched, it will display the partitions and offsets of a topic
$ kac topic migration-completed-2
Searching for "migration-completed-2" in localhost:29092
Partitions and their offsets for migration-completed-2
----------------------------------------
{ migration-completed-2:
{ '0': [ 143024920 ],
'1': [ 142607286 ],
'2': [ 142860803 ],
'3': [ 143160301 ] } }
kac create-topic <topic>
Creates a topic in the kafka server with configurable partitions count and
replication factor.
Options:
--version Show version number [boolean]
--config, -c Path to the configuration file
[default: "kac.config.json"]
--help Show help [boolean]
--partitions Number of partitions for topic [number] [default: 1]
--replication-factor [number] [default: 1]
For example:
$ kac create-topic test-topic --partitions 2
Topic created test-topic partitions 2 replication factor 1
kac produce [file]
Produce events in kafka topic from a file
Options:
--version Show version number [boolean]
--config, -c Path to the configuration file [default: "kac.config.json"]
--help Show help [boolean]
The file needed to produce the topics should look something like this:
{
"0": {
"topic": "migration-completed-2",
"schema": {
"type": "record",
"name": "TestSchema1",
"fields": [{ "name": "accountId", "type": "string" }]
},
"partition": 0,
"key": "some key",
"messages": [{ "accountId": "67096" }, { "accountId": "57096" }]
},
"1": {
"topic": "migration-scheduled",
"schema": {
"type": "record",
"name": "TestSchema1",
"fields": [{ "name": "customerId", "type": "string" }]
},
"partition": 1,
"key": "some other key",
"messages": [{ "customerId": "12321" }, { "customerId": "32131" }]
}
}
This should allow you to produce events in different partitions.
kac consume <topic>
Consume events from kafka topic. Display them in the console or save them to a
file
Options:
--version Show version number [boolean]
--config, -c Path to the configuration file [default: "kac.config.json"]
--help Show help [boolean]
--group Consumer Group Id [default: a prefixed random name]
--latest Start listening from the latest offsets
[boolean] [default: false]
--tail Keep listening after all messages consumed
[boolean] [default: false]
--output-file, -o Save the results into a file, that can later be used to
produce those events
To display the contents of a topic simply write the full topic name.
The consumer group will terminate once all the messages have been consumed, but you can keep listening with the --tail
option.
$ kac consume migration-completed-2
Consume messages in migration-completed-2
----------------------------------------
Offset 0/1 partition 0 key null
BalanceMigrationCompletedEvent {
accountId: '6666666' }
----------------------------------------
Offset 1/1 partition 0 key null
BalanceMigrationCompletedEvent {
accountId: '5555555' }
----------------------------------------
Offset 0/1 partition 1 key null
BalanceMigrationCompletedEvent {
accountId: '4444444' }
----------------------------------------
Offset 1/1 partition 1 key null
BalanceMigrationCompletedEvent {
accountId: '3333333' }
----------------------------------------
[============================================================] 100% Elapsed: 0.0s
Consumed 4 messages
- Partition 0 : 2 messages.
- Partition 1 : 2 messages.
You can also write the output to a file with the --output-file
option. The contents of that file would be in the same format, as the one used to produce the events with the produce
command.
$ kac consume migration-completed-2 --output-file test2.json
Writing to file test2.json from topic migration-completed-2
----------------------------------------
[============================================================] 100% Elapsed: 0.0s
Consumed 8 messages
- Partition 0 : 4 messages.
- Partition 1 : 4 messages.
The tests require a running kafka server and schema-registry service. This is setup easily with a docker-compose from root project folder:
docker-compose up
Then you can run the tests with:
yarn test
Style is maintained with prettier and tslint
yarn lint
Deployment is preferment by lerna automatically on merge / push to master, but you'll need to bump the package version numbers yourself. Only updated packages with newer versions will be pushed to the npm registry.
Have a bug? File an issue with a simple example that reproduces this so we can take a look & confirm.
Want to make a change? Submit a PR, explain why it's useful, and make sure you've updated the docs (this file) and the tests (see test folder).
This project is licensed under Apache 2 - see the LICENSE file for details
FAQs
A CLI for inspecting the confluent schema-registry, produce and consume avro kafka events
We found that @ovotech/kafka-avro-cli demonstrated a not healthy version release cadence and project activity because the last version was released a year ago. It has 95 open source maintainers collaborating on the project.
Did you know?
Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.
Security News
Critics call the Node.js EOL CVE a misuse of the system, sparking debate over CVE standards and the growing noise in vulnerability databases.
Security News
cURL and Go security teams are publicly rejecting CVSS as flawed for assessing vulnerabilities and are calling for more accurate, context-aware approaches.
Security News
Bun 1.2 enhances its JavaScript runtime with 90% Node.js compatibility, built-in S3 and Postgres support, HTML Imports, and faster, cloud-first performance.