What is cbor?
The 'cbor' npm package is used for encoding and decoding data in the Concise Binary Object Representation (CBOR) format. CBOR is a binary data serialization format that is designed to be small, fast, and suitable for constrained environments. It is often used in IoT, web, and mobile applications where efficiency is critical.
What are cbor's main functionalities?
Encoding Data
This feature allows you to encode JavaScript objects into CBOR format. The example demonstrates encoding a simple object asynchronously.
const cbor = require('cbor');
const data = { key: 'value' };
cbor.encodeAsync(data).then(encoded => {
console.log(encoded);
});
Decoding Data
This feature allows you to decode CBOR data back into JavaScript objects. The example shows how to decode a CBOR-encoded buffer.
const cbor = require('cbor');
const encodedData = Buffer.from('a2616b65796576616c7565', 'hex');
cbor.decodeFirst(encodedData).then(decoded => {
console.log(decoded);
});
Streaming Encoding
This feature supports streaming encoding, which is useful for handling large datasets or continuous data streams. The example demonstrates encoding data in a stream.
const cbor = require('cbor');
const stream = cbor.encodeStream();
stream.on('data', chunk => {
console.log(chunk);
});
stream.write({ key: 'value' });
stream.end();
Streaming Decoding
This feature supports streaming decoding, which is useful for processing large or continuous CBOR data streams. The example shows how to decode data from a stream.
const cbor = require('cbor');
const stream = cbor.decodeStream();
stream.on('data', obj => {
console.log(obj);
});
stream.write(Buffer.from('a2616b65796576616c7565', 'hex'));
stream.end();
Other packages similar to cbor
msgpack5
The 'msgpack5' package provides MessagePack encoding and decoding, which is another binary serialization format similar to CBOR. MessagePack is also designed to be efficient and compact, making it suitable for similar use cases. Compared to CBOR, MessagePack has a slightly different data model and may offer different performance characteristics depending on the use case.
protobufjs
The 'protobufjs' package is used for encoding and decoding data in Protocol Buffers (protobuf) format, which is a language-neutral, platform-neutral, extensible mechanism for serializing structured data. Protobuf is often used in communication protocols and data storage. It is more complex and feature-rich compared to CBOR, offering schema definitions and more control over data serialization.
avsc
The 'avsc' package provides support for Apache Avro, a binary serialization format that includes rich data structures and a compact, fast binary encoding. Avro is often used in big data applications and supports schema evolution. Compared to CBOR, Avro is more focused on data interchange and storage in distributed systems.
Encode and parse CBOR documents.
See the documentation and test results.
Installation:
$ npm install cbor
From the command line:
$ bin/json2cbor package.json > package.cbor
$ bin/cbor2json package.cbor
$ bin/cbor2diag package.cbor
Example:
var cbor = require('cbor');
var encoded = cbor.encode(true);
cbor.decode(encoded, function(error, obj) {
assert.ok(obj === true);
});
Allows streaming as well:
var cbor = require('cbor');
var fs = require('fs');
var d = new cbor.Decoder();
d.on('complete', function(obj){
console.log(obj);
});
var s = fs.createReadStream('foo');
s.pipe(d);
var d2 = new cbor.Decoder({input: '00', encoding: 'hex'});
d.on('complete', function(obj){
console.log(obj);
});
d2.start();
And also a SAX-type mode (which the streaming mode wraps):
var cbor = require('cbor');
var fs = require('fs');
var parser = new cbor.Evented();
parser.on('value',function(val,tags,kind) {
console.log(val);
});
parser.on('array-start', function(count,tags,kind) {
});
parser.on('array-stop', function(count,tags,kind) {
});
parser.on('map-start', function(count,tags,kind) {
});
parser.on('map-stop', function(count,tags,kind) {
});
parser.on('stream-start', function(mt,tags,kind) {
});
parser.on('stream-stop', function(count,mt,tags,kind) {
});
parser.on('end', function() {
});
parser.on('error', function(er) {
});
var s = fs.createReadStream('foo');
s.pipe(parser);
Test coverage is currently above 95%.