Research
Security News
Malicious PyPI Package ‘pycord-self’ Targets Discord Developers with Token Theft and Backdoor Exploit
Socket researchers uncover the risks of a malicious Python package targeting Discord developers.
protobufjs
Advanced tools
The protobufjs npm package provides a comprehensive suite of tools for working with Protocol Buffers (protobuf), a method of serializing structured data. It allows users to encode and decode protobuf messages, generate and work with static code, and handle dynamic message building and parsing.
Loading .proto files
This feature allows users to load .proto files and use the defined protobuf structures within their JavaScript code.
const protobuf = require('protobufjs');
protobuf.load('awesome.proto', function(err, root) {
if (err) throw err;
const AwesomeMessage = root.lookupType('awesomepackage.AwesomeMessage');
// ... use AwesomeMessage
});
Encoding and decoding messages
With protobufjs, users can encode JavaScript objects into binary protobuf format and decode binary messages into JavaScript objects.
const message = AwesomeMessage.create({ awesomeField: 'AwesomeString' });
const buffer = AwesomeMessage.encode(message).finish();
const decodedMessage = AwesomeMessage.decode(buffer);
Reflection and runtime message building
This feature allows users to work with protobuf messages dynamically at runtime using JSON descriptors, without the need for generated static code.
const root = protobuf.Root.fromJSON(jsonDescriptor);
const AwesomeMessage = root.lookupType('awesomepackage.AwesomeMessage');
const errMsg = AwesomeMessage.verify({ awesomeField: 'AwesomeString' });
if (errMsg) throw Error(errMsg);
const message = AwesomeMessage.create({ awesomeField: 'AwesomeString' });
Static code generation
Protobufjs can generate static code from .proto files, which can be used for better performance and type safety.
protobuf.load('awesome.proto', function(err, root) {
if (err) throw err;
protobuf.codegen(root, { keepCase: true }, function(err, output) {
if (err) throw err;
// output will contain the generated static code
});
});
This is a fork of the original protobufjs package with some modifications. It is used within the Apollo tooling ecosystem but generally offers similar functionality to protobufjs.
This is the official Protocol Buffers runtime library for JavaScript. It is provided by Google and offers similar serialization and deserialization capabilities. However, it may not be as feature-rich or flexible as protobufjs in terms of dynamic message handling and may require more setup for code generation.
Pbf is a fast, lightweight Protocol Buffers implementation in JavaScript. It focuses on performance and is smaller in size compared to protobufjs. However, it might not offer the same level of functionality, especially in terms of reflection and dynamic message building.
Protocol Buffers are a language-neutral, platform-neutral, extensible way of serializing structured data for use in communications protocols, data storage, and more, originally designed at Google (see).
protobuf.js is a pure JavaScript implementation for node and the browser. It efficiently encodes plain objects and custom classes and works out of the box with .proto files.
Recommended read: Changes in protobuf.js 6.0
Usage
How to include protobuf.js in your project.
Examples
A few examples to get you started.
Module Structure
A brief introduction to the structure of the exported module.
Documentation
A list of available documentation resources.
Command line
How to use the command line utility.
Building
How to build the library and its components yourself.
Performance
A few internals and a benchmark on performance.
Compatibility
Notes on compatibility regarding browsers and optional libraries.
$> npm install protobufjs
var protobuf = require("protobufjs");
Development:
<script src="//cdn.rawgit.com/dcodeIO/protobuf.js/6.1.0/dist/protobuf.js"></script>
Production:
<script src="//cdn.rawgit.com/dcodeIO/protobuf.js/6.1.0/dist/protobuf.min.js"></script>
NOTE: Remember to replace the version tag with the exact release your project depends upon.
Or download the library.
The protobuf
namespace will always be available globally but also supports AMD.
// awesome.proto
package awesomepackage;
syntax = "proto3";
message AwesomeMessage {
string awesome_field = 1; // becomes awesomeField
}
protobuf.load("awesome.proto", function(err, root) {
if (err) throw err;
// Obtain a message type
var AwesomeMessage = root.lookup("awesomepackage.AwesomeMessage");
// Create a new message
var message = AwesomeMessage.create({ awesomeField: "AwesomeString" });
// Encode a message
var buffer = AwesomeMessage.encode(message).finish();
// ... do something with buffer
// Or, encode a plain object
var buffer = AwesomeMessage.encode({ awesomeField: "AwesomeString" }).finish();
// ... do something with buffer
// Decode a buffer
var message = AwesomeMessage.decode(buffer);
// ... do something with message
// If your application uses length-delimited buffers, there is also encodeDelimited and decodeDelimited.
});
You can also use promises by omitting the callback:
protobuf.load("awesome.proto")
.then(function(root) {
...
});
...
var Root = protobuf.Root,
Type = protobuf.Type,
Field = protobuf.Field;
var AwesomeMessage = new Type("AwesomeMessage").add(new Field("awesomeField", 1, "string"));
var root = new Root().define("awesomepackage").add(AwesomeMessage);
// Continue at "Create a new message" above
...
...
function AwesomeMessage(properties) {
protobuf.Message.call(this, properties);
}
protobuf.Class.create(root.lookup("awesomepackage.AwesomeMessage") /* or use reflection */, AwesomeMessage);
var message = new AwesomeMessage({ awesomeField: "AwesomeString" });
// Continue at "Encode a message" above
Custom classes are automatically populated with static encode
, encodeDelimited
, decode
, decodeDelimited
and verify
methods and reference their reflected type via the $type
property. Note that there are no methods (just $type
) on instances by default as method names might conflict with field names.
// greeter.proto
service Greeter {
rpc SayHello (HelloRequest) returns (HelloReply) {}
}
message HelloRequest {
string name = 1;
}
message HelloReply {
string message = 1;
}
...
var Greeter = root.lookup("Greeter");
var greeter = Greeter.create(rpcImpl, false, false); // rpcImpl (see below), requestDelimited?, responseDelimited?
greeter.sayHello({ name: 'you' }, function(err, response) {
console.log('Greeting:', response.message);
});
To make this work, all you have to do is provide an rpcImpl
, which is an asynchronous function that takes the reflected service method, the binary HelloRequest and a node-style callback as its parameters. For example:
function rpcImpl(method, requestData, callback) {
// perform the request using an HTTP request or a WebSocket for example
var responseData = ...;
// and call the callback with the binary response afterwards:
callback(null, responseData);
}
There is also an example for streaming RPC.
/// <reference path="node_modules/protobufjs/types/protobuf.js.d.ts" />
import * as protobuf from "protobufjs";
...
The library exports a flat protobuf
namespace including but not restricted to the following members, ordered by category:
load(filename: string|string[]
, [root: Root
], [callback: function(err: Error, [root: Root])
]): Promise|undefined
[source]
Loads one or multiple .proto or preprocessed .json files into a common root namespace.
loadSync(filename: string|string[]
, [root: Root
]): Root
[source]
Synchronously loads one or multiple .proto or preprocessed .json files into a common root namespace (node only).
parse(source: string
): Object
[source]
Parses the given .proto source and returns an object with the parsed contents.
Writer [source]
Wire format writer using Uint8Array
if available, otherwise Array
.
Reader [source]
Wire format reader using Uint8Array
if available, otherwise Array
.
Namespace extends ReflectionObject [source]
Base class of all reflection objects containing nested objects.
Root extends Namespace [source]
Root namespace.
Type extends Namespace [source]
Reflected message type.
Field extends ReflectionObject [source]
Reflected message field.
MapField extends Field [source]
Reflected message map field.
Enum extends ReflectionObject [source]
Reflected enum.
Service extends Namespace [source]
Reflected service.
Method extends ReflectionObject [source]
Reflected service method.
Class [source]
Runtime class providing the tools to create your own custom classes.
Message [source]
Abstract runtime message.
For less common members, see the API documentation.
The pbjs
command line utility can be used to bundle and translate between .proto and .json files.
Consolidates imports and converts between file formats.
-t, --target Specifies the target format. Also accepts a path to require a custom target.
json JSON representation
json-module JSON representation as a module (AMD, CommonJS, global)
proto2 Protocol Buffers, Version 2
proto3 Protocol Buffers, Version 3
static Static code without reflection
static-module Static code without reflection as a module (AMD, CommonJS, global)
-p, --path Adds a directory to the include path.
-o, --out Saves to a file instead of writing to stdout.
-w, --wrap Specifies the wrapper to use for *-module targets. Also accepts a path.
default Default wrapper supporting both CommonJS and AMD
commonjs CommonJS only wrapper
amd AMD only wrapper
-r, --root Specifies an alternative protobuf.roots name for *-module targets.
usage: pbjs [options] file1.proto file2.json ...
For production environments it is recommended to bundle all your .proto files to a single .json file, which reduces the number of network requests and parser invocations required:
$> pbjs -t json file1.proto file2.proto > bundle.json
Now, either include this file in your final bundle:
var root = protobuf.Root.fromJSON(require("./bundle.json"));
or load it the usual way:
protobuf.load("bundle.json", function(err, root) {
...
});
Likewise, the pbts
command line utility can be used to generate TypeScript definitions from pbjs
-generated static modules.
Generates TypeScript definitions from annotated JavaScript files.
-n, --name Specifies the module name.
-o, --out Saves to a file instead of writing to stdout.
usage: pbts [options] file1.js file2.js ...
While .proto and JSON files require the full library (about 18kb gzipped), pretty much all code but the relatively short descriptors is shared.
Static code, on the other hand, requires just the minimal runtime (about 5.5kb gzipped), but generates relatively large code bases without any reflection features.
When new Function
is supported (and it usually is), there is no difference performance-wise as the code generated statically is the same generated at runtime.
To build the library or its components yourself, clone it from GitHub and install the development dependencies:
$> git clone https://github.com/dcodeIO/protobuf.js.git
$> cd protobuf.js
$> npm install --dev
Building the development and production versions with their respective source maps to dist/
:
$> npm run build
Building the documentation to docs/
:
$> npm run docs
Building the TypeScript definition to types/
:
$> npm run types
protobuf.js integrates into any browserify build-process. There are a few possible tweaks:
buffer
module and let protobuf.js do its thing with Uint8Array/Array instead.long
module. It will be included otherwise.process
, _process
and fs
.The package includes a benchmark that tries to compare performance to native JSON as far as this is possible. On an i7-2600K running node 6.9.1 it yields:
benchmarking encoding performance ...
Type.encode to buffer x 481,172 ops/sec ±0.48% (92 runs sampled)
JSON.stringify to string x 307,509 ops/sec ±1.04% (92 runs sampled)
JSON.stringify to buffer x 164,463 ops/sec ±1.37% (89 runs sampled)
Type.encode to buffer was fastest
JSON.stringify to string was 36.4% slower
JSON.stringify to buffer was 66.1% slower
benchmarking decoding performance ...
Type.decode from buffer x 1,319,810 ops/sec ±0.71% (92 runs sampled)
JSON.parse from string x 298,578 ops/sec ±0.98% (90 runs sampled)
JSON.parse from buffer x 267,471 ops/sec ±0.81% (89 runs sampled)
Type.decode from buffer was fastest
JSON.parse from string was 77.4% slower
JSON.parse from buffer was 79.8% slower
benchmarking combined performance ...
Type to/from buffer x 262,728 ops/sec ±0.92% (92 runs sampled)
JSON to/from string x 129,405 ops/sec ±0.78% (94 runs sampled)
JSON to/from buffer x 89,523 ops/sec ±0.71% (89 runs sampled)
Type to/from buffer was fastest
JSON to/from string was 50.7% slower
JSON to/from buffer was 65.9% slower
benchmarking verifying performance ...
Type.verify x 5,833,382 ops/sec ±0.98% (85 runs sampled)
Type.verify was fastest
Note that JSON is a native binding nowadays and as such is about as fast as it possibly can get. So, how can protobuf.js be faster?
verify
method to check this manually instead - where applicable.Note that code generation requires new Function(...)
(basically eval
) support and that an equivalent but slower fallback will be used where unsupported.
You can also run the benchmark ...
$> npm run bench
and the profiler yourself (the latter requires a recent version of node):
$> npm run prof <encode|decode|encode-browser|decode-browser> [iterations=10000000]
Note that as of this writing, the benchmark suite performs significantly slower on node 7.2.0 compared to 6.9.1 because moths.
google/protobuf/descriptor.proto
, options are parsed and presented literally.get
, set
or is
directly (i.e. calling Type#getFieldsById()
instead of accessing Type#fieldsById
).Long
instance instead of a possibly unsafe JavaScript number (see).License: Apache License, Version 2.0, bundled external libraries may have their own license
:hash: Properly encode/decode map kv pairs as repeated messages (codegen and fallback), see #547<br /> :hash: Make genVerifyKey actually generate conditions for 32bit values and bool, fixes #546<br /> :hash: Fix to generation of verify methods for bytes<br /> :hash: Take special care of oneofs when encoding (i.e. when explicitly set to defaults), see #542<br />
:hash: Added Message#asJSON option for bytes conversion<br /> :hash: Added Namespace#lookupType and Namespace#lookupService (throw instead of returning null), see #544<br /> :hash: Provide prebuilt browser versions of the static runtime<br />
:hash: Initial pbts CLI for generating TypeScript definitions, see #550<br /> :hash: Refactored json/static-module targets to use common wrappers<br /> :hash: Refactor cli to support multiple built-in wrappers, added named roots instead of always using global.root and added additionally necessary eslint comments, see #540<br /> :hash: Annotate namespaces generated by static target, see #550<br /> :hash: static target: Basic support for oneof fields, see #542<br />
:hash: Fix to reflection documentation<br /> :hash: README on minimal runtime / available downloads<br /> :hash: Notes on descriptors vs static modules<br />
:hash: A lot of minor optimizations to performance and gzip ratio<br /> :hash: Minimized base64 tables<br />
FAQs
Protocol Buffers for JavaScript (& TypeScript).
The npm package protobufjs receives a total of 17,456,590 weekly downloads. As such, protobufjs popularity was classified as popular.
We found that protobufjs demonstrated a healthy version release cadence and project activity because the last version was released less than a year ago. It has 2 open source maintainers collaborating on the project.
Did you know?
Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.
Research
Security News
Socket researchers uncover the risks of a malicious Python package targeting Discord developers.
Security News
The UK is proposing a bold ban on ransomware payments by public entities to disrupt cybercrime, protect critical services, and lead global cybersecurity efforts.
Security News
Snyk's use of malicious npm packages for research raises ethical concerns, highlighting risks in public deployment, data exfiltration, and unauthorized testing.