What is protobufjs?
The protobufjs npm package provides a comprehensive suite of tools for working with Protocol Buffers (protobuf), a method of serializing structured data. It allows users to encode and decode protobuf messages, generate and work with static code, and handle dynamic message building and parsing.
What are protobufjs's main functionalities?
Loading .proto files
This feature allows users to load .proto files and use the defined protobuf structures within their JavaScript code.
const protobuf = require('protobufjs');
protobuf.load('awesome.proto', function(err, root) {
if (err) throw err;
const AwesomeMessage = root.lookupType('awesomepackage.AwesomeMessage');
// ... use AwesomeMessage
});
Encoding and decoding messages
With protobufjs, users can encode JavaScript objects into binary protobuf format and decode binary messages into JavaScript objects.
const message = AwesomeMessage.create({ awesomeField: 'AwesomeString' });
const buffer = AwesomeMessage.encode(message).finish();
const decodedMessage = AwesomeMessage.decode(buffer);
Reflection and runtime message building
This feature allows users to work with protobuf messages dynamically at runtime using JSON descriptors, without the need for generated static code.
const root = protobuf.Root.fromJSON(jsonDescriptor);
const AwesomeMessage = root.lookupType('awesomepackage.AwesomeMessage');
const errMsg = AwesomeMessage.verify({ awesomeField: 'AwesomeString' });
if (errMsg) throw Error(errMsg);
const message = AwesomeMessage.create({ awesomeField: 'AwesomeString' });
Static code generation
Protobufjs can generate static code from .proto files, which can be used for better performance and type safety.
protobuf.load('awesome.proto', function(err, root) {
if (err) throw err;
protobuf.codegen(root, { keepCase: true }, function(err, output) {
if (err) throw err;
// output will contain the generated static code
});
});
Other packages similar to protobufjs
@apollo/protobufjs
This is a fork of the original protobufjs package with some modifications. It is used within the Apollo tooling ecosystem but generally offers similar functionality to protobufjs.
google-protobuf
This is the official Protocol Buffers runtime library for JavaScript. It is provided by Google and offers similar serialization and deserialization capabilities. However, it may not be as feature-rich or flexible as protobufjs in terms of dynamic message handling and may require more setup for code generation.
pbf
Pbf is a fast, lightweight Protocol Buffers implementation in JavaScript. It focuses on performance and is smaller in size compared to protobufjs. However, it might not offer the same level of functionality, especially in terms of reflection and dynamic message building.
protobuf.js

Protocol Buffers are a language-neutral, platform-neutral, extensible way of serializing structured data for use in communications protocols, data storage, and more, originally designed at Google (see).
protobuf.js is a pure JavaScript implementation for node and the browser. It efficiently encodes plain objects and custom classes and works out of the box with .proto files.
Recommended read: Changes in protobuf.js 6.0
Features
Contents
-
Usage
How to include protobuf.js in your project.
-
Examples
A few examples to get you started.
-
Module Structure
A brief introduction to the structure of the exported module.
-
Documentation
A list of available documentation resources.
-
Command line
How to use the command line utility.
-
Building
How to build the library and its components yourself.
-
Performance
A few internals and a benchmark on performance.
-
Compatibility
Notes on compatibility regarding browsers and optional libraries.
Usage
node.js
$> npm install protobufjs
var protobuf = require("protobufjs");
Browsers
Development:
<script src="//cdn.rawgit.com/dcodeIO/protobuf.js/6.X.X/dist/protobuf.js"></script>
Production:
<script src="//cdn.rawgit.com/dcodeIO/protobuf.js/6.X.X/dist/protobuf.min.js"></script>
NOTE: Remember to replace the version tag with the exact release your project depends upon.
Or download the library.
The protobuf
namespace will always be available globally / also supports AMD loaders.
Examples
Using .proto files
// awesome.proto
package awesomepackage;
syntax = "proto3";
message AwesomeMessage {
string awesome_field = 1; // becomes awesomeField
}
protobuf.load("awesome.proto", function(err, root) {
if (err) throw err;
var AwesomeMessage = root.lookup("awesomepackage.AwesomeMessage");
var message = AwesomeMessage.create({ awesomeField: "AwesomeString" });
var buffer = AwesomeMessage.encode(message).finish();
var buffer = AwesomeMessage.encode({ awesomeField: "AwesomeString" }).finish();
var message = AwesomeMessage.decode(buffer);
});
You can also use promises by omitting the callback:
protobuf.load("awesome.proto")
.then(function(root) {
...
});
Using reflection only
...
var Root = protobuf.Root,
Type = protobuf.Type,
Field = protobuf.Field;
var AwesomeMessage = new Type("AwesomeMessage").add(new Field("awesomeField", 1, "string"));
var root = new Root().define("awesomepackage").add(AwesomeMessage);
...
Using custom classes
...
function AwesomeMessage(properties) {
protobuf.Message.call(this, properties);
}
protobuf.Class.create(root.lookup("awesomepackage.AwesomeMessage") , AwesomeMessage);
var message = new AwesomeMessage({ awesomeField: "AwesomeString" });
Custom classes are automatically populated with static encode
, encodeDelimited
, decode
, decodeDelimited
and verify
methods and reference their reflected type via the $type
property. Note that there are no methods (just $type
) on instances by default as method names might conflict with field names.
Using the Reader/Writer interface directly
While only useful for the adventurous cherishing an aversion to generated static code, it's also possible to use the Reader/Writer interface directly using just the minimal runtime to build custom encoders and decoders that work accross modern to ancient browsers and, of course, node:
var writer = protobuf.Writer.create();
var buffer = writer
.int32( 1 << 3 | 2)
.string("hello world!")
.finish();
var reader = protobuf.Reader.create(buffer);
while (reader.pos < reader.len) {
var tag = reader.int32();
switch ( tag >>> 3) {
case 1:
console.log(reader.string());
break;
default:
reader.skipType( tag & 7);
break;
}
}
Easy ways to obtain example code snippets are either setting protobuf.util.codegen.verbose = true
while watching the magic as it happens, or simply inspecting generated static code.
Using services
// greeter.proto
service Greeter {
rpc SayHello (HelloRequest) returns (HelloReply) {}
}
message HelloRequest {
string name = 1;
}
message HelloReply {
string message = 1;
}
...
var Greeter = root.lookup("Greeter");
var greeter = Greeter.create(rpcImpl, false, false);
greeter.sayHello({ name: 'you' }, function(err, response) {
console.log('Greeting:', response.message);
});
To make this work, all you have to do is provide an rpcImpl
, which is an asynchronous function that takes the reflected service method, the binary HelloRequest and a node-style callback as its parameters. For example:
function rpcImpl(method, requestData, callback) {
var responseData = ...;
callback(null, responseData);
}
There is also an example for streaming RPC.
Usage with TypeScript
import * as protobuf from "protobufjs";
import * as Long from "long";
...
See also: Generating your own TypeScript definitions
Additional configuration might be necessary when not utilizing node, i.e. reference protobuf.js.d.ts and long.js.d.ts.
Module Structure
The library exports a flat protobuf
namespace including but not restricted to the following members, ordered by category:
Parser
-
load(filename: string|string[]
, [root: Root
], [callback: function(err: Error, [root: Root])
]): Promise|undefined
[source]
Loads one or multiple .proto or preprocessed .json files into a common root namespace.
-
loadSync(filename: string|string[]
, [root: Root
]): Root
[source]
Synchronously loads one or multiple .proto or preprocessed .json files into a common root namespace (node only).
-
parse(source: string
): Object
[source]
Parses the given .proto source and returns an object with the parsed contents.
Serialization
-
Writer [source]
Wire format writer using Uint8Array
if available, otherwise Array
.
-
Reader [source]
Wire format reader using Uint8Array
if available, otherwise Array
.
Reflection
-
Namespace extends ReflectionObject [source]
Base class of all reflection objects containing nested objects.
-
Root extends Namespace [source]
Root namespace.
-
Type extends Namespace [source]
Reflected message type.
-
Field extends ReflectionObject [source]
Reflected message field.
-
MapField extends Field [source]
Reflected message map field.
-
Enum extends ReflectionObject [source]
Reflected enum.
-
Service extends Namespace [source]
Reflected service.
-
Method extends ReflectionObject [source]
Reflected service method.
Runtime
Utility
- util [source]
Various utility functions.
For less common members, see the API documentation.
Documentation
Command line
The pbjs
command line utility can be used to bundle and translate between .proto and .json files.
Consolidates imports and converts between file formats.
-t, --target Specifies the target format. Also accepts a path to require a custom target.
json JSON representation
json-module JSON representation as a module
proto2 Protocol Buffers, Version 2
proto3 Protocol Buffers, Version 3
static Static code without reflection
static-module Static code without reflection as a module
-p, --path Adds a directory to the include path.
-o, --out Saves to a file instead of writing to stdout.
Module targets only:
-w, --wrap Specifies the wrapper to use. Also accepts a path to require a custom wrapper.
default Default wrapper supporting both CommonJS and AMD
commonjs CommonJS only wrapper
amd AMD only wrapper
-r, --root Specifies an alternative protobuf.roots name.
Proto sources only:
--keep-case Keeps field casing instead of converting to camel case (not recommended).
Static targets only:
--no-create Does not generate create functions used for runtime compatibility.
--no-encode Does not generate encode functions.
--no-decode Does not generate decode functions.
--no-verify Does not generate verify functions.
--no-convert Does not generate convert functions like asJSON and from.
--no-delimited Does not generate delimited encode/decode functions.
--no-beautify Does not beautify generated code.
--no-comments Does not output any JSDoc comments.
usage: pbjs [options] file1.proto file2.json ...
For production environments it is recommended to bundle all your .proto files to a single .json file, which reduces the number of network requests and parser invocations required:
$> pbjs -t json file1.proto file2.proto > bundle.json
Now, either include this file in your final bundle:
var root = protobuf.Root.fromJSON(require("./bundle.json"));
or load it the usual way:
protobuf.load("bundle.json", function(err, root) {
...
});
Generating TypeScript definitions from static modules
Likewise, the pbts
command line utility can be used to generate TypeScript definitions from pbjs
-generated static modules.
Generates TypeScript definitions from annotated JavaScript files.
-n, --name Wraps everything in a module of the specified name.
-o, --out Saves to a file instead of writing to stdout.
-m, --main Whether building the main library without any imports.
-g, --global Name of the global object in browser environments, if any.
--no-comments Does not output any JSDoc comments.
usage: pbts [options] file1.js file2.js ...
Using pbjs and pbts programmatically
Both utilities can be used programmatically by providing command line arguments and a callback to their respective main
functions:
var pbjs = require("protobufjs/cli/pbjs");
pbjs.main([ "--target", "json-module", "path/to/myproto.proto" ], function(err, output) {
if (err)
throw err;
});
Descriptors vs. static modules
While .proto and JSON files require the full library (about 17.5kb gzipped), pretty much all code but the relatively short descriptors is shared and all features including reflection and the parser are available.
Static code, on the other hand, requires just the minimal runtime (about 5.5kb gzipped), but generates additional, albeit editable, source code without any reflection features.
There is no difference performance-wise as the code generated statically is pretty much the same as generated at runtime.
Additionally, JSON modules can be used with TypeScript definitions generated for their static counterparts as long as the following conditions are met:
- Always use
SomeMessage.create(...)
instead of new SomeMessage(...)
because reflection does not provide such a constructor.
- Types, services and enums must start with an uppercase letter to become available on the reflected types as well.
- When using a TypeScript definition with code not generated by pbjs,
resolveAll()
must be called once on the root instance to populate these additional properties (JSON modules do this automatically).
Building
To build the library or its components yourself, clone it from GitHub and install the development
dependencies:
$> git clone https://github.com/dcodeIO/protobuf.js.git
$> cd protobuf.js
$> npm install --dev
Building the development and production versions with their respective source maps to dist/
:
$> npm run build
Building the documentation to docs/
:
$> npm run docs
Building the TypeScript definition to index.d.ts
:
$> npm run types
Browserify integration
By default, protobuf.js integrates into your browserify build-process without requiring any optional modules. Hence:
- If you need int64 support, explicitly require the
long
module somewhere in your project. It will be excluded otherwise.
- If you have any special requirements, there is the bundler as a reference.
Performance
The package includes a benchmark that tries to compare performance to native JSON as far as this is possible. On an i7-2600K running node 6.9.1 it yields:
benchmarking encoding performance ...
Type.encode to buffer x 521,803 ops/sec ±0.84% (88 runs sampled)
JSON.stringify to string x 300,362 ops/sec ±1.11% (86 runs sampled)
JSON.stringify to buffer x 169,413 ops/sec ±1.49% (86 runs sampled)
Type.encode to buffer was fastest
JSON.stringify to string was 42.6% slower
JSON.stringify to buffer was 67.7% slower
benchmarking decoding performance ...
Type.decode from buffer x 1,325,308 ops/sec ±1.46% (88 runs sampled)
JSON.parse from string x 283,907 ops/sec ±1.39% (86 runs sampled)
JSON.parse from buffer x 255,372 ops/sec ±1.28% (88 runs sampled)
Type.decode from buffer was fastest
JSON.parse from string was 78.6% slower
JSON.parse from buffer was 80.7% slower
benchmarking combined performance ...
Type to/from buffer x 269,719 ops/sec ±0.87% (91 runs sampled)
JSON to/from string x 122,878 ops/sec ±1.59% (87 runs sampled)
JSON to/from buffer x 89,310 ops/sec ±1.01% (88 runs sampled)
Type to/from buffer was fastest
JSON to/from string was 54.8% slower
JSON to/from buffer was 66.9% slower
benchmarking verifying performance ...
Type.verify x 5,857,856 ops/sec ±0.82% (91 runs sampled)
Note that JSON is a native binding nowadays and as such is about as fast as it possibly can get. So, how can protobuf.js be faster?
- The benchmark is somewhat flawed.
- Reader and writer interfaces configure themselves according to the environment to eliminate redundant conditionals.
- Node-specific reader and writer subclasses benefit from node's buffer binding.
- Reflection has built-in code generation that builds type-specific encoders, decoders and verifiers at runtime.
- Encoders and decoders do not implicitly call
verify
on messages to avoid unnecessary overhead where messages are already known to be valid. It's up to the user to call verify
where necessary.
- Quite a bit of V8-specific profiling is accountable for everything else.
You can also run the benchmark ...
$> npm run bench
and the profiler yourself (the latter requires a recent version of node):
$> npm run prof <encode|decode|encode-browser|decode-browser> [iterations=10000000]
Note that as of this writing, the benchmark suite performs significantly slower on node 7.2.0 compared to 6.9.1 because moths.
Compatibility

- Because the internals of this package do not rely on
google/protobuf/descriptor.proto
, options are parsed and presented literally.
- If typed arrays are not supported by the environment, plain arrays will be used instead.
- Support for pre-ES5 environments (except IE8) can be achieved by using a polyfill.
- Support for Content Security Policy-restricted environments (like Chrome extensions without unsafe-eval) can be achieved by generating and using static code instead.
- If you need a proper way to work with 64 bit values (uint64, int64 etc.), you can install long.js alongside this library. All 64 bit numbers will then be returned as a
Long
instance instead of a possibly unsafe JavaScript number (see).
License: BSD 3-Clause License
6.4.0 (release)
Breaking
:hash: Dropped IE8 support<br />
:hash: Removed now unused util.longNeq which was used by early static code<br />
Fixed
:hash: Do not swallow errors in loadSync, also accept negative enum values in Enum#add, fixes #609<br />
:hash: Improved bytes field support, also fixes #606<br />
:hash: Fall back to browser Reader when passing an Uint8Array under node, fixes #605<br />
:hash: Respect optional properties when writing interfaces in tsd-jsdoc, fixes #598<br />
New
:hash: Instead of protobuf.parse.keepCase, fall back to protobuf.parse.defaults holding all possible defaults, see #608<br />
:hash: Added global ParseOptions#keepCase fallback as protobuf.parse.keepCase, see #608<br />
:hash: Converters use code generation and support custom implementations<br />
:hash: Be more verbose when throwing invalid wire type errors, see #602<br />
:hash: Added an asJSON-option to always populate array fields, even if defaults=false, see #597<br />
:hash: Attempt to improve TypeScript support by using explicit exports<br />
:hash: Copy-pasted typescript definitions to micro modules, see #599<br />
:hash: Emit an error on resolveAll() if any extension fields cannot be resolved, see #595 + test case<br />
CLI
:hash: Removed 'not recommend' label for --keep-case, see #608<br />
:hash: Added customizable linter configuration to pbjs<br />
:hash: Added stdin support to pbjs and pbts<br />
:hash: Static code no longer uses IE8 support utility<br />
:hash: Generated static code now supports asJSON/from<br />
:hash: Added support for TypeScript enums to pbts<br />
:hash: Added a few helpful comments to static code<br />
:hash: Slightly beautify statically generated code<br />
:hash: Do not wrap main definition as a module and export directly instead<br />
:hash: Generate prettier definitions with --no-comments<br />
:hash: Added variable arguments support to tsd-jsdoc<br />
:hash: Reference dependency imports as a module to prevent name collisions, see #596<br />
:hash: Removed now unnecessary comment lines in generated static code<br />
Docs
:hash: Added notes on CSP-restricted environments to README, see #593<br />
Other
:hash: Added test case for asJSON with arrays=true, see #597<br />
:hash: Added a tape adapter to assert message equality accross browsers<br />
:hash: Refactored some internal utility away<br />
:hash: Reverted previous attempt on #597<br />
:hash: Minor tsd-jsdoc refactor<br />
:hash: Removed unused sandbox files<br />
:hash: Updated package.json of micro modules to reference types, see #599<br />
:hash: Reference dependencies as imports in generated typescript definitions, see #596<br />
:hash: Allow null values on optional long fields, see #590<br />
:hash: Various jsdoc improvements and a workaround for d.ts generation, see #592<br />