Security News
GitHub Removes Malicious Pull Requests Targeting Open Source Repositories
GitHub removed 27 malicious pull requests attempting to inject harmful code across multiple open source repositories, in another round of low-effort attacks.
google-protobuf
Advanced tools
The google-protobuf npm package provides a library for serializing structured data. It is used to compile .proto files to construct and parse protocol buffers in JavaScript. Protocol buffers are Google's language-neutral, platform-neutral, extensible mechanism for serializing structured data.
Serialization and Deserialization
This feature allows you to serialize your structured data to a binary format and then deserialize it back to a JavaScript object. The code sample shows how to serialize and deserialize a message using the google-protobuf package.
const messages = require('./example_pb');
// Serialization
const message = new messages.ExampleMessage();
message.setSomeField('Hello, Protocol Buffers!');
const bytes = message.serializeBinary();
// Deserialization
const receivedMessage = messages.ExampleMessage.deserializeBinary(bytes);
console.log(receivedMessage.getSomeField());
Defining Message Structure
Before you can serialize and deserialize data, you need to define the structure of your messages using the Protocol Buffers language. This code sample is a .proto file that defines a simple message with one field.
// example.proto
syntax = 'proto3';
package example;
message ExampleMessage {
string some_field = 1;
}
Working with Enums
Protocol Buffers allow you to define enums within your .proto files. This code sample shows how to define an enum and use it within a message.
// example.proto
syntax = 'proto3';
package example;
enum ExampleEnum {
UNKNOWN = 0;
STARTED = 1;
COMPLETED = 2;
}
message ExampleMessage {
ExampleEnum status = 1;
}
protobufjs is a pure JavaScript implementation of Protocol Buffers with a similar API to google-protobuf. It allows you to encode and decode message structures without relying on the official Google library. It can also work in the browser, which may not be fully supported by google-protobuf.
FlatBuffers is an alternative serialization library developed by Google for performance-critical applications. It allows for direct access to serialized data without parsing/unpacking, which can be faster than Protocol Buffers for some use cases. However, it is not as widely adopted as Protocol Buffers.
Copyright 2008 Google Inc.
This directory contains the JavaScript Protocol Buffers runtime library.
The library is currently compatible with:
var protos = require('my-protos');
)goog.require('my.package.MyProto');
)Support for ES6-style imports is not implemented yet. Browsers can be supported by using Browserify, webpack, Closure Compiler, etc. to resolve imports at compile time.
To use Protocol Buffers with JavaScript, you need two main components:
npm install google-protobuf
, or use the files in this directory.
If npm is not being used, as of 3.3.0, the files needed are located in binary subdirectory;
arith.js, constants.js, decoder.js, encoder.js, map.js, message.js, reader.js, utils.js, writer.jsprotoc
. This translates .proto
files
into .js
files. The compiler is not currently available via
npm, but you can download a pre-built binary
on GitHub
(look for the protoc-*.zip
files under Downloads).First, obtain the Protocol Compiler. The easiest way is to download a pre-built binary from https://github.com/protocolbuffers/protobuf/releases.
If you want, you can compile protoc
from source instead. To do this
follow the instructions in the top-level
README.
Once you have protoc
compiled, you can run the tests provided along with our project to examine whether it can run successfully. In order to do this, you should download the Protocol Buffer source code from the release page with the link above. Then extract the source code and navigate to the folder named js
containing a package.json
file and a series of test files. In this folder, you can run the commands below to run the tests automatically.
$ npm install
$ npm test
# If your protoc is somewhere else than ../src/protoc, instead do this.
# But make sure your protoc is the same version as this (or compatible)!
$ PROTOC=/usr/local/bin/protoc npm test
This will run two separate copies of the tests: one that uses
Closure Compiler style imports and one that uses CommonJS imports.
You can see all the CommonJS files in commonjs_out/
.
If all of these tests pass, you know you have a working setup.
To use Protocol Buffers in your own project, you need to integrate the Protocol Compiler into your build system. The details are a little different depending on whether you are using Closure imports or CommonJS imports:
If you want to use Closure imports, your build should run a command like this:
$ protoc --js_out=library=myproto_libs,binary:. messages.proto base.proto
For Closure imports, protoc
will generate a single output file
(myproto_libs.js
in this example). The generated file will goog.provide()
all of the types defined in your .proto files. For example, for the unit
tests the generated files contain many goog.provide
statements like:
goog.provide('proto.google.protobuf.DescriptorProto');
goog.provide('proto.google.protobuf.DescriptorProto.ExtensionRange');
goog.provide('proto.google.protobuf.DescriptorProto.ReservedRange');
goog.provide('proto.google.protobuf.EnumDescriptorProto');
goog.provide('proto.google.protobuf.EnumOptions');
The generated code will also goog.require()
many types in the core library,
and they will require many types in the Google Closure library. So make sure
that your goog.provide()
/ goog.require()
setup can find all of your
generated code, the core library .js
files in this directory, and the
Google Closure library itself.
Once you've done this, you should be able to import your types with statements like:
goog.require('proto.my.package.MyMessage');
var message = proto.my.package.MyMessage();
If unfamiliar with Closure or its compiler, consider reviewing Closure documentation.
If you want to use CommonJS imports, your build should run a command like this:
$ protoc --js_out=import_style=commonjs,binary:. messages.proto base.proto
For CommonJS imports, protoc
will spit out one file per input file
(so messages_pb.js
and base_pb.js
in this example). The generated
code will depend on the core runtime, which should be in a file called
google-protobuf.js
. If you are installing from npm
, this file should
already be built and available. If you are running from GitHub, you need
to build it first by running:
$ gulp dist
Once you've done this, you should be able to import your types with statements like:
var messages = require('./messages_pb');
var message = new messages.MyMessage();
--js_out
flagThe syntax of the --js_out
flag is:
--js_out=[OPTIONS:]output_dir
Where OPTIONS
are separated by commas. Options are either opt=val
or
just opt
(for options that don't take a value). The available options
are specified and documented in the GeneratorOptions
struct in
src/google/protobuf/compiler/js/js_generator.h.
Some examples:
--js_out=library=myprotos_lib.js,binary:.
: this contains the options
library=myprotos.lib.js
and binary
and outputs to the current directory.
The import_style
option is left to the default, which is closure
.--js_out=import_style=commonjs,binary:protos
: this contains the options
import_style=commonjs
and binary
and outputs to the directory protos
.
import_style=commonjs_strict
doesn't expose the output on the global scope.The API is not well-documented yet. Here is a quick example to give you an idea of how the library generally works:
var message = new MyMessage();
message.setName("John Doe");
message.setAge(25);
message.setPhoneNumbers(["800-555-1212", "800-555-0000"]);
// Serializes to a UInt8Array.
var bytes = message.serializeBinary();
var message2 = MyMessage.deserializeBinary(bytes);
For more examples, see the tests. You can also look at the generated code to see what methods are defined for your generated messages.
FAQs
Protocol Buffers for JavaScript
The npm package google-protobuf receives a total of 824,592 weekly downloads. As such, google-protobuf popularity was classified as popular.
We found that google-protobuf demonstrated a healthy version release cadence and project activity because the last version was released less than a year ago. It has 6 open source maintainers collaborating on the project.
Did you know?
Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.
Security News
GitHub removed 27 malicious pull requests attempting to inject harmful code across multiple open source repositories, in another round of low-effort attacks.
Security News
RubyGems.org has added a new "maintainer" role that allows for publishing new versions of gems. This new permission type is aimed at improving security for gem owners and the service overall.
Security News
Node.js will be enforcing stricter semver-major PR policies a month before major releases to enhance stability and ensure reliable release candidates.