protoc-gen-grpc-gateway-es
Generate TypeScript client for gRPC API exposed via grpc-gateway. Powered by protobuf-es framework and bun toolchain.
Philosophy
The plugin walks over the proto files and converts protobuf messages to TypeScipt types and protobuf RPC to RPC JavaScript classes. The RPCs are exported as ServiceName_MethodName
and have following signature
class RPC<RequestMessage, ResponseMessage> = {
readonly method: `DELETE` | `GET` | `PATCH` | `POST` | `PUT`;
readonly path: string;
readonly bodyKey?: string;
createRequest: (c: RequestConfig, m: RequestMessage) => Request;
responseTypeId: (r: any) => ResponseMessage;
};
Unlike many other gRPC-to-TypeScript generators, this one tries to be as minimal as possible, we don't do any extra data de/-serialization for you, but we are using the TypeScript type system to provide more safety and the runtime.ts
file generated alongside your files contains a few conversion helpers.
- The well-known protobuf types
google.protobuf.Timestamp
an google.protobuf.Duration
are typed as a string
. You can pass the Timestamp
string to JS Date
. The Duration
is a float with s
suffix for seconds, there is no appropriate JS type for that. - The protobuf
bytes
type is converted to BytesString
type, which is a type alias for a string
with type-only symbol, that prevents you from using it as a regular string
in TypeScript. There are helper functions in the runtime.ts
module for conversion from/to UInt8Array
. - The protobuf
int64
type is converted to BigIntString
type, which is a type alias for a string
with type-only symbol, that prevents you from using it as a regular string
in TypeScript. There are helper functions in the runtime.ts
module for conversion from/to BigInt
.
Usage
The usage has two phases, first you need to generate the {Java,Type}Script files and then use them in your app.
Generate {Java,Type}Script files
The package outputs an executable, which contains fixed version of bun
runtime and all the needed source files. The executable is currently little overweight (~60MB), but it is a known issue of bun
, hopefully they will fix this soon 🤞.
The executable is standard protoc
plugin so you can use it with any protoc
based tool, but the buf generate
is hightly recommended since it takes the burden of resolving proto dependencies from your shoulders. If you need some intro to buf generate
there is a tutorial on buf website.
To run this plugin copy the executable to your codebase and configure the buf to use it in your buf.yaml
file, for example
version: v1
managed:
enabled: true
go_package_prefix:
default: example/package/prefix
except:
- buf.build/googleapis/googleapis
- buf.build/grpc-ecosystem/grpc-gateway
plugins:
- name: es
opt: target=ts
out: gen/es
path: ./path/to/protoc-gen-grpc-gateway-es
You need to change:
managed.go_package_prefix.default
to your package prefix,plugins[0].out
to the output directory of your choice,plugins[0].path
to the path of the protoc-gen-grpc-gateway-es
executable.
Then run buf generate
(assuming you have properly installed and configured the buf
) and it will generate the TypeScript files for you.
If you want to generate JavaScript instead, just pass change the plugin option opt: target=js
.
The list of all plugin options is here.
Note on formatting
To simplify the development, this plugin is not concerned with pretty printed output. The generated files are readable, but if your eyes are bleeding, use your favorite formatter after the files generation, e.g.
npx prettier --write gen/es/
Usage in apps code
The generated files relies on some browser API, i.e. it is anticipated you will use it in the browser only, usage in Node.js is not tested, but in theory should work.
The generated files contains all the the protobuf messages as TypeScript types, all the protobuf enums as TypeScript enums and protobuf methods are converted to RPC
JavaScript classes. There is also a top-level runtime.ts
file which contains the constructor of RPC
class and a few helper TypeScript types and functions.
The typical usage with fetch
might look like this.
import { SomeService_SomeMethod, type SomeMethodRequest } from "./gen/es/example/package/prefix/some_service_pb.ts";
import { type RequestConfig } from "./gen/es/runtime.ts"
const getBearerToken = () => {
return `XYZ`;
}
const requestConfig: RequestConfig = {
basePath: "https://example.test/api/v1",
bearerToken: getBearerToken
};
const someMethodRequest: SomeMethodRequest = {
flip: "flop",
}
const signal = AbortSignal.timeout(5_000);
const request = SomeService_SomeMethod.getRequest(config, variables)
const serviceMethodCall = fetch(request, { signal }).then(response => {
if (response.ok) {
return response.json().then(SomeService_SomeMethod.responseTypeId)
}
return Promise.reject(response)
})
You will likely create a wrapper function around the RPC
class since the logic will be probably the same for all RPCs. We don't provide this wrapper since it is app specific, but it might look something like this.
import { SomeService_SomeMethod } from "./gen/es/example/package/prefix/some_service_pb.ts";
import { type RPC, type RequestConfig } from "./gen/es/runtime.ts"
const requestConfig: RequestConfig = {
basePath: "https://example.test/api/v1",
bearerToken: getBearerToken
};
const fetchWrapRPC = <RequestMessage, ResponseMessage>(
rpc: RPC<RequestMessage, ResponseMessage>
) => {
return (
variables: RequestMessage,
{ signal }: { signal?: AbortSignal } = {}
) => {
return fetch(rpc.createRequest(requestConfig, variables), { signal })
.then((response) => {
if (response.ok) {
return response.json() as ResponseMessage;
}
return Promise.reject(response);
});
};
};
const someServiceSomeMethodAsyncFunction = fetchWrapRPC(SomeService_SomeMethod);
const abortController = new AbortController();
const responseJSON = await someServiceSomeMethodAsyncFunction(
{ flip: "flop" },
{ signal: abortController.signal }
);
Usage caveats
-
The generated TypeScript files import other files with .js
extension, this can be changed in the plugin configuration and I recommend setting import_extension=none
which produces extension-less imports, otherwise your bundler or test framework might fail to resolve the imports.
-
The protobuf oneof
are generated into the TypeScript as union, i.e. the message
// flip.proto
message Flip {
string flap = 1;
oneof toss {
bool heads = 2;
bool tails = 3;
}
}
is generated as
export type Flip = { flap?: string } & (
| { heads?: boolean; }
| { tails?: boolean; }
);
this captures the mutual exclusivity but is a little cumbersome to work with in TypeScript, because if you attempt to access flip.heads
the compiler complains that heads
might not be defined. This forces you to use the JavaScript in
operator which acts as a type guard. It is little inconvenient to use it each time you want to access the oneof
field, but it is the proper way to tackle this problem.
let test = flip.heads;
if ("heads" in flip) {
test = flip.heads;
test = flip.tails;
}
-
We decided to generate the JavaScript Request
object for you, which is neatly compatible with the fetch API. Unfortunately, the Request
object has a few quirks inherited from the streaming nature of the fetch. If you would ever need to read the body
of the Request
, you'll find it is a ReadableStream
object and as such it is uneasy to obtain it's value. The most straight-forward way to consume the stream is to pass it to the Response
object and then read it asynchronously
const request = SomeService_SomeMethod.getRequest(config, variables);
const requestBodyAsText = await(new Response(request.body).text());
you can as well parse the body to a string with Response#json
you know from the fetch API responses. After this call the readable stream will be consumed and no longer available, so if you read the body
of the Request
you can no longer use the Request
object for the fetch API call 😒 You need to create a new Request
object, where the constructor allows you clone the original request and you can pass in the old body you have read as a new body.
const requestClone = new Request(request, { body: requestBodyAsText });
The fetch API has identical signature so you can pass the same parameters to fetch
const response = await fetch(request, { body: requestBodyAsText });
But you will most likely read the Request
to use other network library than fetch
. In any case, the runtime.ts
library exports all of it's internals so you can use it as a plumbing in case you don't like the RPC#createRequest
method 😉.
Development
First, read the Protobuf-ES: Writing Plugins and familiarize yourself with the Bun toolkit.
Use test-driven development, first write a failing test with feature you want to implement and then change the code.
Folders and their meaning
-
/options/
- when you want to read options (a.k.a. extensions) from the proto files, and the options are non-scalar, such as
option (google.api.http) = {get: "/v1/{name_test=projects/*/documents/*}:customMethod"};
where the value of the google.api.http
is an object, the protobuf-es framework requires you to have prepared the object types as @bufbuild/protobuf/Message
JavaScript classes. Jere we are using buf
to convert all options commonly used in gRPC-gateway into the required messages classes. There is a script bun run generateOptions
which outputs the JavaScript classes into the /options/
folder, from which we import the classes during generation.
-
/src/
- the main source code of this plugin
index.ts
- instantiates the plugin with protobuf-es frameworkgenerate.Ts.ts
- the main logic of the plugin,helpers.ts
- various helpers for the plugin,runtime.ts
- this file is not used during generation, but contains common code for the runtime - the file is coppied by the plugin into the output folder and other generated files imports common logic from there.
-
/tests/
- the test files, written using bun
test API, you can run the test from CLI with bun test
in the root of this repo or use bun plugin for your IDE to run and debug the tests selectivelly (highly recommended). There is an e2e setup which abstracts the protoc
generation. The test framework accepts string representing a proto file, uses buf
"black-magic" to resolve all proto dependencies and prepares the CodeGeneratorRequest
1, which is then passed to our logic and you can assert the generated TypeScript code.
The generated code and your asserts are compiled via tsc
so formatting nuances and comments are stripped away and the syntax of both values is verified before the test. Each test case is selfcontained, you must pass in valid proto file content and you obtain all generated TypeScript files.
-
/tools/go
- the shell script for invoking GO libraries with pinned version.
Development caveats
- The
bun
is used also as a package manager, use bun add
for adding dependencies, bun run
for running NPM scripts or bunx
instead of npx
. - Beware that
bun
and buf
are two different things and easy to confuse, it is easy to make mistake like running bun
command with buf
and vice versa. - Because
buf
can only read proto files from the file-system, each test writes a temporary file into /tests/proto/
. The name of the file passed to getCodeGeneratorRequest
function in the test is the actual name of the file created in /tests/proto/
directory, therefor each test must use unique file name. I usually name the file loosely after the test-case.