Envoy Node
This is a boilerplate to help you adopt Envoy.
There are multiple ways to config Envoy, one of the convenience way to mange different egress traffic is route the traffic by hostname (using virtual hosts). By doing so, you can use one egress port for all your egress dependencies:
static_resources:
listeners:
- name: egress_listener
address:
socket_address:
address: 0.0.0.0
port_value: 12345
filter_chains:
- filters:
- name: envoy.http_connection_manager
typed_config:
"@type": type.googleapis.com/envoy.extensions.filters.network.http_connection_manager.v3.HttpConnectionManager
codec_type: AUTO
stat_prefix: ingress
use_remote_address: true
stat_prefix: http.test.egress
route_config:
name: egress_route_config
virtual_hosts:
- name: foo_service
domains:
- foo.service:8888
routes:
- match:
prefix: /
route:
cluster: remote_foo_server
- name: bar_service
domains:
- bar.service:8888
routes:
- match:
prefix: /
route:
cluster: remote_bar_server
http_filters:
- name: envoy.router
typed_config:
"@type": type.googleapis.com/envoy.extensions.filters.http.router.v3.Router
dynamic_stats: true
But it will bring you new problem, your code is becoming verbose:
- routing traffic to
127.0.0.1:12345
where egress port is listening - setting host headers for each request
- propagating the tracing information
And this library is going to help you deal with these things elegantly.
First, let's tell the library where the egress port is binding. A recommended way is to set the information on the ingress header by request_headers_to_add:
request_headers_to_add:
- header:
key: x-tubi-envoy-egress-port
value: "12345"
- header:
key: x-tubi-envoy-egress-addr
value: 127.0.0.1
You can also set this by the constructor parameters of EnvoyContext
.
High level APIs
HTTP
For HTTP, you can new the client like this:
const { EnvoyHttpClient, HttpRetryOn } = require("envoy-node");
async function awesomeAPI(req, res) {
const client = new EnvoyHttpClient(req.headers);
const url = `http://foo.service:10080/path/to/rpc`
const request = {
message: "ping",
};
const optionalParams = {
timeout: 1000,
retryOn: [HttpRetryOn.RETRIABLE_4XX],
maxRetries: 3,
perTryTimeout: 300,
headers: {
"x-extra-header-you-want": "value",
},
};
const serializedJsonResponse = await client.post(url, request, optionalParams);
res.send({ serializedJsonResponse });
res.end();
}
gRPC
For gRPC, you can new the client like this:
General RPC
const grpc = require("grpc");
const { envoyProtoDecorator, GrpcRetryOn } = require("envoy-node");
const PROTO_PATH = __dirname + "/ping.proto";
const Ping = grpc.load(PROTO_PATH).test.Ping;
const PingClient = envoyProtoDecorator(Ping);
async function awesomeAPI(call, callback) {
const client = new PingClient("bar.service:10081", call.metadata);
const request = {
message: "ping",
};
const optionalParams = {
timeout: 1000,
retryOn: [GrpcRetryOn.DEADLINE_EXCEEDED],
maxRetries: 3,
perTryTimeout: 300,
headers: {
"x-extra-header-you-want": "value",
},
};
const response = await client.pathToRpc(request, optionalParams);
callback(undefined, { remoteResponse: response });
}
Streaming API
But they are also decorated to send the Envoy context. You can also specify the optional params (the last one) for features like timeout
/ retryOn
/ maxRetries
/ perTryTimeout
provided by Envoy.
NOTE:
- For streaming API, they are not implemented as
async
signature. - The optional params (
timeout
etc.) is not tested and Envoy is not documented how it deal with streaming.
Client streaming
const stream = innerClient.clientStream((err, response) => {
if (err) {
return;
}
console.log("server responses:", response);
});
stream.write({ message: "ping" });
stream.write({ message: "ping again" });
stream.end();
Sever streaming
const stream = innerClient.serverStream({ message: "ping" });
stream.on("error", error => {
});
stream.on("data", (data: any) => {
console.log("server sent:", data);
});
stream.on("end", () => {
});
Bidirectional streaming
const stream = innerClient.bidiStream();
stream.write({ message: "ping" });
stream.write({ message: "ping again" });
stream.on("error", error => {
});
stream.on("data", (data: any) => {
console.log("sever sent:", data);
});
stream.on("end", () => {
stream.end();
});
stream.end();
Low level APIs
If you want to have more control of your code, you can also use the low level APIs of this library:
const { envoyFetch, EnvoyContext, EnvoyHttpRequestParams, EnvoyGrpcRequestParams, envoyRequestParamsRefiner } = require("envoy-node");
const context = new EnvoyContext(
headerOrMetadata,
envoyEgressPort,
envoyEgressAddr
);
const params = new EnvoyHttpRequestParams(context, optionalParams);
envoyFetch(params, url, init )
.then(res => {
console.log("envoy tells:", res.overloaded, res.upstreamServiceTime);
return res.json();
})
.then()
const yourOldRequestParams = {};
request(envoyRequestParamsRefiner(yourOldRequestParams, context ))
const client = new Ping((
`${context.envoyEgressAddr}:${context.envoyEgressPort}`,
grpc.credentials.createInsecure()
);
const requestMetadata = params.assembleRequestMeta()
client.pathToRpc(
request,
requestMetadata,
{
host: "bar.service:10081"
},
(error, response) => {
})
Check out the detail document if needed.
Context store
Are you finding it's too painful for you to propagate the context information through function calls' parameter?
If you are using Node.js V8, here is a solution for you:
import { envoyContextStore } from "envoy-node";
envoyContextStore.enable();
envoyContextStore.set(new EnvoyContext(req.headers));
envoyContextStore.get();
IMPORTANT
- according to the implementation, it's strictly requiring the
set
method is called exactly once per request. Or you will get incorrect context. Please check the document for more details. (TBD: We are working on a blog post for the details.) - according to
asyn_hooks
implementation, destroy
is not called if the code is using HTTP keep alive. Please use setEliminateInterval
to set a time for deleting old context data or you may have memory leak. The default (5 mintues) is using if you don't set it.
For dev and test, or migrating to Envoy
If you are developing the application, you may probably do not have Envoy running. You may want to call the service directly:
Either:
new EnvoyContext({
meta: grpcMetadata_Or_HttpHeader,
directMode: true,
envoyManagedHosts: new Set(["some-hostname:8080"]);
})
or:
export ENVOY_DIRECT_MODE=true # 1 works as well
Contributing
For developing or running test of this library, you probably need to:
- have an envoy binary in your
PATH
, or:
$ npm run download-envoy
$ export PATH=./node_modules/.bin/:$PATH
- to commit your code change:
$ git add .
$ npm run commit
- for each commit, the CI will auto release base on commit messages, to allow keeping the version align with Envoy, let's use fix instead of feature unless we want to upgrade minor version.
License
MIT
Credits