fast-gateway
A super fast, framework agnostic Node.js API Gateway for the masses ❤️
Since v2.3.0, AWS Lambda proxying integration is supported via http-lambda-proxy
🔥
Since v3.1.0, WebSockets proxying is supported via faye-websocket
🔥
Also available for Docker: https://hub.docker.com/repository/docker/kyberneees/rproxy
Medium articles:
Install
npm i fast-gateway
Usage
Next we describe two examples proxying HTTP and Lambda downstream services.
For simplicity of reading, both examples are separated, however a single gateway configuration supports all routes configurations.
HTTP Proxying
Gateway
const gateway = require('fast-gateway')
const server = gateway({
routes: [{
prefix: '/service',
target: 'http://127.0.0.1:3000'
}]
})
server.start(8080)
Remote Service
const service = require('restana')()
service.get('/get', (req, res) => res.send('Hello World!'))
service.start(3000)
Lambda Proxying
Gateway
npm i http-lambda-proxy
const gateway = require('fast-gateway')
const server = gateway({
routes: [{
prefix: '/service',
target: 'my-lambda-serverless-api',
proxyType: 'lambda',
proxyConfig: {
region: 'eu-central-1'
}
}]
})
server.start(8080)
You might also want to read: Setting AWS Credentials in Node.js
Lambda Implementation
const serverless = require('serverless-http')
const json = require('serverless-json-parser')
const query = require('connect-query')
const service = require('restana')()
service.use(query())
service.use(json())
service.get('/get', (req, res) => {
res.send({ msg: 'Go Serverless!' })
})
service.post('/post', (req, res) => {
res.send(req.body)
})
module.exports.handler = serverless(service)
Configuration options explained
{
server,
restana: {},
middlewares: [],
pathRegex: '/*',
timeout: 0,
targetOverride: "https://yourdev.api-gateway.com",
proxyFactory: ({ proxyType, opts, route }) => {...}
routes: [{
proxyType: 'http'
proxyConfig: {},
proxyHandler: () => {},
pathRegex: '/*',
timeout: 0,
prefix: '/public',
docs: {
name: 'Public Service',
endpoint: '/api-docs',
type: 'swagger'
},
prefixRewrite: '',
urlRewrite: (req) => req.url,
target: 'http://localhost:3000',
methods: ['GET', 'POST', ...],
middlewares: [],
hooks: {
async onRequest (req, res) {
},
onResponse (req, res, stream) {
}
}
}]
}
onResponse hooks default implementation
For developers reference, default hooks implementation are located in lib/default-hooks.js
file.
The "GET /services.json" endpoint
Since version 1.3.5
the gateway exposes minimal documentation about registered services at: GET /services.json
Example output:
[
{
"prefix":"/public",
"docs":{
"name":"Public Service",
"endpoint":"/swagger.json",
"type":"swagger"
}
},
{
"prefix":"/admin"
}
]
NOTE: Please see docs
configuration entry explained above.
WebSockets support
WebSockets proxying is supported since v3.1.0
. Main considerations:
Configuration example:
gateway({
routes: [{
}, {
proxyType: 'websocket',
prefix: '/echo',
target: 'ws://ws.ifelse.io'
}]
}).start(PORT)
Timeouts and Unavailability
We can restrict requests timeouts globally or at service level using the timeout
configuration.
You can also define endpoints specific timeout using the property timeout
of the request object, normally inside a middleware:
req.timeout = 500
NOTE: You might want to also check https://www.npmjs.com/package/middleware-if-unless
Circuit Breakers
By using the proxyHandler
hook, developers can optionally intercept and modify the default gateway routing behavior right before the origin request is proxied to the remote service. Therefore, connecting advanced monitoring mechanisms like Circuit Breakers is rather simple.
Please see the demos/circuitbreaker.js
example for more details using the opossum
library.
Rate Limiting
Rate limiting, as well many other gateway level features can be easily implemented using fast-gateway
:
const rateLimit = require('express-rate-limit')
const requestIp = require('request-ip')
gateway({
middlewares: [
(req, res, next) => {
req.ip = requestIp.getClientIp(req)
return next()
},
rateLimit({
windowMs: 1 * 60 * 1000,
max: 60,
handler: (req, res) => res.send('Too many requests, please try again later.', 429)
})
],
routes: [{
prefix: '/public',
target: 'http://localhost:3000'
}, {
}]
})
In this example we have used the express-rate-limit module.
Hostnames support
We can also implement hostnames support with fast-gateway, basically we translate hostnames to prefixes:
...
const hostnames2prefix = [{
prefix: '/api',
hostname: 'api.company.tld'
}]
const hostnamesHook = require('fast-gateway/lib/hostnames-hook')(hostnames2prefix)
const app = restana()
const server = http.createServer((req, res) => {
hostnamesHook(req, res, () => {
return app(req, res)
})
})
gateway({
server: app,
routes: [{
prefix: '/api',
target: 'http://localhost:3000'
}]
})
...
Afterwards:
curl --header "Host: api.company.tld:8080" http://127.0.0.1:8080/api-service-endpoint
Using micromatch patterns as hostname value:
const hostnames2prefix = [{
prefix: '/admin',
hostname: '*.admin.company.tld'
}, {
prefix: '/services',
hostname: [
'services.company.tld',
'*.services.company.tld'
]
}]
For more details, please checkout the basic-hostnames.js
demo.
Gateway level caching
Caching support is provided by the http-cache-middleware
module. https://www.npmjs.com/package/http-cache-middleware
Why?
Because caching
is the last mile for low latency distributed systems!
Enabling proper caching strategies at gateway level will drastically reduce the latency of your system,
as it reduces network round-trips and remote services processing.
We are talking here about improvements in response times from X ms
to ~2ms
, as an example.
We use the http-cache-middleware
module to support gateway level caching. Read more about it: https://github.com/jkyberneees/http-cache-middleware
Setting up gateway level cache available for all services
Single node cache (memory):
const cache = require('http-cache-middleware')()
const gateway = require('fast-gateway')
const server = gateway({
middlewares: [cache],
routes: [...]
})
Memory storage is recommended if there is only one gateway instance and you are not afraid of losing cache data.
Multi nodes cache (redis):
const CacheManager = require('cache-manager')
const redisStore = require('cache-manager-ioredis')
const redisCache = CacheManager.caching({
store: redisStore,
db: 0,
host: 'localhost',
port: 6379,
ttl: 30
})
const cache = require('http-cache-middleware')({
stores: [redisCache]
})
const gateway = require('fast-gateway')
const server = gateway({
middlewares: [cache],
routes: [...]
})
Required if there are more than one gateway instances
How to cache remote services endpoints response?
https://github.com/jkyberneees/http-cache-middleware#enabling-cache-for-service-endpoints
How to invalidate caches?
https://github.com/jkyberneees/http-cache-middleware#invalidating-caches
Custom cache keys
Cache keys are generated using: req.method + req.url
, however, for indexing/segmenting requirements it makes sense to allow cache keys extensions.
Unfortunately, this feature can't be implemented at remote service level, because the gateway needs to know the entire lookup key when a request
reaches the gateway.
For doing this, we simply recommend using middlewares on the service configuration:
routes: [{
prefix: '/users',
target: 'http://localhost:3000',
middlewares: [(req, res, next) => {
req.cacheAppendKey = (req) => req.user.id
return next()
}]
}]
In this example we also distinguish cache entries by user.id
, very common case!
Disable cache for custom endpoints
You can also disable cache checks for certain requests programmatically:
routes: [{
prefix: '/users',
target: 'http://localhost:3000',
middlewares: [(req, res, next) => {
req.cacheDisabled = true
return next()
}]
}]
Related projects
Benchmarks
Benchmark scripts can be found in benchmark folder.
Laptop: MacBook Pro 2016, 2,7 GHz Intel Core i7, 16 GB 2133 MHz LPDDR3
wrk -t8 -c50 -d20s http://127.0.0.1:8080/service/get
- fast-gateway: 18069.77 reqs/secs
- k-fastify-gateway: 9763.61 reqs/secs
- (INACTIVE) Kindly sponsored by ShareNow, a company that promotes innovation!
Support / Donate 💚
You can support the maintenance of this project:
Breaking Changes
v3.x
- The
fast-proxy-lite
module is used by default to support http
proxy type 🔥. This means, no undici
or http2
are supported by default. - The old
fast-proxy
module is available under the http-legacy
proxy type, but the module is not installed by default. - Proxy configuration is now generalized under the
proxyConfig
property.