Huge News!Announcing our $40M Series B led by Abstract Ventures.Learn More
Socket
Sign inDemoInstall
Socket

fast-gateway

Package Overview
Dependencies
Maintainers
1
Versions
88
Alerts
File Explorer

Advanced tools

Socket logo

Install Socket

Detect and block malicious and high-risk dependencies

Install

fast-gateway - npm Package Compare versions

Comparing version 1.0.0 to 1.1.0

.travis.yml

3

package.json
{
"name": "fast-gateway",
"version": "1.0.0",
"version": "1.1.0",
"description": "A Node.js API Gateway for the masses!",

@@ -30,2 +30,3 @@ "main": "index.js",

"fast-proxy": "^1.1.0",
"http-cache-middleware": "^1.0.0",
"restana": "^3.0.2"

@@ -32,0 +33,0 @@ },

@@ -5,2 +5,7 @@ # fast-gateway

## Install
```js
npm i fast-gateway
```
## Usage

@@ -79,2 +84,122 @@ ### Gateway

```
## Gateway level caching
### Why?
> Because `caching` is the last mile for low latency distributed systems!
Enabling proper caching strategies at gateway level will drastically reduce the latency of your system,
as it reduces network round-trips and remote services processing.
We are talking here about improvements in response times from `X ms` to `~2ms`, as an example.
### Setting up gateway level cache available for all services
#### Single node cache (memory):
```js
// cache middleware
const cache = require('http-cache-middleware')()
// enable http cache middleware
const gateway = require('fast-gateway')
const server = gateway({
middlewares: [cache],
routes: [...]
})
```
> Memory storage is recommended if there is only one gateway instance and you are not afraid of losing cache data.
#### Multi nodes cache (redis):
```js
// redis setup
const CacheManager = require('cache-manager')
const redisStore = require('cache-manager-ioredis')
const redisCache = CacheManager.caching({
store: redisStore,
db: 0,
host: 'localhost',
port: 6379,
ttl: 30
})
// cache middleware
const cache = require('http-cache-middleware')({
stores: [redisCache]
})
// enable http cache middleware
const gateway = require('fast-gateway')
const server = gateway({
middlewares: [cache],
routes: [...]
})
```
> Required if there are more than one gateway instances
### Enabling cache for service endpoints
Although API Gateway level cache aims as a centralized cache for all services behind the wall, are the services
the ones who indicate the responses to be cached and for how long.
Cache entries will be created for all remote responses coming with the `x-cache-timeout` header:
```js
res.setHeader('x-cache-timeout', '1 hour')
```
> Here we use the [`ms`](`https://www.npmjs.com/package/ms`) package to convert timeout to seconds. Please note that `millisecond` unit is not supported!
Example on remote service using `restana`:
```js
service.get('/numbers', (req, res) => {
res.setHeader('x-cache-timeout', '1 hour')
res.send([
1, 2, 3
])
})
```
### Invalidating cache
> Let's face it, gateway level cache invalidation was complex..., until now!
Remote services can also expire cache entries on demand, i.e: when the data state changes. Here we use the `x-cache-expire` header to indicate the gateway cache entries to expire using a matching pattern:
```js
res.setHeader('x-cache-expire', '*/numbers')
```
> Here we use the [`matcher`](`https://www.npmjs.com/package/matcher`) package for matching patterns evaluation.
Example on remote service using `restana`:
```js
service.patch('/numbers', (req, res) => {
res.setHeader('x-cache-expire', '*/numbers')
// ...
res.send(200)
})
```
### Custom cache keys
Cache keys are generated using: `req.method + req.url`, however, for indexing/segmenting requirements it makes sense to allow cache keys extensions.
Unfortunately, this feature can't be implemented at remote service level, because the gateway needs to know the entire lookup key when a request
reaches the gateway.
For doing this, we simply recommend using middlewares on the service configuration:
```js
routes: [{
prefix: '/users',
target: 'http://localhost:3000',
middlewares: [(req, res, next) => {
req.cacheAppendKey = (req) => req.user.id // here cache key will be: req.method + req.url + req.user.id
return next()
}]
}]
```
> In this example we also distinguish cache entries by `user.id`, very common case!
### Disable cache for custom endpoints
You can also disable cache checks for certain requests programmatically:
```js
routes: [{
prefix: '/users',
target: 'http://localhost:3000',
middlewares: [(req, res, next) => {
req.cacheDisabled = true
return next()
}]
}]
```
## Want to contribute?

@@ -81,0 +206,0 @@ This is your repo ;)

@@ -5,3 +5,6 @@ const pump = require('pump')

return {
middlewares: [require('cors')()],
middlewares: [
require('cors')(),
require('http-cache-middleware')()
],

@@ -8,0 +11,0 @@ routes: [{

SocketSocket SOC 2 Logo

Product

  • Package Alerts
  • Integrations
  • Docs
  • Pricing
  • FAQ
  • Roadmap
  • Changelog

Packages

npm

Stay in touch

Get open source security insights delivered straight into your inbox.


  • Terms
  • Privacy
  • Security

Made with ⚡️ by Socket Inc