
Security News
Deno 2.2 Improves Dependency Management and Expands Node.js Compatibility
Deno 2.2 enhances Node.js compatibility, improves dependency management, adds OpenTelemetry support, and expands linting and task automation for developers.
trell-next-boost
Advanced tools
Add a cache layer for next.js SSR pages. Use stale-while-revalidate to boost the performance.
next-boost
adds a cache layer to your SSR (Server-Side Rendering) applications. It was built originally for Next.js
and should work with any node.js http.Server
based application.
next-boost
achieves great performance by rendering webpages on worker_threads
while serving the cached on the main thread.
If you are familiar with Next.js
, next-boost
can be considered as an implementation of Incremental Static Regeneration which works with getServerSideProps
. And it's not meant to be used with getStaticProps
, in which Next.js will do the cache for you.
$ npm install next-boost --save
next start
worker_threads
for SSRnext-boost
cli with Next.jsAfter install the package, just change the start script from next start
to next-boost
. All next start
's command line arguments, like -p
for specifing the port, are compatible.
"scripts": {
...
"start": "next-boost", // previously `next start`
...
},
There's an example under examples/nodejs
, which works with a plain http.Server
.
To use it with express.js
and next.js
, please check examples/with-express
.
By using worker_threads
, the CPU-heavy SSR rendering will not blocking the main process from serving the cache.
Here are the comparision of using ApacheBench
on a blog post fetched from database. HTML prerendered and the db operation takes around 10~20ms. The page takes around 200ms for Next.js to render.
$ /usr/local/bin/ab -n 200 -c 8 http://127.0.0.1:3000/blog/posts/2020/3/postname
Not a scientific benchmark, but the improvements are visibly huge.
with next start
(data fetched with getServerSideProps
):
Document Length: 76424 bytes
Concurrency Level: 8
Time taken for tests: 41.855 seconds
Complete requests: 200
Failed requests: 0
Total transferred: 15325600 bytes
HTML transferred: 15284800 bytes
Requests per second: 4.78 [#/sec] (mean)
Time per request: 1674.185 [ms] (mean)
Time per request: 209.273 [ms] (mean, across all concurrent requests)
Transfer rate: 357.58 [Kbytes/sec] received
with the drop-in next-boost
cli:
Document Length: 78557 bytes
Concurrency Level: 8
Time taken for tests: 0.149 seconds
Complete requests: 200
Failed requests: 0
Total transferred: 15747600 bytes
HTML transferred: 15711400 bytes
Requests per second: 1340.48 [#/sec] (mean)
Time per request: 5.968 [ms] (mean)
Time per request: 0.746 [ms] (mean, across all concurrent requests)
Transfer rate: 103073.16 [Kbytes/sec] received
It even outperforms next.js's static generated page (getStaticProps
), handling 2~2.5x requests per seconds in my environment.
next-boost
implements a server-side cache in the manner of stale-while-revalidate. When an expired (stale
) page is accessed, the cache will be served and at the same time, a background process will fetch the latest version (revalidate
) of that page and save it to the cache.
The following config will cache URIs matching ^/blog.*
. Only pages match rules
will be handled by next-boost
and there's no exclude
rules.
module.exports = {
rules: [
{ regex: '^/blog.*', ttl: 300 },
],
}
There are 2 parameters to control the behavior of the cache:
ttl (time-to-live)
: After ttl
seconds, the cache will be revalidated. And a cached page's ttl
will be updated when a page is revalidated.tbd (time-before-deletion)
: When a page is not hit again in ttl + tbd
seconds, it will be completely remove from cache.Above: only caching pages with URL start with /blog
.
By sending a GET with header x-cache-status:update
to the URL, the cache will be revalidated. And if the page doesn't exists anymore, the cache will be deleted.
curl -H x-cache-status:update https://the_server_name.com/path_a
If you want to delete mutiple pages at once, you can run SQL on the cache directly:
sqlite3 /cache_path/cache.db "update cache set ttl=0 where key like '%/url/a%';"
This will force all urls containing /url/a
to be revalidated when next time accessed.
Deleting cache_path
will remove all the caches.
By default, each page with different URLs will be cached separately. But in some cases you would like, /path_a?utm_source=twitter
to be served with the same contents of /path_a
. paramFilter
is for filtering the query parameters.
// in .next-boost.js
{
...
paramFilter: (p) => p !== 'utm_source'
}
By default, the URL will be used as the key for cached pages. If you want to server pages from different domains or by different user-agent, you can use this function to custom the cache key.
Notes:
string
, your server will crash.// in .next-boost.js
{
...
cacheKey: (req) => (req.headers.host || '') + ':' + req.url
}
If available, .next-boost.js
at project root will be used. If you use next-boost programmatically, the filename can be changed in options you passed to CachedHandler
.
tips: If you are using next-boost
cli with Next.js, you may want to use the config file.
And here's an example .next-boost.sample.js
in the repo.
interface HandlerConfig {
filename?: string
quiet?: boolean
cache?: {
ttl?: number
tbd?: number
path?: string
}
rules?: Array<URLCacheRule>
paramFilter?: ParamFilter
cacheKey?: CacheKeyBuilder
}
interface URLCacheRule {
regex: string
ttl: number
}
type ParamFilter = (param: string) => boolean
type CacheKeyBuilder = (req: IncomingMessage) => string
Logging is enabled by default. If you use next-boost
programmatically, you can disable logs by passing the quiet
boolean flag as an option to CachedHandler
.
...
const cached = await CachedHandler(args, { quiet: true });
...
There's also a --quiet
flag if you are using the command line.
next-boost
is limited. Until the url is hit on every backend server, it can still miss the cache. Use reverse proxy with cache support (nginx, varnish etc) for that.GET
and HEAD
requests only.worker_threads
is used and it is a node.js 12+ feature.next-boost
works as an in-place replacement for next start
by using Next.js's custom server feature.
On the linked page above, you can see the following notice:
Before deciding to use a custom server please keep in mind that it should only be used when the integrated router of Next.js can't meet your app requirements. A custom server will remove important performance optimizations, like serverless functions and Automatic Static Optimization.
next-boost is meant to be used on cloud VPS or containers, so serverless function is not an issue here. As to Automatic Static Optimization
, because we are not doing any app.render
here, it still works, as perfect as always.
Here's the article about when not to use SQLite. And for next-boost's main purpuse: super faster SSR on low-cost VPSs, as far as I know, it is the best choice.
MIT. Copyright 2020 Rakuraku Jyo.
FAQs
Add a cache layer for next.js SSR pages. Use stale-while-revalidate to boost the performance.
The npm package trell-next-boost receives a total of 0 weekly downloads. As such, trell-next-boost popularity was classified as not popular.
We found that trell-next-boost demonstrated a not healthy version release cadence and project activity because the last version was released a year ago. It has 1 open source maintainer collaborating on the project.
Did you know?
Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.
Security News
Deno 2.2 enhances Node.js compatibility, improves dependency management, adds OpenTelemetry support, and expands linting and task automation for developers.
Security News
React's CRA deprecation announcement sparked community criticism over framework recommendations, leading to quick updates acknowledging build tools like Vite as valid alternatives.
Security News
Ransomware payment rates hit an all-time low in 2024 as law enforcement crackdowns, stronger defenses, and shifting policies make attacks riskier and less profitable.