Security News
Maven Central Adds Sigstore Signature Validation
Maven Central now validates Sigstore signatures, making it easier for developers to verify the provenance of Java packages.
crawler-prerender
Advanced tools
This module make easier to manage SEO for SPAs(single page applications). It solves two problems associated with SPA SEO management.
<meta>
tags, and page title in advance for different routes.In general, the package crawls your web pages, get the Javascript to generate content, runs it, then saves the HTML content to show to search engine crawlers. When a search engine crawls your website, it the serves it the prerendered HTML. Normal clients will still receives normal SPA content.
The npm package has two components
This function generates HTML from your page Javascript and saves it to the file system. All you need to do is pass a path to the resource, and it will generate the HTML.
This middleware will detect traffic from search engine crawlers and it serves them prerendered HTML rather than SPA javascript page. If the path is not yet prerendered, it will return HTTP 503 error code, then prerenders the path.
NB: The middleware only works with Express.js
The front end script consists of two functions.
The function to set page title, meta keywords and meta description
The prerender will wait for the javascript to finish rendering the page contents. This function will notify it when that happens
$ npm install crawler-prerender
(async function () {
const express = require('express');
const crawlerPrerender = require('crawler-prerender');
// getting the middleware
const crawlerPrerenderOptions = { siteUrl: 'http://example.com/8080' };
const { middleware } = await crawlerPrerender(crawlerPrerenderOptions);
const app = express();
// define your api routes and middlewares here
// mount static middleware before the crawler-prerendere middleware
app.use(express.static('/path/to/static/root/directory', { index: false })); // put index: false to avoid issues prerendering the homepage
app.get('*', middleware);
// serving your SPA
app.get('*', function(req, res) {
res.sendFile('/path/to/static/root/directory/index.html');
});
})();
By default, it will overwrite the prerendered contents of the path
const crawlerPrerender = require('crawler-prerender');
const options = { siteUrl: 'www.example.com' };
const { prerender } = await crawlerPrerender(options);
const path = '/products/1234'; // absolute path
await prerender(path);
You can also access the prerender
function as follows
crawlerPrerender.prerender('/some-path');
NB: You can only access prerender that way after passing options
You can prevent the prerender
function from overwriting the path's prerendered contents. This is useful when you want to make sure that all the paths are prerendered every time you startup the application, but you do not want to waste resources when the paths are already prerendered.
prerender(path, { overwrite: false });
siteUrl |
The base URL of your web app
Ex: www.example.com Required |
basePath |
The directory to save prerendered html
Default: `${process.cwd()}/crawler-prerender`
|
prerenderOnTimeout |
The prerender will wait for a period of 30s for the page to finish rendering. If it is not rendered by then, it will either prerender without waiting, or raise an error.
Default: false
|
prerenderOnError |
A function to be called when an error occurs while prerendering.function(path, errors) {
// code
} path — the path that was being prerendered. errors — an array of all the errors that occured during retries NB: This function will only be called when prerender is called because a crawler tried to crawl a path that was not prerendered.
Default: The default function will just print the errors on the console. It's not wise to use this in production, since you won't be always looking at your console. |
<script defer src="https://cdn.jsdelivr.net/gh/xaviertm/crawler-prerender@v0.1.0/crawler-prerender.min.js"></script>
Include the above script in your application
const title = 'My Page Title | My Site';
const description = "My meta page description";
const keywords = "seo, page, keywords";
const meta_data = { title, keywords, description };
CrawlerPrerender.initMetaData(meta_data);
CrawlerPrerender.sendRenderingCompleteEvent();
FAQs
Prerendering for Single Page Applications to improve SEO
The npm package crawler-prerender receives a total of 0 weekly downloads. As such, crawler-prerender popularity was classified as not popular.
We found that crawler-prerender demonstrated a not healthy version release cadence and project activity because the last version was released a year ago. It has 1 open source maintainer collaborating on the project.
Did you know?
Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.
Security News
Maven Central now validates Sigstore signatures, making it easier for developers to verify the provenance of Java packages.
Security News
CISOs are racing to adopt AI for cybersecurity, but hurdles in budgets and governance may leave some falling behind in the fight against cyber threats.
Research
Security News
Socket researchers uncovered a backdoored typosquat of BoltDB in the Go ecosystem, exploiting Go Module Proxy caching to persist undetected for years.