Security News
Input Validation Vulnerabilities Dominate MITRE's 2024 CWE Top 25 List
MITRE's 2024 CWE Top 25 highlights critical software vulnerabilities like XSS, SQL Injection, and CSRF, reflecting shifts due to a refined ranking methodology.
static-sitemap-cli
Advanced tools
CLI to generate XML sitemaps for static sites from local filesystem.
Quick and easy CLI to generate XML or
TXT sitemaps by
searching your local filesystem for .html
files. Automatically exclude files containing the
noindex
meta. Can also be used as a Node module.
NOTE: This is the V2 branch. If you're looking for the older version, see the V1 branch. V2 contains breaking changes. Find out what changed on the releases page.
$ npm i -g static-sitemap-cli
$ sscli -b https://example.com -r public
This trawls the public/
directory for files matching **/*.html
, then parses each file for the
noindex
robots meta tag - excluding that file if the tag exists - and finally generates both
sitemap.xml
and sitemap.txt
into the public/
root.
See below for more usage examples.
Usage: sscli [options]
CLI to generate XML sitemaps for static sites from local filesystem
Options:
-b, --base <url> base URL (required)
-r, --root <dir> root working directory (default: ".")
-i, --ignore <glob...> globs to ignore (default: ["404.html"])
-c, --changefreq <glob,changefreq...> comma-separated glob-changefreq pairs
-p, --priority <glob,priority...> comma-separated glob-priority pairs
--no-robots do not parse html files for noindex meta
--concurrent <max> concurrent number of html parsing ops (default: 128)
--no-clean do not use clean URLs
--slash add trailing slash to all URLs
-f, --format <format> sitemap format (choices: "xml", "txt", "both", default: "both")
-o, --stdout output sitemap to stdout instead
-v, --verbose be more verbose
-V, --version output the version number
-h, --help display help for command
By default, all matched files are piped through a fast
HTML parser to detect if the noindex
meta tag is
set - typically in the form of <meta name="robots" content="noindex" />
- in which case that file
is excluded from the generated sitemap. To disable this behaviour, pass option --no-robots
.
For better performance, file reads are streamed in 1kb
chunks, and parsing stops immediately when
either the noindex
meta, or the </head>
closing tag, is detected (the <body>
is not parsed).
This operation is performed concurrently with an
async pool limit of 128. The limit can be tweaked using
the --concurrent
option.
Hides the .html
file extension in sitemaps like so:
./rootDir/index.html -> https://example.com/
./rootDor/foo/index.html -> https://example.com/foo
./rootDor/foo/bar.html -> https://example.com/foo/bar
Enabled by default; pass option --no-clean
to disable.
Adds a trailing slash to all URLs like so:
./rootDir/index.html -> https://example.com/
./rootDir/foo/index.html -> https://example.com/foo/
./rootDir/foo/bar.html -> https://example.com/foo/bar/
Disabled by default; pass option --slash
to enable.
NOTE: Cannot be used together with -no-clean
. Also, trailing slashes are
always added to
root domains.
The -i
flag allows multiple entries. By default, it's set to the ["404.html"]
. Change the glob
ignore patterns to suit your use-case like so:
$ sscli ... -i '404.html' '**/ignore/**' 'this/other/specific/file.html'
The -c
and -p
flags allow multiple entries and accept glob-*
pairs as input. A glob-*
pair
is a comma-separated pair of <glob>,<value>
. For example, a glob-changefreq pair may look like
this:
$ sscli ... -c '**,weekly' 'events/**,daily'
Latter entries override the former. In the above example, paths matching events/**
have a daily
changefreq, while the rest are set to weekly
.
Options can be passed through the sscli
property in package.json
, or through a .ssclirc
JSON
file, or through other standard conventions.
$ sscli -b https://x.com -f txt -o
$ sscli -b https://x.com -r dist -f xml -o > www/sm.xml
$ sscli -b https://x.com/foo -r dist/foo -f txt -o > dist/sitemap.txt
static-sitemap-cli
can also be used as a Node module.
import {
generateUrls,
generateXmlSitemap,
generateTxtSitemap
} from 'static-sitemap-cli'
const options = {
base: 'https://x.com',
root: 'path/to/root',
ignore: ['404.html'],
changefreq: [],
priority: [],
robots: true,
concurrent: 128,
clean: true,
slash: false
}
generateUrls(options).then((urls) => {
const xmlString = generateXmlSitemap(urls)
const txtString = generateTxtSitemap(urls)
...
})
Using the XML sitemap generator by itself:
import { generateXmlSitemap } from 'static-sitemap-cli'
const urls = [
{ loc: 'https://x.com/', lastmod: '2022-02-22' },
{ loc: 'https://x.com/about', lastmod: '2022-02-22' },
...
]
const xml = generateXmlSitemap(urls)
Standard Github contribution workflow applies.
Test specs are at test/spec.js
. To run the tests:
$ npm run test
ISC
Changes are logged in the releases page.
FAQs
CLI to generate XML sitemaps for static sites from local filesystem
The npm package static-sitemap-cli receives a total of 125 weekly downloads. As such, static-sitemap-cli popularity was classified as not popular.
We found that static-sitemap-cli demonstrated a healthy version release cadence and project activity because the last version was released less than a year ago. It has 0 open source maintainers collaborating on the project.
Did you know?
Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.
Security News
MITRE's 2024 CWE Top 25 highlights critical software vulnerabilities like XSS, SQL Injection, and CSRF, reflecting shifts due to a refined ranking methodology.
Security News
In this segment of the Risky Business podcast, Feross Aboukhadijeh and Patrick Gray discuss the challenges of tracking malware discovered in open source softare.
Research
Security News
A threat actor's playbook for exploiting the npm ecosystem was exposed on the dark web, detailing how to build a blockchain-powered botnet.