Huge News!Announcing our $40M Series B led by Abstract Ventures.Learn More
Socket
Sign inDemoInstall
Socket

metalsmith-robots

Package Overview
Dependencies
Maintainers
2
Versions
3
Alerts
File Explorer

Advanced tools

Socket logo

Install Socket

Detect and block malicious and high-risk dependencies

Install

metalsmith-robots

A metalsmith plugin for generating a robots.txt file

  • 1.1.0
  • latest
  • Source
  • npm
  • Socket score

Version published
Maintainers
2
Created
Source

metalsmith-robots

npm version build status dependency status devdependency status downloads

A metalsmith plugin for generating a robots.txt file

stack overflow slack chat

This plugin allows you to generate a robots.txt file. It accepts global options, and can be triggered from a file's frontmatter with the public and private keywords. Works well with metalsmith-mapsite, as that also accepts setting a page to private from the frontmatter.

For support questions please use stack overflow or the metalsmith slack channel.

Installation

$ npm install metalsmith-robots

Example

Configuration in metalsmith.json:

{
  "plugins": {
    "metalsmith-robots": {
      "useragent": "googlebot",
      "allow": ["index.html", "about.html"],
      "disallow": ["404.html"],
      "sitemap": "https://www.site.com/sitemap.xml"
    }
  }
}

Which will generate the following robots.txt:

User-agent: googlebot
Allow: index.html
Allow: about.html
Disallow: 404.html
Sitemap: https://www.site.com/sitemap.xml

Options

You can pass options to metalsmith-robots with the Javascript API or CLI. The options are:

  • useragent: the useragent - String, default: *
  • allow: an array of the url(s) to allow - Array of Strings
  • disallow: an array of the url(s) to disallow - Array of Strings
  • sitemap: the sitemap url - String
  • urlMangle: mangle paths in allow and disallow - Function

Besides these options, settings public: true or private: true in a file's frontmatter will add that page to the allow or disallow option respectively. metalsmith-robots expects at least one of the last three options, without them it will not generate a robots.txt.

urlMangle

To make sure paths start with a / you can mangle urls that are provided via allow and disallow.

.use(robots({
  urlMangle: (filepath) => {
    return (filepath.slice(0, 1) !== '/') ? `/${filepath}` : filepath;
  }
}))

License

MIT

FAQs

Package last updated on 03 Mar 2018

Did you know?

Socket

Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.

Install

Related posts

SocketSocket SOC 2 Logo

Product

  • Package Alerts
  • Integrations
  • Docs
  • Pricing
  • FAQ
  • Roadmap
  • Changelog

Packages

npm

Stay in touch

Get open source security insights delivered straight into your inbox.


  • Terms
  • Privacy
  • Security

Made with ⚡️ by Socket Inc