New Case Study:See how Anthropic automated 95% of dependency reviews with Socket.Learn More
Socket
Sign inDemoInstall
Socket

astro-robots-txt

Package Overview
Dependencies
Maintainers
1
Versions
32
Alerts
File Explorer

Advanced tools

Socket logo

Install Socket

Detect and block malicious and high-risk dependencies

Install

astro-robots-txt - npm Package Compare versions

Comparing version 0.1.9 to 0.1.10

4

package.json
{
"name": "astro-robots-txt",
"version": "0.1.9",
"version": "0.1.10",
"description": "Generate a robots.txt for Astro",

@@ -68,3 +68,3 @@ "keywords": [

},
"readme": "# astro-robots-txt\n\nThis **[Astro integration][astro-integration]** generates a _robots.txt_ for your Astro project during build.\n\n[![License: MIT](https://img.shields.io/badge/License-MIT-green.svg)](https://opensource.org/licenses/MIT)\n\n------\nThe _robots.txt_ file informs search engines which pages on your website should be crawled. [See Google's own advice on robots.txt](https://developers.google.com/search/docs/advanced/robots/intro) to learn more.\n\n## Why astro-robots-txt?\n\nFor Astro project you usually create the _robots.txt_ in a text editor and place it to the `public/` directory.\nIn that case you must manually synchronize `site` option in _astro.config.*_ with `Sitemap:` record in _robots.txt_. \nIt brakes DRY principle. \n\nSometimes, especially during development, it's needed to prevent your site from being indexed. To achieve this you need place meta tag `<meta name=\"robots\" content=\"noindex\">` in the `<head>` section of pages or add `X-Robots-Tag: noindex` in HTTP header response, then add lines `User-agent: *` and `Disallow: \\` to _robots.txt_. \nAgain you do it manually in two separate places.\n\n**astro-robots-txt** could help in both two cases on the _robots.txt_ side. See details in the demo [repo](https://github.com/alextim/astro-robots-txt/tree/main/demo).\n\n------\n\n## Installation\n\nIf you run into any hiccups, [feel free to log an issue on my GitHub](https://github.com/alextim/astro-robots-txt/issues).\n\n### Install dependencies\n\nFirst, install the **astro-robots-txt** integration like so:\n\n```sh\n#npm\nnpm install --save-dev astro-robots-txt\n\n#yarn\nyarn add -D astro-robots-txt\n\n#pnpm\npnpm add -D astro-robots-txt\n```\n\n## Getting started\n\nThe `astro-robots-txt` integration requires a deployment / site URL for generation. Add your site's URL under your _astro.config.*_ using the `site` property. \n\n:exclamation: Provide the `experimental` property to your _astro.config.*_, because only official **@astrojs/\\*** integrations are currently supported by Astro. Set the `experimental.integrations` value to `true`. \n\nThen, apply this integration to your _astro.config.*_ file using the `integrations` property. \n\n**astro.config.mjs**\n\n```js\n// astro.config.mjs\nimport { defineConfig } from 'astro/config';\nimport robotsTxt from 'astro-robots-txt';\n\nexport default defineConfig({\n // ...\n site: 'https://example.com',\n // Important!\n // Only official '@astrojs/*' integrations are currently supported by Astro. \n // Add 'experimental.integrations: true' to make 'astro-robots-txt' working with 'astro build' command.\n experimental: {\n integrations: true,\n }, \n integrations: [robotsTxt()],\n});\n```\n\nNow, [build your site for production](https://docs.astro.build/en/reference/cli-reference/#astro-build) via the `astro build` command. You should find your _robots.txt_ under `dist/robots.txt`!\n\nThe _robots.txt_'s content will be\n\n```text\nUser-agent: *\nAllow: /\nSitemap: https://example.com/sitemap.xml\n```\n\nYou can also check our [Astro Integration Documentation][astro-integration] for more on integrations.\n\n## Configuration\n\n## Options\n\n| Name | Type | Default | Description |\n| :-------: | :-----------------------------: | :------------------------------: | :------------------------------------------------------------------: |\n| `host` | `String` | `` | Host of your site |\n| `sitemap` | `Boolean / String` / `String[]` | `true` | Resulting output will be `Sitemap: your-site-url/sitemap.xml` |\n| | | | If `sitemap: false` - no `Sitemap` line in the output |\n| | | | You could use for `sitemap` valid url string or array of url strings |\n| `policy` | `Policy[]` | [{ allow: '/', userAgent: '*' }] | List of `Policy` rules |\n\n### Policy\n\n| Name | Type | Required | Description |\n| :----------: | :-------------------: | :------: | :---------------------------------------------: |\n| `userAgent` | `String` | Yes | You must provide name of user agent or wildcard |\n| `disallow` | `String` / `String[]` | No | disallowed paths |\n| `allow` | `String` / `String[]` | No | allowed paths |\n| `crawlDelay` | `Number` | No | |\n| `cleanParam` | `String` / `String[]` | No | |\n\n**Sample of _astro.config.mjs_**\n\n```js\n// astro.config.mjs\nimport { defineConfig } from 'astro/config';\nimport robotsTxt from 'astro-robots-txt';\n\nexport default defineConfig({\n site: 'https://example.com',\n experimental: {\n integrations: true,\n }, \n integrations: [\n robotsTxt({\n host: 'example.com',\n sitemap: [\n 'https://example.com/main-sitemap.xml', \n 'https://example.com/images-sitemap.xml'\n ],\n policy: [\n {\n userAgent: 'Googlebot',\n allow: '/',\n disallow: ['/search'],\n crawlDelay: 2,\n },\n {\n userAgent: 'OtherBot',\n allow: ['/allow-for-all-bots', '/allow-only-for-other-bot'],\n disallow: ['/admin', '/login'],\n crawlDelay: 2,\n },\n {\n userAgent: '*',\n allow: '/',\n disallow: '/search',\n crawlDelay: 10,\n cleanParam: 'ref /articles/',\n },\n ],\n }),\n ],\n});\n```\n\n:exclamation: Important Notes\n\nOnly official **@astrojs/\\*** integrations are currently supported by Astro. \n\nThere are two possibilities to make **astro-robots-txt** integration working with current version of Astro. \n\nSet the `experimental.integrations` option to `true` in your _astro.config.*_.\n\n```js\n// astro.config.mjs\nexport default defineConfig({\n // ...\n experimental: {\n integrations: true,\n }, \n});\n```\n\nOr use the `--experimental-integrations` flag for build command. \n\n```sh\nastro build --experimental-integrations\n```\n\n[astro-integration]: https://docs.astro.build/en/guides/integrations-guide/\n\n**Inspirations:**\n\n- [gatsby-plugin-robots-txt](https://github.com/mdreizin/gatsby-plugin-robots-txt)\n- [generate-robotstxt](https://github.com/itgalaxy/generate-robotstxt)\n- [is-valid-hostname](https://github.com/miguelmota/is-valid-hostname)\n"
"readme": "# astro-robots-txt\n\nThis **[Astro integration][astro-integration]** generates a _robots.txt_ for your Astro project during build.\n\n[![License: MIT](https://img.shields.io/badge/License-MIT-green.svg)](https://opensource.org/licenses/MIT)\n\n------\nThe _robots.txt_ file informs search engines which pages on your website should be crawled. [See Google's own advice on robots.txt](https://developers.google.com/search/docs/advanced/robots/intro) to learn more.\n\n## Why astro-robots-txt?\n\nFor Astro project you usually create the _robots.txt_ in a text editor and place it to the `public/` directory.\nIn that case you must manually synchronize `site` option in _astro.config.*_ with `Sitemap:` record in _robots.txt_. \nIt brakes DRY principle. \n\nSometimes, especially during development, it's needed to prevent your site from being indexed. To achieve this you need place meta tag `<meta name=\"robots\" content=\"noindex\">` in the `<head>` section of pages or add `X-Robots-Tag: noindex` in HTTP header response, then add lines `User-agent: *` and `Disallow: \\` to _robots.txt_. \nAgain you do it manually in two separate places.\n\n**astro-robots-txt** could help in both two cases on the _robots.txt_ side. See details in the demo [repo](https://github.com/alextim/astro-robots-txt/tree/main/demo).\n\n------\n\n## Installation\n\nIf you run into any hiccups, [feel free to log an issue on my GitHub](https://github.com/alextim/astro-robots-txt/issues).\n\n### Install dependencies\n\nFirst, install the **astro-robots-txt** integration like so:\n\n```sh\n#npm\nnpm install --save-dev astro-robots-txt\n\n#yarn\nyarn add -D astro-robots-txt\n\n#pnpm\npnpm add -D astro-robots-txt\n```\n\n## Getting started\n\nThe `astro-robots-txt` integration requires a deployment / site URL for generation. Add your site's URL under your _astro.config.*_ using the `site` property. \n\n:exclamation: Provide the `experimental` property to your _astro.config.*_, because only official **@astrojs/\\*** integrations are currently supported by Astro. Set the `experimental.integrations` value to `true`. \n\nThen, apply this integration to your _astro.config.*_ file using the `integrations` property. \n\n**astro.config.mjs**\n\n```js\n// astro.config.mjs\nimport { defineConfig } from 'astro/config';\nimport robotsTxt from 'astro-robots-txt';\n\nexport default defineConfig({\n // ...\n site: 'https://example.com',\n // Important!\n // Only official '@astrojs/*' integrations are currently supported by Astro. \n // Add 'experimental.integrations: true' to make 'astro-robots-txt' working\n // with 'astro build' command.\n experimental: {\n integrations: true,\n }, \n integrations: [robotsTxt()],\n});\n```\n\nNow, [build your site for production](https://docs.astro.build/en/reference/cli-reference/#astro-build) via the `astro build` command. You should find your _robots.txt_ under `dist/robots.txt`!\n\nThe _robots.txt_'s content will be\n\n```text\nUser-agent: *\nAllow: /\nSitemap: https://example.com/sitemap.xml\n```\n\nYou can also check our [Astro Integration Documentation][astro-integration] for more on integrations.\n\n## Configuration\n\n## Options\n\n| Name | Type | Default | Description |\n| :-------: | :-----------------------------: | :------------------------------: | :------------------------------------------------------------------: |\n| `host` | `String` | `` | Host of your site |\n| `sitemap` | `Boolean / String` / `String[]` | `true` | Resulting output will be `Sitemap: your-site-url/sitemap.xml` |\n| | | | If `sitemap: false` - no `Sitemap` line in the output |\n| | | | You could use for `sitemap` valid url string or array of url strings |\n| `policy` | `Policy[]` | [{ allow: '/', userAgent: '*' }] | List of `Policy` rules |\n\n### Policy\n\n| Name | Type | Required | Description |\n| :----------: | :-------------------: | :------: | :---------------------------------------------: |\n| `userAgent` | `String` | Yes | You must provide name of user agent or wildcard |\n| `disallow` | `String` / `String[]` | No | disallowed paths |\n| `allow` | `String` / `String[]` | No | allowed paths |\n| `crawlDelay` | `Number` | No | |\n| `cleanParam` | `String` / `String[]` | No | |\n\n**Sample of _astro.config.mjs_**\n\n```js\n// astro.config.mjs\nimport { defineConfig } from 'astro/config';\nimport robotsTxt from 'astro-robots-txt';\n\nexport default defineConfig({\n site: 'https://example.com',\n experimental: {\n integrations: true,\n }, \n integrations: [\n robotsTxt({\n host: 'example.com',\n sitemap: [\n 'https://example.com/main-sitemap.xml', \n 'https://example.com/images-sitemap.xml'\n ],\n policy: [\n {\n userAgent: 'Googlebot',\n allow: '/',\n disallow: ['/search'],\n crawlDelay: 2,\n },\n {\n userAgent: 'OtherBot',\n allow: ['/allow-for-all-bots', '/allow-only-for-other-bot'],\n disallow: ['/admin', '/login'],\n crawlDelay: 2,\n },\n {\n userAgent: '*',\n allow: '/',\n disallow: '/search',\n crawlDelay: 10,\n cleanParam: 'ref /articles/',\n },\n ],\n }),\n ],\n});\n```\n\nif you want your _robots.txt_ without `Sitemap: ...` record please set `sitemap` option to `false`.\n\n```js\n// astro.config.mjs\nimport { defineConfig } from 'astro/config';\nimport robotsTxt from 'astro-robots-txt';\n\nexport default defineConfig({\n // ...\n site: 'https://example.com',\n experimental: {\n integrations: true,\n }, \n integrations: [\n robotsTxt({\n sitemap: false,\n // ...\n }),\n ],\n});\n```\n\n:exclamation: Important Notes\n\nOnly official **@astrojs/\\*** integrations are currently supported by Astro. \n\nThere are two possibilities to make **astro-robots-txt** integration working with current version of Astro. \n\nSet the `experimental.integrations` option to `true` in your _astro.config.*_.\n\n```js\n// astro.config.mjs\nexport default defineConfig({\n // ...\n experimental: {\n integrations: true,\n }, \n});\n```\n\nOr use the `--experimental-integrations` flag for build command. \n\n```sh\nastro build --experimental-integrations\n```\n\n[astro-integration]: https://docs.astro.build/en/guides/integrations-guide/\n\n**Inspirations:**\n\n- [gatsby-plugin-robots-txt](https://github.com/mdreizin/gatsby-plugin-robots-txt)\n- [generate-robotstxt](https://github.com/itgalaxy/generate-robotstxt)\n- [is-valid-hostname](https://github.com/miguelmota/is-valid-hostname)\n"
}

@@ -62,3 +62,4 @@ # astro-robots-txt

// Only official '@astrojs/*' integrations are currently supported by Astro.
// Add 'experimental.integrations: true' to make 'astro-robots-txt' working with 'astro build' command.
// Add 'experimental.integrations: true' to make 'astro-robots-txt' working
// with 'astro build' command.
experimental: {

@@ -150,2 +151,24 @@ integrations: true,

if you want your _robots.txt_ without `Sitemap: ...` record please set `sitemap` option to `false`.
```js
// astro.config.mjs
import { defineConfig } from 'astro/config';
import robotsTxt from 'astro-robots-txt';
export default defineConfig({
// ...
site: 'https://example.com',
experimental: {
integrations: true,
},
integrations: [
robotsTxt({
sitemap: false,
// ...
}),
],
});
```
:exclamation: Important Notes

@@ -152,0 +175,0 @@

SocketSocket SOC 2 Logo

Product

  • Package Alerts
  • Integrations
  • Docs
  • Pricing
  • FAQ
  • Roadmap
  • Changelog

Packages

npm

Stay in touch

Get open source security insights delivered straight into your inbox.


  • Terms
  • Privacy
  • Security

Made with ⚡️ by Socket Inc