astro-robots-txt
This Astro integration generates a robots.txt for your Astro project during build.

The robots.txt file informs search engines which pages on your website should be crawled. See Google's own advice on robots.txt to learn more.
Why astro-robots-txt?
For Astro project you usually create the robots.txt in a text editor and place it to the public/
directory.
In that case you must manually synchronize site
option in astro.config.* with Sitemap:
record in robots.txt.
It brakes DRY principle.
Sometimes, especially during development, it's needed to prevent your site from being indexed. To achieve this you need place meta tag <meta name="robots" content="noindex">
in the <head>
section of pages or add X-Robots-Tag: noindex
in HTTP header response, then add lines User-agent: *
and Disallow: \
to robots.txt.
Again you do it manually in two separate places.
astro-robots-txt could help in both two cases on the robots.txt side. See details in the demo repo.
Installation
If you run into any hiccups, feel free to log an issue on my GitHub.
Install dependencies
First, install the astro-robots-txt integration like so:
npm install --save-dev astro-robots-txt
yarn add -D astro-robots-txt
pnpm add -D astro-robots-txt
Getting started
The astro-robots-txt
integration requires a deployment / site URL for generation. Add your site's URL under your astro.config.* using the site
property.
:exclamation: Provide the experimental
property to your astro.config.*, because only official @astrojs/* integrations are currently supported by Astro. Set the experimental.integrations
value to true
.
Then, apply this integration to your astro.config.* file using the integrations
property.
astro.config.mjs
import { defineConfig } from 'astro/config';
import robotsTxt from 'astro-robots-txt';
export default defineConfig({
site: 'https://example.com',
experimental: {
integrations: true,
},
integrations: [robotsTxt()],
});
Now, build your site for production via the astro build
command. You should find your robots.txt under dist/robots.txt
!
The robots.txt's content will be
User-agent: *
Allow: /
Sitemap: https://example.com/sitemap.xml
You can also check our Astro Integration Documentation for more on integrations.
Configuration
Options
Name | Type | Default | Description |
---|
host | String | `` | Host of your site |
sitemap | Boolean / String / String[] | true | Resulting output will be Sitemap: your-site-url/sitemap.xml |
| | | If sitemap: false - no Sitemap line in the output |
| | | You could use for sitemap valid url string or array of url strings |
policy | Policy[] | [{ allow: '/', userAgent: '*' }] | List of Policy rules |
Policy
Name | Type | Required | Description |
---|
userAgent | String | Yes | You must provide name of user agent or wildcard |
disallow | String / String[] | No | disallowed paths |
allow | String / String[] | No | allowed paths |
crawlDelay | Number | No | |
cleanParam | String / String[] | No | |
Sample of astro.config.mjs
import { defineConfig } from 'astro/config';
import robotsTxt from 'astro-robots-txt';
export default defineConfig({
site: 'https://example.com',
experimental: {
integrations: true,
},
integrations: [
robotsTxt({
host: 'example.com',
sitemap: [
'https://example.com/main-sitemap.xml',
'https://example.com/images-sitemap.xml'
],
policy: [
{
userAgent: 'Googlebot',
allow: '/',
disallow: ['/search'],
crawlDelay: 2,
},
{
userAgent: 'OtherBot',
allow: ['/allow-for-all-bots', '/allow-only-for-other-bot'],
disallow: ['/admin', '/login'],
crawlDelay: 2,
},
{
userAgent: '*',
allow: '/',
disallow: '/search',
crawlDelay: 10,
cleanParam: 'ref /articles/',
},
],
}),
],
});
if you want your robots.txt without Sitemap: ...
record please set sitemap
option to false
.
import { defineConfig } from 'astro/config';
import robotsTxt from 'astro-robots-txt';
export default defineConfig({
site: 'https://example.com',
experimental: {
integrations: true,
},
integrations: [
robotsTxt({
sitemap: false,
}),
],
});
:exclamation: Important Notes
Only official @astrojs/* integrations are currently supported by Astro.
There are two possibilities to make astro-robots-txt integration working with current version of Astro.
Set the experimental.integrations
option to true
in your astro.config.*.
export default defineConfig({
experimental: {
integrations: true,
},
});
Or use the --experimental-integrations
flag for build command.
astro build --experimental-integrations
Inspirations: