Latest Updates! 🎉 See the change log for details.
astro-robots
It simplifies SEO management with a reliable robots.txt generator for Astro projects, offering zero-config setup and Verified Bots support.

Installation
This package is compatible with Astro 3.0.0 and above, due to the use of AstroIntegrationLogger.
To automate the installation, use the astro add
command-line tool. You can initialize it via npx
, yarn
, or pnpm
based on your preference.
npx astro add astro-robots
Usage
Manual Configuration
Alternatively, you can manually install it by running the following command in your terminal:
npm install astro-robots
To use this integration, add it to your astro.config.*
file using the integrations property:
import { defineConfig } from "astro/config";
import robots from "astro-robots";
export default defineConfig({
site: "https://example.com";
integrations: [robots()],
});
After installing, run npm run build
or yarn build
in terminal:
npm run build
This will output robots.txt
to the dist
folder with default rules:
User-agent: *
Allow: /
Sitemap: https://example.com/sitemap-index.xml
Getting Started with Reference
To configure the integration, pass an object to the robots()
function in your astro.config.*
file:
import { defineConfig } from "astro/config";
import robots from "astro-robots";
export default defineConfig({
integrations: [
robots({
host: "https://example.com";,
sitemap: [
"https://example.com/sitemap.xml",
"https://www.example.com/sitemap.xml",
],
policy: [
{
userAgent: [
"Applebot",
"Googlebot",
"bingbot",
"Yandex",
"Yeti",
"Baiduspider",
"360Spider",
"*",
],
allow: ["/"],
disallow: ["/admin", "/login"],
crawlDelay: 5,
cleanParam: ["sid /", "s /forum/showthread"],
},
{
userAgent: "BLEXBot",
disallow: ["/assets", "/uploades/1989-08-21/*jpg$"],
},
],
}),
],
});
With the above configuration, the generated robots.txt
file will look like this:
User-agent: Applebot
User-agent: Googlebot
User-agent: bingbot
User-agent: Yandex
User-agent: Yeti
User-agent: Baiduspider
User-agent: 360Spider
User-agent: *
Allow: /
Disallow: /admin
Disallow: /login
Crawl-delay: 5
Clean-param: sid /
Clean-param: s /forum/showthread
User-agent: BLEXBot
Disallow: /assets
Disallow: /uploades/1989-08-21/*jpg$
Sitemap: https://example.com/sitemap.xml
Sitemap: https://www.example.com/sitemap.xml
Host: example.com
Note: Some directives like Host
, Clean-param
, and Crawl-delay
may not be supported by all crawlers. For example, Yandex has ignored Crawl-delay
since February 2018. To control Yandex's crawl rate, use the Site crawl rate setting in Yandex Webmaster.
Contributing
Submit your issues or feedback on our GitHub channel.
License
MIT