
Security News
Browserslist-rs Gets Major Refactor, Cutting Binary Size by Over 1MB
Browserslist-rs now uses static data to reduce binary size by over 1MB, improving memory use and performance for Rust-based frontend tools.
astro-robots
Advanced tools
The lightweight robots.txt generator makes it simple to create compliant robots.txt files specifically for Astro integrations, with zero-config support, an intuitive JSDoc API, and always up-to-date verified bots user-agents.
Latest Updates! 🎉 See the change log for details.
A lightweight robots.txt
generator designed specifically for Astro Integration . With zero-config support, an intuitive JSDoc API, and up-to-date Verified Bots support
Features · Installation · Usage · Configuration · Change Log
Robots.txt file are simple yet crucial for your website's SEO. A single misstep can prevent search engines from accessing important content. With astro-robots
help, you can avoid common misconfigurations and ensure optimal SEO performance.
This package is compatible with Astro 3.0.0 and above, due to the use of AstroIntegrationLogger.
To automate the installation, use the astro add
command-line tool. You can initialize it via npx
, yarn
, or pnpm
based on your preference.
npx astro add astro-robots
Alternatively, you can manually install it by running the following command in your terminal:
npm install astro-robots
To use this integration, add it to your astro.config.*
file using the integrations property:
// astro.config.mjs
import { defineConfig } from "astro/config";
import robots from "astro-robots"; // Add code manually
export default defineConfig({
site: "https://example.com";
integrations: [robots()], // Add code manually
});
After installing, run npm run build
or yarn build
in terminal:
npm run build
This will output robots.txt
to the dist
folder with default rules:
User-agent: *
Allow: /
# crawling rule(s) for above bots
Sitemap: https://example.com/sitemap-index.xml
To configure the integration, pass an object to the robots()
function in your astro.config.*
file:
//....
import robots from "astro-robots";
export default defineConfig({
//...
integrations: [
robots({
host: "https://example.com";,
sitemap: [
"https://example.com/sitemap.xml",
"https://www.example.com/sitemap.xml",
],
policy: [
{
userAgent: [
"Applebot",
"Googlebot",
"bingbot",
"Yandex",
"Yeti",
"Baiduspider",
"360Spider",
"*",
],
allow: ["/"],
disallow: ["/admin", "/login"],
crawlDelay: 5,
cleanParam: ["sid /", "s /forum/showthread"],
},
{
userAgent: "BLEXBot",
disallow: ["/assets", "/uploades/1989-08-21/*jpg$"],
},
],
}),
],
});
With the above configuration, the generated robots.txt
file will look like this:
User-agent: Applebot
User-agent: Googlebot
User-agent: bingbot
User-agent: Yandex
User-agent: Yeti
User-agent: Baiduspider
User-agent: 360Spider
User-agent: *
Allow: /
Disallow: /admin
Disallow: /login
Crawl-delay: 5
Clean-param: sid /
Clean-param: s /forum/showthread
User-agent: BLEXBot
Disallow: /assets
Disallow: /uploades/1989-08-21/*jpg$
# crawling rule(s) for above bots
Sitemap: https://example.com/sitemap.xml
Sitemap: https://www.example.com/sitemap.xml
Host: example.com
Note: Some directives like
Host
,Clean-param
, andCrawl-delay
may not be supported by all crawlers. For example, Yandex has ignoredCrawl-delay
since February 2018. To control Yandex's crawl rate, use the Site crawl rate setting in Yandex Webmaster.
Through the above examples, you must have understood how astro-robots
works. Now, let's learn more about its interface.
Name | Type | Required | Default value | Directive |
---|---|---|---|---|
host | Boolean String | No | false | Host |
sitemap | Boolean String String[] | No | true | Sitemap |
policy[] | Strig[] | No | [{ userAgent: '*', allow: '/' }] | - |
plicy[{userAgent}] | UserAgentType [4] | Yes | - | User-agent |
plicy[{allow}] | String String[] | * | - | Allow |
plicy[{disAllow}] | String String[] | * | - | Disallow |
plicy[{crawlDelay}] | Number | Optional | - | Crawl-delay |
plicy[{cleanParam}] | String String[] | Optional | - | Clean-param |
*
[ Optional ] At least one or moreallow
ordisallow
entries per rule.-
[ Undefinded ] There is no initial value in the default configuration.
type: UserAgentType
(UserAgentType)[]
UnionTypes
Stored are the latest verified bots.UnionTypeArray
Make it work in array mode too.If you still have questions, don't worry! We have powerful JSDoc support, which makes it easy for both SEO experts and novices to manage.
I'm planning to release more practical examples to cater to the mainstream search engine market.
Submit your issues or feedback on our GitHub channel.
Check out the CHANGELOG.md file for a history of changes to this integration.
2.0.1
FAQs
A reliable robots.txt generator for Astro projects, offering zero-config setup and Verified Bots support.
The npm package astro-robots receives a total of 777 weekly downloads. As such, astro-robots popularity was classified as not popular.
We found that astro-robots demonstrated a healthy version release cadence and project activity because the last version was released less than a year ago. It has 0 open source maintainers collaborating on the project.
Did you know?
Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.
Security News
Browserslist-rs now uses static data to reduce binary size by over 1MB, improving memory use and performance for Rust-based frontend tools.
Research
Security News
Eight new malicious Firefox extensions impersonate games, steal OAuth tokens, hijack sessions, and exploit browser permissions to spy on users.
Security News
The official Go SDK for the Model Context Protocol is in development, with a stable, production-ready release expected by August 2025.