Block AI bots
robots.txt
Prerequisites
You will need to have an Astro project set up. If you don’t have one yet, you can follow the “Install Astro” guide to create one.
Installation
-
astro-ai-robots-txt
is an Astro integration. Install it by running the following command in your terminal:Terminal window npx astro add astro-ai-robots-txtTerminal window pnpm astro add astro-ai-robots-txtTerminal window yarn astro add astro-ai-robots-txt -
That’s it. When you build your site, the integration will add a rule to
robots.txt
to disallow a known list of AI crawlers.
How does this work?
This integration sources a list of known AI crawlers from the ai-robots-txt
project on GitHub.
These are added to a robots.txt
file in your site’s build output, telling the crawlers you don’t want them to access your site.
If you already have a robots.txt
file in your Astro project, the blocklist will be added to that.
Does this work?
Yes and no.
A robots.txt
file cannot guarantee no-one will scrape your site, but it is a standard approach to controlling scraping behaviour that some AI bots do respect.
The ai-robots-txt
repository contains more about which crawlers respect robots.txt
and what else you can do to block crawlers.