Give AI company scraper bots a gentle "no" with this Hugo module. Uses the Dark Visitors API to import the latest robots.txt
.
- Grab your Dark Visitors API key from the Projects page
- Add the API key to the
HUGO_DARKVISITORS
environment variable - Import the module
- Tell Hugo to generate
robots.txt
- Configure API options (optional)
Import the module in your Hugo config:
module:
imports:
- path: github.com/lkhrs/hugo-dark-visitors
Tell Hugo to generate robots.txt
by adding this line to your Hugo config:
enableRobotsTXT: true
By default, the module uses the "AI Data Scraper" category. The API supports more categories that you can request by adding them to your Hugo config:
params:
darkVisitors:
- AI Assistant
- AI Data Scraper
- AI Search Crawler
You can override the template provided by this module and use a partial to add the rules after your custom directives. Add layouts/robots.txt
to the root of your Hugo site and import the partial inside robots.txt
:
{{ partial "dark-visitors.html" . }}
Add your custom directives before or after the partial.
Here are two methods for reducing API calls in Hugo:
- Hugo's file cache (see Configure file caches)
- Only import the module on production builds (see Configuration directory and Issue #1)