Skip to content

Hugo module which creates robots.txt using the Dark Visitors API.

Notifications You must be signed in to change notification settings

techxplorer/hugo-dark-visitors

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

5 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

hugo-dark-visitors

Give AI company scraper bots a gentle "no" with this Hugo module. Uses the Dark Visitors API to import the latest robots.txt.

Installing

  1. Grab your Dark Visitors API key from the Projects page
  2. Add the API key to the HUGO_DARKVISITORS environment variable
  3. Import the module
  4. Tell Hugo to generate robots.txt
  5. Configure API options (optional)

Import the module in your Hugo config:

module:
  imports:
    - path: github.com/lkhrs/hugo-dark-visitors

Tell Hugo to generate robots.txt by adding this line to your Hugo config:

enableRobotsTXT: true

API options

By default, the module uses the "AI Data Scraper" category. The API supports more categories that you can request by adding them to your Hugo config:

params:
  darkVisitors:
    - AI Assistant
    - AI Data Scraper
    - AI Search Crawler

Customizing robots.txt

You can override the template provided by this module and use a partial to add the rules after your custom directives. Add layouts/robots.txt to the root of your Hugo site and import the partial inside robots.txt:

{{ partial "dark-visitors.html" . }}

Add your custom directives before or after the partial.

Reducing API calls

Here are two methods for reducing API calls in Hugo:

  1. Hugo's file cache (see Configure file caches)
  2. Only import the module on production builds (see Configuration directory and Issue #1)

About

Hugo module which creates robots.txt using the Dark Visitors API.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • HTML 100.0%