Robots for Kirby CMS
-
Updated
Oct 20, 2018 - PHP
Robots for Kirby CMS
python binding for Google robots.txt parser C++ library
Typescript robots.txt parser with support for wildcard (*) matching.
The repository contains Google-based robots.txt parser and matcher as a C++ library (compliant to C++17).
User-Agent parser for robots.txt, X-Robots-tag and Robots-meta-tag rule sets
grobotstxt is a native Go port of Google's robots.txt parser and matcher library.
PowerShell module for reading robots.txt files
Lightweight R wrapper around rep-cpp for robot.txt (Robots Exclusion Protocol) parsing and path testing in R
Parse robots.txt and sitemaps using dotnet
Parsers for robots.txt (aka Robots Exclusion Standard / Robots Exclusion Protocol), Robots Meta Tag, and X-Robots-Tag
Manage the robots.txt from the Kirby config file
RFC 9309 spec compliant robots.txt builder and parser. 🦾 No dependencies, fully typed.
Generate an XML sitemap for a GitHub Pages site using GitHub Actions
Alternative robots parser module for Python
Robots Exclusion Standard/Protocol Parser for Web Crawling/Scraping
Add a description, image, and links to the robots-exclusion-protocol topic page so that developers can more easily learn about it.
To associate your repository with the robots-exclusion-protocol topic, visit your repo's landing page and select "manage topics."