NodeJS robots.txt parser with support for wildcard (*) matching.
-
Updated
Oct 28, 2024 - JavaScript
NodeJS robots.txt parser with support for wildcard (*) matching.
Gatsby plugin that automatically creates robots.txt for your site
Parser for robots.txt for node.js
Generator robots.txt for node js
A webpack plugin to generate a robots.txt file
🧑🏻👩🏻 "We are people, not machines" - An initiative to know the creators of a website. Contains the information about humans to the web building - A Nuxt Module to statically integrate and generate a humans.txt author file - Based on the HumansTxt Project.
A lightweight robots.txt parser for Node.js with support for wildcards, caching and promises.
Higher order Next.js config to generate sitemap.xml and robots.txt
Generate a robots.txt file for your Eleventy site
Generate sitemap and robots.txt for NextJS used web hook from STRAPI
Generates a robots.txt
A robots.txt script for Lambda Edge
⚡ JavaScript-aware crawler for security researchers and bug bounty hunters. Extract hidden endpoints and internal subdomains through static and semantic analysis of JS files. Lightweight. Fast. Sneaky.
🤖 Robots.txt generator done right.
🤖 Handle and parse a site's robots.txt file and extract actionable information
nodejs web crawler
🤖 Browser extension to check for and preview a site's robots.txt in a new tab (if it exists)
Chrome extension which blocks urls based on robots.txt (compatible to Chrome 41)
Front-end workflow to start a new project with Eleventy and Webpack.
A tool for debugging robots.txt
Add a description, image, and links to the robots-txt topic page so that developers can more easily learn about it.
To associate your repository with the robots-txt topic, visit your repo's landing page and select "manage topics."