Optimizes your site's robots.txt to reduce server load and CO2 footprint by blocking unnecessary crawlers while allowing major search engines and specific tools.
Test this plugin on the WordPress playground
- Download the plugin right here and install it.
Warning
The plugin will delete your existing robots.txt
file, if one exists on your server, although it'll try to back it up. It'll restore it, when it can, when you uninstall the plugin.
If you're developing on this plugin, you will probably want to run tests and lint. You can do that by running the following commands:
- PHP Code style:
composer check-cs
- PHP Autofixer for code style:
composer fix-cs
- PHP Lint:
composer lint
- PHP Unit tests:
composer test
The default output of this plugin is as follows:
# This site is very specific about who it allows crawling from.
# Our default is to not allow crawling:
User-agent: *
Disallow: /
# Below are the crawlers that are allowed to crawl this site.
# Below that list, you'll find paths that are blocked, even for them,
# and then paths within those blocked paths that are allowed.
User-agent: Applebot
User-agent: ia_archiver
User-agent: Baiduspider
User-agent: Bingbot
User-agent: DuckDuckBot
User-agent: Googlebot
User-agent: AdsBot-Google
User-agent: MediaPartners-Google
User-agent: Yandex
User-agent: Slurp
User-agent: FacebookExternalHit
User-agent: LinkedInBot
User-agent: WhatsApp
User-agent: Twitterbot
Allow: /
Disallow: /wp-includes/css/
Disallow: /wp-includes/js/
Allow: /wp-json/
Allow: /?rest_route=
Allow: /wp-admin/
Allow: /wp-content/cache/
Allow: /wp-content/plugins/
Allow: /xmlrpc.php
Allow: /wp-includes/
# XML Sitemap:
Sitemap: https://example.com/sitemap_index.xml