Skip to content

Latest commit

 

History

History
59 lines (47 loc) · 2.73 KB

README.md

File metadata and controls

59 lines (47 loc) · 2.73 KB

CS Lint PHP

Eco-Friendly Robots.txt

Optimizes your site's robots.txt to reduce server load and CO2 footprint by blocking unnecessary crawlers while allowing major search engines and specific tools.

Test this plugin on the WordPress playground

Installation

Warning

The plugin will delete your existing robots.txt file, if one exists on your server, although it'll try to back it up. It'll restore it, when it can, when you uninstall the plugin.

Development

If you're developing on this plugin, you will probably want to run tests and lint. You can do that by running the following commands:

  • PHP Code style: composer check-cs
  • PHP Autofixer for code style: composer fix-cs
  • PHP Lint: composer lint
  • PHP Unit tests: composer test

The default output of this plugin is as follows:

# This site is very specific about who it allows crawling from. Our default is to not allow crawling:
User-agent: *
Disallow: /

# Below are the crawlers that are allowed to crawl this site.
# Below that list, you'll find paths that are blocked, even for them, and then paths within those blocked paths that are allowed.
User-agent: Googlebot
User-agent: AdsBot-Google
User-agent: MediaPartners-Google
User-agent: Applebot
User-agent: Yandex
User-agent: Baiduspider
User-agent: Bingbot
User-agent: Slurp
User-agent: DuckDuckBot
User-agent: ia_archiver
User-agent: FacebookExternalHit
User-agent: Twitterbot
User-agent: LinkedInBot
Disallow: /wp-json/
Disallow: /?rest_route=
Disallow: /wp-admin/
Disallow: /wp-content/cache/
Disallow: /wp-content/plugins/
Disallow: /xmlrpc.php
Disallow: /wp-includes/
Allow: /wp-includes/css/
Allow: /wp-includes/js/

# XML Sitemap:
Sitemap: https://example.com/sitemap_index.xml