Skip to content

Latest commit

 

History

History
95 lines (75 loc) · 3.13 KB

README.md

File metadata and controls

95 lines (75 loc) · 3.13 KB

NPM version Travis build status AppVeyor build status Dependency Status Development Dependency Status Greenkeeper badge

gatsby-plugin-robots-txt

Create robots.txt for your Gatsby site.

Install

yarn add gatsby-plugin-robots-txt

or

npm install --save gatsby-plugin-robots-txt

How to Use

gatsby-config.js

module.exports = {
  siteMetadata: {
    siteUrl: 'https://www.example.com'
  },
  plugins: ['gatsby-plugin-robots-txt']
};

Options

This plugin uses generate-robotstxt to generate content of robots.txt and it has the following options:

Name Type Default Description
host String ${siteMetadata.siteUrl} Host of your site
sitemap String ${siteMetadata.siteUrl}/sitemap.xml Path to sitemap.xml
policy Policy[] [] List of Policy rules

You can specify any allowed generate-robotstxt options:

gatsby-config.js

module.exports = {
  plugins: [
    {
      resolve: 'gatsby-plugin-robots-txt',
      options: {
        host: 'https://www.example.com',
        sitemap: 'https://www.example.com/sitemap.xml',
        policy: [{ userAgent: '*', allow: '/' }]
      }
    }
  ]
};

Netlify

If you would like to disable crawlers for deploy-previews you can use the following snippet:

gatsby-config.js

const {
  NODE_ENV,
  URL: NETLIFY_SITE_URL = 'https://www.example.com',
  DEPLOY_PRIME_URL: NETLIFY_DEPLOY_URL = NETLIFY_SITE_URL,
  CONTEXT: NETLIFY_ENV = NODE_ENV
} = process.env;
const isNetlifyProduction = NETLIFY_ENV === 'production';
const siteUrl = isNetlifyProduction ? NETLIFY_SITE_URL : NETLIFY_DEPLOY_URL;

module.exports = {
  siteMetadata: {
    siteUrl
  },
  plugins: [
    {
      resolve: 'gatsby-plugin-robots-txt',
      options: isNetlifyProduction
        ? { policy: [{ userAgent: '*' }] }
        : {
            policy: [{ userAgent: '*', disallow: ['/'] }],
            sitemap: null,
            host: null
          }
    }
  ]
};