Skip to content

How should I ... do ... robots.txt? Sitemap? #1996

Answered by kylehodgson
kylehodgson asked this question in Q&A
Discussion options

You must be logged in to vote

I ended up doing it just after build. I did try it with a postBuild task but my CI process (Cloudflare pages) wasn't triggering it, so I just stuffed it in to build.

Package.json:

"build": "observable build && node src/sitemapper.js | npx sitemap > dist/sitemap.xml && cat src/robots.txt > dist/robots.txt"

Sitemapper.js:

import * as of from "../observablehq.config.js";

function formPath(path) {
    return  "https://myiste.tld" + path
}

const site_pages = [];

for await (const value of of.default.dynamicPaths()) {
  site_pages.push(formPath(value));
}

of.default.pages.forEach(node => {
    node.pages.map(subpage => subpage.path.substring(0,4)!=='http' && site_pages.push(formPath(subpage.p…

Replies: 2 comments

Comment options

You must be logged in to vote
0 replies
Comment options

You must be logged in to vote
0 replies
Answer selected by kylehodgson
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
2 participants