Streaming live at 10am (PST)

Easier way to disallow robots.txt and remove collection pages from sitemap


#1

It would be super-awesome if there was a way to *disallow robots.txt and hide pages from the sitemap at the collection level.

For example, we have a collection of team members who can appear on specific pages, but we'll never have a team member page. It would be great if we didn't have to manually edit and re-upload the sitemap every time we want to change something or post a new blog.


#2
User-agent: *
Disallow: /team-member

#3

I know how to edit the robots.txt file ... I'm asking for an easy on/off switch at the collection level to exclude a subfolder/collection from the sitemap.


#4