Robots.txt to block subdomain when using WordPress Plugin?

Hey everyone,

I’m currently pulling in webflow pages onto my WordPress site using the plugin.

In order to remove the ‘made in webflow’ badge, I’ve needed to put my webflow pages live on a sub domain.

My worry is that Google could index both the live page, and the subdomain version, which could damage my SEO.

Is there an easy way to use Robots.txt to block Google crawling the subdomain? I see in Webflow there’s an option to add robots.txt settings, but not 100% sure what I should write here to ensure I block only the subdomain version, and not live.

Hopefully my question makes sense, so please ask if you have any questions, and thank you in advance for the help.

Ian

1 Like

Not sure if you got this answered, but I had the same question and did some digging. The simple answer is you are okay to disable Webflow subdomain indexing.

As far as I understand this, WordPress only uses the Webflow pages you directly tell it to. This does not include the robots.txt file. If you go to yoursite.webflow.io/robots.txt, the robots.txt file should display:

User-agent: *
Disallow: /

…which means the subdomain is marked as noindex. But if you go to yoursite.com/robots.txt, it
should use the robots.txt file generated through WordPress, something like:

User-agent: *
Disallow: /wp-admin/
Allow: /wp-admin/admin-ajax.php

You could also test it in Google Search Console with the Inspect URL tool.