Yesterday Google suddenly stopped crawling my site. It said there’s a Robots.txt issue - yet every guide I’ve read seems to suggest its ok
I thought perhaps it’s the sitemap but that seems fine here: https://www.fromthemurkydepths.co.uk/sitemap_index.xml
When I try to crawl the sitemap on Google Search Console it says robots.txt is blocking it and it “couldn’t fetch”.
When I request indexing specific pages on search console it states:
“Indexing request rejected”
A robots.txt test at google states “robots.txt fetch failed. You have a robots.txt file that we are currently unable to fetch. In such cases we stop crawling your site until we get hold of a robots.txt, or fall back to the last known good robots.txt file.”
I’ve read numerous guides and tried other sitemaps with Yoast instead, for example, and nothing is changing. My google search referrals are on the floor and adsense income dropped hugely. All tests I do elsewhere show no issues.