I wanted to begin using agencyanalytics.com to run some SEO checks on my client’s sites.
As it turns out, their crawlers aren’t getting through to the pages.
My robots.txt includes:
User-agent: *
Disallow:
Their tech support told me to add “AA-Site-Audit-Crawler” as an exception. Their exact words:
“You have to white-list our crawler user agent on the server where that site resides. We use rotating IPs, so you’ll need to white-list by name. The name to use is AA-Site-Audit-Crawler. In this case you have to enable it since it’s being blocked by your server.”
Ok that’s what I thought…because that is what it was originally set to. And still their crawlers aren’t indexing so I thought to maybe add them as an allowed agent. Is that redundant then? How do I white-list a user agent by name?
I ended up removing the syntax completely which is the same thing as disallow I believe, then I realized that the crawler was crawling a redirected URL smh so it’s all working now!