I wanted to begin using agencyanalytics.com to run some SEO checks on my client’s sites.
As it turns out, their crawlers aren’t getting through to the pages.
My robots.txt includes:
Their tech support told me to add “AA-Site-Audit-Crawler” as an exception. Their exact words:
“You have to white-list our crawler user agent on the server where that site resides. We use rotating IPs, so you’ll need to white-list by name. The name to use is AA-Site-Audit-Crawler. In this case you have to enable it since it’s being blocked by your server.”
The site is www.fantasyfootballmetrics.com
Is there a way to resolve this on my end?
Thanks in advance for the help!