Hello fellas, i’ve got an issue… well, actually i’ve got several issues that i can’t fix on my own.
To give a bit of context, i work for a company that already had a website running (https://www.kimmy.pt) but they wanted to update the website with a new look and so i chose webflow to make a new one (hosted with webflow, of course), but now that i wanted to speed up the google indexing thingy so that it updates with the new links etc, ive came across a few issues. Ill describe them below.
Problem numero uno: In the google search console thingy, when i try to add a new sitemap.xml to it, it gives me an “http general error” which i can’t fix. In my webflow project settings i have the “auto-generate sitemap” option on, so google console should accept the new sitemap.xml right?
Second problem: In the google’s inspection url tab, when i make it inspect “https://www.kimmy.pt”, google detects it and it tells me it’s on google, but the actual content is refering to the old website (old links etc), and when i request a new indexing it literally rejects it and tells me that there were problems detected with the url indexing. When i click to see the test results it tells me the url is not on google… So… I don’t get it, the previous url was on google but the new one isn’t? What gives here? Or maybe i’m missing something…?
Problemo number three: When i open up google’s robot.txt tester, it tells me robots.txt doesn’t even exist, and when i try to submit a one it tells me to refresh in a minute, which i do, but then the same error happens. However when i click the link “See live robots.txt” it gives me the code which i added in the webflow project settings:
So, it detects the code but it doesn’t detect the actual robots.txt file which should be created automatically by webflow?
Ive googled about these issues a lot but i haven’t been able to fix any of them. Any help would be highly appreciated
Cheers, Cristiano M.