I am working on a project that just rebranded.
Using Google Search Console I am getting some weird errors. Despite my
robot.txt working properly, google cannot seem to fetch it for some reason.
my sitemap is here: https://example.com/sitemap_index.xml
and my robot.txt: https://example.com/robot.txt
When I try to get my site referenced on google this is what I get:
If I click on Open sitemaps it opens just fine.
This is what google is saying on my url inspection:
I tried reindexing multiple times but nothing changed.
The site has been live for over a month now and is still not referenced despite having backlinks pointing to it from linked in and more.
Where can this be coming from? I asked Google support but no luck and asked my DNS provider to double check everything but it seems fine. I’ve also hired a DevOps to check my server configuration but apparently everything is fine.