Today I got a notice from google that 1 file was being blocked by robots.txt, and has been for several days. I checked in search console, and when I run the check, it says, “There is no Robot.txt file”
If I run ‘fetch as google’ I what I expect.
I’ve submitted feedback to google.
I’ve checked other answers here, and so far all have talked about formatting/syntax in the Robots.txt file I’ve not found the combination of getting a URL blocked when there isn’t a Robots file.
Is this one of those things that just happens, and goes away?