Robots.txt not blocking yet GSC URL Inspection Tool says it is

Recently I realized Disallow /was set on my staging sites.

I didn’t realize until I got a warning in Search Console…

Mistakes happen

FYI for those that may read this:

You never want to disallow a staging/development site from Google.

Best policy is to set meta robots to noindex follow and list your main site as the canonical URL for all pages.

We want Google to be able to crawl the staging site and understand what we’re doing

I’ve removed the directive and cleared all caches at least 10 times.

However, when I use the URL inspection tool I’m still getting the following:

Google Search Console URL Inspection Tool showing that a page is blocked by robots.txt

Never experienced this before – and I’ve been doing SEO for a long time. Usually I’d see this reflected in a live test right away.

Anyone experienced this before? Is there some sort of processing time or cache on Google’s end? Shouldn’t be if the "live test" is actually live.