How to fix Core Web Vitals 0 good URLs
I know that Google algo (publicly released) updates are just recommendations but how "seriously" are you taking Core Web Vitals update May 1st?
This is a rolling-update spread over a month, but my question is this: all our competitors (we are in mass/large-scale eCommerce) hover at the same bad results, basically all performing "badly" in the results (GMetrix and Google PageSpeed)….
Now, I know that there is a difference between "Lab" and "Field" data but still – how seriously should we be taking this?
We rank for half a million keywords and I can’t imagine us being knocked down the rankings because we score low on a lab-testing machine?
What are your thoughts?
I have a SPA (single page website) build with React + Firebase and I’ve been getting these Core Web Vitals errors (see images below).
My website is loading normally both on Desktop and Mobile. And I think it’s rendering in a very reasonable time. At least I think it’s way faster than most websites I visit, even though it’s client side rendered.
I’m guessing these errors on Core Web Vitals are being triggered because there’s a
Spinner that runs while the app is loading its data.
For example: that report is probably measuring the
Loaded content as layout shift. Because I can guarantee that my app has ZERO layout shift. Once the
Spinner is gone and you see content on your screen, the app is 100% ready for you to browse and interact with.
Maybe to get rid of those errors I would have to do SSR + hydration, which I really don’t want to, because it’s a dynamic website and I would have to either remove caching completely, or to risk a content (state vs fresh) flickering on the screen once it’s fully hydrated.
Should I care about these results? Is anybody that also manages a SPA also getting these kind of errors? Is there a way to fix this?
Found some related articles:
Google Search Console’s “Core Web Vitals” is showing these two graphs.
Notice that the number of “good” URLs in one graph exactly match the number of “bad” URLs in the other. Each day always has the same number on each graph, so it’s not likely a random coincidence.
The reports provide only one example, and it is the same URL in both cases (https://rbutterworth.nfshost.com/Tables/compose/). The page is static, with no scripts or forms.
The site has hundreds of other pages (all also static without forms), so what is so special about these reported pages that every one of them would be good in one context and bad in the other?