Imagine I have this blogging / ecommerce website with 1000 posts / products. And I’ve built a sitemap for it, which is dynamically generated. Basically it’s a list with a bunch of
I’m pretty sure that the crawlers expect me to update the
<lastmod> dates for whatever product or blogpost that I edit and change the text content (or change the images). Add something new, update information, etc. Basically, anything that users will SEE differently when they enter my page. This makes sense.
But my question is:
I have a dynamic single page website. So I don’t keep static pages. I generate and render them (server-side) at run time. So what if I decide that all of my blogposts now should render inside a
<main> or <article> tag instead of a
div? Or what if I add some structured meta data to add price and review properties for my products, or to add structured data for breadcrumbs.
You see what I mean? The content that the user sees hasn’t changed. But I’ve updated some tags that the CRAWLER will interpret differently. The text/image content is the same, but the HTML content has changed. And this could even have impact on my ranking, since I’m throwing in new tags that might get me better SEO.
But now what should I do? The changes I made now will render the 1000 posts / products in a different way with the new tags (in the perspective of the crawler). Should I update the
<lastmod> tag to ALL of my 1000 urls in my sitemap? The user will still see the same text/image content and will not notice any difference.
If I do update all the 1000
<lastmod> tags, won’t the crawler think that it’s “weird” that now all of my urls have been updated on the same day? Since they’ll all have the same
<lastmod> tags. Does it make sense?
Please, any help is appreciated. Thanks