Changed URL for a page that was indexed by Googlebot. Will redirect 301 from the old URL to the new one. But what to do with my Sitemap?

I’m planning to change a url for one of my site’s page.




The fact is that Google has already indexed the old url:

And from these DOC’s, we see that to avoid lose page ranking we should respond with a 301 - Moved permanently from the old URL pointing to the new URL.

enter image description here


I get that I should redirect (301) from the old URL to the new one. So when Google re-crawls, it will see that change. But what should be on my Sitemap? The old URL or the new one? Or both?

I tend to think that it would be best to keep only the new url on my Sitemap. But what if Google crawls the new URL before it sees the redirect from the old one? Wouldn’t the new page URL start off as a new page (from Google’s index perspective) with zero ranking points? How does Googlebot handles that? What is the recommended practice?

Google Search console cannot fetch my sitemap. How to force Google to index a site?

I am working on a project that just rebranded.

Using Google Search Console I am getting some weird errors. Despite my sitemap_index.xml and robot.txt working properly, google cannot seem to fetch it for some reason.

my sitemap is here:

and my robot.txt:

When I try to get my site referenced on google this is what I get: enter image description here enter image description here

If I click on Open sitemaps it opens just fine.

This is what google is saying on my url inspection: enter image description here

I tried reindexing multiple times but nothing changed.

The site has been live for over a month now and is still not referenced despite having backlinks pointing to it from linked in and more.

Where can this be coming from? I asked Google support but no luck and asked my DNS provider to double check everything but it seems fine. I’ve also hired a DevOps to check my server configuration but apparently everything is fine.

Add URL to sitemap to be available in the future

I generate automatically the sitemap.xml on publish of content on my website, however some of the content will be publish starting from an specific date and time.

Is there any tag that I can add to a sitemap or any other way to cater for this? so that the google/bing… bots would know to only index the content if the date is greater than ‘now’.

I know that I could use a task scheduler to update the sitemap file when the content publish date is reached but I was trying to avoid that solution.

Custom Post Type – Category Rewrite – Remove Rewrite from Sitemap

I’ve got “case_studies” post type and categories for it as “case_studies_categories” with a rewrite to include category in URL.

Everything works, but for some reason, rewrite url is in sitemap (as the first URL), for example:


and the rest is fine:

/case-studies/%case_studies_categories%/ /case-studies/category-name/post-name/ /case-studies/category-name/post-name/ /case-studies/category-name/post-name/ 

How to remove it (/case-studies/%case_studies_categories%/) from Sitemap?

    add_action('init','case_studies_init');  function case_studies_init(){      $  labels = array(         'name'               => _x( 'Case Studies', 'Case Studies' ),         'singular_name'      => _x( 'Case Study', 'Case Study' ),         'add_new'            => _x( 'Add Case Study', 'Case Study' ),         'add_new_item'       => __( 'Add Case Study' ),         'edit_item'          => __( 'Edit Case Study' ),         'new_item'           => __( 'New Case Study' ),         'all_items'          => __( 'All Case Study' ),         'view_item'          => __( 'View Case Study' ),         'search_items'       => __( 'Search Case Study' ),         'not_found'          => __( 'No Case Studies Found' ),         'not_found_in_trash' => __( 'No Case Studies in Trash' ),         'parent_item_colon'  => '',         'menu_name'          => 'Case Studies'     );     $  args = array(         'labels'                => $  labels,         'description'           => 'Holds case studies post data',         'public'                => true,         'menu_position'         => 7,         'hierarchical'          => true,         'menu_icon'             => 'dashicons-admin-comments',         'rewrite'               => array('slug' => 'case-studies/%case_studies_categories%', 'with_front' => false),         'supports'              => array( 'title', 'revisions', 'thumbnail'),         'has_archive'           => true,         'show_ui'               => true,         'show_in_nav_menus'     => true,         'show_in_menu'          => true,         'show_in_admin_bar'     => true,         'taxonomies'            => array("case_study_categories"),     );      register_post_type('case_studies',$  args);   //  flush_rewrite_rules( false ); }  // register a custom category taxonomy type // so that the categories are not connected to the 'post' type taxonomies  add_action( 'init', 'register_case_study_tax' );  function register_case_study_tax(){      $  labels = array(     'name'              => _x( 'Case Study Categories', 'case-studies'),     'singular_name'     => _x( 'Case Study Category', 'testimonials'),     'search_items'      => __( 'Search Case Study Categories'),     'all_items'         => __( 'All Case Study Categories'),     'parent_item'       => __( 'Parent Case Study Category'),     'parent_item_colon' => __( 'Parent Case Study Category:'),     'edit_item'         => __( 'Edit Case Study Category'),     'update_item'       => __( 'Update Case Study Category'),     'add_new_item'      => __( 'Add Case Study Category'),     'new_item_name'     => __( 'New Case Study Category'),     'menu_name'         => __( 'Case Study Categories'),     );      $  args = array(     'labels'                => $  labels,     'taxonomy'              => 'case_study_categories',     'object_type'           => 'case_studies',     'hierarchical'          => true,     'show_ui'               => true,     'show_admin_column'     => true,     'query_var'             => false,     );      register_taxonomy('case_studies_categories', 'case_studies', $  args); }   /** filter URL link for post type url **/ add_filter('post_type_link', 'case_studies_permalink_structure', 10, 4);  function case_studies_permalink_structure($  post_link, $  post, $  leavename, $  sample) {   if ( false !== strpos( $  post_link, '%case_studies_categories%' ) ) {     $  event_type_term = get_the_terms( $  post->ID, 'case_studies_categories' );     if($  event_type_term)     $  post_link = str_replace( '%case_studies_categories%', array_pop( $  event_type_term )->slug, $  post_link );    }  return $  post_link; } 

I’m sure I’ve done something stupid, please assist if possible.


In a sitemap, should I update the lastmod tag of a url based on the text content or html content?

Imagine I have this blogging / ecommerce website with 1000 posts / products. And I’ve built a sitemap for it, which is dynamically generated. Basically it’s a list with a bunch of <url> and <lastmod> tags.

I’m pretty sure that the crawlers expect me to update the <lastmod> dates for whatever product or blogpost that I edit and change the text content (or change the images). Add something new, update information, etc. Basically, anything that users will SEE differently when they enter my page. This makes sense.

But my question is:

I have a dynamic single page website. So I don’t keep static pages. I generate and render them (server-side) at run time. So what if I decide that all of my blogposts now should render inside a <main> or <article> tag instead of a div? Or what if I add some structured meta data to add price and review properties for my products, or to add structured data for breadcrumbs.

You see what I mean? The content that the user sees hasn’t changed. But I’ve updated some tags that the CRAWLER will interpret differently. The text/image content is the same, but the HTML content has changed. And this could even have impact on my ranking, since I’m throwing in new tags that might get me better SEO.

But now what should I do? The changes I made now will render the 1000 posts / products in a different way with the new tags (in the perspective of the crawler). Should I update the <lastmod> tag to ALL of my 1000 urls in my sitemap? The user will still see the same text/image content and will not notice any difference.

If I do update all the 1000 <lastmod> tags, won’t the crawler think that it’s “weird” that now all of my urls have been updated on the same day? Since they’ll all have the same <lastmod> tags. Does it make sense?

Please, any help is appreciated. Thanks

do onpage SEO optimization, meta description, speed optimization and xml sitemap for $40

On page SEO optimization is the process of optimizing each and every web page of your site in order to rank higher in the Search Engine Results Pages (SERPS). This White Hat SEO optimization service can boost google ranking of your website or blog traffic by 90% or more. Our onsite optimization services include:Free: Website seo audit reportLong tail, LSI & Focused keywords OptimizationCompetitors backlinks analysis Install & configure WordPress yoast SEO pluginCompelling Meta description, Title and effective TagsHeading tags (H₁, H₂, H₃) optimizationImages Alt tags and Speed optimizationXML Sitemap/Robots txt file creation & submissionGoogle webmaster tools verificationCheck Broken links & Redirect Broken Links to the Homepage or Parent PageSocial media Meta tags, Hyperlink and anchor text optimizationIndex all pages to Google, Bing and YahooSearch Engines friendly titles, URLs and website structure Why us?100% Client Satisfaction and Money back Guarantee.Wordpress, Wix, Shopify and Amazon Specialists If you have any questions, please don’t hesitate to get in touch.

by: Emiliadavid41
Created: —
Category: Onsite SEO & Research
Viewed: 228

Dynamic Pages & SEO Friendly URL in sitemap

I have a website built with C# that has some dynamic pages. And I’ve recently added some code to allow using SEO friendly URLs.

For example, I have a page: “/products.aspx?itID=1” and you can access it with “/Product-Name” as well.

Should I be putting the SEO Friendly URL in my sitemap instead of the real URL? I think the answer is yes, but I’m very new to this and would really appreciate some confirmation from someone with more experience.

Do the order of “Disallow” and “Sitemap” lines in robots.txt matter?

One can sort robots.txt this way:

User-agent: DESIRED_INPUT Sitemap: Disallow: / 


User-agent: DESIRED_INPUT Disallow: / Sitemap: 

I assume both are okay because it’s likely the file is compiled in correct order by generally all crawlers.
Is it a best practice to put Disallow: before Sitemap: to prevent an extremely unlikely bug of a crawler’s bad compilation of crawling before ignoring Disallow:?