Custom post type not showing in xml sitemap

I create a custom post type in WordPress and it’s working. My issue is when I create my XML sitemap then I am getting my all the pages and post but not getting the custom post type.

Would you help me out with this?

// Register Custom Post Type blog function create_blog_cpt() {      $  labels = array(         'name' => _x( 'blogs', 'Post Type General Name', 'bloglist' ),         'singular_name' => _x( 'blog', 'Post Type Singular Name', 'bloglist' ),         'menu_name' => _x( 'blogs', 'Admin Menu text', 'bloglist' ),         'name_admin_bar' => _x( 'blog', 'Add New on Toolbar', 'bloglist' ),         'archives' => __( 'blog Archives', 'bloglist' ),         'attributes' => __( 'blog Attributes', 'bloglist' ),         'parent_item_colon' => __( 'Parent blog:', 'bloglist' ),         'all_items' => __( 'All blogs', 'bloglist' ),         'add_new_item' => __( 'Add New blog', 'bloglist' ),         'add_new' => __( 'Add New', 'bloglist' ),         'new_item' => __( 'New blog', 'bloglist' ),         'edit_item' => __( 'Edit blog', 'bloglist' ),         'update_item' => __( 'Update blog', 'bloglist' ),         'view_item' => __( 'View blog', 'bloglist' ),         'view_items' => __( 'View blogs', 'bloglist' ),         'search_items' => __( 'Search blog', 'bloglist' ),         'not_found' => __( 'Not found', 'bloglist' ),         'not_found_in_trash' => __( 'Not found in Trash', 'bloglist' ),         'featured_image' => __( 'Featured Image', 'bloglist' ),         'set_featured_image' => __( 'Set featured image', 'bloglist' ),         'remove_featured_image' => __( 'Remove featured image', 'bloglist' ),         'use_featured_image' => __( 'Use as featured image', 'bloglist' ),         'insert_into_item' => __( 'Insert into blog', 'bloglist' ),         'uploaded_to_this_item' => __( 'Uploaded to this blog', 'bloglist' ),         'items_list' => __( 'blogs list', 'bloglist' ),         'items_list_navigation' => __( 'blogs list navigation', 'bloglist' ),         'filter_items_list' => __( 'Filter blogs list', 'bloglist' ),     );     $  args = array(         'label' => __( 'blog', 'bloglist' ),         'description' => __( '', 'bloglist' ),         'labels' => $  labels,         'menu_icon' => 'dashicons-admin-comments',         'supports' => array('title', 'editor', 'thumbnail', 'custom-fields'),         'taxonomies' => array(),         'public' => true,         'show_ui' => true,         'show_in_menu' => true,         'menu_position' => 5,         'show_in_admin_bar' => true,         'show_in_nav_menus' => true,         'can_export' => true,         'has_archive' => true,         'hierarchical' => false,         'exclude_from_search' => true,         'show_in_rest' => true,         'publicly_queryable' => true,         'capability_type' => 'post',     );     register_post_type( 'blog', $  args );  } add_action( 'init', 'create_blog_cpt', 0 ); 

Use XML Based Sitemap and/or Static Page Sitemap?

I have a question about which sitemap to use – XML based that is sent to Google Search Console or static HTML based sitemap that is added to the website footer.

If you search "dump trucks for sale", you will find this result in the third position: https://www.commercialtrucktrader.com/Dump/trucks-for-sale?category=Dump%20Truck%7C2000609

Our website uses faceted navigation to filter inventory results, like the Commercial Truck Trader website example I posted regarding a search query.

I see the Commercial Truck Trader website has a sitemap link added to their footer. This is a static HTML based sitemap that can help the user navigate parts of the website.

Do you think the search query "dump trucks for sale" is showing the Commercial Truck Trader website in the third position on a serp from using a static HTML based sitemap that is added to the website footer, or from an XML based sitemap that is sent to Google?

Yandex not crawling compressed sitemap index

I have submitted a sitemap index file (one that links to other sitemaps that contain the actual URLs search engines are instructed to crawl). It is GZip compressed.

Using the Yandex sitemap validation tool it tells me it is valid and has 202 links and no errors.

However, in Yandex Webmaster it shows up with a small, grey sign in the status column. When clicked it says ‘Not indexed’.

Yandex is not indexing the URLs provided in the file, which are all new. Though it states it has consulted the sitemap.

Any ideas what may be wrong?

Google Search Console cannot read my XML: Sitemap appears to be an HTML page

I’m working on a web application written with AngularJS (v8) and deployed on an apache2 using proxy to forward requests (frontend, api, backoffice).

My problem is that I’m trying to submit the sitemap ({website}/sitemap.xml) on Google, but Google Search Console keep saying that it’s not valid: Google can read the link but it seem to be in HTML

gsc

My sitemap: sitemap

I tried to validate that XML on many website and I didn’t find any error.

I mentioned apache2 because maybe when Google try to fetch the URL, before finding the XML, apache give another page but I cannot prove that. I tried in many ways and the first page that I see when opening the URL is the sitemap and nothing else.

In my angular.json I added the file in the assets as follow:

"assets": ["src/favicon.ico", "src/assets", "src/sitemap.xml"],

What it can be?

Thank you

Sitemap: Should I dynamically update sitemap for dynamic content or create a page containing all the dynamic links

Say i have the following route http://<my-domain>/{category}/subjects/{id}/Sitemap: Should I dynamically update sitemap for dynamic content or create a page containing all the dynamic links

the ones in the brackets are dynamic, I’m struggling with what is better or any better way to let google crawl through all these dynamic links

Approach 1: manually doing the job by removing or adding the record to site map and updating <lastmod>

Approach 2: create a page that includes all those links and reference that page in sitemap.xml

The second approach can be generated as a plain html file which is generated from the server app. Or, a simple webform aspx page that dynamically generates those links without having to create an html file.

Sitemap for one website in multiple domains

I have 2 domains (A and B). The first one (A) is the website.

In sitemap.xml of “A” domain, I have a sub-sitemap in “B” domain with “A” URLs.

I have follow the documentation (https://support.google.com/webmasters/answer/75712?hl=en). So these 2 domains are verified in the Google Search Console but Google does not index the sub-sitemap (in “B” domain with “A” URLs). Other sub-sitemap in “A” domain are OK.

FYI: the sitemap is valid because if I send it manually in the Search Console (on the B domain property), URLs is displayed in Google search results. An for multiple reasons, I can’t sent it manually every time.

Do you have an idea ? Thanks

Changed URL for a page that was indexed by Googlebot. Will redirect 301 from the old URL to the new one. But what to do with my Sitemap?

I’m planning to change a url for one of my site’s page.

Example:

From: https://www.example.com/old-post-slug

To: https://www.example.com/new-post-slug

The fact is that Google has already indexed the old url: https://www.example.com/old-post-slug

And from these DOC’s, we see that to avoid lose page ranking we should respond with a 301 - Moved permanently from the old URL pointing to the new URL.

https://support.google.com/webmasters/answer/6033049?hl=en

enter image description here

QUESTION

I get that I should redirect (301) from the old URL to the new one. So when Google re-crawls, it will see that change. But what should be on my Sitemap? The old URL or the new one? Or both?

I tend to think that it would be best to keep only the new url on my Sitemap. But what if Google crawls the new URL before it sees the redirect from the old one? Wouldn’t the new page URL start off as a new page (from Google’s index perspective) with zero ranking points? How does Googlebot handles that? What is the recommended practice?

Google Search console cannot fetch my sitemap. How to force Google to index a site?

I am working on a project that just rebranded.

Using Google Search Console I am getting some weird errors. Despite my sitemap_index.xml and robot.txt working properly, google cannot seem to fetch it for some reason.

my sitemap is here: https://example.com/sitemap_index.xml

and my robot.txt: https://example.com/robot.txt

When I try to get my site referenced on google this is what I get: enter image description here enter image description here

If I click on Open sitemaps it opens just fine.

This is what google is saying on my url inspection: enter image description here

I tried reindexing multiple times but nothing changed.

The site has been live for over a month now and is still not referenced despite having backlinks pointing to it from linked in and more.

Where can this be coming from? I asked Google support but no luck and asked my DNS provider to double check everything but it seems fine. I’ve also hired a DevOps to check my server configuration but apparently everything is fine.

Add URL to sitemap to be available in the future

I generate automatically the sitemap.xml on publish of content on my website, however some of the content will be publish starting from an specific date and time.

Is there any tag that I can add to a sitemap or any other way to cater for this? so that the google/bing… bots would know to only index the content if the date is greater than ‘now’.

I know that I could use a task scheduler to update the sitemap file when the content publish date is reached but I was trying to avoid that solution.