google search console – error indexing

I have a site built with WordPress. Initially, because the default content of the template was indexed in Google, I decided to remove those urls through the removals section in the Google console search. I also enabled the nofollow and noindex tags for all posts and tabs via the yoast plugin. After that, when I searched site: domain.com, there was no address. When I tried to index the home page again from the Google console search, https://example.com, I came across a noindex error while I had previously removed the tags. Other pages do not provide this error, but I will encounter a URL error that will be indexed only if certain conditions are met in livetest. I don’t know why in the beginning, even though I didn’t create a sitemap, everything was normal and the pages were indexed quickly, but after I deleted the previous urls and added new ones, the pages don’t index. Of course, the last time Google crawled the home page was on May 26.

Make Google Search Console ignore missing review and rating

Google Search Console complains that the “review” and “aggregateRating” are missing (optional) on the product scheme

A lot of posts say “Just ignore it, it’s optional”

But we are never ever going to get reviews. As in never.

Is there really no way of turning these warnings so I don’t see the 746 products with these useless warnings, which hide the few products which DO have errors in the descriptions?

I’m tempted to add a 5-star rating and a revivw saying “we like this product” on all products to get rid of the warnings, but I’m not sure it’s a good idea 😉

Configuring Azure Console for _external_ authentication/SSO/IdP? [migrated]

I’m looking for pointers on how to configure Azure “IAM” to trust an external IdP/Authentication server…. but finding my way around the docs for Azure which is… not easy. Help would be more than appreciated…

Some more context:

The challenge I have to solve should be “easy”: I need to use an 3rd party authentication/MFA solution to manage access to the Azure “cloud” console, to control which users access the console etc.

So my first idea is to configure the Azure console/IAM to use an external IdP for user access/SSO… Now, looking at the docs, I can see lots of info on how to use Azure AD to act as an IdP for other systems, but not so much on how to act as an SP for an external IDP. Also, I find all the different “flavours” of Azure AD that seem to be available somewhat confusing…

The closer I’ve been able to find is this: https://docs.microsoft.com/en-us/azure/active-directory/b2b/direct-federation, but I’m not sure if that’s the approach to follow…

There are other articles like https://docs.microsoft.com/en-us/azure/active-directory/hybrid/how-to-connect-fed-saml-idp that seem to apply to using a SAML IDP for access to Office or other MS service — but not to the Azure “tenant” itself?

ANY tip more than appreciated 😉!!

How to record/Where can I find the error history in console with Eclipse?

I hope to find where the errors such as “Exception in the thread ‘main’ java.lang.IndexOutOfBoundsException: Index 0 out of bounds for length 0” are recorded. Or can I create a file to record such information?

I have found the answer of “Finding the Error Log”(I can’t understand the errors or find what I want) and “Retrieve Console History in Eclipse”(it just recorded the outputs, not the errors), these are not what I am looking for.

Thanks a lot! -Jiduan

Google Search Console: issues detected “Top Errors” Server Error (5xx). BUG?

Just got this email from Google Search Console:

enter image description here

This were the issues:

enter image description here

But when I inspected those live URL’s (using Google Search Console tool) they came back as “available to Google” and one of them was infact already indexed.

enter image description here

My app is a Single Page App built with React and Firebase.

I’m doing Server Side Rendering to robots using a Firebase Cloud Function ssrApp

There are no errors showing up on my function’s logs. Not on those dates (last crawled dates in the screen above), and no other dates.

QUESTION

I’m thinking this is a bug from Google Search. Has anyone seen this before?

Recently I got this other email from Google saying that some of my pages were blocked by robots.txt, which was clearly a bug (see link below). But the pages are different this time. They are not the same as the ones from the other error.

Google Search Console Warning: " Indexed, though blocked by robots.txt " (BUG)

Google Search Console: Separate Subdomain Property from Top Level Property

I’m working with a website and there’s a subdomain which I don’t have access to.

In Google Search Console, the “Property” consists of the entire domain, including this subdomain. (It’s all part of 1 Property).

The subdomain is handled by a different part of the company — and it has a bazillion issues which show up in Google Search Console for the domain as a whole.

I’d like to separate these two domains into two different Search Console properties so that the top level site doesn’t “see” the errors from the subdomain. And I’d like the subdomain to be its own property in Search Console.

(Also I’m a little afraid that the poor quality subdomain may be impacting the search performance of the top level domain? I’m not sure about this).

What I’ve done so far:

I used “Add Property” to add the subdomain to Search Console. That part was easy enough.

Now I have a listing for the top level domain and the subdomain.

What I want to do next

Is there a way to tell Search Console to then ‘ignore’ the subdomain from the main, top-level Search Console property?

Effectively this would give me two totally different Search Console properties. One for the top-level (minus the subdomain) and one for the subdomain.

Is this possible?

What if Remove page from Google index after the new search console update?

google search console

According to the new update in search console, url can be temporarily remove for 6 months or less as 3 month. Can be removed permanently from site also.

I have pages that already indexed and ranked. The pages also have amount of visitor. But now don’t want to have those pages on site. Want to delete.

After deleting Will it remove from Google also. Whereas the pages indexed approx 2 years ago…

Google Search console cannot fetch my sitemap. How to force Google to index a site?

I am working on a project that just rebranded.

Using Google Search Console I am getting some weird errors. Despite my sitemap_index.xml and robot.txt working properly, google cannot seem to fetch it for some reason.

my sitemap is here: https://example.com/sitemap_index.xml

and my robot.txt: https://example.com/robot.txt

When I try to get my site referenced on google this is what I get: enter image description here enter image description here

If I click on Open sitemaps it opens just fine.

This is what google is saying on my url inspection: enter image description here

I tried reindexing multiple times but nothing changed.

The site has been live for over a month now and is still not referenced despite having backlinks pointing to it from linked in and more.

Where can this be coming from? I asked Google support but no luck and asked my DNS provider to double check everything but it seems fine. I’ve also hired a DevOps to check my server configuration but apparently everything is fine.

Why is JavaScript executed manually from the browser console not allowed to access everything?

Why is JavaScripts executed manually from the browser console not allowed to access “everything”? Especially the “visited” status (see this question) of links? What kind of security threat would that pose?

Usually, users have full access to their environment, sometimes with a little bump in the form of entering the root password or similar. Why is this an exception?

(I am not saying that scripts downloaded from a web page should have this access. I understand why that is a threat to the user’s privacy etc.)