I hope to find where the errors such as “Exception in the thread ‘main’ java.lang.IndexOutOfBoundsException: Index 0 out of bounds for length 0” are recorded. Or can I create a file to record such information?
I have found the answer of “Finding the Error Log”(I can’t understand the errors or find what I want) and “Retrieve Console History in Eclipse”(it just recorded the outputs, not the errors), these are not what I am looking for.
Thanks a lot! -Jiduan
Just got this email from Google Search Console:
This were the issues:
But when I inspected those live URL’s (using Google Search Console tool) they came back as “available to Google” and one of them was infact already indexed.
My app is a Single Page App built with React and Firebase.
I’m doing Server Side Rendering to robots using a Firebase Cloud Function
There are no errors showing up on my function’s logs. Not on those dates (last crawled dates in the screen above), and no other dates.
I’m thinking this is a bug from Google Search. Has anyone seen this before?
Recently I got this other email from Google saying that some of my pages were blocked by
robots.txt, which was clearly a bug (see link below). But the pages are different this time. They are not the same as the ones from the other error.
Google Search Console Warning: " Indexed, though blocked by robots.txt " (BUG)
I’m working with a website and there’s a subdomain which I don’t have access to.
In Google Search Console, the “Property” consists of the entire domain, including this subdomain. (It’s all part of 1 Property).
The subdomain is handled by a different part of the company — and it has a bazillion issues which show up in Google Search Console for the domain as a whole.
I’d like to separate these two domains into two different Search Console properties so that the top level site doesn’t “see” the errors from the subdomain. And I’d like the subdomain to be its own property in Search Console.
(Also I’m a little afraid that the poor quality subdomain may be impacting the search performance of the top level domain? I’m not sure about this).
What I’ve done so far:
I used “Add Property” to add the subdomain to Search Console. That part was easy enough.
Now I have a listing for the top level domain and the subdomain.
What I want to do next
Is there a way to tell Search Console to then ‘ignore’ the subdomain from the main, top-level Search Console property?
Effectively this would give me two totally different Search Console properties. One for the top-level (minus the subdomain) and one for the subdomain.
Is this possible?
According to the new update in search console, url can be temporarily remove for 6 months or less as 3 month. Can be removed permanently from site also.
I have pages that already indexed and ranked. The pages also have amount of visitor. But now don’t want to have those pages on site. Want to delete.
After deleting Will it remove from Google also. Whereas the pages indexed approx 2 years ago…
I realized that my bank handles sensitive information in the web console log, so my concern is if any browser extension could read the log?
I am working on a project that just rebranded.
Using Google Search Console I am getting some weird errors. Despite my
robot.txt working properly, google cannot seem to fetch it for some reason.
my sitemap is here: https://example.com/sitemap_index.xml
and my robot.txt: https://example.com/robot.txt
When I try to get my site referenced on google this is what I get:
If I click on Open sitemaps it opens just fine.
This is what google is saying on my url inspection:
I tried reindexing multiple times but nothing changed.
The site has been live for over a month now and is still not referenced despite having backlinks pointing to it from linked in and more.
Where can this be coming from? I asked Google support but no luck and asked my DNS provider to double check everything but it seems fine. I’ve also hired a DevOps to check my server configuration but apparently everything is fine.
Usually, users have full access to their environment, sometimes with a little bump in the form of entering the root password or similar. Why is this an exception?
(I am not saying that scripts downloaded from a web page should have this access. I understand why that is a threat to the user’s privacy etc.)
The following is a snapshot of my website’s Google Search console history in the past 6 months. Unfortunately, I don’t have any reference to understand whether this looks bad, healthy/typical, or even great. Based on your experience working with different sites, would you say this is a bad, typical, or great progress?
i have an issue after updating the last WP version. now i have this errors on my dashboard:
t_Terms_Order_Walker::walk($ elements, $ max_depth) should be compatible with Walker::walk($ elements, $ max_depth, …$ args) in /home/c4ddwork/staging/2/wp-content/plugins/post-terms-order/include/pto_walkers.php on line 135
Warning: Cannot modify header information – headers already sent by (output started at /home/c4ddwork/staging/2/wp-content/plugins/post-terms-order/include/pto_interface-class.php:25) in /home/c4ddwork/staging/2/wp-includes/functions.php on line 5946
Warning: Cannot modify header information – headers already sent by (output started at /home/c4ddwork/staging/2/wp-content/plugins/post-terms-order/include/pto_interface-class.php:25) in /home/c4ddwork/staging/2/wp-admin/includes/misc.php on line 1252
Warning: Cannot modify header information – headers already sent by (output started at /home/c4ddwork/staging/2/wp-content/plugins/post-terms-order/include/pto_interface-class.php:25) in /home/c4ddwork/staging/2/wp-admin/admin-header.php on line 9
Warning: Cannot modify header information – headers already sent by (output started at /home/c4ddwork/staging/2/wp-content/plugins/post-terms-order/include/pto_interface-class.php:25) in /home/c4ddwork/staging/2/wp-includes/option.php on line 958
Warning: Cannot modify header information – headers already sent by (output started at /home/c4ddwork/staging/2/wp-content/plugins/post-terms-order/include/pto_interface-class.php:25) in /home/c4ddwork/staging/2/wp-includes/option.php on line 959
when i deactivate post-terms-order plugin, all theses errors disappear.
Can anyone help me please!
I just finished building my first SharePoint Online page. Just a simple page with a few standard web parts (image, news, document library, list, events, file embed). Everything seems to work okay (except for what appears to be a
margin-right CSS issue on the news web part) but I’m getting a nonstop trickle of console errors coming in the longer I stay on the page.
For example, I’m in Chrome and did a fresh reload of the page when starting to type up this question, and in 2-3 minutes I’m up to 85 errors and 12 warnings. The errors are all over the place from
Warning! Use of this tool exposes you to potential security threats which can result in others gaining access to your personal Office 365 data (documents, emails, conversations and more). Make sure you trust the person or organization that asked you to access this tool before proceeding.
to mixed content errors related to a
Mixed Content: The page at ” was loaded over HTTPS, but requested an insecure XMLHttpRequest endpoint ”. This request has been blocked; the content must be served over HTTPS.
to errors that seem to be related to the PowerPoint embed I have:
Access to XMLHttpRequest at ‘https://browser.pipe.aria.microsoft.com/ping’ from origin ‘https://powerpoint.officeapps.live.com’ has been blocked by CORS policy: No ‘Access-Control-Allow-Origin’ header is present on the requested resource.
Given my level of experience with SharePoint (literally nonexistent), I wouldn’t be totally surprised if I’m doing something wrong but given that I haven’t used any custom web parts or done any other theming, I’m a little perplexed as to what it could be.
Is it normal to get all these console errors?