Google DNS cannot ping my web server [closed]

I have manually set my local pc to run on DNS Server 8.8.8.8 (dns.google). I have gone into my godaddy account and pointed my a record to point to the external ip address of my server. And changed my Forward Lookup Zones in my DNS Manager to point to my internal ip address, and my server name. When I run:

nslookup >set type=soa >mydomainnameblahblah.com 

It returns

primary name server = "ns55.domaincontrol.com" //default godaddy name server

responsible mail ddr = "dns.jomav.net" //default godaddy

then displays the serial number, refresh, TTL, etc.

But when I ping mydomainnameblahblah.com it shows my external ip address, but Request timed out. All packets are lost.

From what I understand, which could be wrong… Everything seems to be correct but dns.google cannot ping my website. My understanding is that dns.google should check for the primary name server which is the godaddy default name server (n255.domaincontrol.com), then it will be pointed to my external ip address (24.234.xxx.xx) and my forward lookup zones will direct it to my internal ip address of the server(10.0.xx.xx).

In DNS Manager under Forward Lookup Zones I specifically have under SOA Primary Server set to my server name. Under Name Servers my Server Fully Qualified Domain Name (FQDN) set to mydomainnameblahblah.com with the IP Address set to the ip address of my server (10.0.xx.xx). Under A record Host is set to (same as parent folder), Fully qualified domain name (FQDN) set to mydomainblahblah.com and IP Address set to my server ip address (10.0.xx.xx).

Am I completely wrong about this? And should I change any of my Forward Lookup Zone information?

Edit: This is a server hosted through a local network. The person who set up the server tells me that http and https traffic is allowed and there are no rules denying that traffic. I have verified with him that the firewall allows ports 443 and 80 to direct traffic to the internal ip address (10.0.xx.xx). Which is where I thought that the Forward Lookup Zones would come into play.

Match types don’t apply after moving to “Create Ads” step” in Google Ads

I’m trying to set up a Google Ads campaign. On step 2 "Set up ad groups" I add keywords to Ad groups with match types:

+ipad +apps
"ipad apps"
[ipad apps]

Then I click "Save and Continue" and on Step 3 my keywords appear changed to:

ipad apps, +ipad +apps

Then I click back to Step 2 and I see the following:

ipad apps
ipad apps
+ipad +apps

For some reason, my keywords are changed to "broad match". How to prevent this behavior? What do I need to do to turn on Matching Types?

Recover Google rankings after overwriting one page with another and getting the duplicate content excluded

One of our pages was on page 1 of Google for a very important keyword. Some mistake from our developer team caused a disastrous issue in indexing that page. The parent page and the child page used the same content for a few days by mistake. Google then removed the child page from its index and kept the parent only, saying "Duplicate, submitted URL not selected as canonical".

I understand the issue. But I don’t know how to solve it! I have already brought back the original content on both pages and asked Google to reindex both pages through Google’s inspector tool in Search Console. But it doesn’t work! Google doesn’t index the child page anymore which is really crucial for us. Google has identified our child page as a duplicate of the parent page. I have re-written the whole content of the page and asked Google to re-index it again. No luck yet.

Do you have any solution for this issue?

Why is my website talking to Google and Facebook servers? [closed]

When I visit my website and look at the trackers using the uBlock Origin’s report, I can see that some of the Java Scripts on my website are trying to connect to Google and Facebook servers.

enter image description here

When I inspected the network tab, I figured that the scripts is trying to connect to graph.facebook.com in particular (Couldn’t find any request to a sub domain for google.com). I thought these requests were the result of a plugin I installed to get the share buttons on different social media platforms but I removed the plugins and it didn’t change anything. I was wondering if and how can I stop these requests.

Google news domain approved, adsense approved site – chavellenge.com

hi,
I have an adsense website approved, Google news approved
It's a new website: https://chavellenge.com

Registrar : Namecheap
Expires On
2021-09-20
Registered On
2020-09-20

See ahrefs : https://drive.google.com/file/d/1QaqhaXC2jaV79szoM9m62vMmZG2ukYtO/view?usp=sharing
See Google News approved
Newstand: https://newsstand.google.com/publications/CAAqBwgKMOKlngsw-a-2Aw?nsro=true
News.google.com:…

Google news domain approved, adsense approved site – chavellenge.com

SEO – onload components seen as separate pages by Google

I have tried to optimized my blog by loading some component after the page load to improve the performance. Since I have done this, the performance has increased but I now see that those components have been indexed in Google search.

I have use the following code to load my components

window.onload = function (e)  {   loadComments();   loadFeeds(); } 

and then one of the functions:

function loadComments() {     event.preventDefault();     console.log('Loading comments');     fetch('https://www.laurentwillen.be/gadgets/xiaomi-mi-10-lite-5g-test-avis/?module=comments&r=".rand(0,1500); ?>',          {             method: 'GET',             headers : new Headers()         })     .then(response => response.text())     .then((response) =>          {             document.getElementById('comments-content').innerHTML=response;             // PREFEED COMMENT FORM             reply_links = document.querySelectorAll(".feed_form");             for (x=0;x<reply_links.length;x++)                 {                     local_reply = reply_links[x];                     local_reply.addEventListener("click", feedComment);                                      }          })     .catch((err)=>console.log(err))  } 

I can see that the url https://www.laurentwillen.be/gadgets/xiaomi-mi-10-lite-5g-test-avis/?module=comments is now indexed in Google and that’s not what I want.

Should I load the page differently? Or should I add something to the loaded component?

Thanks

Google not working

I’ve been using scrape box for the past month but I’ve come across this error where the google search engine isn’t working. I tried setting the delay to 30 seconds but the 10 secs after it gives error. I’ve use private and public HQ proxies but nothing works? I also reloaded the default search engines to no avail.

Google Analytics: View filter not picking up any traffic

I have created a new view (a copy of the main one) and set a filter as follows:

enter image description here

However, if I visit any URL with that inside of it, nothing gets picked up on the real-time view. However it does get picked up on the main view, where there are no filters.

I can’t click on the verify link as it’s saying the service isn’t available

Any ideas?

Why Google Cache Showing 404 on my React Base Webpages?

My web page is client-side rendered with React. On hitting the cache, it is showing 404 instead of a rendered page.

Example Screenshot enter image description here

When I inspect the Page URL on Search Console, Its parsing and rendering my page properly on Google Index as well as on Live Test.

Example Screenshots enter image description here enter image description here

Why cache page is showing 404? Is the Client-side react is responsible? If not then what is the issue? Or will it be resolved automatically in a few days?

Note It’s my new site which recently goes live in last week.