How to prevent discovery of URLs by search engines?

I crawled my WordPress site using the Screaming Frog program and noticed a few issues which might have SEO implications and was hoping someone could help identify the problem (if there is one).

N.B: I have obscured the domain name in the image as I do not want it made public.

The canonical URL for my domain is prefixed by https://www but as you can see from the tree graph image below, there are a number of URLs accessible under the http protocol and the non-www version of the domain. Those URLs are non-indexable but I’m curious as to why they’re accessible at all. I was thinking that shouldn’t happen if the proper redirects were in place.

In the WordPress admin settings I have entered the correct version of my domain for the WordPress and site addresses (i.e., https://www)

It’s possible my .htaccess file might require editing so I have pasted it below:

# BEGIN LSCACHE ## LITESPEED WP CACHE PLUGIN - Do not edit the contents of this block! ## <IfModule LiteSpeed> RewriteEngine on CacheLookup on RewriteRule .* - [E=Cache-Control:no-autoflush] RewriteRule \.object-cache\.ini - [F,L]  ### marker CACHE RESOURCE start ### RewriteRule wp-content/.*/[^/]*(responsive|css|js|dynamic|loader|fonts)\.php - [E=cache-control:max-age=3600] ### marker CACHE RESOURCE end ###  ### marker FAVICON start ### RewriteRule favicon\.ico$   - [E=cache-control:max-age=86400] ### marker FAVICON end ###  ### marker DROPQS start ### CacheKeyModify -qs:fbclid CacheKeyModify -qs:gclid CacheKeyModify -qs:utm* CacheKeyModify -qs:_ga ### marker DROPQS end ###  </IfModule> ## LITESPEED WP CACHE PLUGIN - Do not edit the contents of this block! ## # END LSCACHE # BEGIN NON_LSCACHE ## LITESPEED WP CACHE PLUGIN - Do not edit the contents of this block! ## ### marker BROWSER CACHE start ### <IfModule mod_expires.c> ExpiresActive on ExpiresByType application/pdf A31557600 ExpiresByType image/x-icon A31557600 ExpiresByType image/vnd.microsoft.icon A31557600 ExpiresByType image/svg+xml A31557600  ExpiresByType image/jpg A31557600 ExpiresByType image/jpeg A31557600 ExpiresByType image/png A31557600 ExpiresByType image/gif A31557600 ExpiresByType image/webp A31557600  ExpiresByType video/ogg A31557600 ExpiresByType audio/ogg A31557600 ExpiresByType video/mp4 A31557600 ExpiresByType video/webm A31557600  ExpiresByType text/css A31557600 ExpiresByType text/javascript A31557600 ExpiresByType application/javascript A31557600 ExpiresByType application/x-javascript A31557600  ExpiresByType application/x-font-ttf A31557600 ExpiresByType application/x-font-woff A31557600 ExpiresByType application/font-woff A31557600 ExpiresByType application/font-woff2 A31557600 ExpiresByType application/vnd.ms-fontobject A31557600 ExpiresByType font/ttf A31557600 ExpiresByType font/otf A31557600 ExpiresByType font/woff A31557600 ExpiresByType font/woff2 A31557600  </IfModule> ### marker BROWSER CACHE end ###  ## LITESPEED WP CACHE PLUGIN - Do not edit the contents of this block! ## # END NON_LSCACHE #This Apache config file was created by Duplicator Installer on 2021-02-17 10:08:29. #The original can be found in archived file with the name .htaccess__[HASH]  # BEGIN WordPress # The directives (lines) between "BEGIN WordPress" and "END WordPress" are # dynamically generated, and should only be modified via WordPress filters. # Any changes to the directives between these markers will be overwritten. <IfModule mod_rewrite.c> RewriteEngine On RewriteRule .* - [E=HTTP_AUTHORIZATION:%{HTTP:Authorization}] RewriteBase / RewriteRule ^index\.php$   - [L] RewriteCond %{REQUEST_FILENAME} !-f RewriteCond %{REQUEST_FILENAME} !-d RewriteRule . /index.php [L] </IfModule>  # END WordPress 

enter image description here

Google change of address tool from .co.uk to .com resulted in huge derank

December 2020 I moved a site from a .co.uk to a .com. No other changes, simple 301 redirect and use of the tool in google webmaster tools. The site had existed since 2006 and was the de facto site in the vertical, meaning I had plenty of P1s on google for medium tail keywords.

Now, 2.5 months later I am still hugely deranked for 10,000s of key words, and as a result I’ve dropped 40% of my traffic.

I was wondering whether anyone had any advice. Obviously this is a huge issue for me as it’s my livelihood I’ve built up over 15 years of hard work. I followed googles direction perfectly (my background is actually web dev and SEO) and it’s still deranked me.

I’m not sure the linking policy here, but the site is tyrereviews dot com, it used to be tyrereviews dot couk, and for example we used to rank 1 or 2 for the search "michelin primacy 4" in google UK however we are now page 2.

The new .com domain is targeted properly to the UK in webmaster tools too.

This is the same for many many medium tail keywords.

SoftwareApplication/WebApplication not shown in search console enhancements

I have a pricing page with SoftwareApplication that have sub-type of WebApplication set. Structured data rendered properly and saved to indexed page content. But I see only these enhancements at Search Console:

  • Core Web Vitals
  • Mobile Usability
  • Breadcrumbs
  • FAQ

Is there way to include SoftwareApplication/WebApplication into enhancements section of Google Search Console?

Is cloudlayar a legit protection service? [closed]

I recently came across a very cheap protection service similar to Cloudflare called Cloudlayar: https://cloudlayar.com/

It’s a bit weird because the prices are too good to be true, no PayPal (so refund at their mercy), and they use Cloudflare to protect their own website…

Does anyone know what this company is? how good is their service? is it a scam?

p.s: I’m not asking for an "opinion", just trying to know if this service actually works and have customers

Thanks in advance

I’m being outranked by iframe of my ownsite

My website is being outranked by another third party website through an iframe of my website. Their website is featuring my website through an iFrame on a basic page. This page consists of a header, iFrame and footer.

Canonical is set for my website. They are ranking for keywords from my website and I no longer show up on google for most keywords.

Could I be demoted as Google has deemed this website more trustworthy? or better? or is this negative SEO at work? I have found hundreds of sites that are direct clones of mine, could that do something?

I guess I’m grabbing straws looking for the cause.

What can be a reason of a “The requested URL / was not found on this server” error in case of an addon domain?

Let’s say I have a site hosted on a.example.com. I’m adding an addon domain to the server (shared hosting): b.example.com and point this domain to the server IP address.

Visiting b.example.com provides the same site as a.example.com, since they point to the same server. That’s OK.

Then I’m adding an other addon domain c.example.com and point it to the server IP. When visiting c.example.com I get the error: The requested URL / was not found on this server.

Can you tell me a reason how this could happen? I don’t get it.

Can Cloudflare workers be used for heavy traffic?

I tried to find the details about traffic limit on Cloudflare workers, the only limit I found is by the number of requests.

Can I use it to provide extremely heavy traffic 10, 20, 30, 50, 100, or 1000 TB/month?

I’m trying to find any ToS for workers but no luck… Can I use them for codec critical video that I don’t want Cloudflare to transcode via their stream service? Can I use it to serve images? or other large files?

What are the limitations to avoid "hurting" the service and other users that may be sharing same resources?

Using own SMTP server with GSuite

We’ve been having problems recently with our email ending up in junk and after doing some testing it turns out that (some of) the GSuite IPs have been blacklisted, which explains the partial deliverability.

E-mail is very integral to our business so we’d happily pay to have a dedicated IP but Google doesn’t give that option.

Is there a way to force outgoing email to go through a custom SMTP that we’ll setup with another provider that can offer dedicated IPs?

How does Facebook Ads bidding strategy decide ties?

If 2 entities are targetting the same set of audience and set exactly the same bid strategy and bid amount then how does Facebook decide who wins the bid? What’s the tie-breaker logic?

Example:

Entity A set bid strategy as "Lowest Cost" and targets an audience who has an interest in "Computers".

Entity B set bid strategy as "Lowest Cost" and targets an audience who has an interest in "Computers".

Who wins the bid here? Entity A or Entity B?