I crawled my WordPress site using the Screaming Frog program and noticed a few issues which might have SEO implications and was hoping someone could help identify the problem (if there is one).
N.B: I have obscured the domain name in the image as I do not want it made public.
The canonical URL for my domain is prefixed by
https://www but as you can see from the tree graph image below, there are a number of URLs accessible under the
http protocol and the non-
www version of the domain. Those URLs are non-indexable but I’m curious as to why they’re accessible at all. I was thinking that shouldn’t happen if the proper redirects were in place.
In the WordPress admin settings I have entered the correct version of my domain for the WordPress and site addresses (i.e.,
It’s possible my
.htaccess file might require editing so I have pasted it below:
this is my website URL structure is it good for SEO?
WTP (Water Treatment Plant)
December 2020 I moved a site from a
.co.uk to a
.com. No other changes, simple 301 redirect and use of the tool in google webmaster tools. The site had existed since 2006 and was the de facto site in the vertical, meaning I had plenty of P1s on google for medium tail keywords.
Now, 2.5 months later I am still hugely deranked for 10,000s of key words, and as a result I’ve dropped 40% of my traffic.
I was wondering whether anyone had any advice. Obviously this is a huge issue for me as it’s my livelihood I’ve built up over 15 years of hard work. I followed googles direction perfectly (my background is actually web dev and SEO) and it’s still deranked me.
I’m not sure the linking policy here, but the site is tyrereviews dot com, it used to be tyrereviews dot couk, and for example we used to rank 1 or 2 for the search "michelin primacy 4" in google UK however we are now page 2.
The new .com domain is targeted properly to the UK in webmaster tools too.
This is the same for many many medium tail keywords.
I have a pricing page with
SoftwareApplication that have sub-type of
WebApplication set. Structured data rendered properly and saved to indexed page content. But I see only these enhancements at Search Console:
- Core Web Vitals
- Mobile Usability
Is there way to include SoftwareApplication/WebApplication into enhancements section of Google Search Console?
I recently came across a very cheap protection service similar to Cloudflare called Cloudlayar: https://cloudlayar.com/
It’s a bit weird because the prices are too good to be true, no PayPal (so refund at their mercy), and they use Cloudflare to protect their own website…
Does anyone know what this company is? how good is their service? is it a scam?
p.s: I’m not asking for an "opinion", just trying to know if this service actually works and have customers
Thanks in advance
My website is being outranked by another third party website through an iframe of my website. Their website is featuring my website through an iFrame on a basic page. This page consists of a header, iFrame and footer.
Canonical is set for my website. They are ranking for keywords from my website and I no longer show up on google for most keywords.
Could I be demoted as Google has deemed this website more trustworthy? or better? or is this negative SEO at work? I have found hundreds of sites that are direct clones of mine, could that do something?
I guess I’m grabbing straws looking for the cause.
Let’s say I have a site hosted on
a.example.com. I’m adding an addon domain to the server (shared hosting):
b.example.com and point this domain to the server IP address.
b.example.com provides the same site as
a.example.com, since they point to the same server. That’s OK.
Then I’m adding an other addon domain
c.example.com and point it to the server IP. When visiting
c.example.com I get the error:
The requested URL / was not found on this server.
Can you tell me a reason how this could happen? I don’t get it.
I tried to find the details about traffic limit on Cloudflare workers, the only limit I found is by the number of requests.
Can I use it to provide extremely heavy traffic 10, 20, 30, 50, 100, or 1000 TB/month?
I’m trying to find any ToS for workers but no luck… Can I use them for codec critical video that I don’t want Cloudflare to transcode via their stream service? Can I use it to serve images? or other large files?
What are the limitations to avoid "hurting" the service and other users that may be sharing same resources?
We’ve been having problems recently with our email ending up in junk and after doing some testing it turns out that (some of) the GSuite IPs have been blacklisted, which explains the partial deliverability.
E-mail is very integral to our business so we’d happily pay to have a dedicated IP but Google doesn’t give that option.
Is there a way to force outgoing email to go through a custom SMTP that we’ll setup with another provider that can offer dedicated IPs?
If 2 entities are targetting the same set of audience and set exactly the same bid strategy and bid amount then how does Facebook decide who wins the bid? What’s the tie-breaker logic?
Entity A set bid strategy as "Lowest Cost" and targets an audience who has an interest in "Computers".
Entity B set bid strategy as "Lowest Cost" and targets an audience who has an interest in "Computers".
Who wins the bid here? Entity A or Entity B?