Help me to prevent my site from negative SEO

Hi guys,

I’m not sure if here is the right place to post my thread but I’m desperate.
As here most of you are familiar with software that create automatic backlinks I hope to find some help here.

Few months ago my my website started loosing rankings in Google and when I checked on google search console i found that my site was hit by a negative seo campaign – someone blasted few thousand backlinks ( mostly on russian language) with anchors: “forex”, “trading”…and so on. I want to point that my site has nothing to do with forex – it is a London based man and van company). Just open ahrefs and check my site and you will see these spam anchors are still there.

Then they just remove these spam links somehow and over 70% of my traffic disappeared. How is that possible?

I have checked all backlinks one by one and added the spam ones in the disavow file but the site doesn’t recover – the traffic now is 70% less then before the negative seo campaign.

My question is how can I prevent from such negative seo campaigns and most importantly – how to recover my site?

Robots.txt for a multilanguage site where root is redirected

I have a site which offers two languages, English and Spanish. When the user navigates to the home page, let’s say the page redirects you to either /es if your browser language is Spanish or English otherwise.

At the moment the robots.txt I have is:

User-agent: * Allow: /  Sitemap: 

because I’m defining all hreflang alternate URLs in the sitemap_languages.xml and all URLs are listed also in the sitemap.xml. My question is more towards the configuration of the robots.txt because I’m not sure if I should be allowing any user agent to crawl the / page. As that page always redirects to the home of either /en or /es I believe that should be disallowed.

Should I then do:

User-agent: * Disallow: / Allow: /es Allow: /en  Sitemap: 

I’m not sure if that could cause a crawl issue or whether there is another way to achieve the same result.

Is running bash script that is taking arguments from site dialog box a good idea?

I’m building a site that will use youtubeAPI to keep track of playlist changes. In order for 3rd party to use it I would supply a dialog box in which user would type his/hers playlistID – this would be read and then put as an argument into bash script that in turn runs curl/python scripts to connect with API (ran on my machine) and another bash script that would mkdirs on my disk.

Does this potentially endanger me/my files somehow ? Can someone input some magic command that would do “rm * -f” or similar malicious endeavor ? Should I use some external server instead of my machine ?

I know nothing about security, Ive read few topics here but didnt find similar problem.

Should I use Two or One Instance of WordPress for Main & Blog Site? [closed]

I have a main site and a blog site under the subfolder . Whether I should use two separate instances of WordPress for them or use one instance?

I have asked this question on . But the question does not draw much attention. So I have to ask here.

How can a site bypass VPN without using geolocation?

[I was instructed to repost if related questions didn’t answer my question. I’ll try to explain my question better.]

There’s a website and it shows two pieces of information when you chat with another user:

“IP location” and “Detected location”.

When I used a VPN (and WebRTC off), the “IP location” changed to the VPN country, (as expected) but the “Detected location” showed the real country. So the site was able to bypass VPN.

Does anyone understand how this is possible?

I tried several other sites but not a single one was able to detect real location. Sites like only saw the VPN location. So I know the VPN is solid. Some sites use geolocation (e.g., but for that, the browser shows notification asking for permission. In related questions, there’s discussion about Google Maps, but on fresh browser even can’t bypass the VPN (and shows the VPN location). Then if you click on the small circle, browser will ask to allow location permission. So unless you allow, GM can’t get your location.

But in the case of this site, the browser didn’t ask for location permission (nor is there any permission indication on the address bar)

So I’m curious to understand how is this possible?

Ready to make Money, Wallpaper Site

Grab this chance to own wallpaper website, beautiful design, unique content

Ready to make money There are 101 articles already published. Website built with last wordpress. where wordpress is most preferred by search engines like Google. Domain name are chosen carefully so I only choose the best domain name and high value, this is premium domain name so that it has the potential to become a large and profitable business…

Ready to make Money, Wallpaper Site

Selling a novel/manga site for a specific series

Why are you selling this site?
New projects that need $ $ $ $

How is it monetized?
Adsense. Monthly revenue about $ 80-$ 100. The revenue is still growing very quickly as the traffic is booming.

Does this site come with any social media accounts?

How much time does this site take to run?
1-2h every week.

What challenges are there with running this site?
None. Anyone can do this because the work is very simple to do.

Some other stats (pm me for…

Selling a novel/manga site for a specific series

Video hosting and CMS site


I have an exercise video program website that has currently been built using Magento, the videos are hosted on Wistia.

Managing and updating the custom CMS that we have for the videos and members area is taking up a lot of development resources.

Does anyone know of any solutions where we could host the videos and it would have a CMS area for the members? the users / members list would need to link up to our Magento stores. We would still have the front end Magento store, the customer…

Video hosting and CMS site