I have in excess of 500k of verified sitelist_Blog Comment-General Blogs. With a lot of do-follows
I want to attempt to post a comment to all the do-follows, and I don’t care about keyword matching etc. I’m doing some parasite linking. When I try to set it up, it goes through the list eventually but only posts to about 100. I just don’t get it. In Scrapebox you can just import a list of good urls and an post to them. How do I do that with GSA ?
Hello everyone. Im pretty new to gsa ser. I used it 3-4 years ago and thought i had good results back then, but havent used since then and everything is very rusty. So Im watching tutorials and thought i had my first project ready to go and it shows 0 submitted and over 300 verified. I think the verified are ser finding sites from my keywords to target. But why isnt it showing any submissions?
Maybe my tutorial is outdated, I dont know. Would greatly appreciate any feedback and possibly an up to date step by step I go over to make sure everything is done correctly.
One other question while Im here…when choosing to submit to articles and directories, why does it force you to enter content in german and polish categories? I dont remember being force to do that 4 years ago…???
SER Links, scraping, filtering, and the syncing process is based on the daily usage and the performances of the resources. We make sure to provide a least monthly count of 500000 verified links and 80 000 high-quality Majestic CF/TF links.
Our top priority is the quality of each link, and we ensure that our links do not contain any pornographic/illegal/unethical targets because our filters are optimized on keeping such spam links out while only processing high-quality verified links.
SER Links only guarantee the quality of the links as long as they remain in the member’s area. We do not take any part in the projects/activities that have been conducted after usage.
When I checking the search engine bots access logs on my VPS, I can see that there are several bing bot logs, but when I use the Verify Bingbot tool (on Bing Webmaster Tools), it shows these IP’s does NOT belong to Bingbot. Then I use whois lookup search and it shows these IPs belong to Microsoft Corporation.
What could be the reason? Is it a fake bot? If so, why it adds a "bingbot/2.0" phrase to access log and these IP’s own Microsoft Corporation?
These are the few controversial IPs.
18.104.22.168 22.214.171.124 126.96.36.199
188.8.131.52 – – [25/Oct/2020:02:34:48 +0530] "GET /2019/08/install-sony-update.html/amp HTTP/1.1" 301 5 "-" "Mozilla/5.0 (compatible; bingbot/2.0; +http://www.bing.com/bingbot.htm)
The “Insert up to X verified links from project” function is excellent, however, when i choose 10 for example, all 10 url anchors are the same! For example, all 10 URLs created in the article are different, but all have the same anchor “plumber Detroit”. This looks quite suspicious, and can be quite a fo0tprint.
So, if possible, can we have randomised keywords from the Keywords section? So if I have 10 URLs in a single article created, then maybe it uses 10 different keywords. This way, it looks more natural.
it now happened three times that my stats in the project area are gone. Yesterday evening I started the campaign and everything was fine.
Lots of links were submitted – but now when I look at the file where those get saved (show URls –> submitted), only one URL was left. All the rest is deleted somehow or just gone or I don’t know what happened.
See the image: 0 | 0
Is there anything that I’ve missed here maybe?
The software is updated and apparently all seems to be working – but then, when I come back it’s simply gone.
Thanks for any help 🙂