I’ve been using SER for a few weeks now, so still wet behind the ears. Excuse if question has been answered before but I searched and couldn’t find the answer.
Something strange happening with re-verification. When SER re-verifies links (I have it set at a day, or 2 days in some cases), it literally removes 90% of previously verified links as being not verified. Problem is, the links are still good (I can load them into Scrapebox link verifier and it finds them right away).
I’ve also been digging around with some of the links that failed first verification and it seems like around 30-40% of those links are actually good too. Even 2-3 weeks after first being discredited as not there.
Any thoughts on what’s happening? I’m using 50 blazing dedicated proxies and good emails (catchalls from both domains I own and inbox.eu). I use GSA Captcha and 2Captca for breaking, not that I think these have anything to do with it..
One last one, any way to add incorrectly removed emails back in to a project as verified? Just so it doesn’t mess tiering up?
Thanks in advance for any help.