RESULTS FROM USERS
Review from Matt Borden (Scrapebox Support)
Wanted to post back and say that I have found great value in the list and am just now getting started. I have found myself using the list regularly, my only issue is sometimes I have to stop and focus because this list provides so many options that I get overwhelmed, lol.
I have mostly used English but I have merged in some of the other languages on tests. I love the idea of so many languages, for me scraping with such diversity is huge. Im testing Scrapebox 2.0 at the moment and its 64bit (or it has a 64bit version) so I have been able to merge more footprints with the large keyword lists and come up with millions of combinations. I plan on taking some of my good footprints and just working thru the various languages.
Its been weeks now and I am still only taping less then 5% of the potential of this list. Its Good Stuff.
VIDEO RUN-THROUGH BY MATT HERE
Video Review from Devin s4ntos (GSA Support)
Using this list to build up my global site lists with SER. Loaded up the included English keyword list and some article footprints. So far I’ve scraped 7+ million URL’s and haven’t even scraped 1% yet.
Looks like I have enough keywords I should be able to scrape for the next year or so
Review from kvmcable (BHW Reputable VIP)
I was fortunate enough to receive a review copy of this list and to be honest I thought, yeah yeah yeah, another keyword list, big deal. Well I downloaded it today and at first glance I saw the download was 73 MB and immediate thought WTF. I have a lot of keyword lists but nothing more than a few MBs. What kind of list could possibly be 73 MBs in size?
So still a bit skeptical, I downloaded and unzipped the list, yes the 73 MB is compressed size. I unzipped this file and was impressed once again; after extraction the Keyword Lists totaled 312 MB!
Holy Crap I couldn’t wait to check the lists. I have a lot of tools that use Keyword Lists but currently I’m on a Tumblr scrapping kick so I grabbed a 55K list and dropped it in Gscraper to see what it dug up in unique tumblr blogs. After about an hour I had the results, 208,807 unique tumblr blogs:
I went about my usual routine and did a HTTP test to see how many dead blogs were in my 208K list. It ran my average of 1-2% expectation with 2,479 dead tumblr blogs scraped from this small sample of FuryKyle’s Keyword Lists.
Now came the real test of how many of my dead tumblr blogs had valid PR still available. I was quite impressed to see this small (55K) keyword list discovered 523 dead tumblr blogs with PR1-4 available. All in about an hour’s worth of work!
I’m really impressed and I haven’t even put a dent into the lists containing over 1 Billion Keywords (for sure there are that many). I’m now considering another license of gscraper to work over these keyword lists. You could scrape 24/7 for weeks and not run out of keywords. I’m truly impressed and can’t wait to toss some of these keywords in GSA.
For $ 20 this is a no-brainer for guys using tools like Gscraper and GSA for scraping and back-linking. The only thing that surprises me is why this isn’t priced much higher. Honestly it’s that good!