Improving: Insert up to X verified links from project

The “Insert up to X verified links from project” function is excellent, however, when i choose 10 for example, all 10 url anchors are the same! For example, all 10 URLs created in the article are different, but all have the same anchor “plumber Detroit”. This looks quite suspicious, and can be quite a fo0tprint. 

So, if possible, can we have randomised keywords from the Keywords section? So if I have 10 URLs in a single article created, then maybe it uses 10 different keywords. This way, it looks more natural.

Scrapebox + Automator : How to Grab links froms hundreds of websites ?

Hello,
I have Scrapebox and the premium plugin : Automator.
I have a text file with 800 domains (one domain per line).
I would like to Grab Links (function that is in Automator) of each domain one by one, with a level crawl of 4.
Now I manually create the task “Grab Link” several times.
So for 800 domains, I need to create this task in Automator 800 times with a different domain each time.
Does someone know how can I do in Automator to Grab links from 800 domains more quickly please ? Smile
Thank you very much Smile

How to increase a number of daily links and LPM with global site lists?

Hello,
i have a question.
Is there any way to increase a number of daily links that GSA SER creates from global site lists? I use lists from https://www.serpowerlists.com/ and they are great. But usually i have only about 200-300 verified links per day and LPM is 0,81. I do not set any limits on daily links in settings:
https://prnt.sc/uz9uon
All filtering options are unchecked:
https://prnt.sc/uz9w8p
I connected dropbox folders to GSA SER:
Identified – Contextual_URLs
Submitted – Top_Tier_URLs
Failed – Verified_URLs
https://prnt.sc/uz9y63
Checked Identified, Submitted and Failed and Use URLs from global site lists if enabled:
https://prnt.sc/uz96tm
I run GSA SER on Windows VPS with 4 core and 8 GB RAM at 40 threads with rotating dedicated proxies from https://stormproxies.com/rotating_reverse_proxies.html (40 threads proxy package), 1 catcha-ll e-mail from catchallboxes.com and Xevil for captcha recognition.
For some reason GSA SER runs mostly at 7-10 threads instead of 40 in settings:
https://prnt.sc/uz99pq
CPU and RAM are not higher than 50%.
I decided to check how it will change the situation and directly uploaded target URLs into the project (Import target URLs – From site lists – Identified, Submitted and Failed). GSA speeds up to 40 threads, LPM is still low:
https://prnt.sc/uz9ejj
In total its a big number of targeted URLs (713K URLs), but i still have about 200-300 daily links from uploading targeted URLs.
Sometimes i see message “No targets to post to (no search engines chosen, no url extraction chosen, no scheduled posting)” in my projects:
https://prnt.sc/uz9q76
At the same time dropbox folders with site lists are updating every few hours.
If i click “Show URls – Show remaining target URLs”, it usually shows 0 URLs, or 8-10 URLs.
Is it possible to solve this problem? Am i doing something wrong?
Thank you!

I have a problem with broken links

I have a problem with broken links and I don’t know how to fix it I will attach photos of the problem to you Links: 1- This image is from the view page source https://ibb.co/tLhNKHT

2- This is a post-scan photo on Semrush site https://ibb.co/4Vf5fwV

The page I need help with: https://mwe100.com/lifestyle/the-values-of-a-healthy-lifestyle-examples-of-a-healthy-lifestyle/

Post pagination links ordered by meta value

Is there a way to order the nex-previous post links, or any other pagination function to get the links ordered by meta value, like I have done in the posts listing page shown below?

$  events = get_posts(array(     'posts_per_page' => -1,     'post_type' => 'post',     'meta_key'  => 'date_time',     'orderby'   => 'meta_value',     'order'     => 'ASC',     'suppress_filters' => false )); 

The main goal is tho have the next, – previous event buttons in chronological order in the single post page. (not the post date, but the event date that is stored in custom fields)

Long after the demise of Google Authorship, is it now both valid and viable for a document to include multiple links?

When Google Authorship was very much still a thing several years ago, the conclusion was that it was better not to include more than one <link rel="author"> on any given page.

See:

  • 2012 – How to implement rel="author" on a page with multiple authors?
  • 2013 – Is Google OK with multiple rel="author" links?

Google Authorship is now a distant memory (Mountain View stopped using it several centuries ago in 2016) but I’m concerned that there may still be something invalid or nonsensical about including more than one <link rel="author"> in the <head> of a given document.

My use case involves referencing both an About Page and humans.txt:

<link rel="author" href="https://example.com/about-us/" /> <link rel="author" href="https://example.com/humans.txt" type="text/plain" /> 

Is there anything – I can’t find explicit confirmation – from the WHAT-WG to confirm that this is valid usage?

Or is there a viable alternative to using more than one <link rel="author"> element?