possible new Target URLs from present accounts.

i  keep getting the same message thousands of times, i have deleted/blocked the domain/url in the global system as well as in the specific project (after turning off all projects except one to isolate the problem)
the same message again & again is :-
15:46:00: [-] 1/1 PR-0 too low – http://www.gomaze-play.de/index.php?page=Register&action=register
15:46:00: [+] 001 possible new Target URLs from present accounts.
it is already listed in
project > options > skip sites with the following words in url/domain

Auto import urls

Hi (sorry for noob question but not use for long time):
I have a doubt, if I want to scrape the URLs with Scrapebox or Gscraper and use directly in Ser (always in scraping and not filtered) how do I do so that the list of urls which are refreshed without interruption is imported without stopping?
I remember that you had to use the site list. But since it is a single file I have a doubt. If I import into my Ser projects directly, will it do the trick (the process shouldn’t stop ???)
Thank you

Use archive template for CPT but not generate urls for posts items

I need to be able to use the archive template for a custom post type but also at the same time prevent URLs from being created for the “posts” that are created, and keep the posts publicly visible.

I created the proper archive template and that works just fine, also the slug for the archive works great, but when I go to set rewrite to false I get a 404 error on the archive page. So it appears that the method won’t do. I could always create a page and query the posts in a page template but I would prefer not to.

So is there a way to use the archive template but also keep WordPress from creating URLs for the “posts” I create.

Below is the code I’m using to generate the CPT.

    function cptui_register_my_cpts_multi_fam_prop() {      /**      * Post Type: Multi-Family Properties .      */      $  labels = array(         "name" => __( "Properties ", "custom-post-type-ui" ),         "singular_name" => __( "Property", "custom-post-type-ui" ),     );      $  args = array(         "label" => __( "Properties ", "custom-post-type-ui" ),         "labels" => $  labels,         "description" => "",         "public" => true,         "publicly_queryable" => true,         "show_ui" => true,         "delete_with_user" => false,         "show_in_rest" => true,         "rest_base" => "",         "rest_controller_class" => "WP_REST_Posts_Controller",         "has_archive" => "multi-family-management/properties",         "show_in_menu" => "mf-menu",         "show_in_nav_menus" => true,         "exclude_from_search" => true,         "capability_type" => "post",         "map_meta_cap" => true,         "hierarchical" => false,         "rewrite" => false,         "query_var" => true,         "supports" => array( "title", "editor", "thumbnail" ),     );      register_post_type( "multi_fam_prop", $  args ); }  add_action( 'init', 'cptui_register_my_cpts_multi_fam_prop' ); 

First time posting here, so school me up if I’m missing anything.

Only scrape if target urls < x

Right now I’m scraping far more urls then I post to, target urls just keeps on getting bigger and bigger and its not submitting as much as i want to

Would be great to have more control over when it scrapes/posts, maybe an option to only scrape if target urls are below a certain threshold

580000+ .edu & .gov URLs


I am sharing a list with you, 580000+ .edu & .gov URL list. 
Download Link
This is a password protected file, please ask me for the PW. I will give you personally. Bcz I dont want to share that file publicly. 
I did not filter the list, there maybe some duplicate urls, Please do it yourself. I really dont have much time for this, If you do that please give that list to me 😀
Thank you so much.

What benefit does Amazon get from including search results in product URLs? [closed]

https://www.amazon.com/JOYO-Wireless-Infinite-Sustainer-Handheld/dp/B07WZL5ZDK/ref=mp_s_a_1_6?keywords=heet+ebow&qid=1580579507&sr=8-6 Above is a link to a product. You can see in the URL what I looked up to find it. As an information conscious person, we know that this benefits Amazon enough to have bothered making the decision. What are the possible benefits, if they’re not already known? What are the risks?

Multiple URL’s in Google Analytics Goals Funnel Step

Have a landing page with 3 packages: 1 of the product, 3 of the product, 6 of the product, each option has their own checkout URL but all will have /thank-you in the URL once purchased.

So they all start on the same LP but there are 3 possible paths to the goal.

If I select Regular Expression what exactly would I put in the Destination? Just /thank-you ?

And if you select regular expression, from what I understand you are supposed to use regex in the funnel steps as well.

Would it then be possible to use the or symbol: | to seperate the 3 possible 2nd steps into funnel step number 2 field?

Something like: /(redoxol-h2|redoxol-h2-3-bottle-bundle|redoxol-h2-6-bottle-bundle)

I tried just add the 3 paths into 3 dif steps but the data was all wrong. If I separate it into 3 different goals the data won’t be accurate either.

Any ideas?