Suggestion for NoSQL database for global users

We are planning to build a social media application, lets say similar to Instagram but at much smaller scale, of course.
The users could be from many geographies like US, Europe, Asia who could interact on same post.
We are evaluating a NoSQL database choice for our use-case where we want:

  • optimal reads at the same time making writes(lets say comments/likes) not noticeably slow.
  • flexible schema, preferably json documents
  • okayish transaction support as it is needed only in few operations and hence we can take a latency hit

We evaluated MongoDB, and with replica set support we can serve reads from local geography(say NA replica for NA users etc) but since there is single master/primary, writes could be slow for some geography(e.g master is in Europe and write came from Australia).
Any help on this would be great.

Note: If you feel this question is broad/needs more info, please comment first before downvoting! I will make sure to add all details that is needed.

Use global variables with Elementor page builder

I have created a short code with php and widget for sidebar to control the results of the short code I made and I need to use some global variables to send from the short code to the sidebar widget, everything working fine, But when I am using Elementor page builder global variables disappears!! However when using another page builder everything working well!! any help about that as I wish to use Elementor in my site! Cheers Mo

Singleton Pattern & global var access alternatives

I know this is a controversial topic being discussed a lot , but I have not found a clear answer yet. So , there is a "Grid" in my game , which I implement it with an Array of GameObjects. This array should be accessed by different scripts in my game. My approach , is that I have a Game Manager , which is a Singleton , the array is public , and every script that needs to access it or make a change can do something like GameManager.Instance.GridArray[i,j]; .

Using singleton and public variables are heavily criticized for extensibility and coupling issues. What approach would be better for this situation ?

How to increase a number of daily links and LPM with global site lists?

Hello,
i have a question.
Is there any way to increase a number of daily links that GSA SER creates from global site lists? I use lists from https://www.serpowerlists.com/ and they are great. But usually i have only about 200-300 verified links per day and LPM is 0,81. I do not set any limits on daily links in settings:
https://prnt.sc/uz9uon
All filtering options are unchecked:
https://prnt.sc/uz9w8p
I connected dropbox folders to GSA SER:
Identified – Contextual_URLs
Submitted – Top_Tier_URLs
Failed – Verified_URLs
https://prnt.sc/uz9y63
Checked Identified, Submitted and Failed and Use URLs from global site lists if enabled:
https://prnt.sc/uz96tm
I run GSA SER on Windows VPS with 4 core and 8 GB RAM at 40 threads with rotating dedicated proxies from https://stormproxies.com/rotating_reverse_proxies.html (40 threads proxy package), 1 catcha-ll e-mail from catchallboxes.com and Xevil for captcha recognition.
For some reason GSA SER runs mostly at 7-10 threads instead of 40 in settings:
https://prnt.sc/uz99pq
CPU and RAM are not higher than 50%.
I decided to check how it will change the situation and directly uploaded target URLs into the project (Import target URLs – From site lists – Identified, Submitted and Failed). GSA speeds up to 40 threads, LPM is still low:
https://prnt.sc/uz9ejj
In total its a big number of targeted URLs (713K URLs), but i still have about 200-300 daily links from uploading targeted URLs.
Sometimes i see message “No targets to post to (no search engines chosen, no url extraction chosen, no scheduled posting)” in my projects:
https://prnt.sc/uz9q76
At the same time dropbox folders with site lists are updating every few hours.
If i click “Show URls – Show remaining target URLs”, it usually shows 0 URLs, or 8-10 URLs.
Is it possible to solve this problem? Am i doing something wrong?
Thank you!

With respect to differential privacy how to find the global sensitivity of queries like ‘maximum height’ ‘Average height’ etc

As much as I have understood,for any query f(x), we need to take maximum of |f(x)-f(y)| over all neighboring databases.

please explain how to find global sensitivity of queries like average height or maximum height.

Woocommerce multisite global search premium integration

Im trying to integrate Global Search Premium plugin (link to plugin) to my site, but some errors occurs! This is the PHP Function to mod, someone can help me to integrate? PS When i contacted seller answear with this guide, no other kind of support!!!

    public function ajax_suggestions() {      if( apply_filters('basel_search_by_sku', basel_get_opt('search_by_sku') ) && basel_woocommerce_installed() ) {         add_filter('posts_search', array( $  this, 'product_ajax_search_sku'), 10);     }      $  allowed_types = array( 'post', 'product', 'portfolio' );     $  post_type = 'product';      $  query_args = array(         'posts_per_page' => 5,         'post_status'    => 'publish',         'post_type'      => $  post_type,         'no_found_rows'  => 1,     );      if ( ! empty( $  _REQUEST['post_type'] ) && in_array( $  _REQUEST['post_type'], $  allowed_types ) ) {         $  post_type = strip_tags( $  _REQUEST['post_type'] );         $  query_args['post_type'] = $  post_type;     }      if ( $  post_type == 'product' && basel_woocommerce_installed() ) {          $  product_visibility_term_ids = wc_get_product_visibility_term_ids();         $  query_args['tax_query'][] = array(             'taxonomy' => 'product_visibility',             'field'    => 'term_taxonomy_id',             'terms'    => $  product_visibility_term_ids['exclude-from-search'],             'operator' => 'NOT IN',         );          if ( ! empty( $  _REQUEST['product_cat'] ) ) {             $  query_args['product_cat'] = strip_tags( $  _REQUEST['product_cat'] );         }     }      if ( 'yes' === get_option( 'woocommerce_hide_out_of_stock_items' ) && $  post_type == 'product' ) {         $  query_args['meta_query'][] = array( 'key' => '_stock_status', 'value' => 'outofstock', 'compare' => 'NOT IN' );     }      if ( ! empty( $  _REQUEST['query'] ) ) {         $  query_args['s'] = sanitize_text_field( $  _REQUEST['query'] );     }      if ( ! empty( $  _REQUEST['number'] ) ) {         $  query_args['posts_per_page'] = (int) $  _REQUEST['number'];     }      $  results = new WP_Query( apply_filters( 'basel_ajax_search_args', $  query_args ) );      if ( basel_get_opt( 'relevanssi_search' ) && function_exists( 'relevanssi_do_query' ) ) {         relevanssi_do_query( $  results );     }      $  suggestions = array();      if ( $  results->have_posts() ) {          if ( $  post_type == 'product' && basel_woocommerce_installed() ) {             $  factory = new WC_Product_Factory();         }          while ( $  results->have_posts() ) {             $  results->the_post();              if ( $  post_type == 'product' && basel_woocommerce_installed() ) {                 $  product = $  factory->get_product( get_the_ID() );                  $  suggestions[] = array(                     'value' => get_the_title(),                     'permalink' => get_the_permalink(),                     'price' => $  product->get_price_html(),                     'thumbnail' => $  product->get_image(),                     'sku' => $  product->get_sku() ? esc_html__( 'SKU:', 'basel' ) . ' ' . $  product->get_sku() : '',                 );             } else {                 $  suggestions[] = array(                     'value' => get_the_title(),                     'permalink' => get_the_permalink(),                     'thumbnail' => get_the_post_thumbnail( null, 'medium', '' ),                 );             }         }          wp_reset_postdata();     } else {         $  suggestions[] = array(             'value' => ( $  post_type == 'product' ) ? esc_html__( 'No products found', 'basel' ) : esc_html__( 'No posts found', 'basel' ),             'no_found' => true,             'permalink' => ''         );     }      echo json_encode( array(         'suggestions' => $  suggestions     ) );      die(); } 

What are some ways I can use Project and Global Verified lists?

I really haven’t played with cross-posting on different projects’ verified lists.
Or, used the Global Verified (or other three) lists.
I understand that the global list may have posts on niche-specific targets, like (e.g.) pinball machines, tulip bulbs, and lightbulbs and so would it be weird to try posting a new site that focuses on vegan food using the global list, or one of the other niche projects’ lists?
Are there any negatives to using my own verified lists?
Could it be perceived as spam if I post multiple links to different pages of the same domain on a single target?

What are some of the alternatives for not having a global state?

I watched some of Google Tech Talks and I got convinced that having a global state is such an awful thing. Now I’m not sure how the alternative should be. Suppose for example that I have a logger and a graph, and for some reason each node in the graph can log its value to the logger. Before, while having the global state, I could’ve just had a global log object and call the “log” function on the global log object from every node. But now since I don’t have a global state, I can’t think of any other solution except for having a reference to the log object inside each node. Now, if the graph is something like of only 100 nodes for example, It won’t be a such a big deal. But suppose the graph has couple of millions of nodes for example… I’m not sure whether having a reference to the log object is a good idea or not. What are some of the alternatives for not having a global object that was used by alot of alot of objects? Having a reference is one of them, but are there some other ways that would be more efficient?

Can global adversaries ‘de-anonymize’ any TOR user in a day?

Tor traffic correlation attacks by global adversaries

I know what traffic correlation attack is but I find it hard to understand this article or how it got to its conclusions about ‘de-anonymize’ ‘typical web user’ who use TOR within a day just with the ability to monitor enough web traffic. I also don’t get how dark markets and child porn still exists on onion sites if all it takes is a little bit of cooperation to de-anonymize everyone?, The article is from 2013 so the governments had a long time to do it.

At the end all what the ‘global adversaries’ can see is traffic volume and timing(that is affected by some timing noise) right? so if you are just a typical web user who just connected few times(lets say 5 times) to an average website/webpage with an average size of 700kb who said that you are one of the few TOR users who visited a website/webpage in this size 5 times around the time you did it?. I am missing something here?.