How can a site bypass VPN without using geolocation?

[I was instructed to repost if related questions didn’t answer my question. I’ll try to explain my question better.]

There’s a website and it shows two pieces of information when you chat with another user:

“IP location” and “Detected location”.

When I used a VPN (and WebRTC off), the “IP location” changed to the VPN country, (as expected) but the “Detected location” showed the real country. So the site was able to bypass VPN.

Does anyone understand how this is possible?

I tried several other sites but not a single one was able to detect real location. Sites like only saw the VPN location. So I know the VPN is solid. Some sites use geolocation (e.g., but for that, the browser shows notification asking for permission. In related questions, there’s discussion about Google Maps, but on fresh browser even can’t bypass the VPN (and shows the VPN location). Then if you click on the small circle, browser will ask to allow location permission. So unless you allow, GM can’t get your location.

But in the case of this site, the browser didn’t ask for location permission (nor is there any permission indication on the address bar)

So I’m curious to understand how is this possible?

Is geolocation aware Actors possible?

Is it possible to have geo-location aware actors in the Actor model?

To be specific, is it possible to achieve this using Akka?

Basically, is it possible to have Akka Actors that are aware that they are deployed to EU zone for instance, and other actors be aware they are deployed to N.America…and they are all part of the same distributed application

Is this something that is already possible “out of the box”? Or not? Or is this even a redundant question to be asking in the context of Actors and distributed systems?

Geolocation and SEO – Store/State based websites

We are a Grocery co-operative retailer and have chain of stores owned by different people. We are building a new website, where we would geo-locate the closest store to the customer and direct them to a particular store (selected based on cookie and geo location). All our stores have a consistent range of products + Variation in 25% range. I have few questions

1) How to build a site-map. Since it will be mandatory for a store to be selected and same flow for the bot and user, should have all products across all stores in the sitemap? we are allowing users to find any products across all stores if they search by product identifier. But, they will be able to see products available in a particular store if go through the hierarchical journey of the website. 2) Will the bot crawl all pages across all the stores or since it will be geolocated to only one store, the content belonging to only one store will be indexed? 3) We are also allowing customers to search for older products which they might have bought few years and that are not part of out catalogue any more. these products will not appear on the online hierarchical journey but, customers will be able to search and find the products . Will this affect our SEO ranking?

I have done quite a bit of research on the geo-location, but, have found only country wise website and SEO factors but not within a country/state?

Any help will be greatly appreciated.

Thanks – Costa

Rails 5: Omniauth with devise and html5 geolocation issue

I’m using devise and I have just added the omniauth feature to a rails app.

gem 'omniauth-facebook' gem 'omniauth-twitter' gem 'activerecord-session_store' 


devise_for :users, controllers: {registrations: 'users/registrations', sessions: 'users/sessions', omniauth_callbacks: "users/omniauth_callbacks"} 


config.omniauth :facebook, 'App id', 'App_secret', callback_url: "https://localhost:3000/users/auth/facebook/callback", :scope => 'email', info_fields: 'email, name' config.omniauth :twitter, 'app_id', 'App_secret', callback_url: "https://localhost:3000/users/auth/twitter/callback" 


devise :database_authenticatable, :registerable, :recoverable, :timeoutable, :rememberable,      :validatable, :trackable, :lockable, :omniauthable, :omniauth_providers => [:facebook, :twitter], :authentication_keys => {email: true}    def self.create_from_facebook_data(facebook_data)     where(provider: facebook_data.provider, uid: facebook_data.uid).first_or_create do | user | = =       user.image =       user.password = Devise.friendly_token[0, 20]     end   end    def self.create_from_twitter_data(twitter_data)     where(provider: twitter_data.provider, uid: twitter_data.uid).first_or_create do | user | = =       user.image =       user.password = Devise.friendly_token[0, 20]     end   end 


class Users::OmniauthCallbacksController < Devise::OmniauthCallbacksController   # facebook callback   def facebook     @user = User.create_from_facebook_data(request.env['omniauth.auth'])     if @user.persisted?       sign_in_and_redirect @user       set_flash_message(:notice, :success, kind: 'Facebook') if is_navigational_format?     else       flash[:error] = 'There was a problem signing you in through Facebook. Please register or try signing in later.'       redirect_to new_user_registration_url     end   end    # twitter callback   def twitter     @user = User.create_from_twitter_data(request.env['omniauth.auth'])     if @user.persisted?       sign_in_and_redirect @user       set_flash_message(:notice, :success, kind: 'Twitter') if is_navigational_format?     else       flash[:error] = 'There was a problem signing you in through Twitter. Please register or try signing in later.'       redirect_to new_user_registration_url     end   end    def failure     flash[:error] = 'There was a problem signing you in. Please register or try signing in later.'     redirect_to new_user_registration_url   end end 

I can login using a social media account and everything is working just fine up to this point, but I have noticed a bug that I can’t figure out what is causing it.

I’m using the html5 geolocation in order to get the location of a user and then according to that location conduct a search. Also, in order to use and allow access to the html5 geolocation I had to implemented this.

function getLocation() {     if (navigator.geolocation) {         navigator.geolocation.getCurrentPosition(setGeoCookie, displayError);      } else {         alert("Geolocation is not supported by this browser.");     } }  function setGeoCookie(position) {     var cookieName = "lat_lng";     var now = new Date();     var time = now.getTime();     time += 3600 * 5000;     now.setTime(time);     var cookie_val = position.coords.latitude + "|" + position.coords.longitude;     document.cookie = cookieName +"=" + cookie_val + '; expires=' + now.toUTCString() + '; path=/';     $  ("#search-form").trigger('submit.rails'); }  function displayError(error) {     var errors = {         1: 'You need to click on allow.',         2: 'Position unavailable',         3: 'Request timeout'     };     alert("Error: " + errors[error.code]);     location.reload(); } 

Now when I try to test while in a localhost and using a safari browser this weird thing happens… I can login with a social media account but the error 2: 'Position unavailable' pops up when I try to implement a search. This problem only happens when I’m using safari, if I use firefox it works just fine.

Any ideas what might be causing this issue in safari and a localhost and how I can solve it?

SharePoint Online Multi Geo SPAppWebUrl is not updated after moving site to a new geolocation

I am developing a provider hosted app on SharePoint Online. I move my site which has my app installed to a different geo location (eg: from US to Japan), I notice that the app is also moved to the new location. However, when I load my app in the new location, I notice that the SPAppWebUrl is the URL of the old location. Is there anything I can do for the SPAppWebUrl to use the new location? My aspx pages are being fetched from the old location therefore it returns 404 Not Found

Incorrect display of the language by user geolocation in Google maps

Good day. I could not find an answer to this question and I want to know what the experts know about this problem. I created a google map with tags in Greece, put it on my website and asked website users from different countries to see what language they see the google map with tags. – from Germany see the English text – from England see the Russian text – from Ukraine see the Russian text – from Ukraine see the English text

Since the site is international, I expected that Google maps would determine the user’s geo-location and show him the map in his language. But now this is a problem, as people see the completely different language of city names, inconsistent with their location. Website users mostly speak English. And, of course, in other languages. But Russian is definitely not included in the priority list, so I don’t want to be seen by users who do not know it. How is this problem solved? Has anyone already solved this problem? Does this problem have a solution or is it not possible yet? I do not understand how to solve it. In Google maps I was sent to ask all the technical questions on this site. I hope that Google technical support will answer me or users who are aware of such a problem and can solve it or suggest whether this problem has a solution.