How do I get google or bing maps to give me turn by turn directions on a laptop?

Our company uses rugged laptops for our customer’s delivery trucks.

I can map the location without an issue and it will give us directions to the location but we cannot get it to do the turn by turn live directions. It works without issue on our phones but it’s not following us on the map when we use the tablet.

Is there something that needs to be enabled on the tablets? The laptops have internet via cellular network. Could that have something to do with it?

Any information would be greatly appreciated!

How to deal with online position errors in Alberta Rural Addressing system in Google Maps, Bing Maps, Apple Maps, Waze Maps

Alberta has a system for rural addresses described here:

Alberta Rural Addressing System

This system allows a location to be calculated from the address and the dominion survey, and the name of a local municipality. E.g. The address

50042 Range Road 31, Warburg, Alberta is 42/80 of a mile north of township road 500, near Warburg. (The addresses are unique only within roughly a 60-mile radius)

If this address corresponds to an address that the mapping service knows about, all of them seem to plot it correctly. However, if you give an address that is unknown to the system, each plots a location in an arbitrary location on a segment of the road near the municipality.

This can be extreme: E.g.

  • 52420 Range Road 12, Stony Plain, Alberta

  • 51020 Range Road 12, Stony Plain, Alberta (10 miles south of the first location)

  • 62420 Range Road 12, Stony Plain, Alberta (60 miles north of the first location)

all plot at the wrong place.

Worst: None of the 4 services mentioned tell you that the address couldn’t be found.

I have sent feedback to Google and to Bing about this on several occasions.

Submit your website to 775 search engines and directories – Google, Bing and many more for $7

Dears, When you submit your website to many search engines, this will increase your website traffic. Our service includes all the major search engines including Google, and Bing plus hundreds of relevant free directories from around the world you can get your website listed in. Our propriety search engine submission automatically requests inclusion of your website saving you hours of time manually submitting to each directory one by one. All I need is your website URL and leave the submitting to me, you will have a full list of search engines and directories your website is submitted. Regards

by: aaymann
Created: —
Category: Directory Submission
Viewed: 185


Does search engines like Google or Bing search through all web pages for a single query every time?

Here is what I imagine would happen when I type in a search query in Google or Bing:

  1. The search query is converted into a vector through some pre-trained machine learning models. The vector captures semantics features, etc.

  2. The search engine goes through all webpages that it has ever crawled and computes a similarity score of each webpage to my search query based on the vectors of a query and the webpage. (Assuming the search engine have already pre-computed the semantics feature vector for all webpages that it crawled)

  3. The search engine ranks the similarity scores of all the webpages, and return the ranked list to me.

To me, it just seems that searching through all webpages would be too expensive and I find it hard to believe the searching and ranking are all done in a few milliseconds.

Can someone comment?

SPFx webpart with Bing Maps

I have SharePoint 2019, and I am trying to create a webpart that will display Bing Map, I tried this on a react app created using create-react-app, and it works fine. Then I took the code, and added it to SPFx, I am getting no errors, but the div that’s supposed to render the map is empty.

Here’s my code for the root component:

 return (       <div>         <BingMap />       </div>     ); 

Then I have a utility function that’ll load scripts when called:

const loadDynamicScript = (config, callback) => {     const existingScript = document.getElementById(config.id);      if (!existingScript) {       const script = document.createElement('script');       script.src = config.link;       script.id = config.id;       document.body.appendChild(script);        script.onload = () => {         if (callback) callback();       };     }      if (existingScript && callback) callback();   };    export default loadDynamicScript; 

Then I have my BingMap component:

import * as React from 'react'; import scriptLoader from '../../utils/scriptLoader'  const config = {     id: 'bingmaps',     link: 'https://www.bing.com/api/maps/mapcontrol?callback=GetMap&key=[mykey]' } declare global {     interface Window { Microsoft: any; GetMap: any } }  window.Microsoft = window.Microsoft || {}; window.GetMap = window.GetMap || {};    export default class BingMap extends React.Component<{}, {}> {      myRef: any;       GetMap = () => {       new window.Microsoft.Maps.Map(this.myRef, { });       componentDidMount() {         console.log(this.myRef)         window.GetMap = this.GetMap;         scriptLoader(config, null)     }      render() {         return (             <div ref={(myDiv) => { this.myRef = myDiv; }} className="map" ></div>         )     } } 

So when my component is mounted, it’ll call the script loader, and load bing maps, and execute my callback function. This works in create-react-app, but in SPFx it’s not, and I a not getting any errors in the console.

It generates this div:

enter image description here

Any suggestion is appreciated.

Scraping Bing without proxies

Hello!

I could use some advise…

I want to talk about the feasibility of scraping bing without proxies. I’ve honestly never been able to get a hang of using the proxy scraper, and I can’t justify $10/m for shared proxies, but deathbycaptcha, on the other hand, is super affordable. So wouldn’t it be more cost-effective to simply have deathbycaptcha┬ásolve a couple of captchas┬áif & when they pop up, then to buy proxies….especially for someone like me who is not doing this every month?

With a set up like this, what would be an acceptable delay setting for scaping bing? Or are proxies an absolute must? I mean…I can call up bing from my browser and download a serp manually without issues so surely it must be possible to use scrapebox on bing without proxies right? Or can bing tell that I’m using software?

Thanks!