Scrapping Google “reasonably” without triggering spam filter

My goal is to do some scrapes to get list of businesses per city.

I need to send about 500 emails per day, I use the inurl:contact to make sure that the results have a contact page, and then I extract the email address from that.

So I figure I might batches of about 2000 URL’s not sure yet…

Can I safely crawl 1000-5000 url results a day with the default proxies that come with scrapebox (using the option “harvest proxies”)