In my web application I generate links in the following format:
Each link according to my specs is sent via email and should not allow robots to scrape it. In order to avoid visits via robot I placed the following
User-agent: * Disallow: /
And on page’s
<header></header> tag I placed:
<meta name="robots" content="noindex"> <meta name="robots" content="nofollow">
The question is how I can ensure that a link is opened only when user has clicked upon it and not a random bot/spider scrapes it? Does the length and the randomness of the
^token^ in url factor into bot visit prevention?
In my application the
^token^ is a cryptographically-random 5 byte value that once generates is hex-econded. So in my case, if the token length takes a significant role on non-scrappines of this link, which is the recommended length of the random token?
My application is in Laravel and I use nginx and php-fpm combo in order to serve the content to the browser.