Given a website with the fictional domain
The owner of this website added a subdomain :
- After one year, the owner changed
yso to have
- After two years, the owner changed
zso to have
Each of the three scenarios did not involve a change of all
example.com structures at robots.txt so the owner got a serious long term SEO problem because crawling software were requested to scan non existing webpages (
y ones respectively).
What regex prophylaxis could have been used by the owner, beforehand to prevent the SEO problem;
Is there a regex way to match generally all possible subdomains in robots.txt?