I have a site which offers two languages, English and Spanish. When the user navigates to the home page, let’s say
www.example.com the page redirects you to either
/es if your browser language is Spanish or English otherwise.
At the moment the
robots.txt I have is:
User-agent: * Allow: / Sitemap: https://www.example.com/sitemap_index.xml
because I’m defining all
alternate URLs in the
sitemap_languages.xml and all URLs are listed also in the
sitemap.xml. My question is more towards the configuration of the
robots.txt because I’m not sure if I should be allowing any user agent to crawl the
/ page. As that page always redirects to the home of either
/es I believe that should be disallowed.
Should I then do:
User-agent: * Disallow: / Allow: /es Allow: /en Sitemap: https://www.example.com/sitemap_index.xml
I’m not sure if that could cause a crawl issue or whether there is another way to achieve the same result.