Can I indicate (via robots.txt, meta robots or another approach) that one or more queryString parameters should be ignored by crawlers?

I’ve written my own SiteSearch Script in PHP.

The SiteSearch parses two GET parameters from the queryString:

  • search // the search-phrase
  • filters (optional) // indicating which parts of the site to include or ignore

I don’t mind GoogleBot and other crawlers reading the search parameter.

But I would like to advise crawlers to ignore the filters parameter, because a very high number of configurations of that parameter would simply return the same results – and an identical, duplicate page as far as the crawlers are concerned.


As much as I would like to add to my robots.txt file something like:

User-agent: * IgnoreParameter: filters 

this isn’t an option.

And a meta robots directive like:

<meta name="robots" content="ignoreparams[filters]"> 

isn’t an option either.

Is there any creative way I can enable crawling of the page and have crawlers ignore the filters parameter in the queryString?

Or am I stuck with a directive as unrefined as:

<meta name="robots" content="noindex"> 

if I don’t want to risk pages with identical search parameters (but different filters parameters) being crawled?