Currently we try to keep spiders out by adding various lists to robots.txt: Disallow: /project/*/list/?*order= Disallow: /project/*/list/?*series= Disallow: /project/*/list/?*submitter= Disallow: /project/*/list/?*delegate= Disallow: /project/*/list/?*param= The idea is that we do like the patches to be indexed so they can be easily found. But we don't like spiders going through the patches through all the possible listing results. With sitemaps we would have a list for search engines and can don't need complicated robots.txt rules.