Sorry this site requires JavaScript to be enabled in your browser. See the following guide on How to enable JavaScript in Internet Explorer, Netscape, Firefox and Safari. Alternatively you may be blocking JavaScript with an advert-related or developer plugin. Please check your browser plugins.

An extremely useful example of how to use wildcard (*) in Robots.txt for sites that use dynamic query parameters.
Comments3 Comments  


Avatar Moderator
from kerimorgret 3517 Days ago #
Votes: 0

I’ve found Google takes a long, long time to get rid of pages with parameters. Unfortunately, they have no way to remove URLs with specific parameters.Yahoo Site Explorer has a way to specify what are your parameters, and how you would like Yahoo to deal with them. They also don’t have a way to force removal of specific parameters, but I’ve found they do a better job at dealing with them, since you can specify your dynamic parameters.From their page:Specify up to 10 dynamic parameters that you want us to treat specially whenever these are seen in URLs belonging to [url] We will automatically rewrite the URLs containing these parameters as specified below. You can choose to: Remove these parameters from the URLs, such as in case of session ids, you could ask to remove ’sid’ from URLs. Use a default value for the parameter, for example you could set the ’src’ parameter to be ’yhoo_srch’ More help at

from robots 3517 Days ago #
Votes: 1

Nice guide with specific examples on one of the lesser known features of robots.txt protocol - thanks!

from wilreynolds 3517 Days ago #
Votes: 0

Great guide, wildcards have always been a bit of a questionmark for me.  Thanks!

Upcoming Conferences

Search Marketing ExpoSearch Engine Land produces SMX, the Search Marketing Expo conference series. SMX events deliver the most comprehensive educational and networking experiences - whether you're just starting in search marketing or you're a seasoned expert.

Join us at an upcoming SMX event: