You can do some other things to throw off the robots.[ul]
[li]The first would be to encapsulate your page in a frameset. Many robots do not follow links in a frameset.
[li]You could also do a meta-refresh or javascript refresh from a doorway page to your real content, which causes many robots to be unable to find your page.
[li]Another thing you could do is start with a blank page with just a layer and some javascript code which inserts your content into the layer. Robots cannot index the results of client-side scripting.
[li]Or you could implement a CGI script which detects the browser and compares it to a custom list of robots or robot-like heuristics and return a special robots page if there is a match or the user page if not.[/ul]
This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
By continuing to use this site, you are consenting to our use of cookies.