You can protect your site from webcathers that reads robots.txt. But most of robots can be forced to ingore this.
You can also protect you site by sending some error to some of "User-agent"s but most of cathers have option to change what they send as "user-agent" string.
In general you shouldn't do this at all. If somebody want to download you site he will do it.
This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
By continuing to use this site, you are consenting to our use of cookies.