Password protection. That is the one invented for access control.
[ul]
[li]Disallow access in robots.txt. Some downloaders will obey, but search engine crawlers certainly will index them.[/li]
[li]Check the HTTP_REFERER header. Some downloaders will be excluded, but you will deny the access for some legitimate human visitors too.[/li]
[li]Check the USER_AGENT header. Some downloaders will be excluded, but is a common practice to fake it.[/li]
[li]Check for interval between subsequent requests. Some downloaders will be excluded, but is very low probability to work.[/li]
[/ul]
Anyway, a visitor could just visit your site, than saving the images from the brewser cache. Or a proxy cache.
There is no real way to stop someone harvesting your website if you wish anyone (ie joe public) to view the site. The point of a website is to allow browsers to download content for display to users - so if someone wants to download your images, then as a long as a browser can, then someone can do it programmatically also.
Ooos. Now I see that I forgot the most important word in the first solution :
[ul]
[li]Disallow access in robots.txt. Some downloaders will obey, but search engine crawlers certainly will [red]not[/red] index them.[/li]
[/ul]
This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
By continuing to use this site, you are consenting to our use of cookies.