Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations SkipVought on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

how to i prevent users from accessing my server?

Status
Not open for further replies.

HSand

Programmer
Sep 17, 2002
7
0
0
IE
hello

i am running Apache 2.0.39 on a SuSE Linux machine (version 7.3), and have noticed some...odd behaviour in the error logs.

It's mostly just stupid stuff: people trying to get an MSDOS command prompt or entering huge variables in the browser window.
my question is this: is there any way of keeping these unsavoury clients AWAY from my web server, i.e. a list of disallowed IP addresses etc. any ideas??

thanx in advance!!
 
Those log entries are not from users. They are from internet worms trying to infect your server. Most likely Nimda or CodeRed (nothing to worry about there), but there is an Apache+SSL infecting worm running around right now.

Take a look at the Apache module mod_access and it's configuration directives:

______________________________________________________________________
TANSTAAFL!
 
thanx for that.

i just found comething else in my logs. someone has been typing the following into their browser when at my website:


whats up with this "robots.txt" file???? is it another worm or something like that? i'm pretty sure it's not a problem with my site (bad href or whatever) and it's starting to worry me...

thankx again for the help
 
The robots.txt file is the file that search engines uses to see what files you want it to spider and what it shouldn't spider. This really isn't anything to worry about. I think that if you don't have a robots.txt, the search engine will spider everything it can find, which, I suppose, is a good thing. //Daniel
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor

Back
Top