Re: How to rid a pest?

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



Charles Michener wrote:
I have a couple of spider bots hitting my server that I do not wish to have access to my pages - they ignore robots.txt, so I finally put them on my 'deny from xxxxx' list. This does deny them access but they persist to keep trying - trying each page address at least 30 times - several hits per second . Is there a standard method to forward them to some black hole or the FBI or ...?
---------------- End original message. ---------------------

This is the kind of thing a router/firewall will handle for you.

Stopping these requests before they get to your machine is the best way to handle them. Otherwise, it doesn't really have a lot of impact on the performance of the server for it to send a forbidden response back to the offenders. Yeah, it takes a little bit of processing but it is pretty insignificant per request.

Hopefully they will eventually give up but if they don't, look into using a firewall to deny at the edge of your network.

Dragon

~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
 Venimus, Saltavimus, Bibimus (et naribus canium capti sumus)
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~


---------------------------------------------------------------------
The official User-To-User support forum of the Apache HTTP Server Project.
See <URL:http://httpd.apache.org/userslist.html> for more info.
To unsubscribe, e-mail: users-unsubscribe@xxxxxxxxxxxxxxxx
  "   from the digest: users-digest-unsubscribe@xxxxxxxxxxxxxxxx
For additional commands, e-mail: users-help@xxxxxxxxxxxxxxxx


[Index of Archives]     [Open SSH Users]     [Linux ACPI]     [Linux Kernel]     [Linux Laptop]     [Kernel Newbies]     [Security]     [Netfilter]     [Bugtraq]     [Squid]     [Yosemite News]     [MIPS Linux]     [ARM Linux]     [Linux Security]     [Linux RAID]     [Samba]     [Video 4 Linux]     [Device Mapper]

  Powered by Linux