Re: How to rid a pest?

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



On Dec 15, 2007, Charles Michener (nospam-micheck123@xxxxxxxxxxxx) typed:

Charles:  I have a couple of spider bots hitting my server that I do
Charles:  not wish to have access to my pages - they ignore
Charles:  robots.txt, so I finally put them on my 'deny from xxxxx'
Charles:  list. This does deny them access but they persist to keep
Charles:  trying - trying each page address at least 30 times -
Charles:  several hits per second .  Is there a standard method to
Charles:  forward them to some black hole or the FBI or ...?



Ive been through that.  I just Deny them and eventually learn to
ignore the log entries.

You could wrap httpd around TCPwrappers or such.  Or if you have
control over the network traffic, drop it at the router level.

I seriously doubt the authorities will get involved; it's not like
the spiders are cracking you.  Sounds like they might be
mis-configured if they ignore robots.txt.

Hope that helps.


Thanks
 Birl

Please do not CC me responses to my own posts.
I'll read the responses on the list.

Archives   http://mail-archives.apache.org/mod_mbox/httpd-users/

---------------------------------------------------------------------
The official User-To-User support forum of the Apache HTTP Server Project.
See <URL:http://httpd.apache.org/userslist.html> for more info.
To unsubscribe, e-mail: users-unsubscribe@xxxxxxxxxxxxxxxx
   "   from the digest: users-digest-unsubscribe@xxxxxxxxxxxxxxxx
For additional commands, e-mail: users-help@xxxxxxxxxxxxxxxx


[Index of Archives]     [Open SSH Users]     [Linux ACPI]     [Linux Kernel]     [Linux Laptop]     [Kernel Newbies]     [Security]     [Netfilter]     [Bugtraq]     [Squid]     [Yosemite News]     [MIPS Linux]     [ARM Linux]     [Linux Security]     [Linux RAID]     [Samba]     [Video 4 Linux]     [Device Mapper]

  Powered by Linux