Protection against impolite bots

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



Hi all,

I was wondering if httpd has some module that can be used for
protecting against malicious clients that send too-many requests at a
high rate. Sometimes, some clients (or robots), send too many requests
(e.g., 20 per second) to our application that operates with rather
limited resources. Is there a way that I can limit number-of-requests
per IP per second? Or even better, notify the system admin if someone
downloads over 1000 pages in less than a minute.

Any pointers will be appreciated.

thanks
Nilesh

-- 
Nilesh Bansal.
http://queens.db.toronto.edu/~nilesh/

---------------------------------------------------------------------
The official User-To-User support forum of the Apache HTTP Server Project.
See <URL:http://httpd.apache.org/userslist.html> for more info.
To unsubscribe, e-mail: users-unsubscribe@xxxxxxxxxxxxxxxx
   "   from the digest: users-digest-unsubscribe@xxxxxxxxxxxxxxxx
For additional commands, e-mail: users-help@xxxxxxxxxxxxxxxx


[Index of Archives]     [Open SSH Users]     [Linux ACPI]     [Linux Kernel]     [Linux Laptop]     [Kernel Newbies]     [Security]     [Netfilter]     [Bugtraq]     [Squid]     [Yosemite News]     [MIPS Linux]     [ARM Linux]     [Linux Security]     [Linux RAID]     [Samba]     [Video 4 Linux]     [Device Mapper]

  Powered by Linux