Re: Stoping bots from sucking files

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



On Fri, Oct 24, 2008 at 08:29:43PM +0100, Mário Gamito wrote:
> Hi,
> 
> I have this site that has a directory with some files.
> A few weeks ago, two web bots started sucking those files at an impressive rate.


Use a robots.txt file in your home directory.

http://www.robotstxt.org/
http://en.wikipedia.org/wiki/Robots.txt

If they ignore it, then use iptables to block them.  That takes the
strain off httpd.


-- 
/*********************************************************************\
**
** Joe Yao				jsdy@xxxxxxx - Joseph S. D. Yao
**
\*********************************************************************/

---------------------------------------------------------------------
The official User-To-User support forum of the Apache HTTP Server Project.
See <URL:http://httpd.apache.org/userslist.html> for more info.
To unsubscribe, e-mail: users-unsubscribe@xxxxxxxxxxxxxxxx
   "   from the digest: users-digest-unsubscribe@xxxxxxxxxxxxxxxx
For additional commands, e-mail: users-help@xxxxxxxxxxxxxxxx


[Index of Archives]     [Open SSH Users]     [Linux ACPI]     [Linux Kernel]     [Linux Laptop]     [Kernel Newbies]     [Security]     [Netfilter]     [Bugtraq]     [Squid]     [Yosemite News]     [MIPS Linux]     [ARM Linux]     [Linux Security]     [Linux RAID]     [Samba]     [Video 4 Linux]     [Device Mapper]

  Powered by Linux