Re: Search engines

Linux Advanced Routing and Traffic Control

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



Hello Sjaak,

Monday, December 29, 2003, 5:10:51 PM, you wrote:

SN> Hi

SN> Does anyone have expirience or advice and set me on the wright track if it
SN> is usefull to slowdown search engine spiders.
SN> I want to limit some spider how eating lott's of bandwith from my server.
SN> Is it possible and a good idea to do this with TC and what would be the
SN> wright script.

It is a bad idea.
Read about Internet Robots and the special robots.txt file.

Put these lines in your webroot, 'robots.txt' file:
User-agent: *
Disallow: /

It will disallow all robots who implement 'robots.txt'.


P.Krumins

_______________________________________________
LARTC mailing list / LARTC@xxxxxxxxxxxxxxx
http://mailman.ds9a.nl/mailman/listinfo/lartc HOWTO: http://lartc.org/

[Index of Archives]     [LARTC Home Page]     [Netfilter]     [Netfilter Development]     [Network Development]     [Bugtraq]     [GCC Help]     [Yosemite News]     [Linux Kernel]     [Fedora Users]
  Powered by Linux