Search engines

Linux Advanced Routing and Traffic Control

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



Hi

Does anyone have expirience or advice and set me on the wright track if it
is usefull to slowdown search engine spiders.
I want to limit some spider how eating lott's of bandwith from my server.
Is it possible and a good idea to do this with TC and what would be the
wright script.



With thanks in advance


Sjaak Nabuurs


_______________________________________________
LARTC mailing list / LARTC@xxxxxxxxxxxxxxx
http://mailman.ds9a.nl/mailman/listinfo/lartc HOWTO: http://lartc.org/

[Index of Archives]     [LARTC Home Page]     [Netfilter]     [Netfilter Development]     [Network Development]     [Bugtraq]     [GCC Help]     [Yosemite News]     [Linux Kernel]     [Fedora Users]
  Powered by Linux