realtime protection against cloud scans

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



Anyone having a suggestion on how to block cloud crawlers/bots? Obviously I would like search engine bots to have access, but all the other crap I want to lose. Only 'real users'.

What is best practice for this? Just getting amazon, googleusercontent, digitalocean, azure ip ranges and put them in something like ipset or are there currently better ways of doing this?



---------------------------------------------------------------------
To unsubscribe, e-mail: users-unsubscribe@xxxxxxxxxxxxxxxx
For additional commands, e-mail: users-help@xxxxxxxxxxxxxxxx




[Index of Archives]     [Open SSH Users]     [Linux ACPI]     [Linux Kernel]     [Linux Laptop]     [Kernel Newbies]     [Security]     [Netfilter]     [Bugtraq]     [Squid]     [Yosemite News]     [MIPS Linux]     [ARM Linux]     [Linux Security]     [Linux RAID]     [Samba]     [Video 4 Linux]     [Device Mapper]

  Powered by Linux