Re: realtime protection against cloud scans

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



Marc wrote:
> Anyone having a suggestion on how to block cloud crawlers/bots? Obviously I
> would like search engine bots to have access, but all the other crap I want to
> lose. Only 'real users'.

I take a three-pronged approach, using the NTP firewall and some scripts.

1. db-ip.com keeps a list of IP ranges by geocode, updated monthly. I Block
geocodes CN, VN, RU, HK, SG, IN, KR, TW, BR, JP, and ID. I arrived at this list
of geocodes based on observing where most intrusion attempts were actually
coming from. (US, DE, FR, and NL are also on my list, but blocking them would
interfere with intended use.)

2. blocklist.de keeps a list of malicious IPs, updated in near-real-time. I
block these as they appear.

3. I maintain my own list of log signatures that signal malice, such as "GET
/.env" in the Apache log, or signatures of crawlers that I do not sanction, or
probes of ports such as SSH, SMTP, and IMAP. I block the originating IPs as they
appear. About 50 intrusion attempts are blocked per day this way.

With these measures in place, nearly all my Apache traffic is intended use.

-- 
Cheers!
Edward

---------------------------------------------------------------------
To unsubscribe, e-mail: users-unsubscribe@xxxxxxxxxxxxxxxx
For additional commands, e-mail: users-help@xxxxxxxxxxxxxxxx




[Index of Archives]     [Open SSH Users]     [Linux ACPI]     [Linux Kernel]     [Linux Laptop]     [Kernel Newbies]     [Security]     [Netfilter]     [Bugtraq]     [Squid]     [Yosemite News]     [MIPS Linux]     [ARM Linux]     [Linux Security]     [Linux RAID]     [Samba]     [Video 4 Linux]     [Device Mapper]

  Powered by Linux